background preloader

Tracking kinect

Facebook Twitter

Kinect SDK Hello World - All. In this tutorial I am using the Kinect for Windows, this Kinect has the model number 1517 this means it is the Windows version and thus can be used for commercial use and sadly doesn't work with other OS's.

Kinect SDK Hello World - All

You may have the Xbox version models 1414, 1473 and these should also work. First thing to do is to install the Kinect SDK and Development Kit. You can get them from these links: You will also need Visual Studio for this Instructable, head over and download the VS2015 Community edition, its free and packed with so many features it will make your mind boggle!

Ae to arduino

Create Interactive Electronic Instruments with MaxMSP : Xbox Kinect and MaxMSP. The XBox Kinect is a hackable depth sensing camera and gesture tracking device.

Create Interactive Electronic Instruments with MaxMSP : Xbox Kinect and MaxMSP

Since the Kinect was first hacked, it's been a popular choice for tons of interactive projects, and eventually the Max community developed a bunch of ways to pull Kinect information into Max. If you are going to buy a Kinect, be aware the the newer model 1473 does not work with any of the options listed below as of the Sept/2013, the older model 1414 will work, also make sure it is called "Kinect for Xbox". Once you have Synapse doing skeleton tracking, click on a joint in the Kinect-Via-Synapse Max patch to start pulling in info about its location. You can toggle between x, y, and z coordinates and you can also select the coordinate system you would like to reference to: Double click on the object labeled "User 1" on the left side of the patch.

Guide to Camera Types for Interactive Installations / Guest post by Blair Neal (@laserpilot) (image source) Choosing the right type of camera for your interactive installation is one of the most important technical choices you can make in your initial planning phases.

Guide to Camera Types for Interactive Installations / Guest post by Blair Neal (@laserpilot)

Making the incorrect choice can really impact how well your installation reacts to its victims and it can also impact its ability to perform robustly in a large amount of environments. You can always correct for certain things in software, but the hardware setup can often be the first line of defense against undesired behavior. Whether you’re working in Processing, OpenFrameworks, Max/MSP/Jitter, Quartz Composer , Cinder, VVVV, or really any artistically geared programming environment, your choice of camera can impact your work no matter the software. Some environments will give you more options with different cameras (maybe you need a Blackmagic capture card, or an IP cam, or a DSLR, or a Point Grey Firefly). Questions to consider when in planning phases: • Where is it being set up? 1. (image source)

Rêve de p et j recherches

Kinect Physics Tutorial for Processing. Jitter version of Animata. Recipe 50: Branching - Cycling 74 Makers of Max Visual Programming Software for Artists, Musicians & Performers. General Principles Animating OpenGL objectsUsing audio to drive motionWorking with jit.path for smooth motion Commentary After the RoboMoves recipe, I wanted to try a slightly more complex structure and work with a simple audio responsive motion control.

Recipe 50: Branching - Cycling 74 Makers of Max Visual Programming Software for Artists, Musicians & Performers

The repeated, branching structure of a tree provides the basis for this animation patch, and we use the amplitude of an incoming audio signal to scrub around the animation. While this recipe follows much of the same structure as RoboMoves, the effect of this is really different. Ingredients Technique First, we’ll look at the structure of the animation nodes.

To animate the tree, we use jit.path, just like in RoboMoves, but instead of a looping counter, we have a peakamp~ object driving the eval message. Kimchi and Chips' blog » Blog Archive » Kinect + Projector experiments. Jpbellona. SimpleKinect Application simpleKinect is an interface application for sending data from the Microsoft Kinect to any OSC-enabled application.


The application attempts to improve upon similar software by offering more openni features and more user control. The interface was built with Processing, utilizing the libraries: controlP5, oscP5, and simple-openni. Kinect via OSCeleton Max interface. Jpbellona. Frequently Asked Questions. What is NI mate?

Frequently Asked Questions

NI (Natural Interaction) mate is a small but powerful software that takes real-time motion capture data from an OpenNI compliant device such as the Microsoft Kinect, Asus Xtion or PrimeSense Carmine and turns it into two industry standard protocols: OSC (Open Sound Control) and MIDI (Musical Instrument Digital Interface). Available for Windows, Mac OS X and Ubuntu Linux, NI mate offers users easy installation and a powerful, yet user-friendly configuration interface. Keeping to standard protocols for output makes NI mate an extremely flexible piece of software that can be applied in a vast number of scenarios.

You can read more about the software from here. Where can I find documentation for NI mate? We are in the process of creating proper documentation for NI mate and all officially supported add-ons. How can I have my licence email re-sent? You can have your licence email re-sent by going to the re-send licence email page. Feature requests and official add-ons? Kinect+projector.v4p. OpenFrameworks.