Synapse for Kinect Introduction Synapse for Kinect Quartz Composer Tutorial Synapse for Kinect. SYNAPSE for Kinect Update: There’s some newer Kinect hardware out there, “Kinect for Windows”.
This hardware is slightly different, and doesn’t work with Synapse. Be careful when purchasing, Synapse only supports “Kinect for Xbox”. Update to the update: There appears to also be newer “Kinect for Xbox” hardware out there. Model 1414 Kinects work with Synapse, but I’m getting reports that the newer 1473 models do not work. Update the third: Synapse doesn’t work on Windows 8, sorry.Synapse is an app for Mac and Windows that allows you to easily use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. Kinect hack for easy 3D modeling - exciting potential in le Kinecting the dots Kinecting the dots part 2
Scan your world in 3D with MatterPort Truebones Motions Free BVH files and Character Animation Software. 6D SLAM with RGB-D Data from Kinect. How Motion Detection Works in Xbox Kinect. The prototype for Microsoft’s Kinect camera and microphone famously cost $30,000.
At midnight Thursday morning, you’ll be able to buy it for $150 as an Xbox 360 peripheral. Microsoft is projecting that it will sell 5 million units between now and Christmas. We’ll have more details and a review of the system soon, but for now it’s worth taking some time to think about how it all works. Kinect’s camera is powered by both hardware and software. And it does two things: generate a three-dimensional (moving) image of the objects in its field of view, and recognize (moving) human beings among those objects. Older software programs used differences in color and texture to distinguish objects from their backgrounds. Time-of-flight works like sonar: If you know how long the light takes to return, you know how far away an object is. Using an infrared generator also partially solves the problem of ambient light.
PrimeSense and Kinect go one step further and encode information in the near-IR light. FaceCube: Copy Real Life with a Kinect and 3D Printer. This project is a tangent off of something cool I’ve been hacking on in small pieces over the last few months.
I probably would not have gone down this tangent had it not been for the recent publication of Fabricate Yourself. Nothing irks inspires me more than when someone does something cool and then releases only a description and pictures of it. Thus, I’ve written FaceCube, my own open source take on automatic creation of solid models of real life objects using the libfreenect python wrapper, pygame, NumPy, MeshLab, and OpenSCAD.
The process is currently multi-step, but I hope to have it down to one button press in the future. First, run facecube.py, which brings up a psychedelic preview image showing the closest 10 cm of stuff to the Kinect. You can then open the PLY file in MeshLab to turn it into a solid STL. FaceCube: Copy Real Life with a Kinect and 3D Printer by nrp. The process is currently multi-step, but I hope to have it down to one button press in the future.
First, run facecube.py, which brings up a psychedelic preview image showing the closest 10 cm of stuff to the Kinect. Use the up and down arrow keys to adjust that distance threshold. Pressing spacebar toggles pausing capture, to make it easier to pick objects. Click on an object in the preview to segment it out. Everything else will disappear; clicking elsewhere will clear the choice. You can then open the PLY file in MeshLab to turn it into a solid STL. You can then open the STL in OpenSCAD or Blender and scale it and modify to your heartâ€™s (or printerâ€™s) content. Since all of the cool kids are apparently doing it, Iâ€™ve put this stuff into a GitHub repository. Git clone email@example.com:nrpatel/FaceCube.git. Kinect, OSC, OpenFrameworks, Visuals.
Open Frameworks + Kinect + Sound. While attending FlITC San Francisco I saw Theo Watson talk about his work with the creative coding libraries known as open frameworks. While I didn’t have very much experience in c++ besides a couple of simple test apps to learn openGL, I thought it would be a good experience to learn a language completely different from actionscript. Below is the result that I after a very productive day of tinkering: Kinect Sound Experiment with Open Frameworks from Ben McChesney on Vimeo What is openframeworks? I think the most accurate term I’ve heard to describe it is “processing on crack”.
Download + install xCode for mac Download openframeworks FAT 0061 and unzip it my work folder ( though it will work anywhere ) Compile any example project under openframeworksFolder/apps/examples/ to make things easier for yourself and to get developing quickly. OpenFrameworks. 2 Kinects 1 Box. Software Release and Kinect Camera Calibration. Software Release (and a little surprise) Oliver Kreylos' Research and Development Homepage - Kinect Hacking. 3D Video Capture with Kinect. Diarmuidwrenne. Kinect Point Cloud Viewer in 3D. iPhone 4 vs Kinect controller ?