# Blender

The good folks at ZigFu have created a great way for people to set up the things they need in order to start using their Kinect right away. The package will install OpenNI, NITE and Sensor Kinect with just the click of a button. This will definitely come in handy if you're new to the game of have to configure multiple workstations in a short amount of time. They developed a Unity package as well for gaming and their own Portal you can use on your Desktop.

Related:  Kinect - LeaPmotionechoromKinect_tot

Turn Kinect into 3D scanner Explained Full tutorial with Code. « David Leonard Turn Kinect into 3D scanner Explained Full tutorial with Code. Posted by David Leonard on February 2, 2011 · Comments Off The Kinect is here. Thousands around the world have been hacking the game cam. One of it’s most exciting applications the device offers is it’s ability to display depth information on the screen. Open Frameworks + Kinect + Sound While attending FlITC San Francisco I saw Theo Watson talk about his work with the creative coding libraries known as open frameworks. While I didn’t have very much experience in c++ besides a couple of simple test apps to learn openGL, I thought it would be a good experience to learn a language completely different from actionscript. Below is the result that I after a very productive day of tinkering:

Install FAAST on your PC Install FAAST on your PC for full body control and VR applications: Disclaimer: This comprehensive guide of FAAST installation was taken from Institute for Creative Technologies website. Processing - V // Pixelnerve I’ve made an application as an example for this thread on how to compute/evaluate a Cubic Bézier Curve using a Geometry Shader. The formula is pretty straightforward as described by this wikipedia article (look for Cubic Bézier Curve). I will not go over the bézier math or theory. I assume you have some knowledge in shader programming (GLSL is the case) and some math background would help, while not really a need. All that said, let’s get to work. On this case we will need 4 points: 2 anchor points (the line end points) and 2 control points.

Kinect to STL sketch for Processing by johngomm Here's my Processing sketch to interface with the Kinect and capture the depth data and render it as a solid STL file. I've included controls to adjust two thresholds - near and far. This allows you to set up a "Han Solo in carbonite" type effect. I am not going to hold your hand through setting up Processing and this write up is In Progress, so if you get frustrated, realize that this might not be for you yet. FaceCube: Copy Real Life with a Kinect and 3D Printer by nrp The process is currently multi-step, but I hope to have it down to one button press in the future. First, run facecube.py, which brings up a psychedelic preview image showing the closest 10 cm of stuff to the Kinect. Use the up and down arrow keys to adjust that distance threshold. Pressing spacebar toggles pausing capture, to make it easier to pick objects. Click on an object in the preview to segment it out.

Flexible Action and Articulated Skeleton Toolkit (FAAST) Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: faast@ict.usc.edu 32-bit(recommended for most users) 64-bit(for advanced users) Jean-Christophe Naour — Interaction Design — About I'm , a french Interaction Designer based in Seoul, South-Korea. During the past years, I've been developing interaction concepts, interfaces, programs and motion design for mobile phones, mp3 players, television, GPS, installations, etc. My scope is continually expanding, in an effort to, ultimately, go beyond the screen.

FaceCube: Copy Real Life with a Kinect and 3D Printer This project is a tangent off of something cool I’ve been hacking on in small pieces over the last few months. I probably would not have gone down this tangent had it not been for the recent publication of Fabricate Yourself. Nothing irks inspires me more than when someone does something cool and then releases only a description and pictures of it. Thus, I’ve written FaceCube, my own open source take on automatic creation of solid models of real life objects using the libfreenect python wrapper, pygame, NumPy, MeshLab, and OpenSCAD.

Kinect Open Source Programming Secrets Kinect Open Source Programming Secrets (KOPS) is the only book that explains the official Java wrappers for OpenNI and NITE. (If you want installation instructions, scroll down this page a little.) The main drawback of using the PrimeSense Java wrappers is their lack of documentation. As I explain in chapter 1, I had to decompile the libraries' JAR files, and work out the correspondences between the Java source and the somewhat better documented C++ OpenNI/NITE APIs. This is why including "secrets" in the book's title isn't too excessive :). This book covers programming topics not found elsewhere.

Kinect OpenNI-NITE ofxOpenNI Skeleton with openFrameworks on Linux At last, after some setting up we got the Nite library accesible from openFrameworks on linux thanxs to the work of the independent community. To make it work, we followed closely these instructions to install the OpenNI libs, Primesense's Sensor drivers, and the NITE library. We got some of the "InitFromXml failed: Can't create any node of the requested type!" error reported by other, but it was solved after rebooting, deleting the three folders (OpenNI, Sensors and NITE), and repeating everithing.

Kinect Kinect Experimentsoverview of available kinect libraries and drivers In the last two years the Kinect Camera became an influential tool in a variety of situations, reaching from interactive installations to virtual user interfaces. Hence we’ve been following the development of different Kinect software solutions from the beginning and compared different processing libraries like How Motion Detection Works in Xbox Kinect The prototype for Microsoft’s Kinect camera and microphone famously cost \$30,000. At midnight Thursday morning, you’ll be able to buy it for \$150 as an Xbox 360 peripheral. Microsoft is projecting that it will sell 5 million units between now and Christmas. We’ll have more details and a review of the system soon, but for now it’s worth taking some time to think about how it all works.

Kinect & HTML5 using WebSockets and Canvas - Vangos Pterneas blog Kinect defined Natural User Interaction. HTML5 redefined the Web. Currenty, there are various tutorials describing how to interact with a Kinect sensor using Windows Forms or WPF for the user interface. But what about using a web interface for handling Kinect data? Trying to combine those two hot, cutting-edge technologies, I came up with a pretty and open-source solution, which I am going to describe in this blog post.

Related:  Kinect