Nicolas Burrus Homepage - Using the nestk library Developing your own software based on the nestk library Nestk is the core library used by the demo programs within RGB Demo. It is designed to be easy to integrate into existing cmake-based software and provide access the Kinect features quickly. The library is built on top of OpenCV and QT for the graphical parts. Parallel Programming and Computing Platform What is CUDA? Enroll today! Intro to Parallel Programming An open, online course from Udacity Instructors: Dr.
Working with Kinect 3D Sensor in Robotics - Setup, Tutorials, Applications - Into Robotics A revolutionary 3D sensor from gaming industry designed to capture motion of players is effectively used in the robotic fields for a wide range of applications including objects recognition and tracking, 3D environment mapping, detect distance, or voice recognition and control. All these features make from Kinect the subject of this article where a set of setup and application tutorials are included. In the following are available a wide range of setup tutorials for different versions of operating systems as well as different operating systems including Windows, Linux and Mac. All features of Kinect sensor listed above are used in different robotic applications and a series of tutorials to learn how to use these features are available in the following.
Use Kinect with Mac OSX In this article I will show how you can use Microsoft Kinect for xBox 360 on Mac OSX. In this way you can create applications that interact with the body's movements. Introduction This article is intended for people who have a lot of experience in the Information Technology area, both as a developer and as systems engineer, especially on unix systems. In fact, the installation of the drivers may be a little tricky, especially if something does not go the first time. I warn you... there are some commands to run with the terminal, I do not take any responsibility if with these commands (or connecting the kinect) you will damage your Mac.
Coordinate Spaces Kinect for Windows 1.5, 1.6, 1.7, 1.8 A Kinect streams out color, depth, and skeleton data one frame at a time. This section briefly describes the coordinate spaces for each data type and the API support for transforming data from one space to another. Each frame, the color sensor captures a color image of everything visible in the field of view of the color sensor. Kinect Depth vs. Actual Distance The depth array that comes in from the Kinect has a precision of up to 11 bits or different values. The Kinect is just a sensor, or measurement tool, and every scientist wants to know the limitations of their equipment. So the obvious questions are, what is the accuracy of these measurements? What do these values represent in physical space? I conducted a very simple experiment to answer these questions.
Getting started with Kinect on Ubuntu 12.04 – OpenNI, Nite , SimpleopenNI and Processing The objective of this tutorial is - 1) To install OpenNI, Nite (drivers that help getting data from the kinect) in Ubuntu 12.04. 2) To set up Processing and SimpleOpenNI ( OpenNI and Nite wrapper for Processing ) using which you can get started with Kinect coding. What you need? Kinect with usb cable and a computer with Ubuntu installed. How to use Quartz Composer, Synapse & Xbox Kinect on your Mac If you’re looking to kickstart your Kinect programming and create some magic on the Mac, then this is the place to be. In this tutorial we use Synapse, Quartz Composer and the Kinect sensor to create a cool motion-activated particle effect that lets you move an animation around your screen using only your hands. Please note, this tutorial has been completed using Mac OS X 10.8 (Mountain Lion). It is also important to note that your Xbox Kinect should be model #1414.
Getting started with Microsoft Kinect SDK At this point we could start with something complicated but as with all things simple is better - at first at least. So rather than digging into the details of the depth field and body skeletonization methods lets just make use of the video camera. It may not be as exciting but it does indicate the general idea of using the SDK. Every Kinect program starts in the same general way. First you need to retrieve the collection of Kinects connected to the machine - since beta 2 there can be more than one. To do this we use the static object Runtime and its Kinects collection:
The Kinect Thread Hey there I just came across vvvv, downloaded it and looked at some of the video tutorials. My interest in vvvv is primarily for using it in my VJ shows. Setting up Kinect for programming in Linux (part 2) Last month, we had a look on how to setup the Kinect on a linux (Ubuntu) machine (if you missed it, you can find it here) . Today, we will continue exploring the Kinect possibilities by using the device in a program written in Qt. We will first design a class that will be our interface between OpenKinect/NITE libraries and Qt, then instantiate it in a little example program. A word on the Kinect API we’ll use OpenKinect OpenKinect offers a standard C++ API to operate the device.
Setup Microsoft Kinect on Mac OS X 10.9 (Mavericks) If you want to get the Microsoft Kinect setup and working on your Mac using OS X 10.9 Mavericks, then you’ve come to the right place. Since posting the first tutorial, a number of new software updates have been released, so it’s a good idea to recap from the start. This tutorial will detail all the steps necessary to get the Kinect working in Mavericks, so buckle up and let’s get this party started.