background preloader

Main Page

Main Page
Welcome to the OpenKinect project About OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac. The OpenKinect community consists of over 2000 members contributing their time and code to the Project. Our members have joined this Project with the mission of creating the best possible suite of applications for the Kinect.

http://openkinect.org/wiki/Main_Page

Nicolas Burrus Homepage - Using the nestk library Developing your own software based on the nestk library Nestk is the core library used by the demo programs within RGB Demo. It is designed to be easy to integrate into existing cmake-based software and provide access the Kinect features quickly. The library is built on top of OpenCV and QT for the graphical parts. Parallel Programming and Computing Platform What is CUDA? Enroll today! Intro to Parallel Programming An open, online course from Udacity Instructors: Dr.

Working with Kinect 3D Sensor in Robotics - Setup, Tutorials, Applications - Into Robotics A revolutionary 3D sensor from gaming industry designed to capture motion of players is effectively used in the robotic fields for a wide range of applications including objects recognition and tracking, 3D environment mapping, detect distance, or voice recognition and control. All these features make from Kinect the subject of this article where a set of setup and application tutorials are included. In the following are available a wide range of setup tutorials for different versions of operating systems as well as different operating systems including Windows, Linux and Mac. All features of Kinect sensor listed above are used in different robotic applications and a series of tutorials to learn how to use these features are available in the following.

Coordinate Spaces Kinect for Windows 1.5, 1.6, 1.7, 1.8 A Kinect streams out color, depth, and skeleton data one frame at a time. This section briefly describes the coordinate spaces for each data type and the API support for transforming data from one space to another. Each frame, the color sensor captures a color image of everything visible in the field of view of the color sensor. Kinect Depth vs. Actual Distance The depth array that comes in from the Kinect has a precision of up to 11 bits or different values. The Kinect is just a sensor, or measurement tool, and every scientist wants to know the limitations of their equipment. So the obvious questions are, what is the accuracy of these measurements? What do these values represent in physical space? I conducted a very simple experiment to answer these questions.

Getting started with Kinect on Ubuntu 12.04 – OpenNI, Nite , SimpleopenNI and Processing The objective of this tutorial is - 1) To install OpenNI, Nite (drivers that help getting data from the kinect) in Ubuntu 12.04. 2) To set up Processing and SimpleOpenNI ( OpenNI and Nite wrapper for Processing ) using which you can get started with Kinect coding. What you need? Kinect with usb cable and a computer with Ubuntu installed. Getting started with Microsoft Kinect SDK At this point we could start with something complicated but as with all things simple is better - at first at least. So rather than digging into the details of the depth field and body skeletonization methods lets just make use of the video camera. It may not be as exciting but it does indicate the general idea of using the SDK. Every Kinect program starts in the same general way. First you need to retrieve the collection of Kinects connected to the machine - since beta 2 there can be more than one. To do this we use the static object Runtime and its Kinects collection:

The Kinect Thread Hey there I just came across vvvv, downloaded it and looked at some of the video tutorials. My interest in vvvv is primarily for using it in my VJ shows. Setting up Kinect for programming in Linux (part 2) Last month, we had a look on how to setup the Kinect on a linux (Ubuntu) machine (if you missed it, you can find it here) . Today, we will continue exploring the Kinect possibilities by using the device in a program written in Qt. We will first design a class that will be our interface between OpenKinect/NITE libraries and Qt, then instantiate it in a little example program. A word on the Kinect API we’ll use OpenKinect OpenKinect offers a standard C++ API to operate the device.

How-to: Benefit from Kinect.Toolbox and Coding4Fun on Kinect Programming Introduction Since July 2011, the publishing date of the beta version of the Kinect SDK, the number of programmer, students and fans who are interested to this new technology is increasing, also the development of tools and API that could make the Kinect programming very easy has increased, the most used API on Kinect programming are Kinect.Toolbox and coding4Fun. The use of those two APIs is what we will see and study in this article. Background C#, Visual Studio 2010, Kinect device, Download and install the Kinect SDK: Download Kinect SDK Beta. Step 0 : Download and install the APIs Nicolas Burrus Homepage - Kinect Calibration Calibrating the depth and color camera Here is a preliminary semi-automatic way to calibrate the Kinect depth sensor and the rgb output to enable a mapping between them. You can see some results there:

Setting up Kinect for programming in Linux (part 1) The Kinect, a Microsoft device originally made for the Xbox360 (a gaming console), has become incredibly popular among developers in the past few months; it allows easy tracking of predefined movements without having to wear special clothes or complicated sensors. Once you’re set-up, you would be able to access functions such as hand tracking, scene analysis (count how many people are in the room, where they are), and much more. The first part of this tutorial will guide you through all the required steps to set up the Kinect on your Ubuntu environment. The Kinect at a glance The device, with its two cameras, and the IR projector on the left. How does it work?

Using the Kinect hardware to implement user interfaces. by kaspervandenberg Feb 27

Related:  OPen wiimote kinect