background preloader

Getting Started with Kinect and Processing

Getting Started with Kinect and Processing
So, you want to use the Kinect in Processing. Great. This page will serve to document the current state of my Processing Kinect library, with some tips and info. The current state of affairs Since the kinect launched in November 2010, there have been several models released. Here's a quick list of what is out there and what is supported in Processing for Mac OS X. Kinect 1414: This is the original kinect and works with the library documented on this page in Processing 2.1 Kinect 1473: This looks identical to the 1414, but is an updated model. Now, before you proceed, you could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book. I’m ready to get started right now What hardware do I need? First you need a “stand-alone” kinect (model 1414 only for now!). Standalone Kinect Sensor If you have a previous kinect that came with an XBox, it will not include the USB adapter. Kinect Sensor Power Supply Um, what is Processing? ofxKinectKinect CinderBlock Lots! Related:  `test 1023

Kinect + Arduino | Tanner's Website With an Arduino Ethernet, Processing, and a Kinect, I was able to easily create this little demo where hand movement can control a servo. This is just a tiny step in my master plan to create a robot clone so that I don’t have to leave my chair. <p>[Javascript required to view Flash movie, please turn it on and refresh this page]</p> The following libraries and drivers made this work and also made it super easy for me to create it: OpenKinectDaniel Shiffman’s Processing Kinect Library (he knows his stuff and has great examples on his site)Arduino Ethernet UDP send / receive string Servo:EMAX ES08A Servo How it works: The Arduino Ethernet acquires an IP address and waits for UDP packets on a certain port.The machine with the Kinect sends packets to the Arduino that contain hand coordinate data.The Arduino then takes this data (an integer) and maps the range from 0 to 180 degrees.The mapped value is sent to the servo.

Getting started with Kinect on Ubuntu 12.04 – OpenNI, Nite , SimpleopenNI and Processing | ramsrigoutham The objective of this tutorial is - 1) To install OpenNI, Nite (drivers that help getting data from the kinect) in Ubuntu 12.04. 2) To set up Processing and SimpleOpenNI ( OpenNI and Nite wrapper for Processing ) using which you can get started with Kinect coding. What you need? Kinect with usb cable and a computer with Ubuntu installed. It is not recommended to run Ubuntu as a wubi installer from Windows when working with Kinect. Better install Ubuntu in a new partition and run it. 1) Installing OpenNI and NITE 1) I highly recommend installing 32 bit versions of all even if yours is a 64 bit system. Tip: Instead of navigating to different folders using cd command you can enable Open in terminal option when you right click in any folder. sudo apt-get install nautilus-open-terminal After installing type : killall nautilus && nautilus in terminal to activate the change immediately. Testing the installation: Connect the kinect and ensure that the green Led on it is blinking. . Errors in my case:

Kinect - Medien Wiki The Microsoft® XBOX 360 Kinect is a motion controller developped by PrimeSense. It projects a pattern with infrared light and calculates a depth image using a camera. It also has a color camera and four microphones. About Blogs and portals Software Applications CocoaKinect App Freenect by Robert Pointon Synapse generates sceleton data and provides it as OSC[[OSC|Open Sound Control]] ofxFaceTracker provides face metrics (orientation, eye and mouth open/closed) over OSC[[OSC|Open Sound Control]] codelaboratories.com/kb/nui TUIO Kinect lets you define a depth range where multiple blobs can be detected. SDKs, Frameworks and Libraries Depth image openkinect.org Drivers, Installation pix_freenect for Pd[[Pure Data]] a dataflow programming environment (incl. binaries work without any compiling) fux_kinect Object for Pd[[Pure Data]] a dataflow programming environment ofxKinect openFramworks Kinect integration vvvv kinect integration Skeleton data Successors/Competitors

Mobile Autonomous Robot using the Kinect Given a priori knowledge of the environment and the goal position, mobile robot navigation refers to the robot’s ability to safely move towards the goal using its knowledge and sensorial information of the surrounding environment. In fact, in mobile robot operating in unstructured environment, the knowledge of the environment is usually absent or partial. Therefore, obstacle detection and avoidance are always mentioned for mobile robot missions. Kinect is not only normal camera sensor but also a special device can provide depth map.Depth map is acquired through OpenNI library then processed by Point Cloud library to extract accurate information about the environment. Here is link of full project: (code + references in English, others in Vietnamese but still good to understand from the source code) Some fun stuffs using kinect are available on my channel.

Global Health, Local Knowledge Marker Color Marker color reflects the noteworthiness of events at a particular location during a given time window. An event's degree of noteworthiness is based on the significance rating of the alert provided by HealthMap users. Marker Size The large circle indicates a country-level alert, while state, province and local alerts are indicated by the small circle. Vizual Invaders - blog U.F.O. par VJ ZERO. Voici la nouvelle création de l’artiste visuel ZERO (www.zero.com) U.F.O (Unknow Flashing Object), mise en œuvre par le label et très certainement le plus beau projet auquel nous avons participé. CLIQUER ICI POUR VOIR LA VIDEO ! ∞INFINITY∞ par Vizual Invaders. Voici la nouvelle scénographie du label, polymorphe, elle est conçue pour s’adapter à toutes les jauges de scène.

rosnodejs - Program robots with JavaScript Rosnodejs is currently deprecated. Most of my efforts for JavaScript and robotics has been shifted to the Robot Web Tools project. I highly recommend taking a look at Robot Web Tools if interested in putting your robot on the web. Features include: JavaScript interface to ROS functionality 2D tools for mapping and more 3D tools for robot visualization in a 3D environment I still feel a Node.js interface into ROS is important. Rosnodejs is a Node.js module that lets you use JavaScript to interact with the Robot Operating System, an open-source robot framework used by many of the top universities and research programs around the world. Perform a range of robotic tasks, from controlling the motors on an Arduino to processing Kinect sensor data using JavaScript and Node.js. The goal is to make the field of robotics more accessible to the countless intelligent web developers out there. One of the top frameworks to program robots with today is the Robot Operating System (ROS).

Global Payroll Service Provider Kinect | Doc-Ok.org I just read an interesting article, a behind-the-scenes look at the infamous “Milo” demo Peter Molyneux did at 2009′s E3 to introduce Project Natal, i.e., Kinect. This article is related to VR in two ways. First, the usual progression of overhyping the capabilities of some new technology and then falling flat on one’s face because not even one’s own developers know what the new technology’s capabilities actually are is something that should be very familiar to anyone working in the VR field. But here’s the quote that really got my interest (emphasis is mine): Others recall worrying about the presentation not being live, and thinking people might assume it was fake. Gee, sounds familiar? With the “Milo” demo, the problem was similar. The take-home message here is that mainstream games are slowly converging towards approaches that have been embodied in proper VR software for a long time now, without really noticing it, and are repeating old mistakes.

Related: