background preloader

Skeleton tracking with the Kinect and OSCulator

Skeleton tracking with the Kinect and OSCulator
Related:  kinectComputer Music Collection

kinect_ProjectorDance | princeMio This is a lightweightkinect-projector calibration method – which iseasy to usefast to setupfun to play with ;) It’s hacked for processing and works without openCV. I created this tool because i needed an interactive wall for another project. So what this application basically does…IS…it transforms the user’s data (pixel, sceleton) orthogonal to a self defined projectorwall. Most of the artists and especially beginners within the processing community, tried the wonderfull applications presented at This is a zip folder, containing the calibration scetch and a demo scetch. Ok this should be very easy! I Calibrate the wall’s orientation Place your camera pointing at the screen/projection in about 2 – 5 meter distance. Now that you defined the orientation of the wall, you need to define the exact dimensions of the actual screen/projection displaying the content. The calibartion of course can have some calculation offset, rooting from inaccuracy and noise.

Use Kinect with Mac OSX In this article I will show how you can use Microsoft Kinect for xBox 360 on Mac OSX. In this way you can create applications that interact with the body's movements. Introduction This article is intended for people who have a lot of experience in the Information Technology area, both as a developer and as systems engineer, especially on unix systems. In fact, the installation of the drivers may be a little tricky, especially if something does not go the first time. I warn you... there are some commands to run with the terminal, I do not take any responsibility if with these commands (or connecting the kinect) you will damage your Mac. The version of Kinect that i have is sold separately from the xBox. I connected the Kinect to an iMac with OSX 10.7.4 64-bit. Well, now that I have described the tools used for testing, we can install and configure the software and drivers required. Driver and SDK Before proceeding you should know that there are several available API and SDK for Kinect.

Setup Microsoft Kinect on Mac OS X 10.9 (Mavericks) If you want to get the Microsoft Kinect setup and working on your Mac using OS X 10.9 Mavericks, then you’ve come to the right place. Since posting the first tutorial, a number of new software updates have been released, so it’s a good idea to recap from the start. This tutorial will detail all the steps necessary to get the Kinect working in Mavericks, so buckle up and let’s get this party started. As always, if you have any questions, issues, or feedback, please feel free to post them in the comments section at the bottom, and to keep abreast of any new updates and posts you can follow me on Twitter, or subscribe using the new email form in the sidebar. Oh, and if you don’t own a Kinect yet, there’s a few things you’ll need to know, so please check out the buyers guide or If you followed my earlier tutorial and/or had your Kinect running in Mac OS X 10.8 Mountain Lion, then you’ll want to complete this step before moving ahead. When it comes to hacking the Kinect, cleaner is better.

OpenKinect How to use Quartz Composer, Synapse & Xbox Kinect on your Mac If you’re looking to kickstart your Kinect programming and create some magic on the Mac, then this is the place to be. In this tutorial we use Synapse, Quartz Composer and the Kinect sensor to create a cool motion-activated particle effect that lets you move an animation around your screen using only your hands. Please note, this tutorial has been completed using Mac OS X 10.8 (Mountain Lion). It is also important to note that your Xbox Kinect should be model #1414. Download the Project Files This tutorial will guide you through all the steps necessary to install and use Quartz Composer and Synapse with the Xbox Kinect on Mac. Download Project Files Step 1: Setup the Xbox Kinect First things first, before we can move forward, you’ll need to make sure you have your Xbox Kinect setup. Step 2: Install Quartz Composer Quartz Composer is an amazing app created by Apple, and distributed via the Apple Developer network. 1. Menubar navigation to ‘More Developer Tools…’ 2. 3. 1. 2. 3. 4. Not so fast!

ITP Spring Show 2011 » Capturing Dance: Exploring Movement with Computer Vision We've created a series of software "tools" for capturing and manipulating Kinect footage of a live dancer through sound and gesture cues. With these tools, we've produced a set of pre-recorded videos that explore each of these tools in a short choreographic "study." Each study touches upon a different aspect of using visual imagery to underline and transform the live dance performance. We've also begun to experiment with manipulating the abstracted, kinect imagery through sound as a way of visualizing the interaction between musician and dancer. BackgroundKinect Sound libraries in Processing: Sonia, Minim Audience Kids and Adults. User Scenario We would like to show the videos alongside a live installation that allows viewers to interact with the tools we've created so they can experience some of the effects for themselves. Viewers enter a controlled space where they see a video recording of our Kinect dance studies. Implementation Kinect camera.

Pioneros del arte de la red La chica de la izquierda de la foto se llama Olia Lialina, y acaba de vender una de sus obras: con este apretón de manos está cerrando la transacción. Olia Lialina es una de las pionera del arte de internet: todos sus trabajos han sido creados especialmente para ser difundidos en la red. La aparición del –un arte inmaterial que se crea y distribuye en internet- evidencia futuras transformaciones en la recepción de prácticas artísticas. Los primeros intentos de vender arte de internet durante lo que hoy conocemos con el nombre de Periodo Heroico nos parecen especialmente interesantes porque no fueron llevados a cabo por galeristas ni agentes del mercado, sino por los propios artistas en un intento por reflexionar sobre la naturaleza de sus obras. 1. Utilizamos el término para referirnos a aquellas prácticas artísticas creadas especialmente para difundirse en la red. José Luis Brea, editor de Aleph, (definía): Pequeña guía de navegación. Cuatro artículos Dos libros Un wiki subir

Identifying People in a Scene with the Kinect - Learning We'll start with the sketch we wrote to draw the depth image from the Kinect in Drawing Depth with the Kinect: import SimpleOpenNI.*; SimpleOpenNI context; void setup(){ // instantiate a new context context = new SimpleOpenNI(this); // enable depth image generation context.enableDepth(); // create a window the size of the scene size(context.depthWidth(), context.depthHeight()); } void draw(){ // update the camera context.update(); // draw depth image image(context.depthImage(),0,0); } The OpenNI library does all the work for us to identify people or other moving objects in a scene. It does this with the scene analyser which is turned on by adding context.enableScene() to the setup(). The depth image also needs to be enabled in order for the scene to be analysed. We then need to draw the scene image, rather than the depth image, in the draw() function: void draw(){ // update the camera context.update(); // draw scene Image image(context.sceneImage(), 0, 0); } Try running the sketch.

Kinect / Kinect La Kinect est une caméra qui permet de capter trois types d'images différentes, une image de profondeur, une image RGB et une image infrarouge: Images de profondeur, RGB et infrarouge L'image de profondeur peut-être utilisée pour assembler des maillages 3D en temps réel. L'image RGB peut-être mappée à ce maillage 3D ou le maillage peut-être déformé pour créer des effets intéressants: En appliquant des modules d'analyses à l'image de profondeur, il est possible de poursuivre plusieurs interacteurs et d'interpréter leurs mouvements: Modules d'analyse de la scène, squelettique et gestuel L'utilisation de la Kinect nécessite un pilote. OpenNI: réalisé par les concepteurs de la Kinect et très riche en fonctionnalités libfreenect: réalisé par la communauté et très élémentaire Parfois il est nécessaire de télécharger et d'installer les pilotes et parfois ils sont déjà inclus et compilés.

Skeleton Tracking with the Kinect - Learning This tutorial will explain how to track human skeletons using the Kinect. The OpenNI library can identify the position of key joints on the human body such as the hands, elbows, knees, head and so on. These points form a representation we call the 'skeleton'. Enabling Skeleton Tracking Let us start with the code that we had by the end of the tutorial called Drawing Depth with the Kinect: import SimpleOpenNI.*; SimpleOpenNI context; void setup(){ // instantiate a new context context = new SimpleOpenNI(this); // enable depth image generation context.enableDepth(); // create a window the size of the depth information size(context.depthWidth(), context.depthHeight()); } void draw(){ // update the camera context.update(); // draw depth image image(context.depthImage(),0,0); } First we must tell the OpenNI library to determine joint positions by enabling the skeleton tracking functionality . // enable skeleton generation for all joints context.enableUser(SimpleOpenNI.SKEL_PROFILE_ALL);

COS429: Computer Vision Overview: On your one-minute walk from the coffee machine to your desk each morning, you pass by dozens of scenes – a kitchen, an elevator, your office – and you effortlessly recognize them and perceive their 3D structure. But this one-minute scene-understanding problem has been an open challenge in computer vision, since the field was first established 50 years ago. In this class, we will learn the state-of-the-art algorithms, and study how to build computer systems that automatically understand visual scenes, both inferring the semantics and extracting 3D structure. This course requires programming experience as well as basic linear algebra. Instructor: Jianxiong XiaoTAs: Yinda Zhang (yindaz [at] princeton ) Mingru Bai (mingru.bai [at] princeton ) Time: Tuesday,Thursday, 3:00PM - 4:20PMLocation for Lecture: CS 105 Office Hour: Friday 1:00PM-2:00PM (Location CS003)Online Discussion: Assignments: Schedule: Matlab Workshop: Requirements/Grading: Communication: Late Policy:

Computer Vision: Algorithms and Applications © 2010 Richard Szeliski Welcome to the Web site ( for my computer vision textbook, which you can now purchase at a variety of locations, including Springer (SpringerLink, DOI), Amazon, and Barnes & Noble. The book is also available in Chinese and Japanese (translated by Prof. This book is largely based on the computer vision courses that I have co-taught at the University of Washington (2008, 2005, 2001) and Stanford (2003) with Steve Seitz and David Fleet. You are welcome to download the PDF from this Web site for personal use, but not to repost it on any other Web site. The PDFs should be enabled for commenting directly in your viewer. If you have any comments or feedback on the book, please send me e-mail. This Web site will also eventually contain supplementary materials for the textbook, such as figures and images from the book, slides sets, pointers to software, and a bibliography. Electronic draft: September 3, 2010 Errata Slide sets

therenect - A virtual Theremin for the Kinect Controller The Therenect is a virtual Theremin for the Kinect controller. It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be adjusted by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that should be quite similar to playing an actual Theremin. This video was recorded prior to this release, an updated video introducing the improved features of the current version will follow soon. Configuration Oscillator: Theremin, Sinewave, Sawtooth, Squarewave Tonality: continuos mode or Chromatic, Ionian and Pentatonic scales MIDI: optionally send MIDI note on/off events to the selected device & channel Kinect: adjust the sensor camera angle AcknowledgmentsThis application has been created by Martin Kaltenbrunner at the Interface Culture Lab.