background preloader

Skeleton tracking with the Kinect and OSCulator

Skeleton tracking with the Kinect and OSCulator
Related:  kinectComputer Music Collection

kinect_ProjectorDance | princeMio This is a lightweightkinect-projector calibration method – which iseasy to usefast to setupfun to play with ;) It’s hacked for processing and works without openCV. I created this tool because i needed an interactive wall for another project. So what this application basically does…IS…it transforms the user’s data (pixel, sceleton) orthogonal to a self defined projectorwall. Most of the artists and especially beginners within the processing community, tried the wonderfull applications presented at This is a zip folder, containing the calibration scetch and a demo scetch. Ok this should be very easy! I Calibrate the wall’s orientation Place your camera pointing at the screen/projection in about 2 – 5 meter distance. Now that you defined the orientation of the wall, you need to define the exact dimensions of the actual screen/projection displaying the content. The calibartion of course can have some calculation offset, rooting from inaccuracy and noise.

motej - a Wiimote library for JAVA Use Kinect with Mac OSX In this article I will show how you can use Microsoft Kinect for xBox 360 on Mac OSX. In this way you can create applications that interact with the body's movements. Introduction This article is intended for people who have a lot of experience in the Information Technology area, both as a developer and as systems engineer, especially on unix systems. In fact, the installation of the drivers may be a little tricky, especially if something does not go the first time. I warn you... there are some commands to run with the terminal, I do not take any responsibility if with these commands (or connecting the kinect) you will damage your Mac. The version of Kinect that i have is sold separately from the xBox. I connected the Kinect to an iMac with OSX 10.7.4 64-bit. Well, now that I have described the tools used for testing, we can install and configure the software and drivers required. Driver and SDK Before proceeding you should know that there are several available API and SDK for Kinect.

TouchOSC Modular OSC and MIDI control surface for iPhone / iPod Touch / iPad Send and receive Open Sound Control and MIDI messages over Wi-Fi and control CoreMIDI compatible software, hardware and mobile apps. Also available for Android Screenshots Features Remote control and receive feedback from any software or hardware that implements the OSC or MIDI protocols such as: Total Control Choose from a wide variety of controls and configure each to fit your preference and the requirements of the software or hardware you are working with. Built-in Logic Pro & Express Support TouchOSC is an officially supported Apple Logic Pro and Express Control Surface. Cross-platform Support TouchOSC is available for iOS and Android devices. TouchOSC Editor TouchOSC Bridge TouchOSC Bridge is a standalone tool application for Windows and Mac OS X that relays MIDI messages sent from TouchOSC to any MIDI capable application on your computer and vice versa. Requirements Manual Complete online documentation can be found here.

Setup Microsoft Kinect on Mac OS X 10.9 (Mavericks) If you want to get the Microsoft Kinect setup and working on your Mac using OS X 10.9 Mavericks, then you’ve come to the right place. Since posting the first tutorial, a number of new software updates have been released, so it’s a good idea to recap from the start. This tutorial will detail all the steps necessary to get the Kinect working in Mavericks, so buckle up and let’s get this party started. As always, if you have any questions, issues, or feedback, please feel free to post them in the comments section at the bottom, and to keep abreast of any new updates and posts you can follow me on Twitter, or subscribe using the new email form in the sidebar. Oh, and if you don’t own a Kinect yet, there’s a few things you’ll need to know, so please check out the buyers guide or If you followed my earlier tutorial and/or had your Kinect running in Mac OS X 10.8 Mountain Lion, then you’ll want to complete this step before moving ahead. When it comes to hacking the Kinect, cleaner is better.

The Official YAML Web Site OpenKinect Tools | Showsync Livegrabber, free plugins to connect all parameters of Ableton Live to external devices with Open Sound Control.ClipSMPTE, a free plugin for Ableton Live that outputs the active clip position as a SMPTE audio signal.Livesync, free plugins that accurately synchronize the tempos, playheads and clip playback position of two Ableton Live sets via a network, without MIDI. The tools below are not in a downloadable state yet, but we’re keen to get your approval and we hope that you can help us develop them to final products by suggesting partnerships or intriguing test projects: Beattracker (WIP), an intelligent audio-based BPM and beat tracker.Lightsync (WIP), a new lighting controller based on Ableton Live.Videosync (WIP), an add-on that accurately links the playback position of matching video tracks to audio clips in Ableton Live.CDJ2Live (WIP), a Max For Live plugin that connects a CDJ to Ableton Live.

How to use Quartz Composer, Synapse & Xbox Kinect on your Mac If you’re looking to kickstart your Kinect programming and create some magic on the Mac, then this is the place to be. In this tutorial we use Synapse, Quartz Composer and the Kinect sensor to create a cool motion-activated particle effect that lets you move an animation around your screen using only your hands. Please note, this tutorial has been completed using Mac OS X 10.8 (Mountain Lion). It is also important to note that your Xbox Kinect should be model #1414. Download the Project Files This tutorial will guide you through all the steps necessary to install and use Quartz Composer and Synapse with the Xbox Kinect on Mac. Download Project Files Step 1: Setup the Xbox Kinect First things first, before we can move forward, you’ll need to make sure you have your Xbox Kinect setup. Step 2: Install Quartz Composer Quartz Composer is an amazing app created by Apple, and distributed via the Apple Developer network. 1. Menubar navigation to ‘More Developer Tools…’ 2. 3. 1. 2. 3. 4. Not so fast!

Agentgroup ITP Spring Show 2011 » Capturing Dance: Exploring Movement with Computer Vision We've created a series of software "tools" for capturing and manipulating Kinect footage of a live dancer through sound and gesture cues. With these tools, we've produced a set of pre-recorded videos that explore each of these tools in a short choreographic "study." Each study touches upon a different aspect of using visual imagery to underline and transform the live dance performance. We've also begun to experiment with manipulating the abstracted, kinect imagery through sound as a way of visualizing the interaction between musician and dancer. BackgroundKinect Sound libraries in Processing: Sonia, Minim Audience Kids and Adults. User Scenario We would like to show the videos alongside a live installation that allows viewers to interact with the tools we've created so they can experience some of the effects for themselves. Viewers enter a controlled space where they see a video recording of our Kinect dance studies. Implementation Kinect camera.

Pioneros del arte de la red La chica de la izquierda de la foto se llama Olia Lialina, y acaba de vender una de sus obras: con este apretón de manos está cerrando la transacción. Olia Lialina es una de las pionera del arte de internet: todos sus trabajos han sido creados especialmente para ser difundidos en la red. La aparición del –un arte inmaterial que se crea y distribuye en internet- evidencia futuras transformaciones en la recepción de prácticas artísticas. Los primeros intentos de vender arte de internet durante lo que hoy conocemos con el nombre de Periodo Heroico nos parecen especialmente interesantes porque no fueron llevados a cabo por galeristas ni agentes del mercado, sino por los propios artistas en un intento por reflexionar sobre la naturaleza de sus obras. 1. Utilizamos el término para referirnos a aquellas prácticas artísticas creadas especialmente para difundirse en la red. José Luis Brea, editor de Aleph, (definía): Pequeña guía de navegación. Cuatro artículos Dos libros Un wiki subir

SMIL Synchronized Multimedia Integration Language Un article de Wikipédia, l'encyclopédie libre. Synchronized Multimedia Integration Language (SMIL) est une spécification du W3C dont l'objectif est de permettre l'intégration de contenus multimédias diversifiés (images, sons, textes, vidéo, animations, flux de texte) en les synchronisant afin de permettre la création de présentations multimédias. SMIL est un langage de la famille XML. La structure XML d'un document SMIL décrit le déroulement temporel et spatial des différents composants intégrés. En d'autres termes, SMIL permet d'indiquer le moment où un contenu sera affiché, pendant combien de temps et dans quelle partie de la fenêtre d'affichage. SMIL est le format utilisé par les MMS. Logiciels permettant de visualiser ou modifier du SMIL[modifier | modifier le code] Autres liens[modifier | modifier le code] (en) La page SMIL du W3C Portail de l’informatique