background preloader

DA Kinect

Facebook Twitter

Mensch-Computer-Interaktion: Basiswissen für Entwickler und Gestalter - Andreas M. Heinecke. Www.billbuxton.com/multitouchOverview.html. Bill Buxton Microsoft Research Original: Jan. 12, 2007 Version: June 12, 2014 Keywords / Search Terms Multi-touch, multitouch, input, interaction, touch screen, touch tablet, multi-finger input, multi-hand input, bi-manual input, two-handed input, multi-person input, interactive surfaces, soft machine, hand gesture, gesture recognition . An earlier version of this page is also available in Belorussian, thanks to the translation by Martha Ruszkowski. Preamble Since the announcements of the iPhone and Microsoft's Surface (both in 2007), an especially large number of people have asked me about multi-touch.

Multi-touch technologies have a long history. Westerman, Wayne (1999). In making this statement about their awareness of past work, I am not criticizing Westerman, the iPhone, or Apple. Please do not be shy in terms of sending me photos, updates, etc. Get Sdk | Kinesis - build gesture apps. Accessing the kinect in javascript through websockets « Aboutme – blog. Good morning all! (or evening, night, … depending on when you read this post of course). As you might know, I’ve been working on AIRKinect (as3nui.com) and I’ve got a side project AIRServer aswell (which allows you to setup air as a socket server, including websocket support).

Wouldn’t it be fun, to combine these two projects in a demo, so you can access the kinect information through a websocket? That’s exactly what I did. You run a desktop application on your computer, which is responsible for accessing the kinect, and exposing the skeleton information over a websocket. In this demo, I’m just rendering the skeleton points in a canvas element, using three.js. I’ve uploaded the sources and included binary installers for the desktop application (windows 7, OSX Lion). Using the javascript client, you connect to your ip (if you’re testing on the same ip, 127.0.0.1 should be fine), and you can start dancing in the canvas element :-). Implementations. TUIO hardware support In order to support the further development of the TUIO platform, we are looking for hardware donations of various hardware: iOS and Android tablets, Touchscreen hardware, Windows 7 Multitouch Notebooks, Netbooks and Tablets, MS Surface, Samsung SUR40, Magic Trackpad or any other devices to evaluate their existing or potential TUIO support.

Please get in touch with martin_at_tuio_dot_org for further information! TUIO Tracker Implementations TUIO Server Reference Implementations C++: TUIO_CPP.zip (source, all platforms) TUIO output Bridges Touch2Tuio: forwards native Windows 7 touch messages to TUIO clients mtdev2tuio: converts Linux touch events from libmtdev to TUIO 1.1 TouchToTuio: N-Trig panel TUIO driver (e.g. TUIO input Bridges Multi-Touch Vista Windows HID driver, input management layer with a TUIO input provider. TUIO Simulators TUIO Gateways. Patricio Gonzalez Vivo. Kinect Core Vision Finger Detection Update.

Leap Motion. Kinect Open Source Programming Secrets. Kinect Open Source Programming Secrets (KOPS) is the only book that explains the official Java wrappers for OpenNI and NITE. (If you want installation instructions, scroll down this page a little.) The main drawback of using the PrimeSense Java wrappers is their lack of documentation. As I explain in chapter 1, I had to decompile the libraries' JAR files, and work out the correspondences between the Java source and the somewhat better documented C++ OpenNI/NITE APIs. This is why including "secrets" in the book's title isn't too excessive :).

This book covers programming topics not found elsewhere. I start off with the basics, with chapters on depth, infrared, and RGB imaging, point clouds, skeletal user tracking, hand tracking, and gesture support. Moving beyond that, I look at several novel and unusual features, including: Early (sometimes very early) draft versions of KOPS's chapters can be downloaded from here (see the links below). What this Book is Not About Dr. FAAST. Install FAAST on your PC. Install FAAST on your PC for full body control and VR applications: Disclaimer: This comprehensive guide of FAAST installation was taken from Institute for Creative Technologies website. The toolkit of the FAAST was also taken from the same site. By sharing their guide to the community, Kinecthacks.com wishes to promote the value of innovation and creativity with the Micrsoft’s Kinect.

Summary FAAST is middleware to facilitate integration of full-body control with games and VR applications. FAAST is free to use and distribute for research and noncommercial purposes (for commercial uses, please contact us). The preliminary version of FAAST is currently available for Windows only. Installation To use FAAST, you will need to download and install the following software: FAAST should then run out-of-the-box; no additional installation or setup is necessary. Skeleton Usage Currently, FAAST streams the entire skeleton for the first calibrated user that is currently visible to the sensor.

Flexible Action and Articulated Skeleton Toolkit (FAAST) Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: faast@ict.usc.edu 32-bit(recommended for most users) 64-bit(for advanced users) Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. Future updates may occur but will likely be sporadic. You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit. Have a Kinect for Windows v2? We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only).

Recent News December 12, 2013 FAAST 1.2 has been released, adding compatibility for Windows 8. Summary FAAST is free to use and distribute for both commercial and noncommercial purposes. E. FAAST is currently available for Windows only. Kinect & HTML5 using WebSockets and Canvas - Vangos Pterneas blog. Kinect defined Natural User Interaction. HTML5 redefined the Web. Currenty, there are various tutorials describing how to interact with a Kinect sensor using Windows Forms or WPF for the user interface. But what about using a web interface for handling Kinect data? Trying to combine those two hot, cutting-edge technologies, I came up with a pretty and open-source solution, which I am going to describe in this blog post. I am going to use the official Kinect SDK for my demo, but same principles apply to OpenNI SDK, too. Download source code and binaries Prerequisites Results The project consists of two sub-projects: A server-side application which uses Kinect SDK and a client-side web page displaying the skeleton joints on an HTML5 canvas.

Client application: Server application: Tutorial Here is, step by step, a way to achieve the above functionality: Step 1: Server application Considering the data transmission, I highly recommend the use of Fleck. Step 2: Client application Time for HTML5 bits! KinectJS - HTML5 goes motion. Kinect Development in HTML, Unity3D and Flash.