background preloader

Flexible Action and Articulated Skeleton Toolkit (FAAST)

Flexible Action and Articulated Skeleton Toolkit (FAAST)
Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: 32-bit(recommended for most users) 64-bit(for advanced users) Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit. Have a Kinect for Windows v2? We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only). Recent News December 12, 2013 FAAST 1.2 has been released, adding compatibility for Windows 8. Summary FAAST is middleware to facilitate integration of full-body control with games and VR applications using either OpenNI or the Microsoft Kinect for Windows skeleton tracking software. E. Support

Install FAAST on your PC Install FAAST on your PC for full body control and VR applications: Disclaimer: This comprehensive guide of FAAST installation was taken from Institute for Creative Technologies website. RINIONS - Network System Laboratory - Rinions (ex- SLKinect2) RINIONS is Real Time Input from NI/NUI and Output to the Network and Shared Memory System Glossary: NI: Natural Interaction, NUI: Natural User Interface Rinions Rinions transfers animation data from Kinect/Xtion to Second Life/OpenSim Viwer.And Rinions realizes Real-Time Animation on Second Life or OpenSim.Rinions supports animation data file (BVH), too.Technical Data: ppt

Kinect Open Source Programming Secrets Kinect Open Source Programming Secrets (KOPS) is the only book that explains the official Java wrappers for OpenNI and NITE. (If you want installation instructions, scroll down this page a little.) The main drawback of using the PrimeSense Java wrappers is their lack of documentation. As I explain in chapter 1, I had to decompile the libraries' JAR files, and work out the correspondences between the Java source and the somewhat better documented C++ OpenNI/NITE APIs. This is why including "secrets" in the book's title isn't too excessive :). This book covers programming topics not found elsewhere.

John McCaffery Armadillo is Virtual World client aimed at supporting immersive interactions. It aims to support hands free input (via the Kinect or other NUI devices) and immersive output (via multiple screens and fisheye projection). It’s purpose is to serve as a research platform for experimenting with immersive technology and also as a practical platform to be used by educators, museum curators or anyone else wishing to explore immersive interaction in Virtual Worlds. Kinect & HTML5 using WebSockets and Canvas - Vangos Pterneas blog Kinect defined Natural User Interaction. HTML5 redefined the Web. Currenty, there are various tutorials describing how to interact with a Kinect sensor using Windows Forms or WPF for the user interface. But what about using a web interface for handling Kinect data? Trying to combine those two hot, cutting-edge technologies, I came up with a pretty and open-source solution, which I am going to describe in this blog post.

The standard framework for 3D sensing Implementations TUIO hardware support In order to support the further development of the TUIO platform, we are looking for hardware donations of various hardware: iOS and Android tablets, Touchscreen hardware, Windows 7 Multitouch Notebooks, Netbooks and Tablets, MS Surface, Samsung SUR40, Magic Trackpad or any other devices to evaluate their existing or potential TUIO support. Please get in touch with martin_at_tuio_dot_org for further information! TUIO Tracker Implementations TUIO Server Reference Implementations Accessing the kinect in javascript through websockets « Aboutme – blog Good morning all! (or evening, night, … depending on when you read this post of course). As you might know, I’ve been working on AIRKinect ( and I’ve got a side project AIRServer aswell (which allows you to setup air as a socket server, including websocket support). Wouldn’t it be fun, to combine these two projects in a demo, so you can access the kinect information through a websocket? Bill Buxton Microsoft Research Original: Jan. 12, 2007 Version: June 12, 2014 Keywords / Search Terms Multi-touch, multitouch, input, interaction, touch screen, touch tablet, multi-finger input, multi-hand input, bi-manual input, two-handed input, multi-person input, interactive surfaces, soft machine, hand gesture, gesture recognition . KinectJS - HTML5 goes motion Kinesis - build gesture apps