background preloader

Flexible Action and Articulated Skeleton Toolkit (FAAST)

Flexible Action and Articulated Skeleton Toolkit (FAAST)
Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: faast@ict.usc.edu 32-bit(recommended for most users) 64-bit(for advanced users) Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit. Have a Kinect for Windows v2? We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only). Recent News December 12, 2013 FAAST 1.2 has been released, adding compatibility for Windows 8. Summary FAAST is middleware to facilitate integration of full-body control with games and VR applications using either OpenNI or the Microsoft Kinect for Windows skeleton tracking software. E. Support

http://projects.ict.usc.edu/mxr/faast/

Related:  Kinect for WindowsKinect - LeaPmotionkinectRealite virtuelle

VRPN Virtual Reality Peripheral Network New! One-Euro filter that reads in a tracker pos/orient and filters it, producing a less-jittery tracker output (Jan Ciger). Updated manufacturer-supplied Polhemus drivers. Using Kinect + OpenNI to Embody an Avatar in Second Life Download the software to connect the Microsoft Kinect to Second Life. At the MxR Lab at the University of Southern California Institute for Creative Technologies we are developing methods of recognizing social gestures in order to explore the transference of emotion and gesture between a virtual world and the real world. Thai Phan an engineer at the MxR Lab, using the OpenNI toolkit as a foundation has developed new software which utilizes Kinect to read gestures and triggers corresponding server-side scripts within Second Life. These methods may allow the user to feel a deeper emotional connection to the social gesture performed by their virtual avatar, regardless of the bond which already exists between the user and his recipient. Instead of having to think about pressing the right sequence of keys to make a ‘wave’ gesture, the user can simply raise their hand and wave.

Programming For Kinect 9 Excellent Programming Resources for Kinect Sunday, November 13th, 2011 Last edited on 12/6/2011. Students, educators, and enthusiasts are creating amazing things with Microsoft’s Kinect for education. While new Kinect development resources are constantly emerging, here are 9 people-driven and digital resources that offer assistance for developing Kinect applications. 1. Inc. Need for More Natural Man/Machine User Interfaces Advances in the fields of 3D computer graphics, sensor technologies and display technologies are converging to enable the development of highly-immersive, fully-interactive 3D simulation environments, typically known as “Virtual Reality”. However, the ability to become physically-immersed in a these virtual worlds is currently limited by game controller and windows-based user interfaces that require character movements and actions to be controlled with the hands, for example by moving a joystick, mouse, and/or pressing keys and buttons. Not only do these hand-based interfaces fail to take advantage of the full capabilities of the human body, but they also limit the number of degrees-of-freedom a user can control simultaneously. Another limitation is that users must typically face the television or computer screen when interacting with the computer or game system.

KinEmote: Kinect gesture control for Boxee and XBMC media centers now available (video) We've seen plenty of Kinect hacks over the last few weeks -- trouble is, beyond the initial wow factor they're just not very useful on a daily basis. That situation just changed, however, with the release of KinEmote, a free public beta that lets Windows users navigate XBMC and Boxee menus using nothing but hand gestures. Better yet, the software is built around OpenNI and NITE middleware from PrimeSense, the company behind the Project Natal reference gear. It certainly looks impressive in the video after the break. Good enough that we suspect many of you will hit up the source link below instead of finishing up your last minute holiday shopping -- hey, Santa can wait, this is progress!

NYC Production & Post News A Review of iPi Soft's Markerless Motion Capture System » NYC Production & Post News Motion capture or mocap has made its place as part of a modern animator’s toolkit. For many styles of animation, going with mocap instead of traditional animation saves time and cuts budgets. However, until recently, only productions capable of investing in many thousands of dollars worth of cameras and software from companies such as Polhemus and Vicon Systems could even consider using this approach.

Matt's Webcorner - Kinect Sensor Programming The Kinect is an attachment for the Xbox 360 that combines four microphones, a standard RGB camera, a depth camera, and a motorized tilt. Although none of these individually are new, previously depth sensors have cost over $5000, and the comparatively cheap $150 pricetag for the Kinect makes it highly accessible to hobbyist and academics. This has spurred a lot of work into creating functional drivers for many operating systems so the Kinect can be used outside of the Xbox 360. You can find a decent overview of the current state of people working on Kinect here.

Immersive Game Makes You Duck and Dodge IRL For those looking for a truly immersive gaming experience — where you have to actually duck and dodge from every bullet — the demo above will amaze you. A UK-based developer combined the HydraDeck is currently a work in progress for "Teddy0k", who has been posting updates on his project on the Oculus VR developer forums. The full demo shows him crouching and using cover while blasting enemies in the game. Razer Hydra, a motion controller for PC gaming, makes the project possible. How to Sync Your Media Across Your Entire House with XBMC XBMC is an awesome media center solution but when you’re using it all over your house your library updates and watched-media lists get out of sync. Read on as we show how to keep all your media centers on the same page. Note: This how-to guide was originally published in September of 2011 and detailed how to set up whole-house media syncing for XBMC “Dharma” 10.0. We’ve updated the guide for the newer, more user-friendly MySQL integration included in XBMC “Eden” 11.0.

IKLONE ZERO 3D animation with ICLONE: ICLONE WORLD - MUNDO ICLONE 3. iCLONE 5 PRO - NOVEDADES DE LA VERSIÓN 11. iClone G5 NEXT GEN & Street Dance: Asombrosos motions de STREET DANCE para aplicar a los avatares. Processing - V // Pixelnerve I’ve made an application as an example for this thread on how to compute/evaluate a Cubic Bézier Curve using a Geometry Shader. The formula is pretty straightforward as described by this wikipedia article (look for Cubic Bézier Curve). I will not go over the bézier math or theory. I assume you have some knowledge in shader programming (GLSL is the case) and some math background would help, while not really a need. All that said, let’s get to work. On this case we will need 4 points: 2 anchor points (the line end points) and 2 control points.

Related:  logiciels