background preloader

KinEmote User Forums

KinEmote User Forums
Related:  Augmented Reality

Flexible Action and Articulated Skeleton Toolkit Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: 32-bit(recommended for most users) 64-bit(for advanced users) Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit. Have a Kinect for Windows v2? We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only). Recent News December 12, 2013 FAAST 1.2 has been released, adding compatibility for Windows 8. Summary FAAST is middleware to facilitate integration of full-body control with games and VR applications using either OpenNI or the Microsoft Kinect for Windows skeleton tracking software. E. Support

OpenNI - OpenNI > Home MouseTrap Mission Statement The MouseTrap mission is to provide an exceptional alternative input system to Gnome with a particular eye towards aiding the physically impaired community. Our aim is to give users the option to replace a mouse with a low-cost webcam which can interpret a user's head movement as computer input. What is MouseTrap? MouseTrap is a standalone GNOME application that allows users with physical impairments to move a mouse cursor. MouseTrap is written in Python, based on the OpenCV library and uses image processing to translate the user's head movements into mouse events (movements, clicks) which allow users to interact with the different desktops managers and applications. See Also Need to see it working? These are some videos to see how MouseTrap works: Found a bug or have a feature request? Mailing List Meetings Meetings License Technical Documents

The Simple Idea behind This Mind-Blowing 3D Interactive Sandbox Playing in the sandbox used to be my favorite activity as a small child. I remember making pretend volcanos, rivers, lakes, and tunnels in the sand. Well, researchers at UC Davis have come up with a way to bring those imaginary landscapes to life with interactive 3D projection technology. The results are simply breath-taking! When you were a kid, did your sandbox have active volcanoes? This amazing interactive sandbox responds to your actions, and can be built at home using common-place technology. All it takes is a digital projector and a Kinect. The projector displays an interactive topographic map, with contour and elevation in real-time. You can make hills and valleys, and the computer changes the projection to match the landscape! Museums around the world are starting to create their own interactive sandboxes… It’s not only fun… it’s a great way to teach kids about geography! Watch the full demo video here… I can’t wait to try this!

Kinect for Windows Software and Multitouch Monitors avin2/SensorKinect - GitHub The Khronos Group Inc. Johnny Chung Lee - Projects - Wii As of June 2008, Nintendo has sold nearly 30 million Wii game consoles. This significantly exceeds the number of Tablet PCs in use today according to even the most generous estimates of Tablet PC sales. This makes the Wii Remote one of the most common computer input devices in the world. It also happens to be one of the most sophisticated. Any software on this page is primarily meant for developers and may not run without proper the development tools installed. NOTE: For most of these projects, you don't need the Nintendo Wii console. Coming Later: 3D tracking, and more.... Unfortunately, time constraints in the next couple of months have significantly reduced my ability to work on more projects.

Ultrasound Used To Create 3D Shapes In Mid Air That Can Be Seen And Felt You may not have heard of it before, but haptic technology is all around us. The buzz of your smartphone as you tap the keys, or the rumble of your Wii controller as you smash a tennis ball are both haptic effects. But this touch feedback technology has uses far beyond enhancing your game experience; it’s used in rehabilitation of stroke patients and even surgical training. Now, scientists have invented a new method of haptic feedback using ultrasound, which creates 3D haptic shapes in mid-air that can be seen and felt. The researchers, who are based at the University of Bristol, envisage that this innovative technology could transform the way that we use 3D haptic shapes. The method, which is described in ACM Transactions on Graphics, exploits an effect produced by ultrasound called acoustic radiation force, which is the scattering and absorption of the acoustic wave. By adding these invisible 3D shapes to 3D displays, scientists can create something that can be both seen and felt.

Trackmate 1. Print Tags Trackmate uses a small, specially designed circular barcode that stores information which can be easily decoded by the Trackmate Tracker. The tag measures less than 1"x1" square, contains a six byte unique ID (over 280 trillion unique IDs are possible), and is entirely open source. Download the Trackmate Tagger for Windows, Mac OS X, or Linux to create your own sets of randomly generated tags and print them from your computer. 2. There are a lot of different ways that you can build a Trackmate system. 3. The Trackmate Tracker reads Trackmate tags (by processing images from a webcam) and then sends the corresponding data to any spatial application via LusidOSC. 4. Trackmate sends object data via LusidOSC (a protocol layer for unique spatial input devices), allowing any LusidOSC-based application to work with the system.

Interactive Media Division Throughout the course of my degree progress, one debate raised our very first class meeting of our first year was the concept of traditional authorial narrative vs. emergent narrative. Traditional authorial narrative is what we’ve come to know as our film-based non-interactive media, whereas emergent narrative is procedurally generated by way of a designed system. As I head towards the end of my second year, it’s less of a balanced argument— traditional narrative in games (Ludus, as named by Roger Callois in Man, Play and Games,) seem to be relying on their predecessors as a clutch, while systemic narrative (Paidia) is beginning to show the uniqueness of the new medium that we’re witnessing mature before our eyes. This actually has to do with the concept of “free will”. It makes sense that earlier, more primitive games are more ludus-based, as most of them pre-date multilateral gameplay. Before the advent of the home console, we would go to the coin-operated video game arcade.

MinGW | Minimalist GNU for Windows Open Sources Domain Name Does Not Exist In The Database DotNetNuke supports multiple portals from a single database/codebase. It accomplishes this by converting the URL of the client browser Request to a valid PortalID in the Portals database table. The following steps describe the process: When a web server receives a Request from a client browser, it compares the file name extension on the target URL resource to its Application Extension Mappings defined in IIS. Based on the corresponding match, IIS then sends the Request to the defined Executable Path ( aspnet_asapi.dll in the case of ASP.NET Requests ). The aspnet_isapi.dll engine processes the Request in an ordered series of events beginning with Application_BeginRequest. The Request URL is parsed based on the "/" character A Domain Name is constructed using each of the relevant parsed URL segments. Examples: URL: = Domain Name: URL: = Domain Name: Example: