Plateformes vision - outils (ex: proce55ing)
autodesk123 _ capture 3D easy
Groupes Sensors and MEMS are everywhere. Automobiles, computers, medical devices, a staggering array of consumer electronics-- these are just a few of the places you'll find sensor technology around you. Join element14's Sensors group today and stay current Sensors and MEMS are everywhere. Automobiles, computers, medical devices, a staggering array of consumer electronics-- these are just a few of the places you'll find sensor technology around you.
faceAPI | Seeing Machines Track and understand faces like never before with faceAPI from Seeing Machines – now available for license. faceAPI allows you to incorporate Seeing Machines world class face tracking technology into your own product or application. faceAPI provides a suite of image-processing modules created specifically for tracking and understanding faces and facial features. These powerful tracking modules are combined into a complete API toolkit that delivers a rich stream of information that you can incorporate into your own products or services. Seeing Machines faceAPI is the only comprehensive, integrated solution for developing products that leverage real-time face tracking. All image-processing for face tracking is handled internally, removing the need for any computer vision experience. Version 3.2.6 now available
Do It Yourself - Hacking Real Life
CNC Prototypage rapide
eyesweb - Google Custom Search
Do It Yourself - Hacking Real Life
handy AR Overview The Handy AR presents a vision-based user interface that tracks a user's outstretched hand to use it as the reference pattern for augmented reality (AR) inspection, providing a 6-DOF camera pose estimation from the tracked fingertip configuration. A hand pose model is constructed in a one-time calibration step by measuring the fingertip positions relative to each other in presence of ground-truth scale information. Through frame-by-frame reconstruction of the camera pose relative to the hand, we can stabilize 3D graphics annotations on top of the hand, allowing the user to inspect such virtual objects conveniently from different viewing angles in AR. Fingertip Detection
Il l'avait promis, c'est chose faite. Le petit génie indien Pranav Mistry rend accessible à tous dès aujourd'hui, le code source et les plans du hardware de son invention révolutionnaire, SixthSense . Tout le monde, mais surtout les plus avertis, va pouvoir concevoir dans sa remise le produit révolutionnaire de demain, car SixthSense vient tout juste d'être rendu public et complètement ouvert. SixthSense devient open-source : du génie technologique en liberté
OpenViBE - Interaction cerveau-ordinateur
How to Control Animata With OSC from Max/MSP and Pure Data « Månsteri:::[mons-te-ri] If you haven´t heard of Animata yet, you should head over to http://animata.kibu.hu/index.html and educate yourself. Download the software and go through the tutorials. I also recommend reading through the mailing list, it has tons of useful information. Controlling Animata with a mouse and doing real-time animations is pretty cool by itself, but Animata really shows its true potential when you control it with OSC.
> exTouch Created at the Tangible Media Group at the MIT in collaboration with Sony Corporation, exTouch creates Spatially-Aware Embodied Manipulation of Actuated Objects Mediated by Augmented Reality. > Duration Created by James George and co-developed by YCAMInterLab during the Guest Research Project v. > Reactor for Awareness in Motion (RAM) by Yoko Ando and YCAM
Contributors Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas Project Email Address: firstname.lastname@example.org Note: the new version of FAAST only supports discrete mouse events at this time.