background preloader

Reconnaissance de gestes

Facebook Twitter

Leap Motion gesture control technology hands-on. Leap Motion unveiled its new gesture control technology earlier this week, along with videos showing the system tracking ten fingers with ease and a single digit slicing and dicing a grocery store's worth of produce in Fruit Ninja. Still, doubts persisted as to the veracity of the claim that the Leap is 200 times more accurate than existing tech. So, we decided to head up to San Francisco to talk with the men behind Leap, David Holz and Michael Buckwald, and see it for ourselves. Join us after the break to learn a bit more about Leap, our impressions of the technology, and a video of the thing in action. Leap motion control technology hands-on See all photos 6 Photos Before diving into the more technical details of the device he created, Holz told us about the genesis of his idea to create a better way for humans to interact with their computational devices.

In practice, the Leap is impressive. What's next for Leap Motion? Comments. WiSee. Qualcomm présente le potentiel du Snapdragon S4 en vidéo. Le constructeurQualcomm vient de mettre en ligne deux vidéos de présentation mettant en avant les performances de ses puces Snapdragon S4 et Snapdragon S4 Prime. Nouvelle variante de l’architecture Snapdragon, le S4 Prime est une puce dédiée à l’univers de la télévision connectée comme l’illustre la vidéo ci contre. On y aperçois pêle-mêle les possibilités vidéoludiques, multimédias et sociales de la puce avec au centre de tout la reconnaissance faciale et la détection avancée des mouvements « à la kinect ». La seconde vidéo permet de se faire une idée plus précise de la puissance du Snapdragon S4 dans un environnement Windows 8 RT. Il est ici question de jeu multijoueur à l’autre bout du globe entre deux terminaux de test Qualcomm. Deux joueurs (et plus) sont présentés en train de s’affronter en ligne via une connexion 4G / LTE intégré de série aux puces du constructeur.

Computer learning to read lips to detect emotions. Open the pod bay doors , HAL . Scientists in Malaysia are teaching a computer to interpret human emotions based on lip patterns. The system could improve the way we interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesizers, more effectively and more efficiently, says Karthigayan Muthukaruppan of Manipal International University. The system uses a genetic algorithm that gets better and better with each iteration to match irregular ellipse-fitting equations to the shape of a human mouth displaying different emotions.

They have used photos of individuals from Southeast Asia and Japan to train a computer to recognize the six commonly accepted human emotions — happiness, sadness, fear, angry, disgust, surprise — and a neutral expression. The team’s algorithm can successfully classify the seven emotions and a neutral expression described, the scientists say. The Most Manipulative Use of Kinect Imaginable. The good people of GeekWire spotted a patent application from Microsoft that envisions using Kinect to figure out your mood, and target ads at you accordingly. The application, filed back when Kinect was rather new (in December of 2010) was made public this week. (It’s not the first Microsoft patent application expressing an interest in tracking users’ moods.) How exactly would it work? The idea is that Kinect’s motion and facial recognition technology could figure out whether you’re sad or happy, and serve up ads that jive with your mood.

The patent application contains unusually colorful language about how exactly the Kinect (or other computing device) might infer mood. Good thinking, Kinect! It’s undoubtedly true that we respond differently to ads, depending on our mood. This is just a patent application; there’s no indication there’s a real product necessarily in the works. Then again, advertisements are an inescapable feature of modern life. Microsoft Is Turning Kinect Into a Narc.