background preloader

Gestures

Facebook Twitter

Augmented Reality Meets Gesture Recognition. To make its business software more effective, HP recently paid $10 billion for Autonomy, a U.K. software company that specializes in machine learning.

Augmented Reality Meets Gesture Recognition

But it turns out that Autonomy has developed image-processing techniques for gesture-recognizing augmented reality—the type of technology that could be more attractive to consumers than IT managers. Augmented reality involves layering computer-generated imagery on top of a view of the real world as seen through the camera of a smart phone or tablet computer. So someone looking at a city scene through a device could see tourist information on top of the view. Autonomy’s new augmented reality technology, called Aurasma, goes a step further: it recognizes a user’s hand gestures.

This means a person using the app can reach out in front of the device to interact with the virtual content. Autonomy’s core technology lets businesses index and search data that conventional, text-based search engines struggle with. Taking Touch beyond the Touch Screen. A tablet computer developed collaboratively by researchers at Intel, Microsoft, and the University of Washington can be controlled not only by swiping and pinching at the screen, but by touching any surface on which it is placed.

Taking Touch beyond the Touch Screen

Finding new ways to interact with computers has become an important area of research among computer scientists, especially now that touch-screen smart phones and tablets have grown so popular. The project that produced the new device, called Portico, could eventually result in smart phones or tablets that take touch beyond the physical confines of the device. “The idea is to allow the interactive space to go beyond the display space or screen space,” says Jacob Wobbrock, an associate professor at the University of Washington’s Information School, in Seattle, who helped develop the system. This is achieved with two foldout cameras that sit above the display on either side, detecting and tracking motion around the screen.

The Universe in Your Hands. Sometimes, there’s no such thing as too much visual information.

The Universe in Your Hands

An astronomer, for instance, parsing images of distant galaxies, will never complain about a picture that is too high-resolution. Neither will a microbiologist, who may need to zoom into the microscopic universe to learn more about what makes a cell work, or fail. We have the technology to take massively high resolution images today–so-called gigapixel images, those containing a billion or more pixels–but what we’ve lacked, until now, was a suitable and intuitive way to navigate those images.

Samuel Cox, a masters student in digital imaging at Lincoln University, offers what may be a solution, reports The Engineer. Cox isn’t an astronomer or a biomedical student, but the system he devised might someday apply to those fields. Megapixel panoramas like these are nothing new, of course; what is somewhat novel, however, is the means of experiencing them that Cox then created. An Invisible Touch for Mobile Devices.

Today, the way to interact with a mobile phone is by tapping its keypad or screen with your fingers. But researchers are exploring ways to use mobile devices that would be far less limited. Patrick Baudisch, professor of computer science at the Hasso Plattner Institute in Postdam, Germany, and his research student, Sean Gustafson, are developing a prototype interface for mobile phones that requires no touch screen, keyboard, or any other physical input device. A small video recorder and microprocessor attached to a person’s clothing can capture and analyze their hand gestures, sending an outline of each gesture to a computer display. The idea is that a person could use an “imaginary interface” to augment a phone conversation by tracing shapes with their fingers in the air.

Baudisch and Gustafson have built a prototype device in which the camera is about the size of a large broach, but they predict that within a few years, components will have shrunk, allowing for a much smaller system.