background preloader

UI Devs - Lenses / HUD

Facebook Twitter

Microsoft shows displays with built-in Kinect, predicts future laptop integration. Microsoft's Kinect sensor may soon be integrated in a TV or laptop in the future. Speaking at Microsoft's TechForum in Seattle this week, Craig Mundie, senior advisor to the CEO, revealed where Microsoft is looking to take Kinect. The software giant recently unveiled a new envisioning center at its campus, complete with a host of Kinect-powered demonstrations. During my own look at the center, virtually every demo took advantage of the Kinect sensor in some shape or form. "It's not gonna happen tomorrow, but we can see a path towards that sort of thing. " Microsoft's goal in research is to miniaturise Kinect, says Mundie. Microsoft faces challenges to integrate Kinect into devices Although Microsoft is taking the step of integrating Kinect into devices, Mundie believes there are a number of challenges in bringing this type of technology to a laptop or tablet.

Related Items kinect microsoft research sensor devices craig mundie techforum techforum 2013 Microsoft. Leap Motion gesture controller will sell through Best Buy in the US over coming months. Leap Motion ‘Minority Report’ Computer Interface Preps For Big 2013. In preparation for launch, thousands of developers are writing Leap-enabled applications using this beta version of the Leap device. Mid-2012, Leap Motion unveiled the most accurate motion sensing technology on the market to date. The Leap detects every subtle finger flick and hand gesture with up to 1/100th of a millimeter precision and then translates it into a movement or command on your computer screen. All that for just $70. A round of Minority Report and Prometheus references ensued—and then a round of skepticism. The sense is we may be on the cusp of a big change in the way we interact with our computers.

So, until then, what else is there to say? At the end of 2012, Leap Motion entered a critical phase—application development. For the inside scoop on app development and the Leap’s upcoming release, Singularity Hub contacted Michael Zagorsek, Vice President of Product Marketing at Leap Motion. Why Leap? That’s no small statement. The guys were impressed by the hardware too. New interactive system detects touch and gestures on any surface. Public release date: 9-Oct-2012 [ Print | E-mail Share ] [ Close Window ] Contact: Emil Venerevenere@purdue.edu 765-494-4709Purdue University WEST LAFAYETTE, Ind. – People can let their fingers - and hands - do the talking with a new touch-activated system that projects onto walls and other surfaces and allows users to interact with their environment and each other.

The system identifies the fingers of a person's hand while touching any plain surface. It also recognizes hand posture and gestures, revealing individual users by their unique traits. "Imagine having giant iPads everywhere, on any wall in your house or office, every kitchen counter, without using expensive technology," said Niklas Elmqvist, an assistant professor of electrical and computer engineering at Purdue University. The new "extended multitouch" system allows more than one person to use a surface at the same time and also enables people to use both hands, distinguishing between the right and left hand. Related websites: LG begins mass production of flexible e-paper display. A Concept For Taking Pics Simply By Looking And Blinking. You can almost see the story playing out. An old man sits on his porch, showing off his dusty Nikon D4 to his intrigued grandson. He tells the tale of how cameras went from bulky devices with huge lenses to sleek phones in our pockets.

But then most of us stopped carrying cameras at all, opting to take photos invisibly in our experience, through glasses, or maybe something like Iris. Iris is a prototype camera by recent Royal College of Art in London graduate Mimi Zou (the conceptual polished product is pictured here). The name isn’t just clever; it uses iris tracking to identify a user and follow where they look.

When a photographer wants to snag a shot, they simply focus on that part of the frame--they look at it--and they blink to take the photo. (This idea might seem familiar: Innumerable sci-fi and spy thrillers have posited cameras in our eyes, activated in the same way.) That’s the cleverness of Iris. Indeed.

[Hat tip: The Creators Project] O2Amp, eyewear, vein finder, trauma detector, health monitor. How New 'Mood Ring' Glasses Let You See Emotions. Evolution has tailored the human eye for detecting red, green, blue and yellow in a person's skin, which reveals areas where that person's blood is oxygenated, deoxygenated, pooled below the surface or drained. We subconsciously read these skin color cues to perceive each other's emotions and states of health.

Rosy cheeks can suggest good health, for example, while a yellowish hue hints at fear. Now, researchers have created new glasses, called O2Amps, which they say amplify the wearer's perception of blood physiology , augmenting millions of years of eye evolution. "Our eyes have been optimized to sense spectral changes in skin color," said Mark Changizi , an evolutionary anthropologist and director of human cognition at 2AI Labs in Boise, Idaho .

Based on Changizi's color perception research, he and his colleagues have designed three versions of O2Amps, which are currently being sold to medical distributors and will hit wider markets in 2013. "If you're angry, you get red. 3D images that are 'indistinguishable from reality' could be just 40 years away. Founders of Leap Motion: Our Amazing 3D Tracking Will Be Everywhere. In the short time since its debut, the Leap Motion has inspired zombie-like devotion in many gadget lovers, but can the device live up to the hype? (Yes, yes it can). In the past few weeks the Leap Motion device has sent shudders of delight through gadget lovers and computer designers alike by promising a new kind of ultra-accurate, and very cheap, optical 3D tracking for your desktop or laptop computer. Forget the Kinect, Leap Motion is cheaper ($70), more precise (down to 0.01 mm), and much smaller (think “pack of gum” proportions).

The incredible demo for the Leap Motion (see below) shows how the desktop device can quickly detect hand motion so that a user needs merely wiggle their fingers in front of their computer to intuitively control what happens on the screen. Currently taking pre-orders, the Leap Motion is scheduled to ship between December and February, and with it will come a new market of third party apps designed to take full advantage of the device. SoftKinetic's ten-finger virtual puppet show demo video walkthrough.

Microsoft's Kinect for Windows wasn't the only gesture control system to tout "near mode" this CES. SoftKinetic offers an alternate solution with an eye towards OEMs, and this week it released a public alpha of new firmware for its DepthSense 311 that it claims will detect finger movement from as close as 15cm (vs. Kinect's 50cm) and as far away as about three feet. Those numbers seems about right; SoftKinetic let us try out the firmware first-hand in two applications — a barebones tech demo that showed exactly what the software was detecting, and a "puppet show" app that let you control two cartoon puppets with ragdoll arms — and detection seemed to work fine within the stated range.

The puppets could to and fro, nod their head, twist around, and open their mouths when you un-balled a fist. We also had a rather brief look at the upcoming DS320, with a higher-resolution QVGA depth sensor, HD video, and a wider 73-degree field of view. SoftKinetic announces pocket-sized DepthSense 325 gesture recognition camera kit. It seems like just yesterday that we were sampling SoftKinetic's gesture control systems at CES, but the company is back today with its latest developer-targeted hardware. The DephSense 325 is a pocket-sized camera (SoftKinetic claims it's the smallest such device in the world) that can "see" in high-definition 2D and also analyze 3D depth. Whereas the DS311 we played with in January could detect finger movement from as close as 15 centimeters, the company has managed to shrink that distance even further to just 10cm.

That's something Microsoft can't say about its current Kinect hardware, though we'll need to see SoftKinect's technology in action to get a sense of how accurate tracking is. Dual microphones are also integrated within the DS325 for audio-based interaction. As for how the DS325 might be used in the real world, the company cites theoretical examples that include video games, "edutainment" applications, video conferencing, online shopping, and social media implementations. First video sample from Google's Project Glass. Project T(ether) creates virtual workspace using mo-cap technology. Kinect for Windows 1.5 released with '10-joint' skeletal tracking and Kinect Studio. Microsoft is improving the functionality of its Kinect for Windows hardware this week with the release of a 1.5 version of its Software Development Kit (SDK).

The latest Kinect for Windows 1.5 software includes support for a new "seated" 10-joint skeletal tracking system to enable developers to track the head, neck, and arms of seated and standing users in default and near mode. Kinect for Windows 1.5 also introduces a Kinect Studio developer tool. Kinect developers can use the tool to record and playback Kinect data in their applications to assist with debugging and app improvements. Microsoft is also supporting additional languages with its 1.5 release, including French, Spanish, Italian, and Japanese speech recognition. Leap 3D motion control system is 100 times more accurate than Kinect, will cost $69.99.

Motion control startup Leap Motion has demoed its Leap 3D motion control system, which can track motion to around 0.01mm accuracy — 100 times more accurate than the Kinect. Rather than taking Microsoft's approach, Leap Motion creates a personal 3D workspace of about four cubic feet. The Leap consists of a small USB device with industry-standard sensors and cameras that, in tandem with the company's software, can track multiple objects and recognize gestures. In a demo given to CNET, Leap's designers showed off OS navigation and web browsing using a single finger, writing, pinch-to-zoom, precision drawing, 3D modeling, and gaming.

From what we can see, it looks to be a very precise system, capable of recognizing objects in your hands and tracking them instead of your digits. Although Leap Motion is a startup, it has significant funding behind it, and the system is scheduled to launch early next year at $69.99. Leap Aims to Make Motion Control Ubiquitous -- and Awesome. You know that bit in The Avengers where Tony Stark spreads his fingers apart in mid-air and the stuff on the screen in front of him instantly appears on displays throughout the room? A company called Leap Motion wants to make that kind of gesture control a reality, and it hopes to take the first step with a new type of motion controller. The Leap is a simple motion controller that you can plug into any USB port on your computer. Once it's plugged in and you've installed the Leap software, it turns the 8-cubic feet of air in front of it into "3D interaction space" — basically, it'll track any and all motion within that space, letting you use your hands to do whatever you could do with a mouse.

How is that different from Microsoft Kinect? Precision — the company claims the Leap is 200 times more sensitive than current gesture-based tech, able to track movements down to a hundredth of a millimeter. Users will be able to fine-tune the sensitivity, Leap says. Can't wait to try it out? This Gizmo Lets You Draw A UI On Paper, Then Turns It Into A Touch Screen. You know those huge multichannel mixers--the massive boards that audio engineers manage during concerts to control everything from sound to lights? It’s the sort of highly specialized hardware that the average person would never come into contact with, because why would they? But what if you could just draw it? That’s the idea behind the SketchSynth, by Carnegie Mellon student Billy Keyes.

It allows you to draw your own specialized piece of sound hardware--in this case, a MIDI board--on any random piece of paper. “Ever since I was little, I’ve been fascinated by control panels,” Keyes explains on his blog. His approach is a compromise between boundless childhood imagination and human factors of practicality: He designed three distinct controls that anyone could draw. A simple webcam picks up the shapes and sends them to a computer, then, a projector actually lays extra data on top of the drawing, like virtual nubs to control the sliders.

[Hat tip: Creative Applications] Implanted User Interface Gives Patients New Options - Healthcare - Clinical Information Systems. Placed just under the skin, implanted UIs could accept touch inputs, giving users with implanted medical equipment such as pacemakers more control over their device's operation. (click image for larger view) Slideshow: 12 Advances In Medical Robotics Pacemakers and other implanted medical devices have become commonplace.

But being able to directly interact with implants--via user interfaces that are implanted as well--still might strike some as science fiction a la the Terminator. Researchers who are testing implanted user interfaces say the appliances will enable people who have implanted medical devices such as pacemakers to recharge and reprogram them without the use of wireless transmissions, which are considered vulnerable to hacking. "So far, people have only been able to get those implants checked by making a trip to a physician or by interacting with wireless technologies such as Bluetooth. . [ Is it time to re-engineer your clinical decision support system? More Insights. Beyond Kinect: Gestural computer spells keyboard death - tech - 15 May 2012. THE advent of multi-touch screens and novel gaming interfaces means the days of the traditional mouse and keyboard are well and truly numbered.

With Humantenna and SoundWave, you won't even have to touch a computer to control it, gesturing in its direction will be enough. These two technologies are the latest offerings from Microsoft, who gave us the Kinect controller. But the Kinect hardware looks clunky next to the Humantenna and SoundWave setups, which their inventors say could be built into a watch or laptop. As the name suggests, Humantenna uses the human body as an antenna to pick up the electromagnetic fields - generated by power lines and electrical appliances - that fill indoor and outdoor spaces. By studying how the signal changes as users move through the electromagnetic fields, the team was able to identify gestures, such as a punching motion or swipe of the hand. All sorts of applications would open up if Humantenna can be commercialised. Shhh... touch lips to turn off phone. Microsoft creates Kinect-like motion control for laptops using sound waves.

Laser System Paints Information on the Road Ahead. On the road: A mock-up shows a driver’s view of Microvision’s head-up display. Head-up displays, which project visual data onto the windshield and the driver’s view of the road, are debuting in a growing number of car models. But more vibrant, compact, and efficient displays being developed by Microvision, a company based in Redmond, Washington, could help the technology become much more common. Japan’s Pioneer Corporation plans to release its first head-up display product based on Microvision’s novel technology this year.

Major carmakers in Detroit are also planning to integrate the technology into their vehicles by 2016, says Lance Evans, a director of business development at the company. The company’s head-up display is already in some concept cars but has so far been too costly for commercial models, says Evans. Most existing head-up displays generate images using LCDs. The final cost of Microvision’s product will hinge on the price tag of advanced green lasers.

Pentagon places order for iOptik dual focus augmented reality contact lenses. Samsung's flexible, 'unbreakable' AMOLED displays to be called 'Youm' These Contact Lenses Give You Super-Human Vision. Pentagon places order for iOptik dual focus augmented reality contact lenses. Innovega Inc. - HOME. Kinect boss on the future of computer interfaces - tech - 28 March 2012. Google Unveils Augmented-Reality Glasses, Its Vision Of The Post-PC Era. 4 Problems Google Glasses Have To Solve Before Becoming A Hit. Could Google's Project Glass be used in contact lenses? Kinect for Windows 1.5 will feature '10-joint' skeletal tracking and four new speech recognition languages. 5 Exciting Innovations That Will Change Computing in 2012. Samsung introduces transparent LCD displays. Microsoft's transparent 3D desktop puts a virtual computing environment at your fingertips.

Lumus OE-31 optical engine could add augmented reality to any eyewear. The Long and Winding Road to Personal Heads-Up Displays. Lumus high-definition transparent video glasses hands-on. Vuzix transparent 'smart glasses' prototype hands-on. Vuzix using Nokia IP to build 'Smart Glasses' with transparent displays. Bionic contact lens. Contact Lens. Augmented reality lens from Microsoft and the University of Washington in the final stages of development.