background preloader

HMI

Facebook Twitter

Researchers developing new systems to improve voice recognition. Graduate students and researchers at UT Dallas have developed novel systems that can identify speaking voices despite conditions that can make it harder to make out a voice, such as whispering, speaking through various emotions, or talking with a stuffy nose. By improving this ability to detect voice through changing conditions, the research could be used in voice recognition applications such as signing into a bank, getting into locked rooms, logging onto a computer, or verifying purchases online. The researchers are working in the Center for Robust Speech Systems (CRSS) under the direction of Dr. John Hansen, associate dean for research in the Erik Jonsson School of Engineering and Computer Science. Using algorithms and modeling techniques, the group's solutions are being sought after by other researchers in the signal processing field.

Last fall, CRSS lab work was recognized with high rankings in the National Institute of Standards and Technology Speaker Recognition Evaluation. The Future of Design. A Wearable Computer More Powerful than Glass, And Even More Awkward. Steve Mann, a pioneer in the field of wearable computing, has been touting the benefits of head-mounted computers for decades.

Now the University of Toronto professor is also lending his weight and experience to a company hoping to loosen Google Glass’s grip on the nascent market with a different take on computer glasses that merges the real and the virtual. The company, Meta, is building computerized headwear that can overlay interactive 3-D content onto the real world. While the device is bulky, Meta hopes to eventually slim it down into a sleek, light pair of normal-looking glasses that could be used in all kinds of virtual activities, from gaming to product design. The company, which was founded by Meron Gribetz and Ben Sand, counts Mann as its chief scientist. Meta’s clunky-looking initial product, called Space Glasses, is meant more as a tool for app developers than as a gadget you’d want to actually wear. That’s a long way away, though.

Meet MYO - a Revolution in Motion-Sensing TechnologyTech & Innovation Daily. The way we interact with the real world and the digital world is about to change dramatically. If you have any doubt, consider the stir that Google (GOOG) Glass is already creating. And the device hasn’t even hit the market yet. But Google isn’t the only company breaking new ground in this exciting area. As you know, I’m keeping you abreast of the most promising new companies in the technology sector in our Top 10 Startups to Watch in 2013 feature.

And today, we’re adding another company to the list – Thalmic Labs… The Armband That Reads Your Mind The company was only founded in May 2012. It’s called MYO (which is Greek for “muscle”). Simply put, MYO is an armband that detects the electrical impulses from your brain to your hand and arm muscles. In other words, it’s a mind-reading armband. I know, I know… it sounds like something from “fantasy land.” Do you see why Reuters calls this technology an opportunity to “unleash your inner Jedi”? And Racing Up on the Outside… How is that even possible? Haptix Is A Gesture-Sensing Laptop Clip That Wants To Be Kinect For Accountants. As BlackBerry will now readily admit, gesture-based touchscreens have taken over from physical keyboards in the mobile space.

But there’s still plenty of plastic keys on and around PCs. Not that people aren’t trying to change that. The Leap Motion controller is one well-funded device that’s attempting to move things along by letting you swipe in mid-air to interact with a terminal, rather than clicking on a boring old mouse. Well, here’s another: Haptix uses twin cameras to peek at what your hands are doing and turn their actions into input signals. It supports both in-air gestures — a la Leap Motion — but also interprets 2D gestures (such as pinch to zoom) made on any flat surface.

In other words, it can turn a tabletop into a touchscreen. Haptix’s creators, who have just kicked off a Kickstarter campaign seeking $100,000 to help fund manufacturing costs, are Darren Lim and Lai Xue. The Kickstarter campaign is offering the Haptix device itself for an early bird price of $59. EMBRACE+, a smart piece of wearable technology by Paul & Rudy.

Hi - We are Paul and Rudi and we want to tell you about a product we really believe in. It's called the EMBRACE+, and we think it can simplify your life. Nowadays smartphones affect every aspect of our personal and professional lives. We use it for organizing, planning and interacting with each other. But our busy lives require us to focus on work, activities and obligations. The EMBRACE+ is a smart piece of wearable technology that alerts you to changes in your environment. Through a simple set-up of some basic parameters on the smartphone you already have, the EMBRACE+ will show you what you want, and when you want to know it. With the EMBRACE+ app you can match colors and vibration with your contacts, updates, reminders, and messages. The EMBRACE+ band is made of the highest quality transparent silicone produced in the U.S. ensuring 100% silicone with no harmful additives.

. - $275K: 'Camera Shutter' function for on/off button. Why we need your support Thanks for backing us! Paul and Rudi. DesigningCX | Human-Centered Design & Customer Experience Innovation Tools. Experience maps, user journeys and more… | UX Lady. Experience Map is an important design tool to understand our product/service interactions from users’ point of view. One experience map is basically a visual representation that illustrate users’ flow (within a product or service) their needs, wants, expectations and the overall experience for a particular goal. Besides Experience Maps, different names are used to refer to similar representations, some of them are: Customer Journey, User Journey and some time Blueprint or Service Ecology, although there are some nuances in the latter two, I prefer to include them in the group of the multidimensional maps. If you search the internet you will see that there are many different examples of experience map, with some common elements between them.

After reviewing many of them, investigate the existing methodology and design one for the company I work for, I have reached the conclusion that there are some design patterns, more or less clear, here I will share with you some insights about them. Dave Ferguson - Self-driving cars - Google X. Ross Young - Project Loon - Google [x] An EEG That Fits Inside Your Ear.

Neuroscientists often use electroencephalography (EEG) as an inexpensive way to record electrical signals in the brain. Though it would be useful to run these recordings for long periods of time, that usually isn’t practical: EEG recording traditionally involves attaching many electrodes and cables to a patient’s scalp. Now engineers at Imperial College in London have developed an EEG device that can be worn inside the ear, like a hearing aid. They say the device will allow scientists to record EEGs for several days at a time; this would allow doctors to monitor patients who have regularly recurring problems like seizures or microsleep. “The ideal is to have a very stable recording system, and recordings which are repeatable,” explains co-creator Danilo Mandic. “It’s not interfering with your normal life, because there are acoustic vents so people can hear. After a while, they forget they’re having an EEG.”

“Different modalities will have different applications. Wireless devices go battery-free with new communication technique. (Phys.org) —We might be one step closer to an Internet-of-things reality. University of Washington engineers have created a new wireless communication system that allows devices to interact with each other without relying on batteries or wires for power. The new communication technique, which the researchers call "ambient backscatter," takes advantage of the TV and cellular transmissions that already surround us around the clock.

Two devices communicate with each other by reflecting the existing signals to exchange information. The researchers built small, battery-free devices with antennas that can detect, harness and reflect a TV signal, which then is picked up by other similar devices. The technology could enable a network of devices and sensors to communicate with no power source or human attention needed. "Our devices form a network out of thin air," said co-author Joshua Smith, a UW associate professor of computer science and engineering and of electrical engineering. Beyond Google Glass: Researcher looks to the future. (Phys.org) —A wearable display being developed by UA optical scientist Hong Hua could have capabilities even more advanced than those of the recently unveiled Google Glass, a pair of glasses with smartphone capabilities. University of Arizona associate professor of optical sciences Hong Hua is developing technology that could make a wearable display that is lighter, easier to use and has finer and more varied capabilities than the recently rolled-out Google Glass.

Imagine strolling down the street wearing a new pair of glasses – but these are no ordinary shades. A miniscule computer lodged in the frame projects text onto the lenses before your eyes, reflecting the light so that the information appears to be at arm's distance away from you, or a little farther, but only you can read it. You can control the functions of the device by voice, generate a map giving you directions, read text messages and take photographs and video. The device has medical applications, too. Experience maps, user journeys and more… | UX Lady.