iBrain to allow Stephen Hawking to communicate through brainwaves alone Tech startup Neurovigil announced last April that Stephen Hawking was testing the potential of its iBrain device to allow the astrophysicist to communicate through brainwaves alone. Next week Professor Hawking and iBrain inventor, Dr Philip Low from Stanford University, present their findings at the Francis Crick Memorial Conference in Cambridge, England. In anticipation, Gizmag spoke to Dr Low about the potential applications of the iBrain. When Dr Low first met Stephen Hawking, he asked when the renowned astrophysicist would like to begin trialing his new product. Low waited a moment as Hawking entered his response through the pair of infra-red glasses he uses to send messages via the muscles in his cheek. At 70-years-old, Amyotrophic Lateral Sclerosis (A.L.S) or Lou Gehrig’s Disease has deteriorated Stephen Hawking’s condition to the point where it takes several minutes for him to communicate a simple message. “We were looking for a change in the signal,” says Dr Low.
Inexpensive device could allow the disabled to control computers with their eyes Bioengineers at Imperial College, London have developed a new computer controller for paraplegics that is not only more accurate and easier to use than current methods, but also uses inexpensive, off-the-shelf components. The GT3D device uses a pair of eyeglass frames with two fast video game console cameras costing less than UKP20 (US$30) each, which scan the wearer’s eyes from outside the field of vision and provide "3D" control at much lower costs and without invasive surgery. As computers become smaller, cheaper and more powerful, they gain greater potential for liberating paraplegics and other heavily impaired people. The computer’s potential to enhance the parapelgic’s quality of life is tremendous, but it depends on finding a practical means of control. Publishing in the Journal of Neural Engineering on Friday, the Imperial College team explained that the GT3D device, as it is called, uses two cameras to collect images of each of the wearer’s eyes.
Mental Scanner Lets Paralyzed People Spell Their Thoughts A new system allows paralyzed people to communicate by mentally selecting letters in the English alphabet. People trained to use the system think certain thoughts for each letter, which causes blood to flow to the brain in characteristic patterns. A functional magnetic resonance imaging (fMRI) scanner then captures and interprets what's happening in users' brains. Right now, the system is still in its proof of concept stage, but it's a promising addition to research on letting people with so-called "locked-in syndrome" out again, Scientific American reported. One reason the system isn't ready for widespread use yet is that it's a little cumbersome to use. The system's creators, a team of neuroscientists in the Netherlands and Germany, tested the system successfully in six healthy adults. The scanner interpreted the tester's reply as "INDCONERCA," from which they guessed the tester's intended answer, "INDONESIA." Sources: Scientific American, BBC
Sensing Cyborg Tissues Now Feasible Scientists have developed a technique for constructing silicon nanowire tissue scaffolds that contain nanoscale electrodes capable of monitoring intra- and extracellular function within living biological tissues grown through them. The porous three-dimensional (3D) biocompatible scaffolds can be generated as a mesh or planar construct and manipulated into just about any shape required before seeding with living cells. Embedded in the framework are silicon nanowire field-effect transistor (FET) detectors that can monitor and detect changes in physicochemical parameters within tissues grown through the scaffold. Initial experiments demonstrated utility of the platform to monitor electrical responses in tissues grown from cardiac and neural cells, and also to monitor pH changes in synthetic blood vessels constructed from smooth muscle cells. Initial experiments demonstrated that the nanoES were capable of supporting the 3D growth of heart and nerve cells seeded into them.
CAVE a high-tech research, education tool published in 2002 Advanced microscope technology allows you to cells and molecules in 3-D by pressing your eye to a lens or looking at the view on a computer screen. CAVE technology allows you to create a room-size projection and walk around inside of a cell. Virginia Tech’s University Visualization and Animation Group helps researchers use the CAVE — which stands for Computer Augmented Virtual Environment. VT-CAVE is a multidisciplinary computer graphic visualization research and educational facility that is part of the new ACITC. When objects become extremely large and complex, a virtual reality CAVE can be used to literally walk inside of these structures. Many departments on campus are using the CAVE for both education and research projects. — the Virtual Jamestown project — the USDA project, “Putting Bugs in a CAVE Room,” entomology — the “Virtual Dandelion” project, plant pathology Remote site CAVE labs have been created in architecture, interior design, and materials science. Crumbs?
Paralyzed woman controls robotic arm, sips coffee Performing even a simple movement is a rather complicated process. First, the brain has to signal its intent to perform an action, which then gets translated into the specific motions that are required to achieve that intention. Those motions require a series of muscle contractions; the signals for these need to be sent out of the brain, through the spinal cord, and to the appropriate destination. For most people who suffer from paralysis, it's really these later steps that are affected—most of the setup can still go on in the brain, but damage keeps the signals from making their way to the muscles. This may sound like science fiction, but significant progress has been made in the area. Now, we've taken the next big step. The two individuals involved were implanted with the same device (termed "BrainGate") that had been used in the earlier experiments in which some individuals controlled a cursor. The success rates weren't enormous, but these experiments were really pretty limited.
MIT creates glucose fuel cell to power implanted brain-computer interfaces Neuroengineers at MIT have created a implantable fuel cell that generates electricity from the glucose present in the cerebrospinal fluid that flows around your brain and spinal cord. In theory, this fuel cell could eventually drive low-power sensors and computers that decode your brain activity to interface with prosthetic limbs. The glucose-powered fuel cell is crafted out of silicon and platinum, using standard semiconductor fabrication processes. The platinum acts as a catalyst, stripping electrons from glucose molecules, similar to how aerobic animal cells (such as our own) strip electrons from glucose with enzymes and oxygen. Size-wise, the MIT engineers have created glucose-powered fuel cells that are as large as 64x64mm (2.5in), or as small as just a few millimeters. This discovery is exciting for two main reasons: a) The fuel cell is completely synthetic, and b) they can be produced using low-tech, decades-old chip fabrication processes.
Play World of Warcraft... With Your Mind! World of Warcraft may be slowly losing players, but it's gaining new ways to play the game -- specifically, thanks to G.Tec Medical Engineering in Austria, you will soon be able to play WoW with your mind: Video explains in detail how the process works, but if you're impatient, the Warcraft action starts at around 1:50. According to G.Tec's Armin Schnürer, this system, called intendiX®SOCI (for Screen Overlay Control Interface), will be commercially available later this year. G.Tec is the same company that created a similar innovation last year with Second Life, and when I blogged about it then, people wondered if this could actually be a commercially viable product. "Of course we need the electrode cap to measure your brain-waves," Armin answers, "but for the future we can think of using our dry electrodes (you don’t need gel anymore) or mounting the electrodes in a baseball cap." Tweet
A Light Bulb with a Computer and Projector Inside from the MIT Media Lab Augments Reality Desk toy: A computer with a camera and projector fits into a light bulb socket, and can make any surface interactive. Powerful computers are becoming small and cheap enough to cram into all sorts of everyday objects. Natan Linder, a student at MIT’s Media Lab, thinks that fitting one inside a light bulb socket, together with a camera and projector, could provide a revolutionary new kind of interface—by turning any table or desk into a simple touch screen. The LuminAR device, created by Linder and colleagues at the Media Lab, can project interactive images onto a surface, sensing when a person’s finger or hand points to an element within those images. Linder describes LuminAR as an augmented-reality system because the images and interfaces it projects can alter the function of a surface or object. Linder’s system uses a camera, a projector, and software to recognize objects and project imagery onto or around them, and also to function as a scanner.
World's Smallest 3D Sensor Ready to Go Anywhere When we look back at this year's CES, we may remember it as the year of the sensor. Wearable and embeddable sensor technology will be everywhere, and right there at the heart of at least some of it will be PrimeSense. The Israeli-based company created the 3D environmental mapping tech behind Microsoft's Kinect motion sensor for the Xbox 360 and, at CES 2013 in Las Vegas, PrimeSense will unveil what it calls the "World's Smallest" 3D sensor: the Capri. As thin as a pencil and no larger than a stick of gum, Capri could herald a new era in 3D-sensing-capable products and services. PrimeSense's 3D sensor technology works by first bathing the area in front of it in a sophisticated, near- infra-red light mesh. While using many of the same 3D sensing technologies found in Kinect, Capri is 10 times smaller, includes a new system on a chip (SoC) and, PrimeSense representatives told me, features more powerful algorithms. SEE ALSO: Can CES Survive the Mobile Revolution? Image Courtesy of PrimeSense
World's First Eye-Tracking PC Accessory to Launch in 2013 Eye-tracking technology has been slowly emerging as a viable technology the last couple of years, and it comes in real handy when you want to know which parts of a Facebook profile people actually look at. One of the leaders in the space, Tobii, is set to bring the tech to consumers in 2013 with a peripheral that works with any Windows 8 PC. Tobii will show off its eye tracker, called the REX, next week at CES. The REX is a strip that attaches beneath your monitor (desktops and laptops are welcome), and it plugs into a USB port. Once it's in place, the device works with special software called Tobii Gaze to track exactly what you're looking at on the screen, letting you do things as mundane as scrolling sideways or as exciting as blasting asteroids — all with a glance. We don't know how much the final product will cost, but there's a clue. Tobii expects to launch the REX in the fall. Images courtesy of Tobii