Force sensor in simulated skin and neural model mimic tactile sai afferent spiking response to ramp and hold stimuli The next generation of prosthetic limbs will restore sensory feedback to the nervous system by mimicking how skin mechanoreceptors, innervated by afferents, produce trains of action potentials in response to compressive stimuli. Force sensor in simulated skin and neural model mimic tactile sai afferent spiking response to ramp and hold stimuli
From left, Gerard Francisco, José Luis Contreras-Vidal and Marcia O’Malley work with a University of Houston (UH) graduate student testing MAHI-EXO II, a robotic rehabilitation device developed at Rice and being used at TIRR Memorial Hermann to help spinal-cord-injury patients recover. In a new project, a similar device will be matched with a noninvasive neural interface under development at UH to help rehabilitate stroke survivors. Brain wave-reading robot might help stroke patients
Can Billionaires Achieve Immortality by 2045? What's the Latest Development?
Paralysed woman drinks coffee with thought-guided robot arm
Swiss scientists show partially paralyzed person can control robot using brain signals
Voicegrams transform brain activity into words Voicegrams transform brain activity into words
Electronic Tattoo-Like Devices Monitor Brain, Heart and Muscles [VIDEO] We might one day be able to monitor our bodies' internal functions — and prevent things like epileptic seizures before they happen — using a flexible circuit attached to the surface of skin.
Artificial Super-Skin Could Transform Phones, Robots and Artificial Limbs Touch sensitivity on gadgets and robots is nothing new. A few strategically placed sensors under a flexible, synthetic skin and you have pressure sensitivity. Add a capacitive, transparent screen to a device and you have touch sensitivity. However, Stanford University’s new “super skin” is something special: a thin, highly flexible, super-stretchable, nearly transparent skin that can respond to touch and pressure, even when it’s being wrung out like a sponge.
Real-time Discovery Engine - YourVersion: Discover Your Version of the Web™
This article was taken from the February 2012 issue of Wired magazine. Be the first to read Wired's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online . Wearable robot puts paralysed legs through their paces
<a href="//ad.doubleclick.net/jump/teg.ckau/kidj/a;subs=n;wsub=n;sdn=n;! Connecting to the brain: Thinking about it
Virtual reality posts on CNET
Mind-Reading Computer: US Scientists Manage To Decode Brain Activity And Put Into Words | TechnologyScientists believe they have found a way to read people's minds in what could be the first step towards helping brain-damaged patients who cannot speak. US researchers used a computer programme to decode brain activity and put it into words using a form of electronic telepathy.
CES: A laptop that follows your eyes - 1/13 Touch control, voice control, gesture control: alternative interfaces - or those that aren't mice and keyboards - are all the rage at this year's Consumer Electronics Show (CES).
Enlarge Chris Clark | firstname.lastname@example.org Paul Thacker gets a hug from his friend Sandy Burns before walking with the help of an Ekso, a bionic, battery-powered wearable robot. Thacker, an extreme athlete and professional snowmobiler lost the use of his legs after a snowmobile accident in 2010. Video: See how new wearable robot technology helps paralyzed patients walk
It sounds like a sci-fi movie – doctors growing body parts to cure our ills. Dream of making artificial body parts becoming a reality