Force sensor in simulated skin and neural model mimic tactile sai afferent spiking response to ramp and hold stimuli The next generation of prosthetic limbs will restore sensory feedback to the nervous system by mimicking how skin mechanoreceptors, innervated by afferents, produce trains of action potentials in response to compressive stimuli. Force sensor in simulated skin and neural model mimic tactile sai afferent spiking response to ramp and hold stimuli
From left, Gerard Francisco, José Luis Contreras-Vidal and Marcia O’Malley work with a University of Houston (UH) graduate student testing MAHI-EXO II, a robotic rehabilitation device developed at Rice and being used at TIRR Memorial Hermann to help spinal-cord-injury patients recover. In a new project, a similar device will be matched with a noninvasive neural interface under development at UH to help rehabilitate stroke survivors. Brain wave-reading robot might help stroke patients
Can Billionaires Achieve Immortality by 2045? What's the Latest Development?
Paralysed woman drinks coffee with thought-guided robot arm
Voicegrams transform brain activity into words
We might one day be able to monitor our bodies' internal functions — and prevent things like epileptic seizures before they happen — using a flexible circuit attached to the surface of skin.
Artificial Super-Skin Could Transform Phones, Robots and Artificial Limbs Touch sensitivity on gadgets and robots is nothing new. A few strategically placed sensors under a flexible, synthetic skin and you have pressure sensitivity. Add a capacitive, transparent screen to a device and you have touch sensitivity. However, Stanford University’s new “super skin” is something special: a thin, highly flexible, super-stretchable, nearly transparent skin that can respond to touch and pressure, even when it’s being wrung out like a sponge.
Real-time Discovery Engine - YourVersion: Discover Your Version of the Web™
This article was taken from the February 2012 issue of Wired magazine. Be the first to read Wired's articles in print before they're posted online, and get your hands on loads of additional content by subscribing online. Wearable robot puts paralysed legs through their paces
Connecting to the brain: Thinking about it
Virtual reality posts on CNET
Mind-Reading Computer: US Scientists Manage To Decode Brain Activity And Put Into Words | TechnologyScientists believe they have found a way to read people's minds in what could be the first step towards helping brain-damaged patients who cannot speak. US researchers used a computer programme to decode brain activity and put it into words using a form of electronic telepathy.
CES: A laptop that follows your eyes - 1/13 Touch control, voice control, gesture control: alternative interfaces – or those that aren’t mice and keyboards – are all the rage at this year’s Consumer Electronics Show (CES). With electronics gaining ever more computing power, it’s understandable that old inputs don’t necessarily apply to new gadgets.
ROCKFORD, MI -- An athlete who was paralyzed in an accident was able to walk again Monday morning during a demonstration of Ekso, a wearable exoskeleton robot, at DMC Rehabilitation Institute of Michigan's Center for Spinal Cord Injury Recovery. The Ekso technology, named for its exoskeleton-like properties, was developed by California-based Ekso Bionics and aims to help those with lower extremity paralysis or weakness to stand and walk. Video: See how new wearable robot technology helps paralyzed patients walk
It sounds like a sci-fi movie – doctors growing body parts to cure our ills.