background preloader

The man who hears colour

The man who hears colour
15 February 2012Last updated at 15:37 Artist Neil Harbisson is completely colour-blind. Here, he explains how a camera attached to his head allows him to hear colour. Until I was 11, I didn't know I could only see in shades of grey. When I was diagnosed with achromatopsia [a rare vision disorder], it was a bit of a shock but at least we knew what was wrong. When I was 16, I decided to study art. I was allowed to do the entire art course in greyscale - only using black and white. At university I went to a cybernetics lecture by Adam Montandon, a student from Plymouth University, and asked if we could create something so I could see colour. If we were all to hear the frequency of red, for example, we would hear a note that is in between F and F sharp. I started using it 24 hours a day, carrying it around in a backpack and feeling that the cybernetic device, the eyeborg, and my organism were completely connected. Continue reading the main story Shades of grey Continue reading the main story Related:  Cognitive Augmentation

World Wide Mind [H]uman nature was originally one and we were a whole, and the desire and pursuit of the whole is called love.—Plato, The Symposium When my BlackBerry died I took it to a cell phone store in San Francisco’s Mission district. “JVM 523,” I said mournfully. The clerk called tech support while I wandered around the store,peering at cell phone covers and batteries. “It’s dead,” he said. “You can’t just reload the operating system?” “They say not.” “How can a software bug kill a BlackBerry?” He shrugged. “All right,” I said, and walked out, minus BlackBerry. The stores were full of avocados and plantains, $15 knapsacks hanging from awnings, and rows of watches in grimy windows. Except for my email, and the Internet. Most of all, I couldn’t ask it, “Who is this person?” I had asked it that question a few months earlier while visiting Gallaudet University, a school for the deaf in Washington, D.C. The professor was blond and flamingo-slender, with a snub nose. Nosy?

The remote control helicopter you can control with your mind Special headband monitors brainwaves, and can trigger commands when certain states are met - such as the user being relaxedHelicopter can help users train their mind to enter a relaxed state By Mark Prigg Published: 18:14 GMT, 16 November 2012 | Updated: 18:22 GMT, 16 November 2012 If you've ever struggled to control a remote control helicopter and sent it crashing into a wall, help could be hand. A US firm is raising money for a remote control helicopter maneuvered not by a joystick, but by the mind. The Puzzlebox Orbit uses a headset to monitor brainwave readings. Scroll down for video The spherical helicopter, designed to be able to withstand being flown into objects, is controlled by a headwaves which can read brainwaves The user wears a headset which monitors their brain activity. When they enter a set (eg relaxed) state, flight patterns such as a preprogrammed path or commands such as 'hover' can be triggered. 'We are building and selling this crazy new toy,' its founders say.

Brain Computer Interface used to control the movement and actions of an android robot Researchers at the CNRS-AIST Joint Robotics Laboratory and the CNRS-LIRMM Interactive Digital Human group, are working on ways to control robots via thought alone. "Basically we would like to create devices which would allow people to feel embodied, in the body of a humanoid robot. To do so we are trying to develop techniques from Brain Computer Interfaces (BCI) so that we can read the peoples thoughts and then try to see how far we can go from interpreting brain waves signals, to transform them into actions to be done by the robot." The interface uses flashing symbols to control where the robot moves and how it interacts with the environment around it. "And the applications targeted are for tetraplegics or paraplegics to use this technology to navigate using the robot, and for instance, a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan."

Reading Visual Braille with a Retinal Prosthesis | Frontiers in Neuroprosthetics 1Second Sight Medical Products, Sylmar, CA, USA 2Brigham Young University – Idaho, Rexburg, ID, USA 3UMR-S 968, Institut de la Vision, Paris, France 4CIC INSERM DHOS 503, National Ophthalmology Hospital, Paris, France Retinal prostheses, which restore partial vision to patients blinded by outer retinal degeneration, are currently in clinical trial. The Argus II retinal prosthesis system was recently awarded CE approval for commercial use in Europe. Keywords: retina, epiretinal prosthesis, sensory substitution, retinitis pigmentosa, blindness, perception, degeneration, sight restoration Citation: Lauritzen TZ, Harris J, Mohand-Said S, Sahel JA, Dorn JD, McClure K and Greenberg RJ (2012) Reading visual braille with a retinal prosthesis. Received: 07 July 2012; Accepted: 01 November 2012; Published online: 22 November 2012. Copyright: © 2012 Lauritzen, Harris, Mohand-Said, Sahel, Dorn, McClure and Greenberg. *Correspondence: Thomas Z.

Neuroscience: The mind reader Adrian Owen still gets animated when he talks about patient 23. The patient was only 24 years old when his life was devastated by a car accident. Alive but unresponsive, he had been languishing in what neurologists refer to as a vegetative state for five years, when Owen, a neuro-scientist then at the University of Cambridge, UK, and his colleagues at the University of Liège in Belgium, put him into a functional magnetic resonance imaging (fMRI) machine and started asking him questions. Incredibly, he provided answers. Patients in these states have emerged from a coma and seem awake. Owen's discovery1, reported in 2010, caused a media furore. Nature Podcast Communicating with vegetative patients. Many researchers disagree with Owen's contention that these individuals are conscious. Still, he shies away from asking patients the toughest question of all — whether they wish life support to be ended — saying that it is too early to think about such applications. Lost and found To the clinic

The revolutionary 'contact lens' loaded with stem cells that restores sight - by helping the eye heal itself naturally Biodegradable implant is loaded with stem cells These then multiply in the eye, allow the body to heal the eye naturallyIt's hoped the implant will help millions of people across the world retain or even regain - their sight By Anna Hodgekiss Published: 12:39 GMT, 6 December 2012 | Updated: 18:25 GMT, 6 December 2012 A ‘contact lens’ loaded with stem cells could be a way to naturally repair or retain sight. Scientists hope the biodegradable implant loaded with stem cells that then multiply will allow the body to heal the eye naturally. Stem cells are the building blocks of tissue growth. Scientists hope the biodegradable implant (pictured) loaded with stem cells that multiply will allow the body to heal the eye naturally The scientists at the University of Sheffield who developed the implant now hope the new technique could help millions of people across the world retain or even regain - their sight. Laboratory tests have shown that the membranes will support cell growth.

Technology - Will we ever… have cyborg brains? After recent triumphs showing that implants could repair lost brain function, Martin W. Angler explores how soon we can use this technology for creating enhanced humans. For the first time in over 15 years, Cathy Hutchinson brought a coffee to her lips and smiled. In both cases the implants convert brain signals into digital commands that a robotic device can follow. Yet it’s still a far cry from the visions of man fused with machine, or cyborgs, that grace computer games or sci-fi. Creating implants that improve cognitive capabilities, such as an enhanced vision “gadget” that can be taken from a shelf and plugged into our brain, or implants that can restore or enhance brain function is understandably a much tougher task. Mind switch The media proclaimed this achievement as being an “artificial cerebellum“ and a “cyborg rat“. In September this year, American scientists said they had created a way of enhancing a monkey’s decision making by about 10%. Unwanted reactions

Labor of Love For a good long while, I let myself think that the slender platinum blonde behind the counter at Pret A Manger was in love with me. How else to explain her visible glow whenever I strolled into the shop for a sandwich or a latte? Then I realized she lit up for the next person in line, and the next. Radiance was her job. Pret A Manger—a London-based chain that has spread over the past decade to the East Coast and Chicago—is at the cutting edge of what the Berkeley sociologist Arlie Hochschild calls "emotional labor." The British journalist Paul Myerscough flagged Pret's reliance on emotional labor in a fascinating recent essay for the London Review of Books. Pret doesn't merely want its employees to lend their minds and bodies; it wants their souls, too. Emotional labor is not itself new. Pret doesn't merely want its employees to lend their minds and bodies; it wants their souls, too.

Emotions are included New Republic has an interesting piece on how corporations enforce ‘emotional labour’ in their workforce – checking that they are being sufficiently passionate about their work and caring to their customers. It focuses on the UK sandwich chain Pret who send a mystery shopper to each outlet weekly and “If the employee who rings up the sale is appropriately ebullient, then everyone in the shop gets a bonus. If not, nobody does.” The concept of ‘emotional labour‘ was invented by sociologist Arlie Hochschild who used it to describe how some professions require people to present as expressing certain emotions regardless of how they feel. The idea is that the waiter who smiles and tells you to ‘have a nice day’ doesn’t really feel happy to see you and doesn’t particularly care how your day will go, but he’s asked to present as if he does anyway. ‘Surface emotional labour’ is known to be particularly difficult when it conflicts too much with what you really feel. For example:

Insect drives robot to track down smells (w/ video) A small, two-wheeled robot has been driven by a male silkmoth to track down the sex pheromone usually given off by a female mate. The robot has been used to characterise the silkmoth's tracking behaviours and it is hoped that these can be applied to other autonomous robots so they can track down smells, and the subsequent sources, of environmental spills and leaks when fitted with highly sensitive sensors. The results have been published today, 6 February, in the journal Bioinspiration and Biomimetics. The male silkmoth was chosen as the 'driver' of the robot due to its characteristic 'mating dance' when reacting to the sex pheromone of the female. Lead author of the research, Dr Noriyasu Ando, said: "The simple and robust odour tracking behaviour of the silkmoth allows us to analyse its neural mechanisms from the level of a single neuron to the moth's overall behaviour. A 1800 millimetre wind tunnel was used in the experiments; the pheromone and robot were placed at opposite ends.

Is bionic vision about to become a reality? Visionary: Is groundbreaking technology close to restore sight to the visually impared without the need for surgery? Bionic vision was once the preserve of futuristic technology shows. But those Tomorrow’s World days could soon be reality, with a pair of glasses linked to a computer offering hope to thousands of visually impaired people in Britain. Current technology that can give profoundly blind people a form of sight through a retinal implant is expensive and invasive, so developing an affordable, non-invasive alternative would be a welcome improvement. And this is exactly what scientists at the University of Oxford are working on and will be showcasing at the Science Uncovered event at the Natural History Museum in London this week. Our technology is aimed at those who are classed as legally blind but can still see a small amount of light, such as those with age-related macular degeneration and diabetic retinopathy.’

untitled February 19th, 2013 | by Charles Q. Choi Temporary electronic tattoos could soon help people fly drones with only thought and talk seemingly telepathically without speech over smartphones, researchers say. Commanding machines using the brain is no longer the stuff of science fiction. In recent years, brain implants have enabled people to control robotics using only their minds, raising the prospect that one day patients could overcome disabilities using bionic limbs or mechanical exoskeletons. But brain implants are invasive technologies, probably of use only to people in medical need of them. His team is developing wireless flexible electronics one can apply on the forehead just like temporary tattoos to read brain activity. "We want something we can use in the coffee shop to have fun," Coleman says. The devices are less than 100 microns thick, the average diameter of a human hair. Electronic telekinesis? These devices can also be put on other parts of the body, such as the throat.

The Amazing Story Of The $300 Glasses That Correct Colorblindne If it were up to academia, Changizi's story might have ended there. "I started out in math and physics, trying to understand the beauty in these fields," he says, "You are taught, or come to believe, that applying something useful is inherently not interesting." Not only did Changizi manage to beat that impulse out of himself, but he and Tim Barber, a friend from middle school, teamed up several years ago to form a joint research institute. 2AI Labs allows the pair to focus on research into cognition and perception in humans and machines, and then to commercialize it. The most recent project? A pair of glasses with filters that just happen to cure colorblindness. Changizi and Barber didn't set out to cure colorblindness. When they started thinking about commercial applications, Changizi and Barber both admit their minds went straight to television cameras. Changizi knew this was a possibility, as the filter concentrates enhancement exactly where red-green colorblind people have a block.

A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information : Scientific Reports In our training paradigm, animals learned basic elements of the tasks prior to participating in any BTBI experiments. First, prospective encoder rats were trained to respond to either tactile or visual stimuli until they reached 95% correct trials accuracy. Meanwhile, decoder rats were trained to become proficient while receiving ICMS as a stimulus. A train of ICMS pulses instructed the animal to select one of the levers/nose pokes, whereas a single ICMS pulse instructed a response to the other option. Decoder rats reached a 78.77% ± 2.1 correct trials performance level. After this preliminary training was completed, the animals were run in pairs, each one in a separate operant box. The next phase of training began with the encoder rat performing ~10 trials of the motor or tactile task, which were used to construct a cortical ensemble template, i.e. the mean cortical neuronal activity for one of the responses. Full size image (222 KB) Full size image (314 KB) Full size image (155 KB)

Related: