background preloader

Non-invasive BCI

Facebook Twitter

Technology - 3D printing powered by thought. Imagine if you could print objects just by thinking about them. Camila Ruz visits one company to see whether this is far-fetched dream or a real possibility. It’s definitely not a bird. Nor is it a plane. The garish orange piece of plastic, small enough to hold in the palm of a hand, could pass for a missing limb of a toy tyrannosaurus. It may not look all that impressive, but it’s notable for two reasons. One is that the monster arm has emerged from a 3D printer. This milestone was reached with little fanfare last month at the Santiago MakerSpace, a technology and design studio in the Chilean capital. Engineers and designers have been using 3D printers for more than two decades. “What is the point of these printers if my son cannot design his own toy?”

That’s where Emotional Evolutionary Design (EED), the software that allows Thinker Thing to interpret its users’ thoughts, comes in. Second nature Lipson’s lab is also working on evolving 3D models with the mind. Dream maker. The future of brain-computer interfaces revealed. You may already be having basic conversations with your smartphone, desktop PC, games console, TV and, soon, your car, but such voice recognition is – in the scientific community, at least – firmly in a folder market 'dumb' technology. New ways of controlling consumer electronics goods with both basic voice and gestures are suddenly common, but we could soon be operating computers not by barking out instructions or waving, but purely by thinking. Research into the long researched brain-computer interface (BCI) – also known as the 'mind-machine' interface – is becoming so advanced that it's set to create a whole new symbiotic relationship between man and machine.

It could even lead to a situation where speech is rendered useless, and people wirelessly communicate through universal translator chips. No more complaining about loud music in nightclubs, then. Forget about the wireless revolution – this revolutionary tech demands cables. Mind control Express yourself Thinking strategies Cloudy days. DARPA combines human brains and 120-megapixel cameras to create the ultimate military threat detection system. After more than four years of research, DARPA has created a system that successfully combines soldiers, EEG brainwave scanners, 120-megapixel cameras, and multiple computers running cognitive visual processing algorithms into a cybernetic hivemind.

Called the Cognitive Technology Threat Warning System (CT2WS), it will be used in a combat setting to significantly improve the US Army’s threat detection capabilities. There are two discrete parts to the system: The 120-megapixel camera, which is tripod-mounted and looks over the battlefield (pictured below); and the computer system, where a soldier sits in front of a computer monitor with an EEG strapped to his head (pictured above). Images from the camera are fed into the computer system, which runs cognitive visual processing algorithms to detect possible threats (enemy combatants, sniper nests, IEDs).

In short, CT2WS taps the human brain’s unsurpassed ability to recognize objects. Now read: Changing the world: DARPA’s top inventions. Reading the World through the Skin and Ears: A New Perspective on Sensory Substitution. Comparison of consumer brain–computer interfaces. This is a comparison of brain-computer interface devices available on the consumer market. Comparison[edit] Open-source projects[edit] Emokit is an open-source Python library for reading out sensor data from the EPOC (Emotiv Systems) by Cody Brocious. It was built by reverse-engineering the encrypted protocol.[40] Emokit has been deprecated in favour of emokit.[41] Open-source Matlab toolboxes such as EEGLAB, Fieldtrip, and the Neurophysiological Biomarker Toolbox (NBT) can be used to process data from the electroencephalography.

The Fieldtrip toolbox also offers a real-time plugin.[42] OpenVibe is a LGPL software platform (C++) to design, test and use BCI.[43] The software comes with an acquisition server that is currently compatible with many EEG device including Neurosky Mindset, Emotiv EPOC (Research Edition or above) and OpenEEG. Several open-source computer programs are also available from EPFL's CNBI project.[44][45] Technology[edit] References[edit] External links[edit] NeuroSky Mindwave Mobile Myndplay Bundle. Google Glass - will we love it or hate it? 5 May 2013Last updated at 20:16 ET By Jane Wakefield Technology reporter The glasses have a small camera and display built in Google's smart glasses project has been causing excitement in the tech world for months as speculation about what it will finally look like and be able to do reaches fever pitch.

Prototype devices are being tested by around 1,000 so-called Glass Explorers and are expected to go on sale to the public next year. While some see such wearable computing as the obvious next step for the digital age, others regard the idea of even more intimate connections with the network quite scary. The BBC has garnered the views of those who have tried Glass and others who have strong views about the project to see what a smart-glassed future might look like. I was the first person on the West Coast to pick up my device and, having had Glass for a few weeks now, I'm mostly surprised at how much there was to learn about using it, and how much more there is to discover.

Don't get me wrong. EEG to help prevent strokes; petting a cat and the prefrontal cortex; beatboxing – this weeks News Roundup! This week in our news roundup: the Discovery Channel highlights how the UPMC Rehabilitation Institute is using single unit recording to achieve greater accuracy and control with brain computer interfaces; an EEG headset to help prevent strokes has been developed in Israel; measuring pleasure stimuli using a consumer EEG headset from neurofeedback company MyndPlay; and beatboxing as seen through an MRI. 1// Using single unit recording to achieve greater control with a BCI The team who helped a woman lift a cup using a brain computer interface, at the UPMC Rehabilitation Institute, are now working on a second study that uses single unit recording.

What is single unit recording ? This is a small grid in the brain that allows researchers to record activity from individual neurons, to achieve an even greater degree of control and accuracy of movement than the first study allowed. 2// NeuroKeeper develops EEG headset to help prevent strokes 3// On a scale of 1 to 100, measure your pleasure. Technology - 3D printing powered by thought. Eunoia. Reading the World through the Skin and Ears: A New Perspective on Sensory Substitution.

EEG to help prevent strokes; petting a cat and the prefrontal cortex; beatboxing – this weeks News Roundup! | InteraXon Blog. Introducing Touchy, A Human Camera from Japan. Touchy is a camera that is placed on one’s head and only takes photos when the person wearing the camera is physically touched. Every time a person makes contact with the person wearing the camera head, it will open its shutter like eye-holes and take photos.

If the person keeps holding on to the touch of that person, then the camera will take a shot every ten seconds. Developed by the minds of Eric Siu, Tomohiko Hayakawa, and Carson Reynolds, Touchy (The human camera) is a phenomenological social interaction experiment that focuses on the relationship of giving and receiving by literally transforming a human into a camera. This human camera, with its unique interpersonal properties, aims at healing social anxiety by creating joyful interactions. “It is common for humans to be separated into social bubbles, to avoid sharing social space and to connect to strangers,” say its creators. ᔥ Artnau. Hearing Through Your Skin, and Other Adventures in Sensory Substitution | In Their Own Words.

We’re entering a very interesting stage of human history right now where we can start importing technology to enhance our natural senses or perception of the world. So as it stands now, as biological creatures, we only see a very small strip of what's going on. Obviously, the infinitely large and the infinitely small - our brains aren't even wired to be able to understand that.

But even on the space scales that we live at, we don't see most of what's going on. So, for example, take electromagnetic radiation, there's a little strip of that that we can see and we call it visible light. But the whole rest of that spectrum - radio waves, television, cell phone, gamma rays, x-rays, it’s invisible to us because we don't have biological receptors for it.

So CNN is passing through your body right now and you don't know it because you don't have the right receptors for it. Actually, I’ll start with sensory substitution. In Their Own Words is recorded in Big Think's studio. InteraXon - Thought-controlled computing - Interaxon. Highlights of NeuroGaming 2013. Google Glass - will we love it or hate it?

Sensors and Actuators A: Physical - A 3D printed dry electrode for ECG/EEG recording. A Centre for Microsystems Technology (CMST), Faculty of Engineering, University of Ghent, 914A Technologiepark, B-9052 Ghent-Zwijnaarde, Belgium b Department of Neurology, Ghent University Hospital, 185 De Pintelaan, 9000 Ghent, Belgium c Faculty of Applied Engineering Sciences, CPMT Research Group, University College Ghent, Voskenslaan 362, B-9000 Ghent, Belgium d Department of Materials Science and Engineering, University Ghent, Technologiepark 903, B-9052 Ghent, Belgium Received 4 May 2011 Revised 8 December 2011 Accepted 8 December 2011 Available online 17 December 2011 , How to Cite or Link Using DOI Abstract In this paper, the design, fabrication and testing of a 3D printed dry electrode is proposed. 3D printing represents an authentic breakthrough for the development and mass production of dry medical electrodes.

Highlights Keywords 3D printed dry electrode ; Bioelectric signal recording ; Electrocardiogram (ECG) ; Electroencephalogram (EEG) 2010 Garguilo Clinical Neurophysiology.pdf. Dry electrodes - gtec's newest development. Brain Computer Interfaces Inch Closer to Mainstream. Cadeau CreativeMuse, a lightweight, wireless headband, can engage with computers, iPads and smartphones.

Last week, engineers sniffing around the programming code for Google Glass found hidden examples of ways that people might interact with the wearable computers without having to say a word. Among them, a user could nod to turn the glasses on or off. A single wink might tell the glasses to take a picture. But don’t expect these gestures to be necessary for long. Soon, we might interact with our smartphones and computers simply by using our minds. In a couple of years, we could be turning on the lights at home just by thinking about it, or sending an e-mail from our smartphone without even pulling the device from our pocket. Farther into the future, your robot assistant will appear by your side with a glass of lemonade simply because it knows you are thirsty. Google Glass Could Make Snapping Pics as Easy as Winking | Gadget Lab. Google Glass will include more features than what meets the eye. Image: Google Code tucked away in the MyGlass Google Glass Companion App reveals Google is working on a handful of cool new features for its smart frames, including two-finger touch-to-zoom and winking to take a photo.

The code, discovered by Reddit user Fodawim, suggests users will be able to use their eyes, fingers, and head to accomplish various tasks. For example, a “head wake” function, listed in the code as a head gesture, could turn Glass on or off. Google unveiled Project Glass in April, 2012. Via TheNextWeb. Samsung’s (Very) Early Attempts At Thought-Controlled Mobile Devices. Samsung’s Galaxy smartphones are controlled by touch, gesture, eye movement—and your mind.

Well, not exactly that last bit. At least, not yet. Perhaps half in the name of science, half for publicity, Samsung’s teamed up with Roozbeh Jafari (University of Texas, Dallas assistant professor and wearable computing expert) to translate thoughts into common computing tasks using an electroencephalogram (EEG) cap. The EEG cap fits snugly onto the users’ head and uses electrodes to pick up the brain’s faint electrical signals. These signals fall into repetitive patterns when confronted with repetitive visual stimuli like blinking icons. The initial challenge was detecting and separating the right signals to accurately control the device. In an MIT Technology Review video, a user sporting an EEG cap is shown manipulating a tablet—launching a music application, selecting the artist, and pausing and resuming music. The news comes with a standard disclaimer: It’s still very early going. Forget your password: The future is 'passthoughts' (Phys.org) —Instead of typing your password, in the future you may only have to think your password, according to School of Information researchers.

A new study explores the feasibility of brainwave-based computer authentication as a substitute for passwords. The project was led by School of Information professor John Chuang, along with Hamilton Nguyen, an undergraduate student in electrical engineering and computer science; Charles Wang, a first-year I School MIMS student; and Benjamin Johnson, formerly a postdoctoral scholar at the I School. Chuang presented the team's findings this week at the 2013 Workshop on Usable Security at the Seventeenth International Conference on Financial Cryptography and Data Security in Okinawa, Japan.

Since the 1980s, computer scientists have proposed the use of biometrics for computer authentication. All that has changed, though, with recent developments in biosensor technologies. New consumer-grade EEG devices Will it work? But accuracy isn't enough. Brain-computer interfaces inch closer to mainstream, raising questions. Non-invasive brain-to-brain interface: links between two brains. Block diagram of brain-to-brain interface (BBI). Left: steady-state visual evoked potential (SSVEP)-based brain-to-computer interface; right:focused ultrasound-based computer-to-brain interface (CBI).

(Credit: Yoo S-S et al. /PLoS ONE) We reported last month how Duke University researchers remotely linked the brains of two rats. The researchers — at Brigham and Women’s Hospital and Harvard Medical School. — set up a system intended to allow a human to remotely make a rat’s tail flick. The BBI system had two parts: a BCI, using EEG sensors and computer to pick up intenton from the human; and transcranial sonication of focused ultrasound (FUS) to modulate the neural activity of specific brain regions in the rat’s brain.

Wagging the rat Here’s how it worked: 1. 2. 3. 4. The “Brainstorm’ scenario Brainstorm (credit: MGM) This experiment was limited to a simple on-off signal. “Stop the Cyborgs” launches public campaign against Google Glass. Less than two weeks ago, Seattle’s 5 Point Cafe became the first known establishment in the United States (and possibly the world) to publicly ban Google Glass, the highly anticipated augmented reality device set to be released later this year. The “No Glass” logo that the café published on its website was developed and released (under a Creative Commons license) by a new London-based group called “Stop the Cyborgs.” The group is composed of three young Londoners who decided to make a public case against Google Glass and other similar devices.

“If it's just a few geeks wearing it, it's a niche tool [and] I don't think it's a problem,” said Adam, 27, who prefers only to be identified by his first name. He communicated with Ars via Skype and an encrypted Hushmail e-mail account. “But if suddenly everyone is wearing it and this becomes as prevalent as smartphones—you can see it becomes very intrusive very quickly. “Most people [have] no idea what they were looking at” A workout for your self-control: Jordan Silberman at TEDxFlourCity. Quasar USA | Quasar USA. Video: Mind reading system may help us drive better. Google reveals tech specs for Glass. Google Glass: how it works (infographic)

The blind rock climber who sees with his tongue. A tablet controlled by your brain. Home - Non-Invasive Brain-Computer Interface. Scientists Create ‘Star Trek’ Visor, Helps Blind See.