background preloader

DARPA combines human brains and 120-megapixel cameras to create the ultimate military threat detection system

DARPA combines human brains and 120-megapixel cameras to create the ultimate military threat detection system
After more than four years of research, DARPA has created a system that successfully combines soldiers, EEG brainwave scanners, 120-megapixel cameras, and multiple computers running cognitive visual processing algorithms into a cybernetic hivemind. Called the Cognitive Technology Threat Warning System (CT2WS), it will be used in a combat setting to significantly improve the US Army’s threat detection capabilities. There are two discrete parts to the system: The 120-megapixel camera, which is tripod-mounted and looks over the battlefield (pictured below); and the computer system, where a soldier sits in front of a computer monitor with an EEG strapped to his head (pictured above). Images from the camera are fed into the computer system, which runs cognitive visual processing algorithms to detect possible threats (enemy combatants, sniper nests, IEDs). In short, CT2WS taps the human brain’s unsurpassed ability to recognize objects. Now read: Changing the world: DARPA’s top inventions

Related:  EEG HeadsetProfessor XJJ

Store — MindWave The MindWave Education turns your computer into a private tutor. The headset takes decades of laboratory brainwave technology and puts it into a bundled software package for under $100. It safely measures brainwave signals and monitors the attention levels of students as they interact with math, memory and pattern recognition applications. Ten apps are included with experiences ranging from fun entertainment to serious education.

Amazing video shows us the actual movies that play inside our mind Upon rereading the entire article it does appear that the second clip shown to the subjects was fully reconstructed by what was learned from the brain in the first mapping session. That is fucking incredible. Agreed. That's just so incredible. AS3 Particle Node Sequencer › Experimenting with the Tonfall Audio Engine “An experimental particle based audio sequencer, created in Flash using Tonfall; the new open source AS3 audio engine produced by Andre Michelle …” (You can drag each node and switch off the wander behaviour to create your own compositions). At Flash on the Beach this year, I had the privilege of seeing Andre Michelle speak. It was great to hear him explain some of his fantastic work behind audiotool and to see and hear some more of his audio experiments.

The future of brain-computer interfaces revealed You may already be having basic conversations with your smartphone, desktop PC, games console, TV and, soon, your car, but such voice recognition is – in the scientific community, at least – firmly in a folder market 'dumb' technology. New ways of controlling consumer electronics goods with both basic voice and gestures are suddenly common, but we could soon be operating computers not by barking out instructions or waving, but purely by thinking. Research into the long researched brain-computer interface (BCI) – also known as the 'mind-machine' interface – is becoming so advanced that it's set to create a whole new symbiotic relationship between man and machine. It could even lead to a situation where speech is rendered useless, and people wirelessly communicate through universal translator chips. No more complaining about loud music in nightclubs, then. Forget about the wireless revolution – this revolutionary tech demands cables.

Video: Remote helicopter controlled by brain waves A team at the University of Minnesota, led by biomedical engineering professor Bin He, have learned to use their thoughts to steer a flying robot around a gym, making it turn, rise, dip, and even sail through a ring. Brain waves (EEG) are picked up by the electrodes of an EEG cap on the scalp. The system works thanks to the geography of the motor cortex—the area of the cerebrum that governs movement. When we move, or think about a movement, neurons in the motor cortex produce tiny electric currents. Thinking about a different movement activates a new assortment of neurons. Monitoring electrical activity from the brain, the 64 scalp electrodes of the EEG cap report the signals (or lack of signals) they detect to a computer, which translates the pattern into an electronic command.

Fluorescent protein lets us read a fish's thoughts - life - 31 January 2013 Video: Glowing protein lets us see into a fish's brain The zebrafish spots its lunch. What goes through its brain? Now, for the first time, we can see exactly what it is thinking, thanks to a new way of studying single neurons that lets researchers track patterns of brain activity in a live animal. 3-D Printing Will Change the World To anyone who hasn’t seen it demonstrated, 3-D printing sounds futuristic—like the meals that materialized in the Jetsons’ oven at the touch of a keypad. But the technology is quite straightforward: It is a small evolutionary step from spraying toner on paper to putting down layers of something more substantial (such as plastic resin) until the layers add up to an object. And yet, by enabling a machine to produce objects of any shape, on the spot and as needed, 3-D printing really is ushering in a new era. As applications of the technology expand and prices drop, the first big implication is that more goods will be manufactured at or close to their point of purchase or consumption.

Comparison of consumer brain–computer interfaces This is a comparison of brain-computer interface devices available on the consumer market. Comparison[edit] Open-source projects[edit] Brain-computer Interfaces and the Social Order Roland Schiefer October 15, 2012 Brain-computer interfaces are entering the mass market. Their current uses are either benevolent or harmless and they open a wide range of fascinating opportunities. However, they also pose serious risks that need to be identified and handled for the future. The company that has recently garnered media attention in this field is NeuroVigil.

Athletes rapidly learn complex and neutral dynamic visual scenes A total of 102 professional players (mean age = 23,8 ± 5,5 SD, median 22) from three different sports including 51 professional soccer players (English Premier League (EPL)), 21 professional ice hockey players (National Hockey League (NHL)) and 30 professional rugby players (French Top 14 Rugby League (Top14)). We also tested a total of 173 elite amateurs (mean age = 23,5 ± 5,8 SD, median 22) with 136 from the NCAA university sports program in the US and 37 from a European Olympic sport-training center. We have also tested 33 non-athlete university students (mean age = 23,8 ± 5,0 SD, median 22) from the Université de Montréal. We have previously reported that, given identical conditions, top professional soccer, ice hockey or rugby teams generate very similar sensitivity profiles3. For this reason the professionals are presented as a single population group. The y values are arbitrary speed units.

Autodesk Labs Innovation Edge Newsletter - February 2013 * Free technology previews are subject to the terms and conditions of the end-user license and services agreement that accompanies download of the software. This email is sent to members of the Autodesk Labs Community. To help ensure delivery of Autodesk Labs Community emails to your inbox, please add our email reply address to your Address Book or Safe Sender List. If you do not want to receive emails from the Autodesk Labs Community, please click on the unsubscribe link at the end of this e-mail, and you will be removed from the Labs e-newsletter list. Please DO NOT REPLY to this email. Autodesk, AutoCAD, Civil 3D, Revit, Showcase, and 3ds Max are registered trademarks or trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries.

Google Glass - will we love it or hate it? 5 May 2013Last updated at 20:16 ET By Jane Wakefield Technology reporter The glasses have a small camera and display built in Google's smart glasses project has been causing excitement in the tech world for months as speculation about what it will finally look like and be able to do reaches fever pitch. Prototype devices are being tested by around 1,000 so-called Glass Explorers and are expected to go on sale to the public next year. MindWave with Myndplay - mind controlled video Speilberg, Cameron, and Scorsese Aren't Going To Like This... MindWave Mobile is a brainwave reading headset, which is compatible with iOS, Android, Mac and PC platforms. Bundled with MyndPlay, MindWave is the world's first mind-controlled video application that puts users in control of their own movie experiences. Similar to Edward Packard's "Choose Your Own Adventure" game books, MyndPlay allows users to adjust various scenes and outcomes within the movie, simply by focusing or relaxing when required. With more than 100 applications available, there are plenty of options depending on your age and personal interests. Whether you want to blow up a can of Redbull on your iPad or levitate a cupcake on your Android, MindWave Mobile allows you to train your brain on the go.