background preloader

Mind

Facebook Twitter

Brain Games & Brain Training. The Autumn of the Multitaskers - The Atlantic (November 2007) Neuroscience is confirming what we all suspect: Multitasking is dumbing us down and driving us crazy. One man’s odyssey through the nightmare of infinite connectivity Illustrations by Istvan Banyai I think your suggestion is, Can we do two things at once?

Well, we’re of the view that we can walk and chew gum at the same time. In the midwestern town where I grew up (a town so small that the phone line on our block was a “party line” well into the 1960s, meaning that we shared it with our neighbors and couldn’t use it while one of them was using it, unless we wanted to quietly listen in—with their permission, naturally, and only if we were feeling awfully lonesome—while they chatted with someone else), there were two skinny brothers in their 30s who built a car that could drive into the river and become a fishing boat.

My pals and I thought the car-boat was a wonder. (But her arms are too short to shoot a nude self-portrait with a camera phone. Then the phone trilled out its normal ringtone. Sporcle.com: mentally stimulating diversions. Digital Overload Is Frying Our Brains | Wired Science from Wired.

Paying attention isn’t a simple act of self-discipline, but a cognitive ability with deep neurobiological roots — and this complex faculty, says Maggie Jackson, is being woefully undermined by how we’re living. In Distracted: The Erosion of Attention and the Coming Dark Age, Jackson explores the effects of "our high-speed, overloaded, split-focus and even cybercentric society" on attention. It’s not a pretty picture: a never-ending stream of phone calls, e-mails, instant messages, text messages and tweets is part of an institutionalized culture of interruption, and makes it hard to concentrate and think creatively.

Of course, every modern age is troubled by its new technologies. "The telegraph might have done just as much to the psyche [of] Victorians as the Blackberry does to us," said Jackson. "But at the same time, that doesn’t mean that nothing has changed. The question is, how do we confront our own challenges? " Wired.com talked to Jackson about attention and its loss. See Also: Games for the Brain. How Google Is Making Us Smarter | Machine-Brain Connections | DI. Our minds are under attack. At least that’s what I keep hearing these days. Thumbing away at our text messages, we are becoming illiterate. (Or is that illiter8?) Blogs make us coarse, YouTube makes us shallow. I have a hard time taking these Cassandras of the Computer Age seriously. More significantly, the ominous warnings feed on a popular misconception of how the mind works.

This concept of the extended mind was first raised in 1998, right around the time Google was born, by two philosophers, Andy Clark, now at the University of Edinburgh, and David Chalmers, now at the Australian National University. The mind appears to be adapted for reaching out and making the world, including our machines, an extension of itself.

Clark and Chalmers asked their readers to imagine a woman named Inga. In the view of Clark and Chalmers, Inga’s brain-based memory and Otto’s notebook are fundamentally the same. We use strikingly little information in the process. A walk on the Web is good for the brain. Can Googling delay the onset of dementia? A UCLA study, part of growing research into the effects of technology on the brain, shows that searching the Internet may keep older brains agile -- it's like taking your brain for a walk. It's too early to conclude that technology will help vanquish Alzheimer's disease, but "our study shows that when your brain is on Google, your neural circuitry changes extensively," said psychiatrist Gary Small, director of UCLA's Memory & Aging Research Center.

The study, which will be published next month in the Journal of Geriatric Psychiatry, comes at a time when medical experts are forecasting that Alzheimer's cases will quadruple by 2050. In response to such projections, "brain-gyms" and memory-building computer programs have proliferated. The subjects in Small's nine-month study were 24 neurologically normal volunteers ages 55 to 76, with similar education levels. By focusing on older users, Small said, he aimed to fill a gap in brain research.