background preloader

Learning Science

Facebook Twitter

Author: ‘electronic telepathy’ is around the corner. Along with all the hubbub about the victory of IBM's Watson supercomputer over human competitors on Jeopardy, there has been a lot of talk about the "Singularity," in which computers move closer to matching the intelligence of humans, and humans embed more technology to augment their own bodies. So the timing couldn't have been better for the release of Michael Chorost's new book World Wide Mind: The Coming Integration of Humanity, Machines, and the Internet. Chorost proposes that technology is evolving to the point that it will bring our collective thinking and intellectual capacity into a kind of global "hive mind. " Technology, he posits, may augment neural processes and, thus, enable "electronic telepathy," or digital communication between human minds.

He has some personal experience with this convergence: he wears cochlear implants, which are implanted computer chips that send sound information directly to the brain and enable him to hear. Brain Rules: Brain development for parents, teachers and business leaders | Brain Rules | Anthropology. Anthropology /ænθrɵˈpɒlədʒi/ is the study of humankind, past and present,[1][2] that draws and builds upon knowledge from social and biological sciences, as well as the humanities and the natural sciences.[3][4] Since the work of Franz Boas and Bronisław Malinowski in the late 19th and early 20th centuries, anthropology in Great Britain and the US has been distinguished from ethnology[5] and from other social sciences by its emphasis on cross-cultural comparisons, long-term in-depth examination of context, and the importance it places on participant-observation or experiential immersion in the area of research.

In those European countries that did not have overseas colonies, where ethnology (a term coined and defined by Adam F. Origin of the term[edit] The term anthropology originates from the Greek anthrōpos (ἄνθρωπος), "human being" (understood to mean humankind or humanity), and -λογία -logia, "study. " Fields[edit] According to Clifford Geertz, Sociocultural[edit] Biological[edit] Educational psychology. Educational psychology is the study of human learning.

The study of learning processes, both cognitive and affective, allows researchers to understand individual differences in behavior, personality, intellect, and self- concept. The field of educational psychology heavily relies on testing, measurement, assessment, evaluation, and training to enhance educational activities and learning processes.[1] This can involve studying instructional processes within the classroom setting.

Educational psychology can in part be understood through its relationship with other disciplines. It is informed primarily by psychology, bearing a relationship to that discipline analogous to the relationship between medicine and biology. It is also informed by neuroscience. The field of educational psychology involves the study of memory, conceptual processes, and individual differences (via cognitive psychology) in conceptualizing new strategies for learning processes in humans. History[edit] Early years[edit] Computer science. Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations History[edit] The earliest foundations of what would become computer science predate the invention of the modern digital computer.

Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since antiquity, even before sophisticated computing equipment were created. The ancient Sanskrit treatise Shulba Sutras, or "Rules of the Chord", is a book of algorithms written in 800 BCE for constructing geometric objects like altars using a peg and chord, an early precursor of the modern field of computational geometry.

Time has seen significant improvements in the usability and effectiveness of computing technology. Contributions[edit] These contributions include: Artificial Intelligence. AI research is highly technical and specialized, and is deeply divided into subfields that often fail to communicate with each other.[5] Some of the division is due to social and cultural factors: subfields have grown up around particular institutions and the work of individual researchers. AI research is also divided by several technical issues. Some subfields focus on the solution of specific problems. Others focus on one of several possible approaches or on the use of a particular tool or towards the accomplishment of particular applications. The central problems (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects.[6] General intelligence is still among the field's long-term goals.[7] Currently popular approaches include statistical methods, computational intelligence and traditional symbolic AI.

History[edit] Research[edit] Goals[edit] Planning[edit] Logic-based. KurzweilAI.net. Lewis Mumford. Lewis Mumford, KBE (October 19, 1895 – January 26, 1990) was an American historian, sociologist, philosopher of technology, and literary critic. Particularly noted for his study of cities and urban architecture, he had a broad career as a writer. Mumford was influenced by the work of Scottish theorist Sir Patrick Geddes and worked closely with his associate the British sociologist Victor Branford. Life[edit] Mumford was born in Flushing, Queens, New York, and graduated from Stuyvesant High School in 1912.[2] He studied at the City College of New York and The New School for Social Research, but became ill with tuberculosis and never finished his degree.

In 1918 he joined the navy to serve in World War I and was assigned as a radio electrician.[1][3] He was discharged in 1919 and became associate editor of The Dial, an influential modernist literary journal. He later worked for The New Yorker where he wrote architectural criticism and commentary on urban issues. Mumford's house in Amenia. Cognitive science. Cognitive science is the interdisciplinary scientific study of the mind and its processes.[1] It examines what cognition is, what it does and how it works. It includes research on intelligence and behavior, especially focusing on how information is represented, processed, and transformed (in faculties such as perception, language, memory, reasoning, and emotion) within nervous systems (human or other animal) and machines (e.g. computers).

Cognitive science consists of multiple research disciplines, including psychology, artificial intelligence, philosophy, neuroscience, linguistics, and anthropology.[2] It spans many levels of analysis, from low-level learning and decision mechanisms to high-level logic and planning; from neural circuitry to modular brain organization. The fundamental concept of cognitive science is "that thinking can best be understood in terms of representational structures in the mind and computational procedures that operate on those structures. "[2] Principles[edit] Neuroplasticity. Contrary to conventional thought as expressed in this diagram, brain functions are not confined to certain fixed locations. Neuroplasticity, also known as brain plasticity, is an umbrella term that encompasses both synaptic plasticity and non-synaptic plasticity—it refers to changes in neural pathways and synapses which are due to changes in behavior, environment and neural processes, as well as changes resulting from bodily injury.[1] Neuroplasticity has replaced the formerly-held position that the brain is a physiologically static organ, and explores how - and in which ways - the brain changes throughout life.[2] Neuroplasticity occurs on a variety of levels, ranging from cellular changes due to learning, to large-scale changes involved in cortical remapping in response to injury.

The role of neuroplasticity is widely recognized in healthy development, learning, memory, and recovery from brain damage. Neurobiology[edit] Cortical maps[edit] Applications and example[edit] Vision[edit] Research shows that Internet is rewiring our brains / UCLA Today. The generation gap has been upgraded. In a world brimming with ever-advancing technology, the generations are now separated by a "brain gap" between young "digital natives" and older "digital immigrants," according to Dr. Gary Small, director of UCLA's Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, and UCLA's Parlow-Solomon Chair on Aging.

"We know that technology is changing our lives. It's also changing our brains," Small said during a recent Open Mind lecture for the Friends of the Semel Institute, a group that supports the institute's work in researching and developing treatment for illnesses of the mind and brain. Small's talk centered around his recently published book, "iBrain: Surviving the Technological Alteration of the Modern Mind. " The human brain is malleable, always changing in response to the environment, Small said. Photo.