background preloader

Lewis Mumford

Lewis Mumford
Lewis Mumford, KBE (October 19, 1895 – January 26, 1990) was an American historian, sociologist, philosopher of technology, and literary critic. Particularly noted for his study of cities and urban architecture, he had a broad career as a writer. Mumford was influenced by the work of Scottish theorist Sir Patrick Geddes and worked closely with his associate the British sociologist Victor Branford. Life[edit] Mumford was born in Flushing, Queens, New York, and graduated from Stuyvesant High School in 1912.[2] He studied at the City College of New York and The New School for Social Research, but became ill with tuberculosis and never finished his degree. Mumford's earliest books in the field of literary criticism have had a lasting impact on contemporary American literary criticism. In his early writings on urban life, Mumford was optimistic about human abilities and wrote that the human race would use electricity and mass communication to build a better world for all humankind.

Research shows that Internet is rewiring our brains / UCLA Today The generation gap has been upgraded. In a world brimming with ever-advancing technology, the generations are now separated by a "brain gap" between young "digital natives" and older "digital immigrants," according to Dr. Gary Small, director of UCLA's Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, and UCLA's Parlow-Solomon Chair on Aging. "We know that technology is changing our lives. It's also changing our brains," Small said during a recent Open Mind lecture for the Friends of the Semel Institute, a group that supports the institute's work in researching and developing treatment for illnesses of the mind and brain. Small's talk centered around his recently published book, "iBrain: Surviving the Technological Alteration of the Modern Mind." The human brain is malleable, always changing in response to the environment, Small said.

Joseph Weizenbaum Joseph Weizenbaum (8 January 1923 – 5 March 2008) was a German and American computer scientist and a professor emeritus at MIT. The Weizenbaum Award is named after him. Life and career[edit] Born in Berlin, Germany to Jewish parents, he escaped Nazi Germany in January 1936, emigrating with his family to the United States. Around 1952, as a research assistant at Wayne, Weizenbaum worked on analog computers and helped create a digital computer. His influential 1976 book Computer Power and Human Reason displays his ambivalence towards computer technology and lays out his case: while Artificial Intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. Weizenbaum was the creator of the SLIP programming language. In 1996, Weizenbaum moved to Berlin and lived in the vicinity of his childhood neighborhood.[5][2] Weizenbaum was reportedly buried at the Jewish Cemetery in Berlin.

Artificial Intelligence AI research is highly technical and specialized, and is deeply divided into subfields that often fail to communicate with each other.[5] Some of the division is due to social and cultural factors: subfields have grown up around particular institutions and the work of individual researchers. AI research is also divided by several technical issues. Some subfields focus on the solution of specific problems. Others focus on one of several possible approaches or on the use of a particular tool or towards the accomplishment of particular applications. The central problems (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects.[6] General intelligence is still among the field's long-term goals.[7] Currently popular approaches include statistical methods, computational intelligence and traditional symbolic AI. History[edit] Research[edit] Goals[edit] Planning[edit] Logic-based

Proust and the Squid: The Story and Science of the R Neuroplasticity Contrary to conventional thought as expressed in this diagram, brain functions are not confined to certain fixed locations. Neuroplasticity, also known as brain plasticity, is an umbrella term that encompasses both synaptic plasticity and non-synaptic plasticity—it refers to changes in neural pathways and synapses which are due to changes in behavior, environment and neural processes, as well as changes resulting from bodily injury.[1] Neuroplasticity has replaced the formerly-held position that the brain is a physiologically static organ, and explores how - and in which ways - the brain changes throughout life.[2] Neuroplasticity occurs on a variety of levels, ranging from cellular changes due to learning, to large-scale changes involved in cortical remapping in response to injury. The role of neuroplasticity is widely recognized in healthy development, learning, memory, and recovery from brain damage. Neurobiology[edit] Cortical maps[edit] Applications and example[edit] Vision[edit]

Computer Power and Human Reason - Wikipedia, the free encycloped Joseph Weizenbaum's influential 1976 book Computer Power and Human Reason: From Judgment To Calculation (San Francisco: W. H. Freeman, 1976; ISBN 0-7167-0463-3) displays his ambivalence towards computer technology and lays out his case: while artificial intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. Weizenbaum makes the crucial distinction between deciding and choosing. Comments printed on the back cover illustrate how the Weizenbaum's commentary and insights were received by the intelligentsia of the time: "Dare I say it? — Keith Oakley, Psychology Today "A thoughtful blend of insight, experience, anecdote, and passion that will stand for a long time as the definitive integration of technological and human thought." — American Mathematical Monthly — Theodore Roszak, The Nation. See also[edit] External links[edit]

Anthropology Anthropology /ænθrɵˈpɒlədʒi/ is the study of humankind, past and present,[1][2] that draws and builds upon knowledge from social and biological sciences, as well as the humanities and the natural sciences.[3][4] Since the work of Franz Boas and Bronisław Malinowski in the late 19th and early 20th centuries, anthropology in Great Britain and the US has been distinguished from ethnology[5] and from other social sciences by its emphasis on cross-cultural comparisons, long-term in-depth examination of context, and the importance it places on participant-observation or experiential immersion in the area of research. In those European countries that did not have overseas colonies, where ethnology (a term coined and defined by Adam F. Origin of the term[edit] The term anthropology originates from the Greek anthrōpos (ἄνθρωπος), "human being" (understood to mean humankind or humanity), and -λογία -logia, "study." Fields[edit] According to Clifford Geertz, Sociocultural[edit] Biological[edit]

Extract from "Computer Power and Human Reason" That last posting jogged my memory and I dug out the following text that has been lying around forgotten in my filesystem for almost ten years. The following extract is taken from a chapter by Joesph Weizenbaum that originally appeared in his book "Computer Power and Human Reason". I came across it in a book that I am currently reading: Computerization and Controversy: Value Conflicts and Social Choices edited by Charles Dunlop and Rob Kling, published by Academic Press, Inc. Weizenbaum's chapter is entitled "Against the Imperialism of Instrumental Reason" in the section on Ethical Perspectives and Professional Responsibilities. I would recommend the book to every computer scientist. In particular, the following extract struck a few chords with me. It happens that programming is a relatively easy craft to learn. Unfortunately, many universities have "computer science" programs at the undergraduate level that permit and even encourage students to take this course.

Educational psychology Educational psychology is the study of human learning. The study of learning processes, both cognitive and affective, allows researchers to understand individual differences in behavior, personality, intellect, and self- concept. The field of educational psychology heavily relies on testing, measurement, assessment, evaluation, and training to enhance educational activities and learning processes.[1] This can involve studying instructional processes within the classroom setting. Educational psychology can in part be understood through its relationship with other disciplines. The field of educational psychology involves the study of memory, conceptual processes, and individual differences (via cognitive psychology) in conceptualizing new strategies for learning processes in humans. History[edit] Early years[edit] Educational Psychology is a fairly new and growing field of study. Plato and Aristotle[edit] Educational psychology dates back to the time of Aristotle and Plato. John Locke[edit]

Is Google Making Us Stupid? - The Atlantic (July/August 2008) - "Dave, stop. Stop, will you? Stop, Dave. I can feel it, too. I think I know what’s going on. For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. I’m not the only one. Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. Anecdotes alone don’t prove much. It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. Reading, explains Wolf, is not an instinctive skill for human beings.

Computer science Computer science deals with the theoretical foundations of information and computation, together with practical techniques for the implementation and application of these foundations History[edit] The earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division. Further, algorithms for performing computations have existed since antiquity, even before sophisticated computing equipment were created. The ancient Sanskrit treatise Shulba Sutras, or "Rules of the Chord", is a book of algorithms written in 800 BCE for constructing geometric objects like altars using a peg and chord, an early precursor of the modern field of computational geometry. Time has seen significant improvements in the usability and effectiveness of computing technology. Contributions[edit] These contributions include:

Class explores the good and evil of Google / UCLA Today It's one thing to hear Professor Todd Presner's shrewd insights into how Google is changing the world. It's another to sit in on his undergrad class, "The Googlization of Everything," and hear the surprising attitudes of his students, most of whom were born in 1990. "It doesn't matter if Google is making us stupid," said one student, before giving a rationale for her alarming comment. "A hunter might say that a grocery store makes us stupid, because we no longer know how to hunt, but if the grocery store means we don't need to hunt, then it doesn't really matter. Presner, an associate professor of Germanic Languages who has won accolades and awards for digital innovation, understands why his students think such things. "These are students who have always known the Internet. With a room full of students who've hardly known life without search engines, cell phones and blogs, his class zeroes in on Google, possibly the biggest influence of all. Google is dangerous, Presner insists.

Cognitive science Cognitive science is the interdisciplinary scientific study of the mind and its processes.[1] It examines what cognition is, what it does and how it works. It includes research on intelligence and behavior, especially focusing on how information is represented, processed, and transformed (in faculties such as perception, language, memory, reasoning, and emotion) within nervous systems (human or other animal) and machines (e.g. computers). Cognitive science consists of multiple research disciplines, including psychology, artificial intelligence, philosophy, neuroscience, linguistics, and anthropology.[2] It spans many levels of analysis, from low-level learning and decision mechanisms to high-level logic and planning; from neural circuitry to modular brain organization. Principles[edit] Levels of analysis[edit] Interdisciplinary nature[edit] Cognitive science: the term[edit] Scope[edit] Cognitive science is a large field, and covers a wide array of topics on cognition. "... Attention[edit]

Brain Rules: Brain development for parents, teachers and business leaders | Brain Rules |

Related: