background preloader

Lewis Mumford

Lewis Mumford
Lewis Mumford, KBE (October 19, 1895 – January 26, 1990) was an American historian, sociologist, philosopher of technology, and literary critic. Particularly noted for his study of cities and urban architecture, he had a broad career as a writer. Mumford was influenced by the work of Scottish theorist Sir Patrick Geddes and worked closely with his associate the British sociologist Victor Branford. Life[edit] Mumford was born in Flushing, Queens, New York, and graduated from Stuyvesant High School in 1912.[2] He studied at the City College of New York and The New School for Social Research, but became ill with tuberculosis and never finished his degree. Mumford's earliest books in the field of literary criticism have had a lasting impact on contemporary American literary criticism. In his early writings on urban life, Mumford was optimistic about human abilities and wrote that the human race would use electricity and mass communication to build a better world for all humankind.

Research shows that Internet is rewiring our brains / UCLA Today The generation gap has been upgraded. In a world brimming with ever-advancing technology, the generations are now separated by a "brain gap" between young "digital natives" and older "digital immigrants," according to Dr. Gary Small, director of UCLA's Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, and UCLA's Parlow-Solomon Chair on Aging. "We know that technology is changing our lives. It's also changing our brains," Small said during a recent Open Mind lecture for the Friends of the Semel Institute, a group that supports the institute's work in researching and developing treatment for illnesses of the mind and brain. Small's talk centered around his recently published book, "iBrain: Surviving the Technological Alteration of the Modern Mind." The human brain is malleable, always changing in response to the environment, Small said.

Technics and Civilization Technics and Civilization is a 1934 book by American philosopher and historian of technology Lewis Mumford. The book presents the history of technology and its role in shaping and being shaped by civilizations. According to Mumford, modern technology has its roots in the Middle Ages rather than in the Industrial Revolution. Background[edit] Apart from its significance as a monumental work of scholarship in several disciplines, Mumford explicitly positioned the book as a call-to-action for the human race to consider its options in the face of the threats to its survival posed by possible ecological catastrophe or industrialised warfare. Synopsis[edit] Mumford divides the development of technology into three overlapping phases: eotechnic, paleotechnic and neotechnic.[1] The first phase of technically civilized life (AD 1000 to 1800) begins with the clock, to Mumford the most important basis for the development of capitalism because time thereby becomes fungible (thus transferable).

Joseph Weizenbaum Joseph Weizenbaum (8 January 1923 – 5 March 2008) was a German and American computer scientist and a professor emeritus at MIT. The Weizenbaum Award is named after him. Life and career[edit] Born in Berlin, Germany to Jewish parents, he escaped Nazi Germany in January 1936, emigrating with his family to the United States. Around 1952, as a research assistant at Wayne, Weizenbaum worked on analog computers and helped create a digital computer. His influential 1976 book Computer Power and Human Reason displays his ambivalence towards computer technology and lays out his case: while Artificial Intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. Weizenbaum was the creator of the SLIP programming language. In 1996, Weizenbaum moved to Berlin and lived in the vicinity of his childhood neighborhood.[5][2] Weizenbaum was reportedly buried at the Jewish Cemetery in Berlin.

Artificial Intelligence AI research is highly technical and specialized, and is deeply divided into subfields that often fail to communicate with each other.[5] Some of the division is due to social and cultural factors: subfields have grown up around particular institutions and the work of individual researchers. AI research is also divided by several technical issues. Some subfields focus on the solution of specific problems. Others focus on one of several possible approaches or on the use of a particular tool or towards the accomplishment of particular applications. The central problems (or goals) of AI research include reasoning, knowledge, planning, learning, natural language processing (communication), perception and the ability to move and manipulate objects.[6] General intelligence is still among the field's long-term goals.[7] Currently popular approaches include statistical methods, computational intelligence and traditional symbolic AI. History[edit] Research[edit] Goals[edit] Planning[edit] Logic-based

Forget IQ, Collective Intelligence is the New Measure of Smart (video We may focus on the stories of individual genius, but it will be harnessing the intelligence of the collective that enables humanity to solve its future problems. Do you know your IQ, that little number that’s supposed to measure how smart you are? Forget it. Collective intelligence can include distributed computing. Another reason why CI will dominate IQ is that individual intelligence is subsumed by the collective. To this end, CCI at MIT is working to understand and guide collective intelligence. Collective intelligence can also take the form of collective art or creativity. Kim-Ung Yong might be the world’s smartest man, his IQ is reportedly 210. [sources: Indiana University, CCI at MIT]

Proust and the Squid: The Story and Science of the R Neuroplasticity Contrary to conventional thought as expressed in this diagram, brain functions are not confined to certain fixed locations. Neuroplasticity, also known as brain plasticity, is an umbrella term that encompasses both synaptic plasticity and non-synaptic plasticity—it refers to changes in neural pathways and synapses which are due to changes in behavior, environment and neural processes, as well as changes resulting from bodily injury.[1] Neuroplasticity has replaced the formerly-held position that the brain is a physiologically static organ, and explores how - and in which ways - the brain changes throughout life.[2] Neuroplasticity occurs on a variety of levels, ranging from cellular changes due to learning, to large-scale changes involved in cortical remapping in response to injury. The role of neuroplasticity is widely recognized in healthy development, learning, memory, and recovery from brain damage. Neurobiology[edit] Cortical maps[edit] Applications and example[edit] Vision[edit]

Wikinews and Multiperspectival Reporting | MIT Center for Future Civic Media Wikinews is a wiki in which users write news articles collaboratively. The project, established in 2004, is run by the Wikimedia Foundation, the organization that also supports Wikipedia. Wikinews has produced over 37,000 articles in 22 languages, with roughly one quarter of those in the English language version of the site. Comparing Wikinews to other “participatory” news sites such as Ohmynews and Indymedia, Axel Bruns contrasted “multiperspectival coverage of the news” with the Wikinews collaborative model. Other commentators have also blamed NPOV and its consensus requirement for Wikinews’ travails. An example from Wikinews illustrates this point. Aaron’s article was quickly and strongly attacked by several other editors on the site. Aaron’s selection of article topic was politically motivated. Google News currently aggregates news from more than 4,500 English-language news sources.

Computer Power and Human Reason - Wikipedia, the free encycloped Joseph Weizenbaum's influential 1976 book Computer Power and Human Reason: From Judgment To Calculation (San Francisco: W. H. Freeman, 1976; ISBN 0-7167-0463-3) displays his ambivalence towards computer technology and lays out his case: while artificial intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom. Weizenbaum makes the crucial distinction between deciding and choosing. Comments printed on the back cover illustrate how the Weizenbaum's commentary and insights were received by the intelligentsia of the time: "Dare I say it? — Keith Oakley, Psychology Today "A thoughtful blend of insight, experience, anecdote, and passion that will stand for a long time as the definitive integration of technological and human thought." — American Mathematical Monthly — Theodore Roszak, The Nation. See also[edit] External links[edit]

Anthropology Anthropology /ænθrɵˈpɒlədʒi/ is the study of humankind, past and present,[1][2] that draws and builds upon knowledge from social and biological sciences, as well as the humanities and the natural sciences.[3][4] Since the work of Franz Boas and Bronisław Malinowski in the late 19th and early 20th centuries, anthropology in Great Britain and the US has been distinguished from ethnology[5] and from other social sciences by its emphasis on cross-cultural comparisons, long-term in-depth examination of context, and the importance it places on participant-observation or experiential immersion in the area of research. In those European countries that did not have overseas colonies, where ethnology (a term coined and defined by Adam F. Origin of the term[edit] The term anthropology originates from the Greek anthrōpos (ἄνθρωπος), "human being" (understood to mean humankind or humanity), and -λογία -logia, "study." Fields[edit] According to Clifford Geertz, Sociocultural[edit] Biological[edit]

Home | Canadian Internet Policy and Public Interest Clinic (CIPPIC) - Canadian Internet Policy and Public Interest Clinic (CIPPIC) Extract from "Computer Power and Human Reason" That last posting jogged my memory and I dug out the following text that has been lying around forgotten in my filesystem for almost ten years. The following extract is taken from a chapter by Joesph Weizenbaum that originally appeared in his book "Computer Power and Human Reason". I came across it in a book that I am currently reading: Computerization and Controversy: Value Conflicts and Social Choices edited by Charles Dunlop and Rob Kling, published by Academic Press, Inc. Weizenbaum's chapter is entitled "Against the Imperialism of Instrumental Reason" in the section on Ethical Perspectives and Professional Responsibilities. I would recommend the book to every computer scientist. In particular, the following extract struck a few chords with me. It happens that programming is a relatively easy craft to learn. Unfortunately, many universities have "computer science" programs at the undergraduate level that permit and even encourage students to take this course.

Educational psychology Educational psychology is the study of human learning. The study of learning processes, both cognitive and affective, allows researchers to understand individual differences in behavior, personality, intellect, and self- concept. The field of educational psychology heavily relies on testing, measurement, assessment, evaluation, and training to enhance educational activities and learning processes.[1] This can involve studying instructional processes within the classroom setting. Educational psychology can in part be understood through its relationship with other disciplines. The field of educational psychology involves the study of memory, conceptual processes, and individual differences (via cognitive psychology) in conceptualizing new strategies for learning processes in humans. History[edit] Early years[edit] Educational Psychology is a fairly new and growing field of study. Plato and Aristotle[edit] Educational psychology dates back to the time of Aristotle and Plato. John Locke[edit]

Is Google Making Us Stupid? - The Atlantic (July/August 2008) - "Dave, stop. Stop, will you? Stop, Dave. I can feel it, too. I think I know what’s going on. For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. I’m not the only one. Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. Anecdotes alone don’t prove much. It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. Reading, explains Wolf, is not an instinctive skill for human beings.

Related: