'Language Gene' Speeds Learning A mutation that appeared more than half a million years ago may have helped humans learn the complex muscle movements that are critical to speech and language. The claim stems from the finding that mice genetically engineered to produce the human form of the gene, called FOXP2, learn more quickly than their normal counterparts. The work was presented by Christiane Schreiweis, a neuroscientist at the Max Planck Institute (MPI) for Evolutionary Anthropology in Leipzig, Germany, at the Society for Neuroscience meeting this week in Washington DC this week. Water-cooler moments may owe their existence partly to a mutation that wires our brain for faster learning. Scientists discovered FOXP2 in the 1990s by studying a British family known as 'KE' in which three generations suffered from severe speech and language problems1. Those with language problems were found to share an inherited mutation that inactivates one copy of FOXP2.
Life's Extremes: Math vs. Language | Dyscalculia & Dyslexia | IQ Scores & Verbal & Math Skills | Nature vs. Nurture In this weekly series, LiveScience examines the psychology and sociology of opposite human behavior and personality types. Do you know what "abecedarian" means? What about the solution to 250 x 11? Most people would agree they are better at verbal or math subjects in school, as grades usually do attest. (As an adjective, abecedarian refers to something relating to the alphabet; 2,750 is the solution to the equation.) These extremes in ability speak (or equate) to the very makeup of our brains. By learning more about the regions of our brains responsible for language and math processing, researchers hope to someday better help those with severe deficits, such as in reading ability, called dyslexia, and general numeracy, called dyscalculia. Wordly wise Verbal ability — reading, writing and speaking — arises from across much of our brain, requiring key elements to harmonize. In about 5 percent to 12 percent of the population with dyslexia, reading is fraught with difficulty. A head for numbers
Baby apes' arm waving hints at origins of language - life - 10 November 2011 Actions speak louder than words. Baby chimps, bonobos, gorillas and orang-utans – our four closest living relatives – quickly learn to use visual gestures to get their message across, providing the latest evidence that hand waving may have been a vital first step in the development of human language. After a long search for the origins of language in animal vocalisations, some evolutionary biologists have begun to change tack. The emerging "gesture theory" of language evolution has it that our ancestors' linguistic abilities may have begun with their hands rather than their vocal cords. Katja Liebal and colleagues at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, have found new evidence for the theory by studying how communication develops in our closest living relatives. Look at me "Given that purely visual gestures require more advanced social cognition we would have expected them to appear later in the apes' lives," says Liebal. Got a point More from the web
The Dictionary of Made-Up Languages: From Elvish to Klingon, The Anwa, Reella, Ealray, Yeht (Real) Origins of Invented Lexicons (9781440528170): Stephen D. Rogers How to organize a dictionary of made-up languages Having studied translation, I can accept a Universal Translator as a necessary fictional device, and I am not against its use in fiction, but in reality a Universal Translator would be near-impossible to make. First off, not every semantic unit is shared in every language- on Earth, when a language doesn't have something and notice it, it often borrows the word from another language. So between alien races, if an alien race has no concept of a particular thing, it would not have any linguistic way to express it. Second, the lexicon (total words) of a specific language is huge and constantly evolving. Third, a universal translator would have to be able to translate not only the sense of words, but the sense of words grouped in other words. Would it be theoretically possible? Right now, we have a hard time creating computer programs able to discern the semantic of sentences in a single language.
Babies understand grammar long before they learn how to speak My experience is that if I come across a new concept (in Russian or Norwegian), I have to deconstruct my own way of saying it, think of the logic while simutaneously not questioning the logic (or lack of), and learn to say it. After time, you accept it. Russian especially, for English speakers. What started out some years ago as "Why and how the fuck do you say this? It cannot be done," changed over time to, "There's no other way to say it, and I can give reasons why." There's a well-defined critical period for language acquisition early in life, when new neurons are actually being formed in the child's mind that are shaped by their linguistic experience. I've taught English to learners of all ages, and the philosophy of teaching is very different. With adults, it's the opposite.
Generative semantics Generative semantics is the name of a research program within linguistics, initiated by the work of various early students of Noam Chomsky: John R. Ross, Paul Postal, and later James McCawley. George Lakoff was also instrumental in developing and advocating the theory. The approach developed out of transformational generative grammar in the mid-1960s, but stood largely apart from, and in opposition to, work by Noam Chomsky and his later students. A number of ideas from later work in generative semantics have been incorporated into cognitive linguistics, Head-Driven Phrase Structure Grammar (HPSG), Construction Grammar, and into mainstream Chomskyan linguistics. History The nature and genesis of the program are a matter of some controversy and have been extensively debated. Throughout the late 1960s and 1970s, there were heated debates between generative semanticists and more orthodox Chomskyans. “Interpretive” vs. Notes References Bibliography
Poop-Throwing Chimps Provide Hints of Human Origins | Wired Science Pick up an object that’s close at hand. Throw it at something, or even someone (but gently, of course!) You’ve just reenacted what appears to be a pivotal stage in human evolution, when a propensity for projectiles shaped cognitive powers that later became language and symbolic thought. That, at least, is one hypothesis for how humans became so smart. And now researchers have found support in chimpanzees, among whom the ability to throw goes hand-in-hand with increased intelligence and brain development. “Imagine you’re an early hominid throwing at a rabbit. In a study published in the January Philosophical Transactions of the Royal Society B, Hopkins and colleagues tracked several years’ worth of throwing behaviors in captive chimpanzees. The researchers were especially interested in relationships between throwing, cognition and lateralization, or the way certain activities are concentrated in the left or right hemispheres of our brains.
A Brief Guide to Embodied Cognition: Why You Are Not Your Brain | Guest Blog Embodied cognition, the idea that the mind is not only connected to the body but that the body influences the mind, is one of the more counter-intuitive ideas in cognitive science. In sharp contrast is dualism, a theory of mind famously put forth by Rene Descartes in the 17th century when he claimed that “there is a great difference between mind and body, inasmuch as body is by nature always divisible, and the mind is entirely indivisible... the mind or soul of man is entirely different from the body.” In the proceeding centuries, the notion of the disembodied mind flourished. Cognitive science calls this entire philosophical worldview into serious question on empirical grounds... What exactly does this mean? Embodied cognition has a relatively short history. Noam Chomsky (Wikimedia Commons) Lakoff was kind enough to field some questions over a recent phone conversation, where I learned about his interesting history first hand. Metaphors We Live By was a game changer.
Do thoughts have a language of their own? - opinion - 08 December 2011 Read full article Continue reading page |1|2 What is the relationship between language and thought? The quest to create artificial intelligence may have come up with some unexpected answers THE idea of machines that think and act as intelligently as humans can generate strong emotions. Chief among these advances is a form of logic called computational logic. According to one school of philosophy, our thoughts have a language-like structure that is independent of natural language: this is what students of language call the language of thought (LOT) hypothesis. The LOT hypothesis contrasts with the mildly contrary view that human thinking is actually conducted in natural language, and thus we could not think intelligently without it. Research in AI lends little support to the first view, and some support to the second. Using this guide we can then try to express ourselves in a form of natural language that is closer to the LOT. Take a sign designed for London's underground train system: