The project was started in 1984 by Douglas Lenat at MCC and is developed by the Cycorp company. Parts of the project are released as OpenCyc, which provides an API, RDF endpoint, and data dump under an open source license. Overview The project was started in 1984 as part of Microelectronics and Computer Technology Corporation. The name "Cyc" (from "encyclopedia", pronounced [saɪk] like syke) is a registered trademark owned by Cycorp. Typical pieces of knowledge represented in the database are "Every tree is a plant" and "Plants die eventually". Much of the current work on the Cyc project continues to be knowledge engineering, representing facts about the world by hand, and implementing efficient inference mechanisms on that knowledge. Like many companies, Cycorp has ambitions to use the Cyc natural language understanding tools to parse the entire internet to extract structured data. Knowledge base The concept names in Cyc are known as constants. "All trees are plants".
Related: Cognitive Architecture
• cluster-1 the array
ACT-RMost of the ACT-R basic assumptions are also inspired by the progress of cognitive neuroscience, and ACT-R can be seen and described as a way of specifying how the brain itself is organized in a way that enables individual processing modules to produce cognition. Inspiration What ACT-R looks like This means that any researcher may download the ACT-R code from the ACT-R website, load it into a Lisp distribution, and gain full access to the theory in the form of the ACT-R interpreter. Also, this enables researchers to specify models of human cognition in the form of a script in the ACT-R language. Like a programming language, ACT-R is a framework: for different tasks (e.g., Tower of Hanoi, memory for text or for list of words, language comprehension, communication, aircraft controlling), researchers create "models" (i.e., programs) in ACT-R. Brief outline There are two types of modules: All the modules can only be accessed through their buffers. Applications Notes
Organization theory (Castells)The theory of the Information Age is deeply rooted in organization theory. This may come as a surprise since Manuel Castells is perhaps more readily associated with either the study of the Internet, cities and regions, or social movements. There are two points to be made about the parallels with organization theory. First, Castells sees himself as picking up Max Weber’s mantle in both his use of historical sociology and in his style of theory. Network researchers (see social networks) do not seem to have adopted Castells’ framework. 1. 2. 3. 4.
Copycat (software)Copycat is a model of analogy making and human cognition based on the concept of the parallel terraced scan, developed in 1988 by Douglas Hofstadter, Melanie Mitchell, and others at the Center for Research on Concepts and Cognition, Indiana University Bloomington. The original Copycat was written in Common Lisp and is bitrotten (as it relies on now-outdated graphics libraries); however, a Java port exists. Copycat produces answers to such problems as "abc is to abd as ijk is to what?" (abc:abd :: ijk:?). Hofstadter and Mitchell consider analogy making as the core of high-level cognition, or high-level perception, as Hofstadter calls it, basic to recognition and categorization. High-level perception emerges from the spreading activity of many independent processes, called codelets, running in parallel, competing or cooperating. Copycat is Hofstadter's most popular model.
virtuallythere - Cognitive Domain dikwWe have already made a distinction between learning as a process and learning as a product or learning understood in terms of the outcome the learning process. In this section we are concerned with learning as a product. More specifically, we're concerned with the various components of the cognitive domain. Cognition in its broadest sense refers to the process of knowing and to the product of such a process;. This means that cognition is not just to do with knowing or with having knowledge. 2.1 Do We Really Want to Produce Wise Students? The inclusion of wisdom in the process of learning might strike some educators as a little odd, but as Bruner pointed out in "The Process of Education", education is about more than learning. 6.1 Reasons for Revising the Taxonomy 6.2 The Knowledge Domain Knowledge is not represented in the revised framework. 6.3 The Cognitive Processes The revised domains can be explained as follows (Krathwohl, 2002). 6.4 The Knowledge Dimension and the Cognitive Processes
Why Cognition-as-a-Service is the next operating system battlefieldThe Semantic Web may have failed, but higher intelligence is coming to applications anyway, in another form: Cognition-as-a-Service (CaaS). And this may just be the next evolution of the operating system. CaaS will enable every app to become as smart as Siri in its own niche. For example your calendar will become a cognitive app — it will be able to intelligently interact with you to help you manage your time and scheduling like a personal assistant would — but the actual artificial intelligence that powers it will come from a third-party cloud based cognitive platform. Cognitive apps will not be as intelligent as humans anytime soon, and they probably will not be anything like the 20th century ideas of humanoid robots. Cognition in the clouds But the key is that the intelligence that powers cognitive apps will come from cloud based platforms that host their brains — the apps themselves won’t really have to be that smart on their own. The new OS battle Everything is going to get smarter.
The Stanford NLP (Natural Language Processing) GroupAbout | Citing | Questions | Download | Included Tools | Extensions | Release history | Sample output | Online | FAQ A natural language parser is a program that works out the grammatical structure of sentences, for instance, which groups of words go together (as "phrases") and which words are the subject or object of a verb. Probabilistic parsers use knowledge of language gained from hand-parsed sentences to try to produce the most likely analysis of new sentences. These statistical parsers still make some mistakes, but commonly work rather well. Their development was one of the biggest breakthroughs in natural language processing in the 1990s. You can try out our parser online. Package contents This package is a Java implementation of probabilistic natural language parsers, both highly optimized PCFG and lexicalized dependency parsers, and a lexicalized PCFG parser. As well as providing an English parser, the parser can be and has been adapted to work with other languages. Usage notes
Semantic WebThe Semantic Web is an extension of the Web where data is given explicit meaning. This allows the data to be integrated, processed, shared, and filtered with much greater ease than before. The idea was popularized by Tim Berners-Lee, who was pursuing this goal since the very first draft of the World Wide Web itself. The technical foundation of the Semantic Web is given by the standards RDF - Resource Description Framework (for data description), OWL - Web Ontology Language (for giving the RDF terms a formal meaning), and SPARQL - SPARQL Protocol and RDF Query Language (as a query language and protocol). A not too simplified view of the Semantic Web is to think of the Semantic Web to traditional databases like the Web was to hypertext systems. This page in other languages: de
IBM launches functioning brain-inspired chipIBM neurosynaptic chip (credit: IBM) IBM announced today, August 7, the first computer chip to achieve one million programmable “neurons,” 256 million programmable “synapses,” and 46 billion “synaptic operations” per second per watt — simulating the function of neurons and synapses in the brain.* Neurosynaptic. That’s orders of magnitude less power than a modern microprocessor and the energy equivalent of a hearing-aid battery. It has a power density of 20mW/cm2 — nearly four orders of magnitude less than today’s microprocessors, says IBM. (Credit: IBM) Non-von Neumann. Instead, the new “TrueNorth” chip architecture features an digital on-chip, two-dimensional mesh network of 4096 digital, distributed neurosynaptic cores. Each core module integrates memory, computation, and communication, and operates in an event-driven, parallel, and fault-tolerant fashion. Transforming mobility. Goal: 100 trillion synapses. Auditory cues for the blind (Credit: IBM)