background preloader

Neural Networks

Facebook Twitter

Meet NELL. See NELL Run, Teach NELL How To Run (Demo, TCTV) A cluster of computers on Carnegie Mellon’s campus named NELL, or formally known as the Never-Ending Language Learning System, has attracted significant attention this week thanks to a NY Times article, “Aiming To Learn As We Do, A Machine Teaches Itself.” Indeed, the eight-month old computer system attempts to “teach” itself by perpetually scanning slices of the web as it looks at thousands of sites simultaneously to find facts that fit into semantic buckets (like athletes, academic fields, emotions, companies) and finding details related to these nouns.

The project, supported by federal grants, a $1 million check from Google, and a M45 supercomputer cluster donated by Yahoo, is trying break down the longstanding barrier between computers and semantics. And yet despite all of NELL’s initiative and innovation, she needs help. She is accurate 80-90% of the time, according to Professor Tom Mitchell, the head of the research team (see our demo with Mitchell above). Neural Network Demo. I first learned about neural networks sometime around 1991. Ever since, I have been intrigued and confused by them. After peripherally reading and talking with colleagues occasionally about the subject for years, I found myself no closer to understanding them than I did when I first learned of them.

Starting earlier this week (around 10/24/2004), I finally got around to creating one for the first time. Doing so has really helped my understanding of how these things work and what one really can do with them. There has been so much hype about the subject of neural nets (NNs). I obviously am no expert in NNs, but I was surprised to see how much I was able to do with just a little code. Let me be quick to disclaim that all of what I say here is surely subject to scrutiny. Here is where a lot of introductions stop short. The goal of a neuron of this sort is to fire when it recognizes a known pattern of inputs. "Firing" occurs when the output is above some threshold. Java Demos. Java Applets for Neural Network and Artificial Life. (To Japanese version) Artificial Neural Networks Lab Contents Competitive Learning Vector Quantizer (VQ) Related to hard competitive learning. (in Japanese) VQ 2 by ELBG algorithm.

Elastic Net for TSP SOM and elastic nets can be regarded as competitive learning with a topological constraint. TSP is the most notorious one in the NP-complete problems. Backpropagation Learning Neural Nets for Constraint Satisfaction and Optimization Other Neural Networks Artificial Life Genetic Algorithm Biomorph & L-system Life Game Boid Boids1 Simulation of flocking of animals Boids2 dead? Other AL AL collection Other Related Applets Mathtools.net See also my link collection. 2004.7.15: Update Akio Utsugi (home page)

Rock-Paper-Scissors: You vs. the Computer. Read the Web :: Carnegie Mellon University. Browse the Knowledge Base! Can computers learn to read? We think so. "Read the Web" is a research project that attempts to create a computer system that learns over time to read the web. Since January 2010, our computer system called NELL (Never-Ending Language Learner) has been running continuously, attempting to perform two tasks each day: First, it attempts to "read," or extract facts from text found in hundreds of millions of web pages (e.g., playsInstrument(George_Harrison, guitar)). So far, NELL has accumulated over 50 million candidate beliefs by reading the web, and it is considering these at different levels of confidence. 20Q.net Inc.