background preloader

An Introduction to Neural Networks

An Introduction to Neural Networks
Prof. Leslie Smith Centre for Cognitive and Computational Neuroscience Department of Computing and Mathematics University of Stirling. lss@cs.stir.ac.uk last major update: 25 October 1996: minor update 22 April 1998 and 12 Sept 2001: links updated (they were out of date) 12 Sept 2001; fix to math font (thanks Sietse Brouwer) 2 April 2003 This document is a roughly HTML-ised version of a talk given at the NSYN meeting in Edinburgh, Scotland, on 28 February 1996, then updated a few times in response to comments received. Please email me comments, but remember that this was originally just the slides from an introductory talk! Why would anyone want a `new' sort of computer? What is a neural network? Some algorithms and architectures. Where have they been applied? What new applications are likely? Some useful sources of information. Some comments added Sept 2001 NEW: questions and answers arising from this tutorial Why would anyone want a `new' sort of computer? Good at Not so good at Fast arithmetic

http://www.cs.stir.ac.uk/~lss/NNIntro/InvSlides.html

Related:  Neural Network

Google scientist Jeff Dean on how neural networks are improving everything Google does Simon Dawson Google's goal: A more powerful search that full understands answers to commands like, "Book me a ticket to Washington DC." Jon Xavier, Web Producer, Silicon Valley Business Journal If you've ever been mystified by how Google knows what you're looking for before you even finish typing your query into the search box, or had voice search on Android recognize exactly what you said even though you're in a noisy subway, chances are you have Jeff Dean and the Systems Infrastructure Group to thank for it. As a Google Research Fellow, Dean has been working on ways to use machine learning and deep neural networks to solve some of the toughest problems Google has, such as natural language processing, speech recognition, and computer vision.

Neural Network Library project in C# home page Welcome to my Neural Network project home page. You can find here the first version of a .NET neural network library and it's API documentation. This library was developed in C# as a .NET class library. I've also written a graphical interface to design neural networks and a few demos. I'm using neural network to perform face detection and recognisation on images, It's not fully functional at the moment but you can find more on my face detection page.

A Non-Mathematical Introduction to Using Neural Networks The goal of this article is to help you understand what a neural network is, and how it is used. Most people, even non-programmers, have heard of neural networks. There are many science fiction overtones associated with them. And like many things, sci-fi writers have created a vast, but somewhat inaccurate, public idea of what a neural network is. Most laypeople think of neural networks as a sort of artificial brain. IBM Research creates new foundation to program SyNAPSE chips (Credit: IBM Research) Scientists from IBM unveiled on Aug. 8 a breakthrough software ecosystem designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain. The technology could enable a new generation of intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition. Dramatically different from traditional software, IBM’s new programming model breaks the mold of sequential operation underlying today’s von Neumann architectures and computers. It is instead tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures.

Marvin Minsky Home Page MIT Media Lab and MIT AI Lab Professor of Media Arts and Sciences, MIT Professor of E.E.C.S., M.I.Tminsky at media.mit.edu Abstracts Bibliography Biography People Marvin Minsky has made many contributions to AI, cognitive psychology, mathematics, computational linguistics, robotics, and optics. In recent years he has worked chiefly on imparting to machines the human capacity for commonsense reasoning. Universe Grows Like A giant Brain The universe may grow like a giant brain, according to a new computer simulation. The results, published Nov.16 in the journal Nature's Scientific Reports, suggest that some undiscovered, fundamental laws may govern the growth of systems large and small, from the electrical firing between brain cells and growth of social networks to the expansion of galaxies. "Natural growth dynamics are the same for different real networks, like the Internet or the brain or social networks," said study co-author Dmitri Krioukov, a physicist at the University of California San Diego. The new study suggests a single fundamental law of nature may govern these networks, said physicist Kevin Bassler of the University of Houston, who was not involved in the study.

Examining the Society of Mind To appear in the journal Computing and Informatics. Push Singh 28 October 2003 push@mit.edu Media Lab Massachusetts Institute of Technology 20 Ames Street Cambridge, MA 02139 United States Abstract Connectivism Editor’s Note: This is a milestone article that deserves careful study. Connectivism should not be con fused with constructivism. George Siemens advances a theory of learning that is consistent with the needs of the twenty first century. His theory takes into account trends in learning, the use of technology and networks, and the diminishing half-life of knowledge. It combines relevant elements of many learning theories, social structures, and technology to create a powerful theoretical construct for learning in the digital age.

A Brain Cell is the Same as the Universe A Brain Cell is the Same as the Universe by Cliff Pickover, Reality Carnival Physicists discover that the structure of a brain cell is the same as the entire universe. Image Source Return to Reality Carnival.

Related: