background preloader

Neural Net

Facebook Twitter

Financial Predictor via Neural Network. Contents Introduction Each year, the field of computer science becomes more sophisticated as new types of technologies hit the market.

Financial Predictor via Neural Network

Despite that, the problem of developing intelligent agents that will precisely simulate human brain activity is still unsolved. One of the most prominent models of intelligent agents built in computer memory is represented by neural networks (NN). Thus in this article, the reader will be introduced to the basics of NN, alongside with the prediction pattern that can be successfully used in different types of "smart" applications.

During my intellectual trip into the world of artificial intelligence, I was fascinated how "magically" a correctly constructed artificial neural network (specifically feed-forward network) can predict values, according to those specified at the input. Function interpolation and approximation Prediction of trends in numerical data Prediction of movements in financial markets Background Dow Jones Industrial Average NASDAQ Composite. Deep Learning Portal. OpenNN - Open Neural Networks Library. Fast Artificial Neural Network Library (FANN) Fast Artificial Neural Network Library is a free open source neural network library, which implements multilayer artificial neural networks in C with support for both fully connected and sparsely connected networks.

Fast Artificial Neural Network Library (FANN)

Cross-platform execution in both fixed and floating point are supported. It includes a framework for easy handling of training data sets. It is easy to use, versatile, well documented, and fast. Bindings to more than 20 programming languages are available. An easy to read introduction article and a reference manual accompanies the library with examples and recommendations on how to use the library. FANN Features: Fast Artificial Neural Network Library. Java Neural Network Framework Neuroph. Computer Vision, Artificial Intelligence, Robotics. Computer Vision, Artificial Intelligence, Robotics. Encog Machine Learning Framework. HyperNEAT User's Page. Language Bindings. Free Science & Engineering software downloads. NeuroBox Neural Network Library. Project Overview NeuroBox is a .NET OOP Library written in C# to generate, propagate and train complex neuronal networks with technologies like backpropagation using weight decay, momentum term, manhattan training, flatspot elimination etc.

NeuroBox Neural Network Library

What's coming, current development Weblog about current thoughts, developments and excperiences around the NeuroBox project. NeuroBox Designer A graphical user interface for composing and manipulating neural networks in a convenient way. QuickProp, RProp Improved learning algorithms will speed up the network training compared with backpropagation in many scenarios. Related People and Projects, Derived Works, Contributions Many Thanks for the contributions from the community (see 'Demos' and 'Links'), in particular to Francois Vanderseypen, Tobias Finazzi, Leopold Rehberger and Matt Jallo.

Questions, Support Please use the contact form to contact me. . :. Untitled. S Tech Journal: Neural Network. Modeling Neuron Behavior in C# Neural Network Lab Modeling Neuron Behavior in C# James McCaffrey presents one of the basic building blocks of a neural network.

Modeling Neuron Behavior in C#

Get Code Download A perceptron is code that models the behavior of a single biological neuron. Perceptrons are the predecessors of neural networks. Perceptrons can be used to solve simple but practical pattern-recognition problems. The best way to see where this article is headed is to look at Figure 1. In the sections that follow, I'll carefully explain the code that generated the screenshot in Figure 1. How Perceptrons Work One way to visualize how perceptrons work is shown in Figure 2. The output of a perceptron, Y, is computed in three steps.

The perceptron's prediction for the bit pattern for the damaged 'B' pattern is computed as follows: dp = (0)(0.075) + (1)(0.00) + (1)(0.00) + (0)(-0.75) + ... + (0)(-0.75) + 0.050 = 0.525 + 0.050 = 0.575 Y = (dp > 0.5 ? Once a perceptron has values for its weights and bias, computing an output is easy.