LIBSVM FAQ
In machine learning and statistics, feature selection, also known as variable selection, attribute selection or variable subset selection, is the process of selecting a subset of relevant features for use in model construction. The central assumption when using a feature selection technique is that the data contains many redundant or irrelevant features. Redundant features are those which provide no more information than the currently selected features, and irrelevant features provide no useful information in any context. Feature selection Feature selection
Weka is a collection of machine learning algorithms for data mining tasks. The algorithms can either be applied directly to a dataset or called from your own Java code. Weka contains tools for data pre-processing, classification, regression, clustering, association rules, and visualization. It is also well-suited for developing new machine learning schemes. Found only on the islands of New Zealand, the Weka is a flightless bird with an inquisitive nature.

Weka 3 - Data Mining with Open Source Machine Learning Software in Java

Weka 3 - Data Mining with Open Source Machine Learning Software in Java
by Kardi Teknomo Share this: Google+ In this tutorial, you will discover step by step how an agent learns through training without teacher in unknown environment. Reinforcement learning is training paradigm for agents in which we have example of problems but we do not have the immediate exact answer. Q-Learning By Examples Q-Learning By Examples
Sample code for Q-learning
www.acm.uiuc.edu/sigart/docs/QLearning.pdf
Reinforcement Learning - Algorithms The parameters used in the Q-value update process are: - the learning rate, set between 0 and 1. Setting it to 0 means that the Q-values are never updated, hence nothing is learned. Reinforcement Learning - Algorithms
Introduction to Reinforcement Learning Xin Chen. University of Hawaii. Fall 2006 Introduction to Reinforcement Learning
DBSCAN.M - dmfa07 - MATLAB code for dbscan - Data Mining projects for the class CIS4930 Fall 2007, Data Mining with Sanjay Ranka
AGHC.m - Classical data mining algorithm matlab c - Source Codes Reader - HackChina
K-Means Clustering Tutorial: Matlab Code K-Means Clustering Tutorial: Matlab Code By Kardi Teknomo, PhD. Purchase the latest e-book with complete code of this k means clustering tutorial here For you who like to use Matlab, Matlab Statistical Toolbox contain a function name kmeans. If you do not have the statistical toolbox, you may use my generic code below.
Kmeans: Matlab Code | Nirmal's Blog My implementation of K means algorithm is highly customized. Initial cluster centroid can be selected in various of ways. Those are: • Randomly initialized cluster centroid as one of the data row. • Select first 3 data row was the three cluster center. • Provide the cluster centroid as a parameter, it is specially helpful when you want to perform the cluster with the same initial data centers so that we don’t have to worry about K means naming different to the same cluster in different run. Kmeans: Matlab Code | Nirmal's Blog
contents.m - Classical data mining algorithm matlab c - Source Codes Reader - HackChina
sponser links: k_means.m - Classical data mining algorithm matlab c - Source Codes Reader - HackChina k_means.m - Classical data mining algorithm matlab c - Source Codes Reader - HackChina
Project

The project describes teaching process of multi-layer neural network employing backpropagation algorithm. To illustrate this process the three layer neural network with two inputs and one output,which is shown in the picture below, is used: Each neuron is composed of two units. First unit adds products of weights coefficients and input signals. The second unit realise nonlinear function, called neuron activation function. Backpropagation Backpropagation
FLD - Fisher Linear Discriminant FLD - Fisher Linear Discriminant Let us assume we have sets , these represent classes, each containing elements ( FLD - Fisher Linear Discriminant
research.cs.tamu.edu/prism/lectures/pr/pr_l10.pdf
www.physics.ohio-state.edu/~gan/teaching/spring04/Chapter5.pdf
CS 229: Machine Learning We've just added extra office hours for PS4 and final project.Problem Set 4 has been released! There will be no Discussion Section on Friday 11/8. Problem Set 3 has been released!
CS340 Winter 2010 Lectures MWF 4.00-5.00, Dempster 301 Calendar entry Prerequisites: Linear algebra, calculus, probability theory, programming (Matlab). Tutorial T2A F 3.00-4.00, Dempster 101 Tutorial T2B M 11.00-12.00, Dempster 101 Instructor: Arnaud Doucet. Office hours: Monday 5.00-6.000.
(1 votes, average: 3.00 out of 5) Loading ... For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector is updated as, (loop over all elements of training set in one iteration) Stochastic Gradient Descent
http://www.cogsci.ucsd.edu/~ajyu/Teaching/Cogs118A_wi10/Scribe/L12.pdf n'est pas accessible
Machine Learning 10-701/15-781
Batch Gradient Descent I happened to stumble on Prof. Andrew Ng’s Machine Learning classes which are available online as part of Stanford Center for Professional Development. The first lecture in the series discuss the topic of fitting parameters for a given data set using linear regression. For understanding this concept, I chose to take data from the top 50 articles of this blog based on the pageviews in the month of September 2011.
Decision Tree
ID3 Decision Trees in Java