background preloader

Machine learning

Machine learning
Machine learning is a subfield of computer science[1] that evolved from the study of pattern recognition and computational learning theory in artificial intelligence.[1] Machine learning explores the construction and study of algorithms that can learn from and make predictions on data.[2] Such algorithms operate by building a model from example inputs in order to make data-driven predictions or decisions,[3]:2 rather than following strictly static program instructions. Machine learning is closely related to and often overlaps with computational statistics; a discipline that also specializes in prediction-making. It has strong ties to mathematical optimization, which deliver methods, theory and application domains to the field. Machine learning is employed in a range of computing tasks where designing and programming explicit, rule-based algorithms is infeasible. Example applications include spam filtering, optical character recognition (OCR),[4] search engines and computer vision.

http://en.wikipedia.org/wiki/Machine_learning

Related:  NeatkushalveersinghMachine Learning

Elastic map Linear PCA versus nonlinear Principal Manifolds[1] for visualization of breast cancermicroarray data: a) Configuration of nodes and 2D Principal Surface in the 3D PCA linear manifold. The dataset is curved and can not be mapped adequately on a 2D principal plane; b) The distribution in the internal 2D non-linear principal surface coordinates (ELMap2D) together with an estimation of the density of points; c) The same as b), but for the linear 2D PCA manifold (PCA2D). The “basal” breast cancer subtype is visualized more adequately with ELMap2D and some features of the distribution become better resolved in comparison to PCA2D. Principal manifolds are produced by the elastic maps algorithm. Data are available for public competition.[2] Software is available for free non-commercial use.[3][4]

Theory of Finite Automata Undergraduate course in finite automata theory with introduction to formal languages. Lecturers J.A. Garcia and S. Statistical classification In machine learning and statistics, classification is the problem of identifying to which of a set of categories (sub-populations) a new observation belongs, on the basis of a training set of data containing observations (or instances) whose category membership is known. An example would be assigning a given email into "spam" or "non-spam" classes or assigning a diagnosis to a given patient as described by observed characteristics of the patient (gender, blood pressure, presence or absence of certain symptoms, etc.). In the terminology of machine learning,[1] classification is considered an instance of supervised learning, i.e. learning where a training set of correctly identified observations is available. The corresponding unsupervised procedure is known as clustering, and involves grouping data into categories based on some measure of inherent similarity or distance. Terminology across fields is quite varied. Relation to other problems[edit]

The Future of Machine Intelligence Ben Goertzel March 20, 2009 In early March 2009, 100 intellectual adventurers journeyed from various corners of Europe, Asia, America and Australasia to the Crowne Plaza Hotel in Arlington Virginia, to take part in the Second Conference on Artificial General Intelligence, AGI-09: a conference aimed explicitly at the grand goal of the AI field, the creation of thinking machines with general intelligence at the human level and ultimately beyond. Artificial Intelligence and Machine Learning A Gaussian Mixture Model Layer Jointly Optimized with Discriminative Features within A Deep Neural Network Architecture Ehsan Variani, Erik McDermott, Georg Heigold ICASSP, IEEE (2015) Adaptation algorithm and theory based on generalized discrepancy Corinna Cortes, Mehryar Mohri, Andrés Muñoz Medina Proceedings of the 21st ACM Conference on Knowledge Discovery and Data Mining (KDD 2015) Adding Third-Party Authentication to Open edX: A Case Study John Cox, Pavel Simakov Proceedings of the Second (2015) ACM Conference on Learning @ Scale, ACM, New York, NY, USA, pp. 277-280 An Exploration of Parameter Redundancy in Deep Networks with Circulant Projections Yu Cheng, Felix X.

Heuristics A*’s ability to vary its behavior based on the heuristic and cost functions can be very useful in a game. The tradeoff between speed and accuracy can be exploited to make your game faster. For most games, you don’t really need the best path between two points. You just need something that’s close. CS 43001 Compiler Construction Course (Autumn Semester 2005) Theory Niloy Ganguly niloy@cse.iitkgp.ernet.in Laboratory Chitta Ranjan Mandal chitta@cse.iitkgp.ernet.inNiloy Ganguly niloy@cse.iitkgp.ernet.in Cluster analysis The result of a cluster analysis shown as the coloring of the squares into three clusters. Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense or another) to each other than to those in other groups (clusters). It is a main task of exploratory data mining, and a common technique for statistical data analysis, used in many fields, including machine learning, pattern recognition, image analysis, information retrieval, and bioinformatics.

Remembering objects lets computers learn like a child - tech - 05 June 2013 Video: Watch a computer recognise familiar objects ALWAYS seeing the world with fresh eyes can make it hard to find your way around. Giving computers the ability to recognise objects as they scan a new environment will let them navigate much more quickly and understand what they are seeing. Renato Salas-Moreno at Imperial College London and colleagues have added object recognition to a computer vision technique called simultaneous location and mapping (SLAM). A SLAM-enabled computer has a camera to orient itself in new surroundings as it maps them. SLAM builds up a picture of the world out of points and lines and contours.

Gaussian Processes for Machine Learning: Book webpage Carl Edward Rasmussen and Christopher K. I. Williams The MIT Press, 2006. DeCasteljau Algorithm Hamburg (Germany), the 19th September 1999. Written by Nils Pipenbrinck aka Submissive/Cubic & $eeN Introduction I learned a nice way to calculate bezier-curves a couple of weeks ago. How to draw two graphs in one scatterplot? I think that's what you have in mind: import matplotlib, matplotlib.pyplot as plt import numpy def plot_me1(): # generate 25 random triplets of points x, y1, y2 = numpy.random.random((3, 25)) # create figure and axes fig = plt.figure() # split the page into a 1x1 array of subplots and put me in the first one (111) # (as a matter of fact, the only one) ax = fig.add_subplot(111) # plots scatter for x, y1 ax.scatter(x, y1, color='red', marker='o', s=100) # plots scatter for x, y2 ax.scatter(x, y2, color='green', marker='^', s=100) plt.show() plot_me1() You can accomplish that using sub-plots. You may superimpose your plots (as in the first example) or you can separate them in different regions:

Related:  Informatiquemachine learning