background preloader

CS 229: Machine Learning (Course handouts)

CS 229: Machine Learning (Course handouts)
Lecture notes 1 (ps) (pdf) Supervised Learning, Discriminative Algorithms Lecture notes 2 (ps) (pdf) Generative Algorithms Lecture notes 3 (ps) (pdf) Support Vector Machines Lecture notes 4 (ps) (pdf) Learning Theory Lecture notes 5 (ps) (pdf) Regularization and Model Selection Lecture notes 6 (ps) (pdf) Online Learning and the Perceptron Algorithm. (optional reading) Lecture notes 7a (ps) (pdf) Unsupervised Learning, k-means clustering. Lecture notes 7b (ps) (pdf) Mixture of Gaussians Lecture notes 8 (ps) (pdf) The EM Algorithm Lecture notes 9 (ps) (pdf) Factor Analysis Lecture notes 10 (ps) (pdf) Principal Components Analysis Lecture notes 11 (ps) (pdf) Independent Components Analysis Lecture notes 12 (ps) (pdf) Reinforcement Learning and Control Supplemental notes 1 (pdf) Binary classification with +/-1 labels. Supplemental notes 2 (pdf) Boosting algorithms and weak learning.

Andrew Ng's Home page Andrew Ng is a co-founder of Coursera and the director of the Stanford AI Lab. In 2011 he led the development of Stanford University’s main MOOC (Massive Open Online Courses) platform and also taught an online Machine Learning class that was offered to over 100,000 students, leading to the founding of Coursera. more > Ng’s Stanford research group focuses on deep learning, which builds very large neural networks to learn from labeled and unlabeled data. more > Main Page - Wiki Course Notes Summary of course Machine Learning by Andrew Ng on Coursera – luckycallor This is my summary of course Machine Learning by Andrew Ng on Coursera. You can have a reference here after finishing the course. I'm glad to communicate with you and learn from each other. If you find any mistakes in the article, I would appreciate it if you pointed them out. a pdf edition 1. For m examples with n features, we can use a matrix X (with m rows and n columns) to describe the data, where row vector xi (with n+1 dimension including x0 ) represents an example, while column vector xj (with m dimension) represents a feature; or in matrix X, every element xij (row i, column j) represent the jth feature of ith example. For parameters, we use vector θ with n+1 elements to describe, where θj is correspond to xj . For labels, we use a vector y (with m elements) to represent, where element yi represent the label of ith example. And for every 1≤i≤m,xi0=0 . Hypothesis: Vector version: hθ=X∗θ Element version: hθ(xi)=n∑j=0xijθj Cost function: J(θ)=12mm∑i=1(hθ(xi)−yi)2 Gradient descent: θ=(XTX)−1XTy

Bit Twiddling Hack By Sean Eron Anderson seander@cs.stanford.edu Individually, the code snippets here are in the public domain (unless otherwise noted) — feel free to use them however you please. The aggregate collection and descriptions are © 1997-2005 Sean Eron Anderson. The code and descriptions are distributed in the hope that they will be useful, but WITHOUT ANY WARRANTY and without even the implied warranty of merchantability or fitness for a particular purpose. Contents About the operation counting methodology When totaling the number of operations for algorithms here, any C operator is counted as one operation. Compute the sign of an integer The last expression above evaluates to sign = v >> 31 for 32-bit integers. Alternatively, if you prefer the result be either -1 or +1, then use: sign = +1 | (v >> (sizeof(int) * CHAR_BIT - 1)); // if v < 0 then -1, else +1 On the other hand, if you prefer the result be either -1, 0, or +1, then use: sign = (v ! Detect if two integers have opposite signs f = v && !

Google Launches Free Course on Deep Learning: The Science of Teaching Computers How to Teach Themselves Last Friday, we mentioned how Google's artificial intelligence software DeepMind has the ability to teach itself many things. It can teach itself how to walk, jump and run. Even take professional pictures. The free course takes about 3 months to complete. Follow Open Culture on Facebook and Twitter and share intelligent media with your friends. If you'd like to support Open Culture and our mission, please consider making a donation to our site. Related Content: Google’s DeepMind AI Teaches Itself to Walk, and the Results Are Kooky, No Wait, Chilling Learn Python: A Free Online Course from Google Take a Free Course on Digital Photography from Stanford Prof Marc Levoy

1.4. Support Vector Machines — scikit-learn 0.17 documentation The support vector machines in scikit-learn support both dense (numpy.ndarray and convertible to that by numpy.asarray) and sparse (any scipy.sparse) sample vectors as input. However, to use an SVM to make predictions for sparse data, it must have been fit on such data. For optimal performance, use C-ordered numpy.ndarray (dense) or scipy.sparse.csr_matrix (sparse) with dtype=float64. 1.4.1. SVC, NuSVC and LinearSVC are classes capable of performing multi-class classification on a dataset. SVC and NuSVC are similar methods, but accept slightly different sets of parameters and have different mathematical formulations (see section Mathematical formulation). As other classifiers, SVC, NuSVC and LinearSVC take as input two arrays: an array X of size [n_samples,n_features] holding the training samples, and an array y of class labels (strings or integers), size [n_samples]: After being fitted, the model can then be used to predict new values: >>> clf.predict([[2., 2.]])array([1]) 1.4.1.1. and . by

CS345: Data Mining Data Mining Winter 2010 Course information: Instructors: Jure LeskovecOffice Hours: Wednesdays 9-10am, Gates 418 Anand RajaramanOffice Hours: Tuesday/Thursday 5:30-6:30pm (after the class in the same room) Room: Tuesday, Thursday 4:15PM - 5:30PM in 200-203 (History Corner). Teaching assistants: Abhishek Gupta (abhig@cs.stanford.edu). Roshan Sumbaly (rsumbaly@cs.stanford.edu). Staff mailing list: You can reach us at cs345a-win0910-staff@lists.stanford.edu Prerequisites: CS145 or equivalent. Materials: Readings have been derived from the book Mining of Massive Datasets. Students will use the Gradiance automated homework system for which a fee will be charged. You can see earlier versions of the notes and slides covering 2008/09 CS345a Data Mining. Requirements: There will be periodic homeworks (some on-line, using the Gradiance system), a final exam, and a project on web-mining. Projects: Course outline See Handouts for a list of topics and reading materials. Announcements: Important Dates

Introduction to Artificial Intelligence (AI) Lo que aprenderás In this course, you will learn how to: Build simple machine learning models with Azure Machine Learning; Use Python and Microsoft cognitive services to work with text, speech, images, and video; Use the Microsoft Bot Framework to implement conversational bots. Ver el programa del curso Ocultar el programa del curso Programa del curso Skip Syllabus DescriptionIntroductionMachine Learning – The Foundation of AIText and Speech – Understanding LanguageComputer Vision – Seeing the World Through AIBots – Conversation as a PlatformNext Steps Do I need an Azure subscription to complete the course?

Related: