background preloader

To Study

Facebook Twitter

Reference request - What's new in purely functional data structures since Okasaki? - Theoretical Computer Science. Barzilay2011. A practical guide to support vector classification. Support Vector Machines (SVM) in Ruby. By Ilya Grigorik on January 07, 2008 Your Family Guy fan-site is riding a wave of viral referrals, the community has grown tenfold in last month alone!

Support Vector Machines (SVM) in Ruby

First, you've deployed an SVD recommendation system, then you've optimized the site content and layout with the help of decision trees, but of course, that wasn't enough, and you've also added a Bayes classifier to help you filter and rank the content - no wonder the site is doing so well! The community is buzzing with action, but as with any honey pot with high traffic, the spam bots have also arrived on the scene. 7 Essential Resources & Tips To Get Started With Data Science. 1.

7 Essential Resources & Tips To Get Started With Data Science

Data Science Data science is an umbrella term for a collection of techniques from many distinct areas such as computer science, statistics, machine learning to name just a few. The main objective is to extract information from data and turn it into knowledge which you can base your further decisions on. Functors, Applicatives, And Monads In Pictures -

Updated: May 20, 2013 Here's a simple value: And we know how to apply a function to this value:

Functors, Applicatives, And Monads In Pictures -

Scheme DSL: SchemeUnit and SchemeQL. Programming in Schelog. Schelog is an embedding of Prolog-style logic programming in Scheme.

Programming in Schelog

“Embedding” means you don’t lose Scheme: You can use Prolog-style and conventional Scheme code fragments alongside each other. Schelog contains the full repertoire of Prolog features, including meta-logical and second-order (“set”) predicates, leaving out only those features that could more easily and more efficiently be done with Scheme subexpressions. The Schelog implementation uses the approach to logic programming described in Felleisen [4] and Haynes [8]. In contrast to earlier Lisp simulations of Prolog [3], which used explicit continuation arguments to store failure (backtrack) information, the Felleisen and Haynes model uses the implicit reified continuations of Scheme as provided by the operator call‑with‑current‑continuation (aka call/cc). This allows Schelog to be an embedding, ie, logic programming is not built as a new language on top of Scheme, but is used alongside Scheme’s other features.

Automata via Macros (Lisp) Pattern Matching for Scheme. Deep Learning in a Nutshell: Core Concepts. This post is the first in a series I’ll be writing for Parallel Forall that aims to provide an intuitive and gentle introduction to deep learning.

Deep Learning in a Nutshell: Core Concepts

It covers the most important deep learning concepts and aims to provide an understanding of each concept rather than its mathematical and theoretical details. While the mathematical terminology is sometimes necessary and can further understanding, these posts use analogies and images whenever possible to provide easily digestible bits comprising an intuitive overview of the field of deep learning. I wrote this series in a glossary style so it can also be used as a reference for deep learning concepts. Part 1 focuses on introducing the main concepts of deep learning. Linear Algebra (Dover Books on Mathematics): Georgi E. Shilov: 9780486635187: Books. Matrixcookbook.pdf. Reading List « Deep Learning. Linear_algebra.