background preloader

AI

Facebook Twitter

DiffEqFlux.jl – A Julia Library for Neural Differential Equations. Translations: Traditional Chinese In this blog post we will show you how to easily, efficiently, and robustly use differential equation (DiffEq) solvers with neural networks in Julia.

DiffEqFlux.jl – A Julia Library for Neural Differential Equations

The Neural Ordinary Differential Equations paper has attracted significant attention even before it was awarded one of the Best Papers of NeurIPS 2018. The paper already gives many exciting results combining these two disparate fields, but this is only the beginning: neural networks and differential equations were born to be together. This blog post, a collaboration between authors of Flux, DifferentialEquations.jl and the Neural ODEs paper, will explain why, outline current and future directions for this work, and start to give a sense of what's possible with state-of-the-art tools.

MLOps

Jupyter Notebooks Gallery. A Software Engineer’s Guide to Cybernetics. AutoML 2.0: Is The Data Scientist Obsolete? It's an AutoML World The world of AutoML has been proliferating over the past few years - and with a recession looming, the notion of automating the development of AI and Machine Learning is bound to become even more appealing.

AutoML 2.0: Is The Data Scientist Obsolete?

New platforms are available with increased capabilities and more automation. The advent of AI-powered Feature Engineering - which allows users to discover and create features for data science processing automatically - is enabling a whole new approach to data science that, seemingly, threatens the role of the data scientist. Will AutoML Be the End of Data Scientists? AutoML is exploding in popularity.

Will AutoML Be the End of Data Scientists?

Here’s how that changes things. In 2012, an arXiv report on Auto-WEKA was released, describing an automated approach to selecting a machine learning algorithm, features, and hyperparameters, in the hopes that it would “help non-expert users” in the field. More recently, AutoML has exploded in popularity, with all the big tech players entering the space. Neurons that fire together, wire together… Ok, but how? One of my little pet projects is a neurology book for psychologists, coaches and the like.

Neurons that fire together, wire together… Ok, but how?

What most people in these fields imagine about the brain is somewhere between perplexing, preposterous and potentially poisonous. (Seriously, I could tell you stories…) So I kind of wanted to write something like this for them. New AtHomeWithAI resources. SOM tutorial part 1. Kohonen's Self Organizing Feature Maps Introductory Note This tutorial is the first of two related to self organising feature maps.

SOM tutorial part 1

Initially, this was just going to be one big comprehensive tutorial, but work demands and other time constraints have forced me to divide it into two. Nevertheless, part one should provide you with a pretty good introduction. Certainly more than enough to whet your appetite anyway! I will appreciate any feedback you are willing to give - good or bad. Overview. The Illustrated Self-Supervised Learning. Yann Lecun, in his talk, introduced the “cake analogy” to illustrate the importance of self-supervised learning.

The Illustrated Self-Supervised Learning

Though the analogy is debated(ref: Deep Learning for Robotics(Slide 96), Pieter Abbeel), we have seen the impact of self-supervised learning in the Natural Language Processing field where recent developments (Word2Vec, Glove, ELMO, BERT) have embraced self-supervision and achieved state of the art results.

Self-Driving Car

Emergent Tool Use from Multi-Agent Interaction. Paper Environment Generation Worldgen In our environment, agents play a team-based hide-and-seek game.

Emergent Tool Use from Multi-Agent Interaction

Introduction to Genetic Algorithms. When you're solving a problem, how do you know if the answer you've found is correct?

Introduction to Genetic Algorithms

In many domains, there is a single correct answer. A mathematical function may have a global maximum or other well-defined attributes. However, other problems, like how a cell behaves in a petri dish, do not have clear solutions. Enter evolution, which does not design towards a known solution but optimizes around constraints. Genetic algorithms are a specific approach to optimization problems that can estimate known solutions and simulate evolutionary behavior in complex systems. This article will briefly discuss the terms and concepts required to understand genetic algorithms then provide two examples. Structure of a Genetic algorithm Genetic algorithms vary in their structure based on their purpose, but all of them share a few common components. Modern SAT solvers: fast, neat and underused (part 1 of N) — The Coding Nest.

Before I started doing research for Intelligent Data Analysis (IDA) group at FEE CTU, I saw SAT solvers as academically interesting but didn't think that they have many practical uses outside of other academic applications.

Modern SAT solvers: fast, neat and underused (part 1 of N) — The Coding Nest

After spending ~1.5 years working with them, I have to say that modern SAT solvers are fast, neat and criminally underused by the industry. Introduction. Flux – What Is Differentiable Programming? With four parameters I can fit an elephant, and with five I can make him wiggle his trunk. – John Von Neumann The idea of “differentiable programming” is coming up a lot in the machine learning world.

Flux – What Is Differentiable Programming?

To many, it’s not clear if this term reflects a real shift in how researchers think about machine learning, or is just (another) rebranding of “deep learning”. This post clarifies what new things differentiable programming (or DP) brings to the machine learning table. Most importantly, differentiable programming is actually a shift opposite from the direction taken by deep learning; from increasingly heavily parameterised models to simpler ones that take more advantage of problem structure. Brute Force with Benefits Differentiability is the core idea that makes deep learning so successful. What about biological neurons and y=σ(W∗x+b)? QUT Robot Academy. How to visualize decision trees. Terence Parr and Prince Grover (Terence teaches in University of San Francisco's MS in Data Science program and Prince is an alumnus.

Datasets

NLP. Math. Python Stack. RL. Keras. Deployment. PyTorch. Watson. Courses. Gaming. Neural Networks.

Kaggle

TensorFlow. Deep Learning. Machine Learning. Business. Prolog. Issues. A Gentle Introduction to Data Structures: How Graphs Work.