background preloader

Machine Learning (Theory)

Machine Learning (Theory)
Related:  ishireign

The Future of Machine Intelligence Ben Goertzel March 20, 2009 In early March 2009, 100 intellectual adventurers journeyed from various corners of Europe, Asia, America and Australasia to the Crowne Plaza Hotel in Arlington Virginia, to take part in the Second Conference on Artificial General Intelligence, AGI-09: a conference aimed explicitly at the grand goal of the AI field, the creation of thinking machines with general intelligence at the human level and ultimately beyond. While the majority of the crowd hailed from academic institutions, major firms like Google, GE, AT&T and Autodesk were also represented, along with a substantial contingent of entrepreneurs involved with AI startups, and independent researchers. Since I was the chair of the conference and played a large role in its organization – along with a number of extremely competent and passionate colleagues – my opinion must be considered rather subjective ... but, be that as it may, my strong feeling is that the conference was an unqualified success!

How to Get Rich Programming I originally discovered the fiendishly addictive Tower Defense as a multiplayer game modification for Warcraft III. It's a cooperative game mode where you, and a few other players, are presented with a simple maze. A group of monsters appear at the entrance and trudge methodically toward the exit. I can't explain exactly what makes Tower Defense so addictive, but man, is it ever. I suppose it was inevitable that this new, addictive Tower Defense game mode would jump from the select audience of gamers with gaming-class PCs to simpler Flash implementations everyone can enjoy. Warning: before clicking on that link, allow me to reiterate: tower defense is addictive! You'd be surprised how much money you can make by creating a flash game and giving it away for free on the internet. So minus the time he spent programming, and his nominal hosting fees, Paul Preese is clearing almost $100,000 "salary" per year with Desktop Tower Defense.

videolectures.net Face mask detection with ML/AI on Cisco industrial hardware - Cisco Blogs Imagine you’ve been asked to create an architecture that can apply analytics on very voluminous data such as video streams generated from cameras. Given the volume and sensitivity of the data, you don’t want to send it off-premises for analysis. Also, the anticipated cost of centralizing the data might invalidate your business value assessment for your use case. You could apply machine learning (ML) or artificial intelligence (AI) at the edge—but only if you can make it work with the available compute resources. This is the exact challenge I recently tackled with the help of my colleague, Michael Wielpuetz. It’s not always easy or even possible to change or scale the available compute resources in a typical edge scenario. To provide food for thought, to incubate new ideas, and to proof possibility, Michael Wielpuetz and I started in our free time to minimize the resource requirement of an exemplary setup. See how the standard Docker images compare to our base image: Share:

Disconnecting Distraction Note: The strategy described at the end of this essay didn't work. It would work for a while, and then I'd gradually find myself using the Internet on my work computer. I'm trying other strategies now, but I think this time I'll wait till I'm sure they work before writing about them. May 2008 Procrastination feeds on distractions. So one way to beat procrastination is to starve it of distractions. Chesterfield described dirt as matter out of place. Television, for example, has after 50 years of refinement reached the point where it's like visual crack. TV is in decline now, but only because people have found even more addictive ways of wasting time. I remember when computers were, for me at least, exclusively for work. After years of carefully avoiding classic time sinks like TV, games, and Usenet, I still managed to fall prey to distraction, because I didn't realize that it evolves. The problem is a hard one to solve because most people still need the Internet for some things. Wow.

Isomap Homepage Anil Dash: My necessity is mothering your invention. Perspectives on Self-serve Machine Learning for Rapid Insights in Healthcare BigML users keep inspiring us with their creativity every day. Many take on Machine Learning with little to no background or education in the field. Why? Because they have access to relevant data and they are smart professionals with the kind of intuition only years of study in a certain field can bring about. It’s no surprise, then, that many of them come to suspect that there must be a better way than the good old descriptive spreadsheet analytics or plain vanilla business intelligence tool reports to solve their business problems. A natural curiosity and a self-starter attitude to actively experiment don’t hurt either! Long time BigML user Dr. Can you please tell us about your background and how you first developed an interest in Machine Learning? Hope you enjoyed this interview and found something useful to directly apply in your projects. Like this: Like Loading...

The Generalized Anti-Zombie Principle Followup to: Zombies! Zombies? "Each problem that I solved became a rule which served afterwards to solve other problems." —Rene Descartes, Discours de la Methode "Zombies" are putatively beings that are atom-by-atom identical to us, governed by all the same third-party-visible physical laws, except that they are not conscious. Though the philosophy is complicated, the core argument against zombies is simple: When you focus your inward awareness on your inward awareness, soon after your internal narrative (the little voice inside your head that speaks your thoughts) says "I am aware of being aware", and then you say it out loud, and then you type it into a computer keyboard, and create a third-party visible blog post. Consciousness, whatever it may be—a substance, a process, a name for a confusion—is not epiphenomenal; your mind can catch the inner listener in the act of listening, and say so out loud. It appears to me that in the case above, the answer is yes. This is a large step.

Theodore Roosevelt - Biography Theodore Roosevelt (October 27, 1858–January 6, 1919) was born in New York into one of the old Dutch families which had settled in America in the seventeenth century. At eighteen he entered Harvard College and spent four years there, dividing his time between books and sport and excelling at both. After leaving Harvard he studied in Germany for almost a year and then immediately entered politics. He was elected to the Assembly of New York State, holding office for three years and distinguishing himself as an ardent reformer. In 1884, because of ill health and the death of his wife, Roosevelt abandoned his political work for some time. President Harrison, after his election in 1889, appointed Roosevelt as a member of the Civil Service Commission of which he later became president. Elected governor of the state of New York in 1898, he invested his two-year administration with the vigorous and businesslike characteristics which were his hallmark. Copyright © The Nobel Foundation 1906

Rewriting the rules of machine-generated art | MIT News | Massachusetts Institute of Technology Horses don’t normally wear hats, and deep generative models, or GANs, don’t normally follow rules laid out by human programmers. But a new tool developed at MIT lets anyone go into a GAN and tell the model, like a coder, to put hats on the heads of the horses it draws. In a new study appearing at the European Conference on Computer Vision this month, researchers show that the deep layers of neural networks can be edited, like so many lines of code, to generate surprising images no one has seen before. “GANs are incredible artists, but they’re confined to imitating the data they see,” says the study’s lead author, David Bau, a PhD student at MIT. “If we can rewrite the rules of a GAN directly, the only limit is human imagination.” Generative adversarial networks, or GANs, pit two neural networks against each other to create hyper-realistic images and sounds. But the new study suggests that big datasets are not essential. “We’re like prisoners to our training data,” he says.

Related: