background preloader

Image evolution

Image evolution
What is this? A simulated annealing like optimization algorithm, a reimplementation of Roger Alsing's excellent idea. The goal is to get an image represented as a collection of overlapping polygons of various colors and transparencies. We start from random 50 polygons that are invisible. In each optimization step we randomly modify one parameter (like color components or polygon vertices) and check whether such new variant looks more like the original image. If it is, we keep it, and continue to mutate this one instead. Fitness is a sum of pixel-by-pixel differences from the original image. This implementation is based on Roger Alsing's description, though not on his code. How does it look after some time? 50 polygons (4-vertex) ~15 minutes 644 benefitial mutations 6,120 candidates 88.74% fitness 50 polygons (6-vertex) ~15 minutes 646 benefitial mutations 6,024 candidates 89.04% fitness 50 polygons (10-vertex) ~15 minutes 645 benefitial mutations 5,367 candidates 87.01% fitness Requirements

BeFunky.com Top 50 Free Open Source Classes on Computer Science : Comtechtor Computer science is an interesting field to go into. There are a number of opportunities in computer science that you can take advantage of. With computers increasingly becoming a regular part of life, those who can work with computers have good opportunities. You can find a good salary with a program in computer science, and as long as you are careful to keep up your skills. Introduction to Computer Science Learn the basics of computer science, and get a foundation in how computer science works. Introduction to Computer Science: Learn about the history of computing, as well as the development of computer languages. Comprehensive Computer Science Collections If you are interested in courses that are a little more comprehensive in nature, you can get a good feel for computer science from the following collections: Programming and Languages Get a handle on computer programming, and learn about different computer languages used in programming. Computer Software Computer Processes and Data

The Running Man, Revisited Comparison between walking and running of mechanical energy expended. Credit: Daniel Lieberman Ann Trason, Scott Jurek, Matt Carpenter. These are the megastars of ultra-distance running, athletes who pound out not just marathons, but dozens of them back-to-back, over Rocky Mountain passes and across the scorching floor of Death Valley. But a handful of scientists think that these ultra-marathoners are using their bodies just as our hominid forbears once did, a theory known as the endurance running hypothesis (ER). Our toes, for instance, are shorter and stubbier than those of nearly all other primates, including chimpanzees, a trait that has long been attributed to our committed bipedalism. “If you have very long toes, the moment of force acting on the foot’s metatarsal phalangeal joint becomes problematic when running,” explains Lieberman. The paper earned the cover of Nature and generated quite a stir within bio/anthro circles. shoothead via Flickr

Ada Lovelace Augusta Ada King, Countess of Lovelace (10 December 1815 – 27 November 1852), born Augusta Ada Byron and now commonly known as Ada Lovelace, was an English mathematician and writer chiefly known for her work on Charles Babbage's early mechanical general-purpose computer, the Analytical Engine. Her notes on the engine include what is recognised as the first algorithm intended to be carried out by a machine. Because of this, she is often described as the world's first computer programmer.[1][2][3] Ada described her approach as "poetical science" and herself as an "Analyst (& Metaphysician)". As a young adult, her mathematical talents led her to an ongoing working relationship and friendship with fellow British mathematician Charles Babbage, and in particular Babbage's work on the Analytical Engine. Biography[edit] Childhood[edit] Ada, aged four On 16 January 1816, Annabella, at George's behest, left for her parents' home at Kirkby Mallory taking one-month-old Ada with her. Adult years[edit]

Psychology Today: Ten Politically Incorrect Truths About Human Nature Human nature is one of those things that everybody talks about but no one can define precisely. Every time we fall in love, fight with our spouse, get upset about the influx of immigrants into our country, or go to church, we are, in part, behaving as a human animal with our own unique evolved nature—human nature. This means two things. First, our thoughts, feelings, and behavior are produced not only by our individual experiences and environment in our own lifetime but also by what happened to our ancestors millions of years ago. Second, our thoughts, feelings, and behavior are shared, to a large extent, by all men or women, despite seemingly large cultural differences. Human behavior is a product both of our innate human nature and of our individual experience and environment. The implications of some of the ideas in this article may seem immoral, contrary to our ideals, or offensive. Why Beautiful People Have More Daughters,

Entropy (information theory) 2 bits of entropy. A single toss of a fair coin has an entropy of one bit. A series of two fair coin tosses has an entropy of two bits. This definition of "entropy" was introduced by Claude E. Entropy is a measure of unpredictability of information content. Now consider the example of a coin toss. English text has fairly low entropy. If a compression scheme is lossless—that is, you can always recover the entire original message by decompressing—then a compressed message has the same quantity of information as the original, but communicated in fewer characters. Shannon's theorem also implies that no lossless compression scheme can compress all messages. Named after Boltzmann's H-theorem, Shannon defined the entropy H (Greek letter Eta) of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as: Here E is the expected value operator, and I is the information content of X.[8][9] I(X) is itself a random variable. . The average uncertainty , with

Antikythera mechanism The Antikythera mechanism (Fragment A – front) The Antikythera mechanism (Fragment A – back) The Antikythera mechanism (/ˌæntɨkɨˈθɪərə/ ANT-i-ki-THEER-ə or /ˌæntɨˈkɪθərə/ ANT-i-KITH-ə-rə) is an ancient analog computer[1][2][3][4] designed to predict astronomical positions and eclipses. It was recovered in 1900–1901 from the Antikythera wreck, a shipwreck off the Greek island of Antikythera.[5] Although the computer's construction has been attributed to the Greeks and dated to the early 1st century BC, its significance and complexity were not understood until the 1970s when it was analyzed with modern X-ray technology. Technological artifacts approaching its complexity and workmanship did not appear again until the 14th century, when mechanical astronomical clocks began to be built in Western Europe.[6] The mechanism was housed in a wooden box approximately 340 × 180 × 90 mm in size and comprised 30 bronze gears (although more could have been lost). Origins and discovery[edit] Gearing[edit]

Analytical Engine Trial model of a part of the Analytical Engine, built by Babbage, as displayed at the Science Museum (London)[1] The Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician Charles Babbage.[2] It was first described in 1837 as the successor to Babbage's Difference engine, a design for a mechanical computer. The Analytical Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[3][4] Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding.[5][6] It was not until the 1940s that the first general-purpose computers were actually built. Design[edit] Two types of punched cards used to program the machine. Construction[edit] Instruction set[edit] From (Bromley, A.G. Influence[edit]

Gödel's incompleteness theorems Gödel's incompleteness theorems are two theorems of mathematical logic that establish inherent limitations of all but the most trivial axiomatic systems capable of doing arithmetic. The theorems, proven by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The two results are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible, giving a negative answer to Hilbert's second problem. The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an "effective procedure" (i.e., any sort of algorithm) is capable of proving all truths about the relations of the natural numbers (arithmetic). Background[edit] Many theories of interest include an infinite set of axioms, however. A formal theory is said to be effectively generated if its set of axioms is a recursively enumerable set. p ↔ F(G(p)). B.

Related: