Evolutionary computation In computer science, evolutionary computation is a subfield of artificial intelligence (more particularly computational intelligence) that involves continuous optimization and combinatorial optimization problems. Its algorithms can be considered global optimization methods with a metaheuristic or stochastic optimization character and are mostly applied for black box problems (no derivatives known), often in the context of expensive optimization. Evolutionary computation uses iterative progress, such as growth or development in a population. This population is then selected in a guided random search using parallel processing to achieve the desired end. As evolution can produce highly optimised processes and networks, it has many applications in computer science. History The use of Darwinian principles for automated problem solving originated in the 1950s. Evolutionary programming was introduced by Lawrence J. Techniques Evolutionary algorithms Software See also
Evolutionary art Artificial Evolution of the Cyprus Problem (2005) is an artwork created by Genco Gulan Evolutionary art is created using a computer. The process starts by having a population of many randomly generated individual representations of artworks. Evolutionary art is a branch of Generative art, which system is characterized by the use of evolutionary principles and natural selection as generative procedure. In common with natural selection and animal husbandry, the members of a population undergoing artificial evolution modify their form or behavior over many reproductive generations in response to a selective regime. In interactive evolution the selective regime may be applied by the viewer explicitly by selecting individuals which are aesthetically pleasing. See also Further reading Conferences "Evomusart. 1st International Conference and 10th European Event on Evolutionary and Biologically Inspired Music, Sound, Art and Design" External links
Genetic algorithm The 2006 NASA ST5 spacecraft antenna. This complicated shape was found by an evolutionary computer design program to create the best radiation pattern. Genetic algorithms find application in bioinformatics, phylogenetics, computational science, engineering, economics, chemistry, manufacturing, mathematics, physics, pharmacometrics and other fields. Methodology In a genetic algorithm, a population of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem is evolved toward better solutions. A typical genetic algorithm requires: a genetic representation of the solution domain,a fitness function to evaluate the solution domain. Once the genetic representation and the fitness function are defined, a GA proceeds to initialize a population of solutions and then to improve it through repetitive application of the mutation, crossover, inversion and selection operators. Initialization of genetic algorithm Selection Genetic operators
Interactive evolutionary computation Interactive evolutionary computation (IEC) or aesthetic selection is a general term for methods of evolutionary computation that use human evaluation. Usually human evaluation is necessary when the form of fitness function is not known (for example, visual appeal or attractiveness; as in Dawkins, 1986) or the result of optimization should fit a particular user preference (for example, taste of coffee or color set of the user interface). IEC design issues The number of evaluations that IEC can receive from one human user is limited by user fatigue which was reported by many researchers as a major problem. However IEC implementations that can concurrently accept evaluations from many users overcome the limitations described above. IEC types IEC methods include interactive evolution strategy, interactive genetic algorithm, interactive genetic programming, and human-based genetic algorithm., IGA See also References External links
Neuroevolution of augmenting topologies NeuroEvolution of Augmenting Topologies (NEAT) is a genetic algorithm for the generation of evolving artificial neural networks (a neuroevolution technique) developed by Ken Stanley in 2002 while at The University of Texas at Austin. It alters both the weighting parameters and structures of networks, attempting to find a balance between the fitness of evolved solutions and their diversity. It is based on applying three key techniques: tracking genes with history markers to allow crossover among topologies, applying speciation (the evolution of species) to preserve innovations, and developing topologies incrementally from simple initial structures ("complexifying"). Performance On simple control tasks, the NEAT algorithm often arrives at effective networks more quickly than other contemporary neuro-evolutionary techniques and reinforcement learning methods. Complexification Implementation Extensions to NEAT rtNEAT Phased Pruning HyperNEAT
Condition of possibility Condition of possibility (Bedingungen der Möglichkeit) is a philosophical concept made popular by Immanuel Kant. A condition of possibility is a necessary framework for the possible appearance of a given list of entities. It is often used in contrast to the unilateral causality concept, or even to the notion of interaction. Gilles Deleuze presented it as a dichotomy in contradistinction to the classical phenomenon/noumenon dichotomy. Foucault would come to adapt it in a historical sense through the concept of "episteme".
Théorie de la complexité des algorithmes Un article de Wikipédia, l'encyclopédie libre. La théorie de la complexité est un domaine des mathématiques, et plus précisément de l'informatique théorique, qui étudie formellement la quantité de ressources (en temps et en espace) nécessaire pour la résolution de problèmes au moyen de l'exécution d'un algorithme. Il s'agit donc d'étudier la difficulté intrinsèque de problèmes posés mathématiquement. Un algorithme répond à un problème. Il est composé d'un ensemble d'étapes simples nécessaires à la résolution, dont le nombre varie en fonction du nombre d'éléments à traiter. D'autre part, plusieurs algorithmes peuvent répondre à un même problème. La théorie de la complexité s'attache à connaître la difficulté (ou la complexité) d'une réponse par algorithme à un problème, dit algorithmique, posé de façon mathématique. La théorie de la complexité étudie principalement (mais pas uniquement) les problèmes de décisions. Un exemple de problème de décision est: TIME(t(n)) NTIME(t(n)) SPACE(s(n))
Human-based computation Human-based computation (HBC) is a computer science technique in which a machine performs its function by outsourcing certain steps to humans. This approach uses differences in abilities and alternative costs between humans and computer agents to achieve symbiotic human-computer interaction. In traditional computation, a human employs a computer to solve a problem; a human provides a formalized problem description and an algorithm to a computer, and receives a solution to interpret. Human-based computation frequently reverses the roles; the computer asks a person or a large group of people to solve a problem, then collects, interprets, and integrates their solutions. Early work Human-based computation (apart from the historical meaning of "computer") research has its origins in the early work on interactive evolutionary computation. A concept of the automatic Turing test pioneered by Moni Naor (1996) is another precursor of human-based computation. Alternative terms
Tierra (computer simulation) Tierra is an abstract model, but any quantitative model is still subject to the same validation and verification techniques applied to more traditional mathematical models, and as such, has no special status. The creation of more detailed models in which more realistic dynamics of biological systems and organisms are incorporated is now an active research field (see systems biology). Jump up ^ Ray, Thomas. "What this Program is". Retrieved 3 January 2014. Jump up ^ Ray, Thomas. Bentley, Peter, J. 2001, "Digital Biology:How Nature is transforming Our Technology and Our Lives", Simon & Schuster, New York, NY. Tierra home page
Procedural generation Procedural generation is a widely used term in the production of media; it refers to content generated algorithmically rather than manually. Often, this means creating content on the fly rather than prior to distribution. This is often related to computer graphics applications and video game level design. Overview The term procedural refers to the process that computes a particular function. The modern demoscene uses procedural generation to package a great deal of audiovisual content into relatively small programs. In recent years, there has been an increasing interest in procedural content generation within the academic game research community, especially among researchers interested in applying artificial intelligence methods to the problems of PCG. Contemporary application Video games RoboBlitz used procedurally generated textures in order to reduce the file size of the game Furthermore, the number of unique objects displayed in a video game is increasing. Film
A biosemiotic conversation: between physics and semiotics | Howard Pattee Available at owa"d . Pattee a! Sign System Studies &'(1-2) &11-&&1* 200+ (,a"tu ! Abst"at . We have started this conversation standing at the base of Massachusettss highest mountain, the forest on the top was hidden by clouds from our sight. 1. #." choosing one’s path $ a characteristic feature of all life $ can be embedded into the picture of physical world which is based on inexorable physical laws. open-ended evolution Would you agree if we call it $ equivalently $ as a freedom to establish new rules *+. open-ended I want to include any emergent structure, function or behavior that can be imagined $ or perhaps even behavior that we cant imagine because of the limitations of our current brains. number of potential forms, and &-' a basic unpredictability of the paths evolution will take.