background preloader

Singularitarianism

Singularitarianism
Singularitarianism is a technocentric ideology and social movement defined by the belief that a technological singularity—the creation of superintelligence—will likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans. Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[1] Some critics argue that Singularitarianism is a new religious movement promising salvation in a technological utopia.[3] Others are concerned that the interest in the Singularity by corporate and military interests provides a clue as to the real direction and social implication of emerging technologies celebrated by Singularitarians.[4] Etymology[edit] History[edit]

http://en.wikipedia.org/wiki/Singularitarianism

Related:  Exponential Change - Technological SingularityA I / Robotics et al.

Defining the Singularity Q: So you mentioned that there is no widely accepted view of what the Singularity is and what exactly is going to happen, is that correct? That’s correct, there is very little continuity regarding what exactly the term “Singularity” refers to. A brilliant AI researcher by the name of Eliezer Yudkowky has dissected and categorized these beliefs into three schools of thought: the Event Horizon Thesis, the Intelligence Explosion Thesis, and finally the Accelerating Change Thesis. Q: Well which school did Vernor Vinge fall into when he originally coined the term “Singularity”? He would fall under the Event Horizon school of thought. I Have Seen The Future, And Its Sky Is Full Of Eyes Allow me just a little self-congratulatory chest-beating. Four years ago I started writing a near-fiction thriller about the risks of swarms of UAVs in the wrong hands. Everyone I talked to back then (including my agent, alas) thought the subject was implausible, even silly. Well, it’s not like I’m the next Vernor Vinge — it always seemed like a pretty blatantly obvious prediction to me — but I am pleased to see that drones and drone swarms have finally become the flavor of the month.

Introducing the Singularity Q: So what is this “Technological Singularity” I keep hearing about? In the broadest sense it refers to “an event or phase brought about by technology that will radically change human civilization, and perhaps even human nature itself before the middle of the 21st century.”1 Think of it as the “tipping point” where the accelerating pace of machines outrun all human capabilities and result in a smarter-than-human intelligence. Q: Seriously? People actually believe this?

Class 17 - Deep... Peter Thiel’s CS183: Startup - Class 17 - Deep Thought He is an essay version of class notes from Class 17 of CS183: Startup. Errors and omissions are mine. Three guests joined the class for a conversation after Peter’s remarks: The Coming Technological Singularity ==================================================================== The Coming Technological Singularity: How to Survive in the Post-Human Era Vernor Vinge Department of Mathematical Sciences San Diego State University (c) 1993 by Vernor Vinge (Verbatim copying/translation and distribution of this entire article is permitted in any medium, provided this notice is preserved.) This article was for the VISION-21 Symposium sponsored by NASA Lewis Research Center and the Ohio Aerospace Institute, March 30-31, 1993. It is also retrievable from the NASA technical reports server as part of NASA CP-10129. A slightly changed version appeared in the Winter 1993 issue of _Whole Earth Review_. Abstract Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.

Extinction Timeline: what will disappear from our lives before 2050 When people talk about the future, they usually point to all the new things that will come to pass. However the evolution of human society is as much about old things disappearing as new things appearing. This means it is particularly useful to consider everything in our lives that is likely to become extinct. Accelerating change In futures studies and the history of technology, accelerating change is a perceived increase in the rate of technological (and sometimes social and cultural) progress throughout history, which may suggest faster and more profound change in the future. While many have suggested accelerating change, the popularity of this theory in modern times is closely associated with various advocates of the technological singularity, such as Vernor Vinge and Ray Kurzweil. Early theories[edit]

Infinity Point Will Arrive by 2035 Latest Eray Ozkural December 23, 2013 During writing a paper for the 100 Year Starship Symposium, I wished to convince the starship designers that they should acknowledge the dynamics of high-technology economy, which may be crucial for interstellar missions. Thus motivated, I have made a new calculation regarding infinity point, also known as the singularity. The best visuals to explain the Singularity to senior executives Tomorrow morning I’m doing a presentation to the top executive team of a very large organization on the next 20 years. Most of what I will cover will be general societal, business and technological drivers as well as specific strategic issues driving their business. However as part of stretching their thinking I’ll also speak a about the Singularity. As such I’ve been trying to find one good image to introduce my explanation, however I haven’t been able to find one which is quite right for the purpose.

The Hivemind Singularity - Alan Jacobs In a near-future science fiction novel, human intelligence evolves into a hivemind that makes people the violent cells of a collective being. Slime mold network formation (Science). New Model Army, a 2010 novel by the English writer Adam Roberts, concerns itself with many things: the intimacy shared by soldiers at war, the motivating powers of memory and love, the rival merits of hierarchical and anarchic social structures, the legitimacy of the polity known as Great Britain, the question of European identity. Also giants. Technological Singularity The technological singularity is the hypothesis that accelerating progress in technologies will cause a runaway effect wherein artificial intelligence will exceed human intellectual capacity and control, thus radically changing civilization in an event called the singularity.[1] Because the capabilities of such an intelligence may be impossible for a human to comprehend, the technological singularity is an occurrence beyond which events may become unpredictable, unfavorable, or even unfathomable.[2] The first use of the term "singularity" in this context was by mathematician John von Neumann. Proponents of the singularity typically postulate an "intelligence explosion",[5][6] where superintelligences design successive generations of increasingly powerful minds, that might occur very quickly and might not stop until the agent's cognitive abilities greatly surpass that of any human. Basic concepts Superintelligence

Promises and Perils on the Road to Superintelligence Global Brain / Image credit: mindcontrol.se In the 21st century, we are walking an important road. Our species is alone on this road and it has one destination: super-intelligence. The most forward-thinking visionaries of our species were able to get a vague glimpse of this destination in the early 20th century. Paleontologist Pierre Teilhard de Chardin called this destination Omega Point. Mathematician Stanislaw Ulam called it “singularity”:

The Acceleration of Acceleration: How The Future Is Arriving Far Faster Than Expected The Acceleration of Acceleration: How The Future Is Arriving Far Faster Than Expected This article co-written with Ken Goffman. One of the things that happens when you write books about the future is you get to watch your predictions fail. This is nothing new, of course, but what’s different this time around is the direction of those failures.

This was mentioned in a chat (the one that inspired me to do this pearltree because of my misery with chat). Looking it up, I can see this is indeed what we have been talking about and I see there is much more focus on technology than I see as part of the evolution. However, technology is a key component in my futurist collection of ACC novels. It also goes with my envirobank ideas. It doesn't however mesh with my desire to get more basic and unplugged which also seems an idea inspired partly by OP. by marynichols1 Jul 30

Related: