background preloader

News, Augmented » societe

News, Augmented » societe
Related:  Veille & Prospective

Program or Be Programmed Books Program or be Programmed Ten Commands for a Digital Age Purchase a copy Download Study Guide The debate over whether the Net is good or bad for us fills the airwaves and the blogosphere. But for all the heat of claim and counter-claim, the argument is essentially beside the point: it’s here; it’s everywhere. The real question is, do we direct technology, or do we let ourselves be directed by it and those who have mastered it? In this spirited, accessible poetics of new media, Rushkoff picks up where Marshall McLuhan left off, helping readers come to recognize programming as the new literacy of the digital age––and as a template through which to see beyond social conventions and power structures that have vexed us for centuries. World-renowned media theorist and counterculture figure Douglas Rushkoff is the originator of ideas such as “viral media,” “social currency” and “screenagers.” “Douglas Rushkoff is one of the great thinkers––and writers––of our time.”

Les algorithmes prédictifs sont-ils un risque pour notre libre-arbitre? L’informavore caractérise l’organisme qui consomme de l’information pour vivre, explique Frank Schirrmacher, coéditeur du premier quotidien national allemand le Frankfurter Allgemeine Zeitung dans une passionnante interview à la revue The Edge. Nous sommes apparemment aujourd’hui dans une situation où la technologie moderne change la façon dont les gens se comportent, parlent, réagissent, pensent et se souviennent. Nous dépendons de plus en plus de nos gadgets pour nous souvenirs des choses : comme le disait Daniel Dennet, nous connaissons une explosion démographique des idées que le cerveau n’arrive pas à couvrir. L’information est alimentée par l’attention : si nous n’avons pas assez d’attention, nous n’avons pas assez de nourriture pour retenir tout ces renseignements. Or, à l’âge de l’explosion de l’information que faut-il retenir ? la question est de savoir ce qu’il faut enseigner, ce qu’il faut apprendre et comment. Quelles informations retenir ?

Michel Serres - L'innovation et le numérique - Université Paris 1 Panthéon-Sorbonne La révolution numérique en cours aura selon Michel Serres des effets au moins aussi considérables qu’en leur temps l’invention de l’écriture puis celle de l’imprimerie. Les notions de temps et d’espace en sont totalement transformées. Les façons d’accéder à la connaissance profondément modifiées. A cet égard, chaque grande rupture dans l’histoire de l’humanité conduit à priver l’homme de facultés ("l’homme perd") mais chaque révolution lui en apporte de nouvelles ("l’homme gagne"). A la part de mémoire et de capacité mentale de traitement de l’information qu’il perd avec la diffusion généralisée des technologies numériques, l’homme gagne une possibilité nouvelle de mise en relation (d’individus, de groupes et de réseaux, de savoirs) mais aussi une faculté décuplée d’invention et de création.

Place de la toile Internet, convergence des médias, téléphonie : quelles conséquences sur l’information, la communication, les liens sociaux, et, finalement, l’organisation de notre vie ? Les écrans nous sont désormais familiers, mais nous ne connaissons encore que les prémices des effets liés à leur domination. Place de la toile est une émission qui aborde les différents aspects de la "révolution" numérique, du côté des conséquences qu’elle induit sur l’information, les médias, la communication, les liens sociaux entre les individus, et finalement, l’organisation de notre vie. Elle fait un tour hebdomadaire des connaissances, s’attarde sur les principaux concepts liés à cette métamorphose, rencontre les acteurs, raconte les principaux événements, discute de l’économie, de la politique, et de la philosophie de cette révolution. Chaque semaine vous retrouverez : - La « Lecture de la semaine », un point de vue provenant en général du monde anglo-saxon, prolixe en analyse numérique. Génériques et

The problem with algorithms: magnifying misbehaviour | News By the time you read these words, much of what has appeared on the screen of whatever device you are using has been dictated by a series of conditional instructions laid down in lines of code, whose weightings and outputs are dependent on your behaviour or characteristics. We live in the Age of the Algorithm, where computer models save time, money and lives. Gone are the days when labyrinthine formulae were the exclusive domain of finance and the sciences - nonprofit organisations, sports teams and the emergency services are now among their beneficiaries. Even romance is no longer a statistics-free zone. But the very feature that makes algorithms so valuable - their ability to replicate human decision-making in a fraction of the time - can be a double-edged sword. The prejudiced computer As detailed here in the British Medical Journal, staff at St George's Hospital Medical School decided to write an algorithm that would automate the first round of its admissions process.

Why Citizen Developers Are The Future Of Programming If you ever wondered what would scare the bejeezus out of a university's computer science department, try this: In June, Google revealed that it no longer considers GPA scores as a hiring criteria. “One of the things we’ve seen from all our data crunching is that GPA's are worthless as a criteria for hiring,” Laszlo Bock, senior vice president of people operations at Google, told the New York Times. To students who have been told their university grades are paramount, that's a big shock. But to anybody watching the tech industry, such a statement was inevitable. The “Citizen Developer” When Google executives look at prospective hires’ portfolios instead of test scores, it expands the playing field beyond “anyone with a degree” to “anyone with skills.” It’s a phenomenon that the tech world is calling the rise of the “citizen developer.” Back in 2011, Gartner predicted that by 2014, citizen developers would build at least 25 percent of new business applications. Tons of Jobs, Few Developers

Gérer les technologies de rupture: conversation avec Chamath Palihapitiya Trois technologies à surveiller de près Je vais vous parler des trois innovations qui m’enthousiasment le plus. La première sont les réseaux de capteurs, que je suis très impatient de voir arriver. La seconde, c’est ce mouvement qui s’amorce vers l’automatisation des transports. Et la troisième concerne une application très spécifique du big data, les grands volumes de données – celle qui concerne le génie génétique. Commençons par le premier exemple. Et c’est ainsi que nous allons voir émerger des moyens extrêmement concrets d’améliorer notre qualité de vie, notre productivité, de manière vraiment tangible, simple et qui parle aux gens. Or pourquoi les gens se retrouvent-ils aux urgences ? Et à partir de là il commence à construire un modèle heuristique. Le second sujet est celui des véhicules autonomes, dont Google est aujourd’hui l’un des pionniers. On peut donc imaginer une flotte de petits véhicules électriques qui livreraient le courrier.

Five Creepiest Advances in Artificial Intelligence Already, the electronic brains of the most advanced robotic models surpass human intelligence and are able to do things that will make some of us shudder uncomfortably. But what is your reaction going to be after learning about recent advances in robotics and artificial intelligence? 5. Schizophrenic robot Scientists at the University of Texas (Austin) have simulated mental illness for a computer, testing schizophrenia on artificial intelligence units. The test subject is DISCERN – a supercomputer that functions as a biological neural network and operates using the principles of how human brain functions. The researchers then emulated schizophrenic brain in artificial intelligence by overloading the computer with many stories. 4. Professor Roland Arkin from the School of interactive computing at the University of Georgia presented the results of an experiment in which scientists were able to teach a group of robots to cheat and deceive. The experiment involved two robots. 3. 2. 1.

All Can Be Lost: The Risk of Putting Our Knowledge in the Hands of Machines - Nicholas Carr We rely on computers to fly our planes, find our cancers, design our buildings, audit our businesses. That's all well and good. But what happens when the computer fails? On the evening of February 12, 2009, a Continental Connection commuter flight made its way through blustery weather between Newark, New Jersey, and Buffalo, New York. As is typical of commercial flights today, the pilots didn’t have all that much to do during the hour-long trip. The captain, Marvin Renslow, manned the controls briefly during takeoff, guiding the Bombardier Q400 turboprop into the air, then switched on the autopilot and let the software do the flying. The crash, which killed all 49 people on board as well as one person on the ground, should never have happened. The Buffalo crash was not an isolated incident. And that, many aviation and automation experts have concluded, is a problem. The experience of airlines should give us pause. Doctors use computers to make diagnoses and to perform surgery.

The Disconnectionists “Unplugging” from the Internet isn’t about restoring the self so much as it about stifling the desire for autonomy that technology can inspire Once upon a pre-digital era, there existed a golden age of personal authenticity, a time before social-media profiles when we were more true to ourselves, when the sense of who we are was held firmly together by geographic space, physical reality, the visceral actuality of flesh. Without Klout-like metrics quantifying our worth, identity did not have to be oriented toward seeming successful or scheming for attention. TNI Vol. 22: Self-Help is out now. Subscribe now for $2 and get yours today. According to this popular fairytale, the Internet arrived and real conversation, interaction, identity slowly came to be displaced by the allure of the virtual — the simulated second life that uproots and disembodies the authentic self in favor of digital status-posturing, empty interaction, and addictive connection. Baratunde Thurston writes,

The Jobs Smart Bots will Kick to the Curb | HomeFree America IN the not too distant past, technological advances destroyed work selectively* within the industrial bureaucratic system. At worst, it destroyed industries or a job category. At best, it merely eliminated drudgery by eliminating routine or time consuming work tasks. That’s not true anymore. Here the primary reasons this technological shift is possible: Better decision making. The upshot of this technological change is that the industrial bureaucracy is going to shrink and more quickly than we experienced with agriculture. To get some insight into the amount of work that is at risk right now, with existing technology, let’s take a look at a study by Carl Frey and Michael Osborne at Oxford University called, “The Future of Employment: How Susceptible are Jobs to Computerization?.” Here’s the chart the team put together (click for a larger version). The final section, 33% of occupations, represent skills that are protected by engineering bottlenecks that they deem likely to persist.

The Algorithms of Our Lives Phototrails.net The author and several colleagues studied cultural differences using these computerized patterns of Instagram postings—arranged by hue and brightness—from Tokyo, New York, Bangkok, and San Francisco. In 2002, I was in Cologne, Germany, and I went into the best bookstore in the city devoted to humanities and arts titles. Yet in the 1990s, software-based tools were adopted in all areas of professional media production and design. Thanks to practices pioneered by Google, the world now operates on web applications that remain forever in beta stage. Software has become a universal language, the interface to our imagination and the world. But while scholars and media and new-media theorists have covered all aspects of the IT revolution, creating fields like cyberculture studies, Internet studies, game studies, new-media theory, and the digital humanities, they have paid comparatively little attention to software, the engine that drives almost all they study. It's time they did.

Related: