background preloader

Automation, Algorithms & Manipulation

Facebook Twitter

De la conception comportementale appliquée aux environnements de travail. Noam Scheiber (@noamscheiber) pour le New York Times vient de lancer un autre pavé dans la mare des nombreuses difficultés que connaît actuellement Uber (après notamment les questions autour de Greyball que nous évoquions hier), en montrant comment l’entreprise – ainsi que son grand concurrent sur le territoire américain, Lyft – utilisait le design comportemental pour optimiser le travail de ses chauffeurs, c’est-à-dire, comme il le dit lui-même, comment il les « manipule au service de la croissance de l’entreprise ».

De la conception comportementale appliquée aux environnements de travail

Pour un rétro-Design de l’attention. La prise en compte de l’expérience utilisateur par les systèmes numériques pose la question de l’attention de ceux qui sont amenés à les utiliser.

Pour un rétro-Design de l’attention

Où en est le Nudge (1/3) ? Tout est-il « nudgable » ? L’association Nudge France – une association pour promouvoir le Nudge en France, ce « coup de pouce » pour orienter les décisions des gens (@nudgefrance) – organisait il y a peu une journée sur la science comportementale, invitant notamment nombre d’experts du domaine.

Où en est le Nudge (1/3) ? Tout est-il « nudgable » ?

L’occasion de mesurer un peu où en est ce sujet qui semblait, au tournant des années 2010 (voir notre dossier), comme la grande solution pour transformer les politiques publiques. Force est de constater que l’impression d’ensemble s’est révélée plutôt décevante. 'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia. Justin Rosenstein had tweaked his laptop’s operating system to block Reddit, banned himself from Snapchat, which he compares to heroin, and imposed limits on his use of Facebook.

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia

But even that wasn’t enough. In August, the 34-year-old tech executive took a more radical step to restrict his use of social media and other addictive technologies. Rosenstein purchased a new iPhone and instructed his assistant to set up a parental-control feature to prevent him from downloading any apps. He was particularly aware of the allure of Facebook “likes”, which he describes as “bright dings of pseudo-pleasure” that can be as hollow as they are seductive. And Rosenstein should know: he was the Facebook engineer who created the “like” button in the first place. L’intelligence artificielle va-t-elle rester impénétrable ? Artificial Intelligence Is Setting Up the Internet for a Huge Clash With Europe. Neural networks are changing the Internet.

Artificial Intelligence Is Setting Up the Internet for a Huge Clash With Europe

Inspired by the networks of neurons inside the human brain, these deep mathematical models can learn discrete tasks by analyzing enormous amounts of data. They’ve learned to recognize faces in photos, identify spoken commands, and translate text from one language to another. And that’s just a start. They’re also moving into the heart of tech giants like Google and Facebook. They’re helping to choose what you see when you query the Google search engine or visit your Facebook News Feed. Why ‘Popping’ the Social Media Filter Bubble Misses the Point.

Let’s be absolutely clear: social media filter bubbles are not responsible for the election of Donald Trump.

Why ‘Popping’ the Social Media Filter Bubble Misses the Point

There are quite a few problems with this thinking. First, it draws a direct causal line between the outcome of the election and social media usage by supposing that every voter uses social media; not that every ballot cast was filled out by someone with a Facebook account, Twitter, or internet access. Second: it suggests that social media is the only mechanism by which the forces that characterized this election—misinformation, extremism, radicalization, and paranoia—proliferate. Forbes Welcome. How Elon Musk and Y Combinator Plan to Stop Computers From Taking Over — Backchannel. How Elon Musk and Y Combinator Plan to Stop Computers From Taking Over They’re funding a new organization, OpenAI, to pursue the most advanced forms of artificial intelligence — and give the results to the public.

How Elon Musk and Y Combinator Plan to Stop Computers From Taking Over — Backchannel

We Need Algorithmic Angels. Editor’s note: Jarno M.

We Need Algorithmic Angels

Koponen is a designer, humanist and co-founder of media discovery startup Random. Our algorithms, ourselves. An earlier version of this essay appeared last year, under the headline “The Manipulators,” in the Los Angeles Review of Books.

Our algorithms, ourselves

Since the launch of Netscape and Yahoo twenty years ago, the story of the internet has been one of new companies and new products, a story shaped largely by the interests of entrepreneurs and venture capitalists. The plot has been linear; the pace, relentless. In 1995 came Amazon and Craigslist; in 1997, Google and Netflix; in 1999, Napster and Blogger; in 2001, iTunes; in 2003, MySpace; in 2004, Facebook; in 2005, YouTube; in 2006, Twitter; in 2007, the iPhone and the Kindle; in 2008, Airbnb; in 2010, Instagram and Uber; in 2011, Snapchat; in 2012, Coursera; in 2013, Tinder. #Celerity: A Critique of the Manifesto for an Accelerationist Politics. Red stack attack! Algorithms, capital and the automation of the common – di Tiziana Terranova. Why the internet of things could destroy the welfare state. On 24 August 1965 Gloria Placente, a 34-year-old resident of Queens, New York, was driving to Orchard Beach in the Bronx.

Why the internet of things could destroy the welfare state

Clad in shorts and sunglasses, the housewife was looking forward to quiet time at the beach. But the moment she crossed the Willis Avenue bridge in her Chevrolet Corvair, Placente was surrounded by a dozen patrolmen. What does the Facebook experiment teach us? — The Message. I’m intrigued by the reaction that has unfolded around the Facebook “emotion contagion” study. (If you aren’t familiar with it, read this primer.) As others have pointed out, the practice of A/B testing content is quite common. Corrupt Personalization. (“And also Bud Light.”) In my last two posts I’ve been writing about my attempt to convince a group of sophomores with no background in my field that there has been a shift to the algorithmic allocation of attention – and that this is important. In this post I’ll respond to a student question.

My favorite: “Sandvig says that algorithms are dangerous, but what are the the most serious repercussions that he envisions?” What is the coming social media apocalypse we should be worried about? This is an important question because people who study this stuff are NOT as interested in this student question as they should be.

And our field’s most common response to the query “what are the dangers?”