background preloader

Facebook unethical experiment: It made news feeds happier or sadder to manipulate people’s emotions.

Facebook unethical experiment: It made news feeds happier or sadder to manipulate people’s emotions.
Photo by Karen Bleier/AFP/Getty Images Facebook has been experimenting on us. A new paper in the Proceedings of the National Academy of Sciences reveals that Facebook intentionally manipulated the news feeds of almost 700,000 users in order to study “emotional contagion through social networks.” Katy Waldman is Slate’s words correspondent. The researchers, who are affiliated with Facebook, Cornell, and the University of California–San Francisco, tested whether reducing the number of positive messages people saw made those people less likely to post positive content themselves. They tweaked the algorithm by which Facebook sweeps posts into members’ news feeds, using a program to analyze whether any given textual snippet contained positive or negative words. The upshot? The other upshot: Facebook intentionally made thousands upon thousands of people sad. Facebook’s methodology raises serious ethical questions. Ah, informed consent.

Global study stresses importance of public Internet access July 10, 2013 Computer users in Bogota, Colombia.New research shows that millions in low-income countries still depend on public computer and Internet access venues despite the global proliferation of mobile phones and home computers.Joe Sullivan Millions of people in low-income countries still depend on public computer and Internet access venues despite the global proliferation of mobile phones and home computers. But a five-year, eight-country study recently concluded by the Technology & Social Change Group at the University of Washington Information School has found that community access to computer and Internet technology remains a crucial resource for connecting people to the information and skills they need in an increasingly digital world. The Global Impact Study of Public Access to Information & Communication Technologies surveyed 5,000 computer users at libraries, telecenters and cybercafés and 2,000 nonusers at home to learn about patterns of public access use.

Cornell ethics board did not pre-approve Facebook mood manipulation study Facebook’s controversial study that manipulated users’ newsfeeds was not pre-approved by Cornell University’s ethics board, and Facebook may not have had “implied” user permission to conduct the study as researchers previously claimed. In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their mood. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts. (For a refresher on the controversy, check out The Washington Post’s story from Monday). Ethics board consulted after the fact A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted.

10 Reasons To Delete Your Facebook Account Facebook's big problem: Ethical blindness | InfoWorld When you agreed to Facebook's terms and conditions, did you know you were agreeing to become a subject in a psychology experiment? This weekend, we learned that Facebook permitted an academic research team to conduct an experiment on a huge number of Facebook's users back in 2012. The researchers adjusted the contents of Facebook timelines for nearly 700,000 users so that either positive or negative news dominated. They found that positive news spread positive responses, and negative news spread negative responses. Some thought leaders dismiss the resulting criticism of Facebook, either because they expected worse or because they think it's no problem. I don't think this particular problem is an inherent consequence of Facebook's business model: an amoral corporation collecting personal data. I and most people around me are well aware of how Facebook hopes to monetize use of the system. I'm especially struck by this official Facebook response:

Are We Becoming too Dependent on the Internet? | Austin & Williams Unplugged The Internet as we know it today really came into its own in 1997, and even then most Internet sites were crude. In the last decade or so, broadband has become commonplace and mobile devices are now highly integrated with the Internet. That's changed everything. We have become increasingly dependent on the Internet for things we need to maintain our normal life. If this trend continues, as most expect it will, we may not be able to survive so easily without the Internet. And this is a huge risk we are taking. It seems very plausible that one day there may indeed be a catastrophic failure of the Internet, and it may be one that we cannot recover from quickly. Given this, our culture really needs to reassess our dependence on the Internet and the rush to put everything in the cloud. It's distinctly possible that we could, in one fell swoop, lose all services like the electric grid, water and sewer and almost all communications (telephone and television). Care to comment?

The legal and ethical issues behind Facebook's massive psychological experiment Unbeknownst to the rest of the world, Facebook’s data science team, in collaboration with Cornell University and the University of California's Center for Tobacco Research, ran an experiment in 2012 to test how moods like happiness or depression can be transmitted through social media. They did this by tweaking the newsfeed algorithm of over 600,000 Facebook users so it would show low numbers of positive or negative posts and observed how this influenced the posts of the users in the study. The researchers released their findings in the Proceedings of the National Academy of Science this March. The study was conducted to observe and test a phenomenon called "emotional contagion," the transfer of long term moods or emotions of people within a network. These studies usually involve real life interactions. They observed that when Facebook users see fewer positive posts (or more negative posts) on their newsfeed, they were more likely to share negative posts or have fewer positive posts.

Facebook’s study crosses an ethical line By Editorial Board July 2 FACEBOOK’S STUDY on emotional contagion may not have broken laws, but it has exposed the unfettered power of big data companies grounded in opaque user policies. For one week in 2012, researchers from Facebook, Cornell and the University of California skewed the emotional content of almost 700,000 news feeds to test how users would react. They found that people would write slightly more negative posts when exposed to negative feeds and vice versa. Facebook is half right. But this crossed an important line: Unlike typical A/B testing, Facebook tried to directly influence emotions, not behaviors. Almost all academic research requires informed consent from participants, which Facebook assumed from acceptance of its terms of service. (Dado Ruvic/Reuters) Recent lawsuits against Facebook and Google — including the European Court’s ruling in favor of a “right to be forgotten” — focus on the ownership and use of companies’ existing store of data.

In defense of Facebook’s newsfeed study (REUTERS/Dado Ruvic) Did Facebook overstep its bounds when it ran a secret psychological experiment on a fraction of its users two years ago? That's the question at the heart of an Internet firestorm of the past few days. The consensus is that Facebook probably did something wrong. If you're just coming to the story: For a week in 2012, Facebook took a slice of 689,000 English-speaking accounts across its userbase. The apparent goal was to find out whether emotions were contagious on Facebook — whether happy (or sad) newsfeeds made users more likely to write more happy posts (or sad posts) themselves. That hasn't stopped a vigorous — and healthy — debate from taking place about the convergence of business and academic research, and whether Facebook acted irresponsibly or unethically with its users' data. It used people's data for an academic study. It manipulated people's newsfeeds to make them happy or sad. The study made it past an institutional review board. It's creepy.

Related: