background preloader

Eli Pariser: Beware online "filter bubbles"

Related:  FAKE NEWS

Students Reject 'Fake News' To Write Footnoted, Neutral Wikipedia Entries : NPR Ed Fake news has been, well, in the news a lot lately. But for the world's largest crowdsourced encyclopedia, it's nothing new. "Wikipedia has been dealing with fake news since it started 16 years ago," notes LiAnna Davis, deputy director of the Wiki Education Foundation. To combat misinformation, Wikipedia has developed a robust corps of volunteer editors. Anyone can write new entries and scrutinize existing ones for adherence to Wikipedia's rules on sourcing and neutrality. While it's not free of errors or pranks, what results is a resource that 50 million people turn to daily on hundreds of thousands of topics in a few dozen languages. Today, educators are among those more concerned than ever with standards of truth and evidence and with the lightning-fast spread of misinformation online. This spring, 7,500 students are expected to participate. Since the program began six years ago, Davis says, students have collectively added more than 25 million words of content to Wikipedia.

Escape your search engine Filter Bubble! How Google Impacts The Way Students Think How Google Impacts The Way Students Think by Terry Heick It’s always revealing to watch learners research. Why do people migrate? Where does inspiration come from? How do different cultures view humanity differently? Literally Google it. And you see knowledge as searchable, even though that’s not how it works. 1. Google is powerful, the result of a complicated algorithm that attempts to index human thought that has been digitally manifest. The result? 2. When students are looking for an “answer,” good fortune sees them arrive at whatever they think they’re looking for, where they can (hopefully) evaluate the quality and relevance of the information, cite their source, and be on their merry way. But with the cold logistics of software, having come what they were looking for, learners are left with the back-button, a link on the page they’re on, or a fresh browser tab. Having found an “answer,” rabid-Googlers are ready to “finish” the assignment. 3. Especially those that seem conflicting.

How to Burst the "Filter Bubble" that Protects Us from Opposing Views The term “filter bubble” entered the public domain back in 2011when the internet activist Eli Pariser coined it to refer to the way recommendation engines shield people from certain aspects of the real world. Pariser used the example of two people who googled the term “BP”. One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm. This is an insidious problem. This is the filter bubble—being surrounded only by people you like and content that you agree with. And the danger is that it can polarise populations creating potentially harmful divisions in society. Today, Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona as well as Mounia Lalmas and Daniel Quercia, both at Yahoo Labs, say they’ve hit on a way to burst the filter bubble. They found over 40,000 Twitter users who had expressed an opinion using the hashtags such as #pro-life and #pro-choice.

Commentary: It’s Facebook’s algorithm vs. democracy, and so far the algorithm is winning — NOVA Next Over the last several years, Facebook has been participating—unintentionally—in the erosion of democracy. The social network may feel like a modern town square, but thanks to its tangle of algorithms, it’s nothing like the public forums of the past. The company determines, according to its interests and those of its shareholders, what we see and learn on its social network. The result has been a loss of focus on critical national issues, an erosion of civil disagreement, and a threat to democracy itself. Facebook is just one part—though a large part—of the Big Data economy, one built on math-powered applications that are based on choices made by fallible human beings. Facebook's algorithm—driven in part by likes and shares—has upended civil discourse. In 2008, when the economy crashed, I witnessed the power of these “Weapons of Math Destruction” firsthand from my desk at a hedge fund in New York City. In many cases, WMDs define their own reality to justify their results.

Advanced Search - Search Help Narrow down search results for complex searches by using the Advanced Search page. For example, you can find sites updated in the last 24 hours or images that are in black and white. Do an Advanced Search Tip: You can also use many of these filters in the search box with search operators. Advanced Search filters you can use Language Region Last updated date Site or domain Where the search terms appear on the page SafeSearch Reading level File type Usage rights (find pages that you have permission to use) Size Aspect ratio Color Type (face, animated, etc.) Filter bubble Intellectual isolation involving search engines The term filter bubble was coined by internet activist Eli Pariser, circa 2010. A filter bubble or ideological frame is a state of intellectual isolation[1] that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior, and search history.[2] As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.[3] The choices made by these algorithms are only sometimes transparent.[4] Prime examples include Google Personalized Search results and Facebook's personalized news-stream. Technology such as social media “lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. Concept[edit] Similar concepts[edit] [edit] Dangers[edit]

for Schools | AllSides Preparing students to participate thoughtfully in democracy - and in life. Students need to learn how to sort through mass media and social networks, think critically about the issues, and engage with each other in a healthy and positive way, even when there are differences in opinions and backgrounds. AllSides for Schools helps educators teach these valuable lessons and skills. Let's teach the next generation how to see diverse perspectives, value differences and benefit from everyone’s best ideas. Contact Us to Sign-Up See Overview of School Program Elections and Relationships The climate for elections and political issues is so divisive, how can classrooms discuss hot-button issues effectively and with mutual understanding? Relationships First Introduces students to civil dialog and appreciating others even when we disagree Dictionary Term Lesson Plan 1 or multiple day program that uses the Dictionary term of your choosing to teach about differences in opinion and perspective Election Issues

No. 1 Position in Google Gets 33% of Search Traffic [Study] New findings from online ad network Chitika confirm it’s anything but lonely at the top. According to the study, the top listing in Google’s organic search results receives 33 percent of the traffic, compared to 18 percent for the second position, and the traffic only degrades from there: For the top 10 results, Chitika found: A similar study by the Chitika team back in 2010 showed comparable results, and Chitika suggests the findings validate the importance of SEO for online businesses. “While being the number one result on a Google search results page is obviously important, these numbers show just how big of an advantage websites of this type have over any competitors listed below them. For many, it’ll come as no surprise that the findings also showed a significant drop in traffic from Page 1 to Page 2 results. And whether you’re on Page 1 or Page 4, Chitika reports the top position consistently sees more traffic than others on the page. Interpreting Rankings and Traffic Data

Virala nyheter: Hur nyheter sprids och bemöts i sociala medier Detta är en Master-uppsats från Göteborgs universitet/Institutionen för journalistik, medier och kommunikation Sammanfattning: AbstractBakgrund: När människor tar del av nyheter via sociala medier som Twitter och Facebook ärdet möjligt för andra användare i de sociala medierna att påverka uppfattningen av innehållet.Det kan exempelvis ske genom att lyfta fram, tona ned, omtolka eller omgestalta nyheterna.Hur detta sker och vad det får för konsekvenser för hur människor uppfattar nyheterna hartidigare inte undersökts. Dessutom saknas det kunskaper om vilka nyheter från massmediernasom sprids i social medier, i synnerhet i Sverige.Syfte: Beskriva och jämföra vilka nyheter som sprids i sociala medier (Facebook och Twitter)samt undersöka de psykologiska orsakerna till varför de sprids vidare.Metod: Artiklar (N = 89 450) från de tolv största svenska nyhetssajterna under två månader ibörjan av 2014 undersöktes hur de spreds på Facebook och Twitter med en kvantitativinnehållsanalys.

Related: