background preloader

Opdracht 2

Facebook Twitter

Jangpro_1.pdf. Google News at 10: How the Algorithm Won Over the News Industry. Google's "billions of clicks" are only half of the story. In April of 2010, Eric Schmidt delivered the keynote address at the conference of the American Society of News Editors in Washington, D.C. During the talk, the then-CEO of Google went out of his way to articulate -- and then reiterate -- his conviction that "the survival of high-quality journalism" was "essential to the functioning of modern democracy. " This was a strange thing. This was the leader of the most powerful company in the world, informing a roomful of professionals how earnestly he would prefer that their profession not die. What a difference nine months make. "100,000 Business Opportunities" There is, on the one hand, an incredibly simple explanation for the shift in news organizations' attitude toward Google: clicks.

As a Google representative put it, "That's about 100,000 business opportunities we provide publishers every minute. " Concession Stands There have been the product-level innovations. Harvesting the News. BuzzFeed CEO Jonah Peretti: 'It's not just a site, it's a whole process' | The Verge. Nilay Patel: It sounds like you have some pretty exciting tech news going on. Jonah Peretti: I think that one of the interesting things about all the stuff we do in tech is that in general technology is harder for people to understand.

Some of the stuff that I'm most excited about I don't get a chance to talk about as much. It seems exciting to me and is probably interesting to readers of The Verge, but there are a lot of people who love BuzzFeed for entertainment and news and how it gets to them isn't really as exciting as what is getting to them. Obviously we have a very big tech audience at The Verge, but I spent six months earlier this year working on the Vox.com team, and it was actually really interesting for me to step back and watch how our publishing platform helped that team scale so fast and hit a different big audience. We think about our platform so much that it almost circles back around to taking it for granted, and I suspect that that is true at BuzzFeed as well. Yes. Download. Personalization Algorithms, and Why They Don’t Understand Us Creative Types.

“Personalization is a process of gathering, storing and analyzing information about site visitors and delivering the right information to each visitor at the right time.” – Algorithms for Web Personalization In 2011, Eli Pariser uncovered the filter bubble. In front of our eyes, Google and Facebook become geniuses at giving us what we “want,” based on algorithms that guess our interests and concerns. Today, nearly ever digital news outlet, search engine and social media app engages in an “invisible, algorithmic editing of the web.” “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” – Mark Zuckerberg, Facebook.

According to Pariser, the idea is one of relevance – digital platforms racing to deliver entertainment, opinions and news most relevant to you as the reader. Are you sure? So it appears that Facebook, Google, Netflix an even Twitter are not alone in valuing and chasing relevance to their consumers. References: The Filter Bubble. So how well does personalization work, anyway? Over at Yahoo, according to FastCompany, quite well. Since setting up their crack personalization team in 2009, clicks on Yahoo’s “Today” box have increased 270%. That’s saying personalization makes us four more times likely to click on a link.

Whether you believe personalization makes the internet more efficient, more fractured or more mind-numbing, that’s a pretty impressive number. For those concerned about the self-looping and fragmenting effects of the filter bubble, the good news is that Yahoo’s algorithm is not entirely human-free. Editors are in charge of curating the 50-100 versions of the “Today” module that could pop up on your Yahoo home page; the bots just guide them to which stories work best and, ultimately, which take on “Today” you’ll see. Humans are also, thankfully, still in charge of deciding when civics trumps the bottom line: ‘Personaliseren sites leidt tot manipulatie’ Hoe algoritmes bepalen welk nieuws wij te zien krijgen • Numrush. Algoritmes spelen een grote rol in onze media consumptie. Zo krijgen we aanbevelingen over films of muziek op Spotify en Netflix en over interessante pagina’s op Facebook en in Google’s zoekresultaten.

Maar wat betekent dat voor journalistiek en democratie in het algemeen? Gilad Lotan van start-up Betaworks en Kelly McBride van The Poynter Institute bogen zich zondag op SXSW over de vraag hoe algoritmes ons helpen de wereld beter te begrijpen of hoe ze juist ons beeld van de realiteit vertroebelen. McBride stelt dat journalistiek tegenwoordig bezig is met de vraag of democratie nog wel kan functioneren.

“When we talk about democracy, we’re really talking about the marketplace of ideas, and whether your idea can surface in a marketplace of ideas.” In de twintigste eeuw was deze marktplaats van ideeën de professionele pers met poortwachters in de vorm van journalisten. Volgens McBride zeggen zeker jongeren van: “als nieuws belangrijk is dan zal het me wel vinden“. Daniele Quercia - Computational Social Science | Urban Informatics. What Happens to #Ferguson Affects Ferguson: — The Message. Ferguson is about many things, starting first with race and policing in America.

But it’s also about internet, net neutrality and algorithmic filtering. It’s a clear example of why “saving the Internet”, as it often phrased, is not an abstract issue of concern only to nerds, Silicon Valley bosses, and few NGOs. It’s why “algorithmic filtering” is not a vague concern. It’s a clear example why net neutrality is a human rights issue; a free speech issue; and an issue of the voiceless being heard, on their own terms.

I saw this play out in multiple countries — my home country of Turkey included — but last night, it became even more heartbreakingly apparent in the United States as well. For me, last night’s Ferguson “coverage” began when people started retweeting pictures of armored vehicles with heavily armored “robocops” on top of them, aiming their muzzle at the protesters, who seemed to number a few hundred. I watched this interaction online. Nada, zip, nada. Algorithms have consequences. Filter bubble wordt doorbroken in experimenteel systeem • Numrush. Ben je online, dan is de kans groot dat je in een filter bubble zit. Vrijwel iedere browser is ‘gepersonaliseerd’, waardoor je online ervaring beperkt wordt. Je krijgt alleen onderwerpen te zien die passen bij jouw profiel en je wordt beschermd tegen tegenstrijdige meningen.

Onderzoekers hebben een manier gevonden om deze filter bubble te doorbreken. Eduardo Grealls-Garrido, Mounia Lalams, Daniel Quercia hebben op Twitter onderzoek gedaan naar hoe deze filter bubble doorbroken kan worden. Het is belangrijk dat dit gebeurt, want filter bubbles leiden tot polarisatie van onderwerpen. Het systeem wat deze drie onderzoekers gemaakt hebben, zorgt er voor dat juist mensen met tegenstrijdige visies met elkaar in contact komen. In het systeem wat de onderzoekers hebben ontwikkeld worden aanbevolen tweets tussen de gewone tijdlijn geplaatst. Het systeem is getest bij ongeveer 3000 twitter-accounts. Dit nieuwe systeem heeft veel potentie. Filter bubble wordt doorbroken in experimenteel systeem • Numrush. Onderzoek laat zien hoe het web discrimineert op basis van demografische gegevens • Numrush. Organisaties die zich inzetten voor de vrijheid van het internet zeggen steeds vaker dat de commercialisatie van het internet een groeiend probleem is. Het blijft altijd lastig om uit te leggen hoe dit precies komt.

Baanbrekend onderzoek laat nu zien hoe mensen online gediscrimineerd worden op basis van hun profielen. Dit onderzoek wordt uitgevoerd door Princeton University en het Belgische KU Leuven. Zij hebben een aantal bots gemaakt en deze een profiel toegewezen. Filter Bubble De resultaten van dit onderzoek kunnen verstrekkende gevolgen hebben. Hier lijkt op zich niks mis mee. Daar komt bij dat algoritmes gaan het gebruikersgedrag van internetter kunnen gaan voorspellen. Privacy schending Het onderzoek staat onder leiding van professor Arvind Narayanan. De onderzoekers van Narayanan hebben al gekeken naar de cookie synchronisatie van online advertentiebureau AppNexus.

Het onderzoek bevindt zich nog in een vroege fase en daadwerkelijke conclusies zijn er dan ook nog niet. Web measurement for fairness and transparency. [This is the first in a series of posts giving some examples of security-related research in the Princeton computer science department. We're actively recruiting top-notch students to enter our Ph.D. program, as well as postdocs and visiting scholars. We don't have enough bandwidth here on the blog to feature everything we do, so we'll be highlighting a few examples over the next couple of weeks.]

Everything we do on the web is tracked, profiled, and analyzed. But what do companies do with that information? To what extent do they use it in ways that benefit us, versus discriminatory ways? Let’s consider some examples. What all these and many more examples have in common is that they are ways of using personal information for differential or discriminatory treatment.

Some researchers have used manual or crowdsourcing techniques to look for such differences. What excites me about this project is that the measurement platform draws heavily from diverse areas of computer science. Search Results. Sign In.