background preloader

The Social Graph is Neither

The Social Graph is Neither
The Social Graph Is Neither I first came across the phrase social graph in 2007, in an essay by Brad Fitzpatrick, though I'd be curious to know if it goes back further. The idea of representing relationships between people as networks is old, but this was the first time I had thought about treating the connections between all living people as one big object that you could manipulate with a computer. At the time he wrote, Fitzpatrick had two points to make. Fitzpatrick subsequently went to work for Google, and his Utopian vision of open standards and open data became subsumed in a rivalry between Google and Facebook. This rivalry has brought the phrase 'social graph' into wider use. I think this is a fascinating metaphor. But right now I would like to take issue with the underlying concept, which I think has two flaws: I. The idea of the social graph is that each person is a dot in a kind of grand connect-the-dots game, the various relationships between us forming the lines. II.

Welcome to Zug: the sleepy Swiss town that became a global economic hub | Business Nestling beside a lake overlooked by snow-dusted mountains, Zug seems for all the world like just another cute, affluent Swiss town. You could wander its cobbled Altstadt, sample its culinary speciality, a liqueur-drenched Kirschtorte, even stay on to see one of Zug's renowned sunsets, without ever imagining you were at a cardinal point of the global economy - or in a town that, for years, was the hideout of the world's most wanted white-collar criminal. According to the government of the canton, or region, of which Zug is the capital, there are 27,000 companies on its commercial register - one for every man, woman and child in the town, leaving a few hundred to spare. A Zug-registered firm is building the strategically critical gas pipeline that will link Europe with Russia via the Baltic. About 3% of the world's petrol is traded, either as crude oil or refined product, through Zug and the neighbouring town of Baar. In addition, Zug offered Rich a much-needed bolthole after 1983.

Escalation in Digital Sleuthing Raises Quandary in Classrooms - Technology By Marc Parry The spread of technology designed to combat academic cheating has created a set of tricky challenges, and sometimes unexpected fallout, for faculty members determined to weed out plagiarism in their classrooms. In the latest development, the company that sells colleges access to Turnitin, a popular plagiarism-detection program that checks uploaded papers against various databases to pinpoint unoriginal content, now also caters directly to students with a newer tool called WriteCheck, which lets users scan papers for plagiarism before handing them in. Meanwhile, faculty members at some colleges are adopting a reverse image-search program called TinEye, which lets them investigate plagiarism in ­visual materials like photos and architectural designs. Cheating is nothing new. One expert on plagiarism, Rebecca Moore Howard, worries that the widespread adoption of antiplagiarism programs is putting professors in the role of police officers. Student Use of Software Ms. Ms. David E.

Philip Trippenbach What Is Sony Now? Sir Howard Stringer remembers when 2011 was going to be wonderful. “This was the first year of the payoff,” he says, “and next year was going to be the second.” As chairman, president, and chief executive officer of Sony (SNE), Stringer had spent six years trying to return the Japanese icon to its former glory and open a new era of growth. Sony expected an annual operating profit of at least $2 billion, its best in three years. A batch of new products was headed for store shelves, including its first tablets, a compact 24-megapixel camera, and a portable PlayStation player. Sony was also preparing to launch a global network that would connect the company’s movies, music, and video games to all its televisions, tablets, PCs, and phones—an iTunes-like digital platform. The feeling of imminent triumph ended abruptly on Mar. 11. He considered returning to Tokyo but decided against it. There’s more to Sony’s problems than acts of God and currency traders.

Ten years of Windows XP: how longevity became a curse Windows XP's retail release was October 25, 2001, ten years ago today. Though no longer readily available to buy, it continues to cast a long shadow over the PC industry: even now, a slim majority of desktop users are still using the operating system. Windows XP didn't boast exciting new features or radical changes, but it was nonetheless a pivotal moment in Microsoft's history. It was Microsoft's first mass-market operating system in the Windows NT family. It was also Microsoft's first consumer operating system that offered true protected memory, preemptive multitasking, multiprocessor support, and multiuser security. The transition to pure 32-bit, modern operating systems was a slow and painful one. In the history of PC operating systems, Windows XP stands alone. The success was remarkable for an operating system whose reception was initially quite muted. It faced tough competition from Microsoft's other operating systems. In the end, none of the objections mattered.

Palantir, the War on Terror's Secret Weapon In October, a foreign national named Mike Fikri purchased a one-way plane ticket from Cairo to Miami, where he rented a condo. Over the previous few weeks, he’d made a number of large withdrawals from a Russian bank account and placed repeated calls to a few people in Syria. More recently, he rented a truck, drove to Orlando, and visited Walt Disney World by himself. As numerous security videos indicate, he did not frolic at the happiest place on earth. He spent his day taking pictures of crowded plazas and gate areas. None of Fikri’s individual actions would raise suspicions. The day Fikri drives to Orlando, he gets a speeding ticket, which triggers an alert in the CIA’s Palantir system. As the CIA analyst starts poking around on Fikri’s file inside of Palantir, a story emerges. Fikri isn’t real—he’s the John Doe example Palantir uses in product demonstrations that lay out such hypothetical examples. The antifraud tools of the time could not keep up with the crooks. Michael E.

Empirical Software Engineering As researchers investigate how software gets made, a new empire for empirical research opens up Greg Wilson, Jorge Aranda Software engineering has long considered itself one of the hard sciences. A growing number of researchers believe software engineering is now at a turning point comparable to the dawn of evidence-based medicine, when the health-care community began examining its practices and sorting out which interventions actually worked and which were just-so stories. The stakes couldn’t be higher. Broadly speaking, people who study programming empirically come at the problem from one of two angles. The other camp typically focuses on the “what” rather than the “who.” The questions we and our colleagues seek to answer are as wide-ranging as those an anthropologist might ask during first contact with a previously unknown culture. Along the way, our field is grappling with the fundamental issues that define any new science. Like all negative results, this one is a bit disappointing.

How to See the Invisible Everybody’s amazed by touch-screen phones. They’re so thin, so powerful, so beautiful! But this revolution is just getting under way. Then there are the apps. That term usually refers to a live-camera view with superimposed informational graphics. If you’re color-blind like me, then apps like Say Color or Color ID represent a classic example of what augmented reality can do. Other apps change what you see. But it’s not. Some of the most promising AR apps are meant to help you when you’re out and about. When you’re in a big city, apps like Layar and Wikitude let you peer through the phone at the world around you. Several of these apps are not, ahem, paragons of software stability. As much fun as they are to use, AR apps mean walking through your environment with your eyes on your phone, held at arm’s length—a posture with unfortunate implications for social interaction, serendipitous discovery and avoiding bus traffic.

Top 10 Pictures That Shocked The World It has often been said throughout time that a picture is worth a thousand words. Any picture may be worth a thousand words, but only a few rare photos tell more than a thousand words. They tell a powerful story, a story poignant enough to change the world and galvanize each of us. From the iconic images of Omayra Sanchez’s tragic death to the horrifying images of the Bhopal Gas disaster in 1984, the power of photography is still alive and invincible. Here is my top 10 list of photos that shocked the world: Warning: Be prepared for images of violence and death (in one case, the photograph of a dead child) if you scroll down. 10. Carol Guzy, the first woman to receive a Pulitzer Prize for spot news photography, received her most recent Pulitzer in 2000 for her touching photographs of Kosovo refugees. The above picture portrays Agim Shala, a two-year-old boy, who is passed through a fence made with barbed wire to his family. 9. Bullet casings cover entirely a street in Monrovia. 8. F. 7. 6.

Nadia Shira Cohen | Online Only All images courtesy of Nadia Shira Cohen. Captions below. In her photo essay Flee, which is featured in Ten Years Later, Nadia Shira Cohen explores the border between Tunisia and Libya, where a sea of migrant workers and refugees had gathered, still wearing the hard hats adorned with the logos of companies that once employed them in Libya. She talks to Granta’s Artistic Director Michael Salu about finding and photographing moments of intimacy, even during times of war and chaos. MS: Could you tell us about how you found yourself in Ras Ajdir, as opposed to, say, Tripoli? Well, I have to admit that I was called for an assignment in Ras Ajdir to photograph the refugee situation because of the type of work I normally do, which is not typically the coverage of breaking news events. Your delicate use of the lens arrested me as soon as I came across your work. When I started out it was because of a desire to explore cultures that were foreign to me and people with fascinating life stories. 1.

The Resignation of Wadah Khanfar and the Future of Al Jazeera The resignation last week of Wadah Khanfar as managing director of Al Jazeera has provoked speculation that scandal lurks beneath his departure. Many have pointed to a WikiLeaks cable stating that Khanfar had succumbed to pressure from the U.S. in 2005 and played down civilian casualties in some of the network's coverage of the Iraq War. Others have argued that larger political matters related to its coverage of the Arab Spring -- especially its unrestrained, albeit selective, endorsement of democratic reforms -- forced Khanfar's ouster. Both suggestions contain more fancy than substance: it is hard to believe that Doha did not already know about Khanfar's talking to the U.S. ambassador or that pro-democracy strands in Al Jazeera's programming would end his career. The more intriguing question is what comes next for Al Jazeera. To continue reading, please log in. Don't have an account? Register Register now to get three articles each month. Register for free to continue reading.

The Curious Science of Counting a Crowd On June 4, a huge crowd gathered in Hong Kong for a vigil to commemorate the 22nd anniversary of the Tiananmen Square massacre in Beijing. But just how huge? In some stories 77,000 people showed up. Another story, though, listed the attendance as nearly double that: 150,000. There's a reason for the disparity. This story of competing head counts is not uncommon. "Almost everyone who has tried to make a crowd estimate has a vested interest in what the outcome of the estimate is," Charles Seife says. Crowd-Counting 101 Herbert Jacobs, a journalism professor at the University of California, Berkeley, in the 1960s, is credited with modernizing crowd-counting techniques. Fifty years after Jacobs, the tools for counting crowds have improved but the principle is the same: area times density. But a simple area times density calculation has its limits. Knowing what to expect, Westergard chose his observation point and launched a tethered balloon at the height of the rally.

March of the Penguin: Ars looks back at 20 years of Linux The Linux kernel was originally created by Linus Torvalds, a Finnish computer science student, and first announced to the world on August 25, 1991—exactly 20 years ago today. At the time, Torvalds described his work as a "hobby" and contended that it would not be "big and professional" like the GNU project. But the Linux kernel turned out to be one of the most significant pieces of open source software ever developed. Over the past two decades, it has grown from a humble hobby project into a global phenomenon that runs on everything from low-cost e-book readers to a majority of the world's supercomputers. From Freax to Linux While it's easy now to take the name "Linux" for granted, Torvalds had modestly rejected the idea of naming the new kernel after himself, instead calling it Freax. The original 0.01 release of Linux could not actually run. The UNIX landscape at the birth of Linux MINIX also played a major role in early Linux history. "Sadly, a kernel by itself gets you nowhere.

Related: