background preloader

Privacy

Facebook Twitter

Nothing to hide ? Really ?

With Big Data Comes Big Responsibility. “You should presume that someday, we will be able to make machines that can reason, think and do things better than we can,” Google co-founder Sergey Brin said in a conversation with Khosla Ventures founder Vinod Khosla.

With Big Data Comes Big Responsibility

To someone as smart as Brin, that comment is as normal as sipping on his super-green juice, but to someone who is not from this landmass we call Silicon Valley or part of the tech-set, that comment is about the futility of their future. And more often than not, the reality of Silicon Valley giants, who are really the gatekeepers of the future, is increasingly in conflict with the reality of the real world! What heightens that conflict — the opaque and often tone-deaf responses from companies big and small!

Silicon Valley (both the idea and the landmass) means that we always try to live in the future. We imagine what the future looks like and then we try and build it. That Uncanny Feeling. Which Company Has Your Back? Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days - Alexis Madrigal - Technology. One simple answer to our privacy problems would be if everyone became maximally informed about how much data was being kept and sold about them.

Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days - Alexis Madrigal - Technology

Logically, to do so, you'd have to read all the privacy policies on the websites you visit. A few years ago, two researchers, both then at Carnegie Mellon, decided to calculate how much time it would take to actually read every privacy policy you should. First, Lorrie Faith Cranor and Aleecia McDonald needed a solid estimate for the average length of a privacy policy. The median length of a privacy policy from the top 75 websites turned out to be 2,514 words.

A standard reading rate in the academic literature is about 250 words a minute, so each and every privacy policy costs each person 10 minutes to read. Next, they had to figure out how many websites, each of which has a different privacy policy, the average American visits. That's greater than the GDP of Florida, which has the fourth largest state economy in the US. You Call That Evil? There’s a nice little insider quarrel going on over Google’s just-announced privacy policy changes.

You Call That Evil?

A number of sites and commentators have let their fingers jump up mechanically in accusatory fashion. Google, caught red-handed being evil! Here, I think, is a time when the word “bias” is actually warranted. Everyone wants so badly for Google to do something truly evil (instead of just questionable or inconvenient) that their perceptions of Google actions are actually being affected.

Casting events systematically in a non-objective light is the exhibition of bias, and the continual presentation of policies one disagrees with as evidence of “evil” seems to fall under that category. Google going evil has become the Godwin’s Law of tech commentary. What specifically is evil about this particular action?

The freedom to be who you want to be… Posted by Alma Whitten, Director of Privacy, Product and Engineering Peter Steiner’s iconic “on the Internet, nobody knows you’re a dog” cartoon may have been drawn in jest--but his point was deadly serious, as recent events in the Middle East and North Africa have shown.

The freedom to be who you want to be…

In reality, as the web has developed--with users anywhere able to post a blog, share photos with friends and family or “broadcast” events they witness online--the issue of identity has become increasingly important. So, we’ve been thinking about the different ways people choose to identify themselves (or not) when they’re using Google--in particular how identification can be helpful or even necessary for certain services, while optional or unnecessary for others. Attribution can be very important, but pseudonyms and anonymity are also an established part of many cultures -- for good reason. When it comes to Google services, we support three types of use: unidentified, pseudonymous and identified. Unidentified. “Privacy” as Censorship: Fleischer Dismantles the EU’s “Right to Forget” Few people have experienced just how oppressive “privacy” regulation can be quite so directly as Peter Fleischer, Google’s Global Privacy Counsel.

“Privacy” as Censorship: Fleischer Dismantles the EU’s “Right to Forget”

Early last year, Peter was convicted by an Italian court because Italian teenagers used Google Video to host a video they shot of bullying a an autistic kid—even though he didn’t know about the video until after Google took it down. Of course, imposing criminal liability on corporate officers for failing to take down user-generated content is just a more extreme form of the more popular concept of holding online intermediaries liable for failing to take down content that is allegedly defamatory, bullying, invasive of a user’s privacy, etc. Both have the same consequence: Given the incredible difficulty of evaluating such complaints, sites that host UGC will tend simply to take it down upon receiving complaints—thus being forced to censor their own users. Judge Says It’s Reasonable For Any Photo Taken To Go Viral. A Dangerous Precedent? - Kashmir Hill. How FB is Redefininf Privacy.

Privacy. Privacy.