FTC looks at the future of news Posted by Josh Cohen, Senior Business Product Manager For the next two days, the Federal Trade Commission will explore a subject that's central to democracy: the future of news. I'll be representing Google at the event, which the commission is calling "From Town Criers to Bloggers: How Will Journalism Survive the Digital Age?" We're an optimistic company, so maybe it's no surprise that we believe journalism will not only survive, but thrive on the Internet. Why does Google care about the future of the news? Traffic: Google makes it easy for people to find the news they're looking for and discover new sources of information. Audience engagement: Google offers news publishers free tools to better engage with their audiences. Revenue: Google provides a variety of advertising solutions to help publishers maximize their revenue. Just as there's no single cause for the news industry's current struggles, there's no single solution.
How the New York Times and CNN try to keep up with the tech comp "The New York Times is now as much a technology company as a journalism company," its executive editor Bill Keller said recently. A glance at the top 10 breaking news sites online shows how seriously that statement must be taken, because in 2009 that list was often led by a tech company rather than a traditional news organisation. AOL News, Yahoo News or MSNBC News attract more US readers than CNN – or the New York Times. Being a big traditional news brand doesn't necessarily bring you success on the web. The now renovated Courthouse displays the situation of news organisations perfectly: lots of nicely renovated rooms, but no windows to get what happens outside. Today, to get the platform right is as important as the quality of the content. The two biggest US players for quality news, CNN and the New York Times, are dealing with this challenge in quite different ways. R&D at the New York Times The future of news consumption is the core of the technological approach of the Times.
Getting the CSS Internet 2.0 religion . . . . . . . . . (don't m Posted by Tom Foremski - October 17, 2006 The past few days I've been working on my CSS skills--the media technology that lies at the heart of this next phase of the Internet. I don't mind learning some Geek, in fact I speak a little Geek, I used to be a software engineer for a very short time a long time ago. I'm of the opinion that these days, I should be a "technology enabled" journalist and I encourage my media colleagues to do the same. I don't need to be proficient in these computer languages, but I should know enougth to be able to do basic things with these tools, because there is an opportunity for journalists to become "media engineers." When I worked as a mainstream journalist, we didn't have to learn an alphabet soup of new skills all the time. Our main requirement as journalists was to meet deadlines, we didn't need exotic skills such as typing or spelling. Cascading Style Sheets (CSS) can deconstruct the media world on the fly...
Media Curation Is Now Consumer-Generated Can 'Curation' Save Media? Why Social Beats Search That's a controversial post headline and I don't mean that social will always beat search, but there's a rising chorus out there about "content farms" and search optimized content creation that is worth touching on. Arrington started it when he posted about "the end of hand crafted content". Richard MacManus penned a similar post the same day called "Content Farms: Why Media, Blogs, and Google should be worried". When a web service like Google controls a huge amount of web traffic (>50% for many sites), it's going to get spammed up. What's worse, and what Mike and Richard are talking about, is the act of search engine driven content creation. I left this comment at the end of a very long comment thread on Arrington's post: social tools will allow us to decide what is crap and what is not. our social graphs will help us. search engines won’t. it’s a lot harder to spam yourself into a social graph. The Internet is a massive content creation machine.
Content farms I've been writing a lot about so-called 'content farms' in recent months - companies like Demand Media and Answers.com which create thousands of pieces of content per day and are making a big impact on the Web. Both of those two companies are now firmly inside the top 20 Web properties in the U.S., on a par with the likes of Apple and AOL. Big media, blogs and Google are all beginning to take notice. Chris Ahearn, President of Media at Thomson Reuters, recently published an article on how journalism can survive in the Internet age. I started my analysis of Demand Media in this August post. In November I explored more about how Demand Media produces 4,000 pieces of content a day, based on an interview I did with the founders in September. Low Quality, High Impact The bottom line is that the quality of content produced by these 'content farms' is dubious, which has an impact on both publishers and readers. Can Quality Survive? Google Needs to Wake Up and Smell the Coffee See also:
Content farms v. curating farmers Tweet: Content farms v curating farmers: Deeper insights in Demand Media’s model & finding opportunity in finding quality. I spent an hour on the phone the other day with Steven Kydd, exec VP of Demand Studios, to understand their model—using algorithms to assign content creation based on search and advertising demand and to minimize cost and maximize revenue—because I wanted to learn a deeper layer of lessons than I think we’re hearing in the discussion of Demand’s allegedly evil genius. The talk thus far misses their key insight and the opportunities they create. Much of what I see online is fear that Demand Media—with the slightly rechristened “Aol.” following fast behind—will cheapen content and flood the internet—that is, search results—with crap that’s just good enough to fool algorithms. Some also fear that while putting content creators to work they will put better content creators out of work: the dreaded deprofessionalization and deflation of media. They may be right.
World's Biggest Blogging Platform Adds Curation Feature WordPress, the biggest blog software platform on the Web, has added a "reblogging" curation feature much like the smaller innovative service Tumblr has offered for years. It's another chapter in the race to decrease friction in sharing your favorite Web content with friends. If the previous era of innovation on the Web was fundamentally characterized by the democratization of publishing and content creation, the next era may be based on finding solutions for building value on top of all that newly published data. Much of that value capture will be performed by machines, but tools for humans could be a game changer as well. As we wrote yesterday, Google VP Marissa Mayer says the average person uploaded 15 times more data in 2009 than they did just three years ago. The gap between the value that's made possible by all this data, and the power of the tools available to consumers to capture it, is so great that it simply must be filled. Can Curation Catch On? What do you think?
Why Content Curation Is Here to Stay Steve Rosenbaum is the CEO of Magnify.net, a video Curation and Publishing platform. Rosenbaum is a blogger, video maker and documentarian. You can follow him on Twitter @magnify and read more about Curation at CurationNation.org. For website content publishers and content creators, there's a debate raging as to the rights and wrongs of curation. The debate pits creators against curators, asking big questions about the rules and ethical questions around content aggregation. In trying to understand the issue and the new emerging rules, I reached out to some of the experts who are weighing in on how curation could help creators and web users have a better online experience. The Issues at Hand Content aggregation (the automated gathering of links) can be seen on sites like Google News. But all that changes with curation — the act of human editors adding their work to the machines that gather, organize and filter content. Who are curators? Where We Stand Now
Yahoo and Google in high-tech news war Google News ranks sixth among online news sites, with just under 15 million unique visitors, behind MSNBC, AOL News, CNN and the New York Times, according to Nielsen. Started in 2002, the service flipped the traditional way that people consumed news on its head. Instead of delivering the day's stories from pre-selected sources, as a newspaper or even many online aggregators do, it offers headlines from a number of outlets and allows people to choose which ones to read. At the end of last month, the Mountain View company redesigned the site, allowing users to pick the subjects and sources they're most interested in, as well as local news and weather for areas they choose. Humans write the Google algorithms that pull in the news, but beyond that, the computers decide which headlines show up where. "It's a different business model," said Josh Cohen, senior product manager for Google News. It's an important distinction for publishers.