Predictions for Journalism 2016 » Collections » Nieman Journalism Lab » Pushing to the Future of Journalism. The End of the News as We Know It: How Facebook Swallowed Journalism. Good afternoon, everyone.
Thank you for coming. It is a great pleasure and honour to be in Cambridge this week as part of the Humanitas CRASSH visiting lecture programme. I must thank my former Observer colleague Professor John Naughton for proposing that I do this, and to the Humanitas committee for bringing me here. I would also like to thank St. John’s College, and in particular the Master, Chris Dobson, for their wonderful hospitality. The first talk in this series has a rather apocalyptic title — ‘The End of News As We Know It: How Facebook Swallowed Journalism’.
Capturing Value: Data Journalism as a Revenue Supplement. By Jake Batsell The byproducts of journalism rarely have value to anyone besides the reporters who gather and assemble the information.
(Exhibit A: The troves of spiral notebooks, manila folders and microcassettes left over from my newspaper days, still gathering dust in my garage.) But more news organizations are discovering that cleaned-up, searchable databases have extra value beyond their journalistic utility — and, better yet, can generate revenue to support even more public-interest reporting. The inverted pyramid of data journalism. I’ve been working for some time on picking apart the many processes which make up what we call data journalism.
Indeed, if you read the chapter on data journalism (blogged draft) in my Online Journalism Handbook, or seen me speak on the subject, you’ll have seen my previous diagram that tries to explain those processes. I’ve now revised that considerably, and what I’ve come up with bears some explanation. I’ve cheekily called it the inverted pyramid of data journalism, partly because it begins with a large amount of information which becomes increasingly focused as you drill down into it until you reach the point of communicating the results. What’s more, I’ve also sketched out a second diagram that breaks down how data journalism stories are communicated – an area which I think has so far not been very widely explored. But that’s for a future post. I’m hoping this will be helpful to those trying to get to grips with data, whether as journalists, developers or designers. In the age of big data, data journalism has profound importance for society - Data.
The promise of data journalism was a strong theme throughout the National Institute for Computer-Assisted Reporting’s (NICAR) 2012 conference.
In 2012, making sense of big data through narrative and context, particularly unstructured data, will be a central goal for data scientists around the world, whether they work in newsrooms, Wall Street or Silicon Valley. How to be literate in what’s changing journalism. Recalculating the newsroom: The rise of the journo-coder? Credit: Image by Arbron on Flickr.
Some rights reserved This is an edited version of a chapter fromData Journalism: Mapping the Future, being launched tomorrow by Abramis academic publishing, republished with kind permission. Data journalism: Mapping the future (RRP £15.95) is available at a reduced rate of £12 for Journalism.co.uk readers. Contact email@example.com for further information.
"Why all your students must be programmers" was the provocative title of one of the liveliest panel discussions at the August 2013 Conference for the Association for Education in Journalism and Mass Communication in Washington DC. The panellists talked passionately about how their programming skills enabled them to take their journalism to a whole new level – interrogating data to find the stories nobody else could or turning static, text- based web pages into dynamic, interactive tools. There was less agreement about what level of "programming" knowledge is actually useful to a journalist. Ctrl + ← The Best Data Journalism Of 2014. Each week, we let you in on the most-read FiveThirtyEight articles and share our favorite data journalism from elsewhere on the Internet in Ctrl + ←.
Seeing as it’s Dec. 28 and the new FiveThirtyEight has now reached the ripe old age of 287 days, we thought it was worth rounding up all the roundups and taking a look at 2014’s biggest and best data journalism pieces, starting with our own. LGBT rights: Same-sex marriage made some major gains in the United States this year, but this interactive feature from The Guardian reminds us that being lesbian, gay, bisexual or transgender is still illegal in 79 countries around the world.
Feilding Cage, Tara Herman and Nathan Good explored the legal status of sex, marriage, civil partnerships, adoption, workplace discrimination and protection against hate crimes for the world’s LGBT population. What the New York Times's 'Snow Fall' Means to Online Journalism's Future. The New York Times debuted a new multimedia feature Thursday so beautiful it has a lot of people wondering — especially those inside the New York Times — if the mainstream media is about to forgo words and pictures for a whole lot more.
Unlike a standard words-on-page article that doesn't diverge too much from print in the design department, "Snow Fall," a multi-"chapter" series by features reporter John Branch, integrates video, photos, and graphics in a way that makes multimedia feel natural and useful, not just tacked on. This tale of last February's Tunnel Creek avalanche — which is also worth reading, not just looking at — opens with full-screen video on loop (we know it looks like a GIF, but it's not) of snow blowing off a mountain. Scrolling down, Branch's text gets peppered with videos and even more striking, big images.
A fundamental way newspaper sites need to change. A blog entry titled 9 Ways for Newspapers to Improve Their Websites has been making the rounds lately.
I don’t write about the online news industry on this site as much as I used to, but this article inspired me to collect my current thinking on what newspaper sites need to do. Here, I present my opinion of one fundamental change that needs to happen. For background: I have a journalism degree, for what it’s worth, and I’ve worked for newspaper Web sites since 1998 (including the college paper and internships). Vox is our next. Early last year, Melissa Bell, Matt Yglesias and I began wrestling with a question that had bugged all of us for a long time: why hadn't the Internet made the news better at delivering crucial context alongside new information?
This year, we're founding a new publication at Vox Media in order to do something about it. New information is not always — and perhaps not even usually — the most important information for understanding a topic. The overriding focus on the new made sense when the dominant technology was newsprint: limited space forces hard choices. At Circa, it’s not about ‘chunkifying’ news but adding structure. You sometimes hear what we do at Circa described as “chunkifying” — taking the news and presenting it in mobile-friendly chunks.
And while on the surface this observation is correct, it misses the bigger picture. If There Was Already an Ocean of Data in 2007, How Much is there Now? I’ve been trying to figure out how to convey the scale of the ‘Big Data‘ phenomenon — the recent worldwide explosion of the volume of data encoded in digital form. Inspiration came from Randall Munroe’s fantastic “What if?” Comics, which provide “Serious Scientific Answers to Absurd Hypothetical Questions.” (check out his 2o14 TED talk and pre-order the “What if?” Book.) So I decided to (poorly) imitate his methodology and try to seriously answer the question posed in the title of this post – if there was already an ocean of data in the world in 2007, how much more ‘datawater’ was there in 2013? I chose 2007 for no particularly good reason.
Everyblock. APIs: The new distribution. The Guardian just announced that it is releasing all its content through an API as well as making available many different data sets through a data store, all of which can be mashed up into others’ sites and applications. They join other organizations – the BBC, National Public Radio, and The New York Times – in releasing APIs; notes that it’s the creme of news that sees the wisdom in APIs. The Guardian’s offers more than headlines: articles, video, galleries, everything. It also adds one more important element to its offering: a business model, creating an ad network for users of the API. Upendra Shardanand, my partner at and the founder of Daylife, has been saying for a few years that APIs are the future of distribution. The Guardian says its API will put its content “into the fabric of the internet.” Institute: Get Smart About Your Readers.
A fundamental way newspaper sites need to change. Database journalism – a different definition of “news” and “reader” Politifact is an innovative journalism project built by Matt Waite, as a project of the St. Petersburg Times, inspired by Adrian Holovaty’s 2006 manifesto on “database journalism”. Waite and Holovaty both focus on the “shape” of the information presented by database journalism – stories that have a consistent set of data elements that can be gathered, presented, sliced, and re-used. This structure is foreign to traditional journalism which thinks of its form as the story, with title, date, byline, lede, body. The Politifact site started by fact-checking politicians’ statements during the 2008 political campaign. (Re)Structuring Journalism. So what is this Structured Journalism thing anyway? It’s not database-driven journalism, although there are elements of that. It’s not Wiki-driven journalism, although there could be elements of that, too.
And it’s not writing to a template, although there will be some of that. Using Data Visualization as a Reporting Tool Can Reveal Story’s Shape. Readers have come to rely on interactive presentations to understand complicated stories, using them to zoom in on periods of time and highlight areas of interest. Yet to investigate these stories, reporters often create what amounts to handcrafted investigative art: flow charts with circles and arrows, maps shaded with highlighters and stuck with pins. More and more, though, some reporters are using data visualization tools to find the story hidden in the data. Those tools help them discover patterns and focus their reporting on particular places and times. Many of the presentations, which can have rough interfaces or less-than-sleek design, are never published.
At the recent National Institute for Computer-Assisted Reporting (NICAR) conference, Sarah Cohen, database editor for The Washington Post‘s investigative team — and recently named professor of computational journalism at Duke University — showed how reporters can use interactive graphics for their exploratory reporting. #ijf11: Lessons in data journalism from the New York Times. The inverted pyramid of data journalism. Voices: News organizations must become hubs of trusted data in a market seeking (and valuing) trust. How to be a data journalist. The growing importance of data journalism. Les données pour comprendre le monde. Michelle Minkoff » Bringing data journalism into curricula. 5 tips for getting started in data journalism. Data journalist.