background preloader

Get the Data: Open Data Q&A Forum

Get the Data: Open Data Q&A Forum

The 70 Online Databases that Define Our Planet Back in April, we looked at an ambitious European plan to simulate the entire planet. The idea is to exploit the huge amounts of data generated by financial markets, health records, social media and climate monitoring to model the planet’s climate, societies and economy. The vision is that a system like this can help to understand and predict crises before they occur so that governments can take appropriate measures in advance. There are numerous challenges here. Nobody yet has the computing power necessary for such a task, neither are there models that will can accurately model even much smaller systems. Today, we get a grand tour of this challenge from Dirk Helbing and Stefano Balietti at the Swiss Federal Institute of Technology in Zurich. It turns out that there are already numerous sources of data that could provide the necessary fuel to power Helbing’s Earth Simulator. WikipediaWikipedia is the most famous cooperatively edited encyclopedia. Where’s George?

Europe: PublicData.eu 5 Ways to find, mix and mash your data :: 10,000 Words One of the most popular trends in online journalism is taking publicly available data and translating it into visualizations or infographics that readers and viewers can quickly and easily understand. A large percentage of the visualizations you see on the web were built from scratch, which can take a considerable amount of time and effort. The following sites allow you to mash your data in record time. Swivel Swivel features more than 15,000 data sets for users to play with in various categories ranging from Economics to Health to Technology. Socrata Socrata is an online space for data lovers to browse datasets as well as create new visualizations to share with others. Widgenie Widgenie lets users upload data from a variety of sources such as Excel spreadsheets, CSV files or Google Spreadsheets and using a drag and drop interface to create custom charts and graphs. Verifiable Like the previously mentioned sites, Verifiable allows users to upload, mash and visualize data. DataMasher

Masterclass 20: Getting started in data journalism If you are impatient to get started, and just quickly do some data journalism, click here If you aren't a subscriber, you'll need to sign up before you can access the rest of this masterclass If you want to find out what data journalism is, and what it's for, before you get stuck in, then read on, or click on the video or audio files Video: Are you confused about what data journalism is, how you do it, and what its purpose is? If so, join the club. There is a mystique surrounding data journalism; it’s almost like it’s a dark art and you have to be a wizard to practise it. A very few people are brilliant at it, a number have dabbled in it, loads of journalists think they probably ought to find out about it, but most fear they probably won’t be able to master it. All this throws up a smoke screen about the subject that I hope to dispel in this masterclass. What data journalism is I am to show what data journalism is, what it can do, and how to do it.

Gapminder: Unveiling the beauty of statistics for a fact based world view. - Gapminder.org The Palestine Papers UK Open Data Institute (Silicon Roundabout) Posted by Information Age on 28 November 2011 Share article 0googleplus Short of time? Print this pageEmail article The government will announce a number of open data initiatives tomorrow, including a new Open Data Institute near 'Silicon Roundabout'. The Open Data Institute will "innovate, exploit and research open data opportunities with business and academia", chancellor George Osborne will announce tomorrow. The institute will be directed by leading open data academics Professor Nigel Shadbolt and web inventor Sir Tim Berners-Lee. The scheme recalls the Institute for Web Science, an academic research centre proposed by former prime minister Gordon Brown in March 2010, which was also due to be run by Shadbolt and Berners-Lee. "We want to build on the outstanding work Sir Tim and Nigel Shadbolt have put in to 'making public data public'," said Brown at the time. However, the current government scrapped the £30 million plan in May 2010, saying it was a "low priority".

Tools to help bring data to your journalism « Michelle Minkoff NOTE: This entry was modified on the evening of 11/9/10 to deal with typos and missing words, resulting from posting this too late the previous night. Sleep deprivation isn’t always a good thing — although it allows one to do things more fun than sleep. Like play with data. Note to self: Be more careful in the future. Many of the stories we do every day, across beats, could benefit from a data component. Luckily, a lot of great design and programming folks have created tools to make it easier to organize, clean and display data. So, here’s a round up of some tools you can use to rapidly produce data pieces without programming knowledge. Prepping tables Tableizer – – Copy and paste cells from your Excel spreadsheet into this tool, and it’ll spit back a formatted HTML table that you can copy and paste into a CMS of your choice. Interactive viz – no programming Static viz Use programming to make custom charts

Coding for Journalists 101 : A four-part series | Dan Nguyen pronounced fast is danwin Photo by Nico Cavallotto on Flickr Update, January 2012: Everything…yes, everything, is superseded by my free online book, The Bastards Book of Ruby, which is a much more complete walkthrough of basic programming principles with far more practical and up-to-date examples and projects than what you’ll find here. I’m only keeping this old walkthrough up as a historical reference. I’m sure the code is so ugly that I’m not going to even try re-reading it. So check it out: The Bastards Book of Ruby -Dan Update, Dec. 30, 2010: I published a series of data collection and cleaning guides for ProPublica, to describe what I did for our Dollars for Docs project. So a little while ago, I set out to write some tutorials that would guide the non-coding-but-computer-savvy journalist through enough programming fundamentals so that he/she could write a web scraper to collect data from public websites. DISCLAIMER: The code, data files, and results are meant for reference and example only.

Related: