background preloader

Get the Data: Data Q&A Forum

Get the Data: Data Q&A Forum

22 free tools for data visualization and analysis Review April 20, 2011 06:00 AM ET Computerworld - You may not think you've got much in common with an investigative journalist or an academic medical researcher. But if you're trying to extract useful information from an ever-increasing inflow of data, you'll likely find visualization useful -- whether it's to show patterns or trends with graphics instead of mountains of text, or to try to explain complex issues to a nontechnical audience. Want to see all the tools at once? For quick reference, check out our chart listing all the tools profiled here. There are many tools around to help turn data into graphics, but they can carry hefty price tags. Related Blog Here's a rundown of some of the better-known options, many of which were demonstrated at the Computer-Assisted Reporting (CAR) conference last month. Data cleaning Before you can analyze and visualize data, it often needs to be "cleaned." DataWrangler Click on a row or column, and DataWrangler will suggest changes.

Qu’est-ce que l’open data ? – Des exemples emblématiques ParisData est lancé ! Comme elle l’avait annoncé, la ville de Paris a mis en ligne la semaine dernière un certain nombre de sets de données accessibles à tous. Mapize et bluenove en ont profité pour mettre en ligne, tout premier exemple de réutilisation des données ouvertes par Paris. Il vous permet de visualiser, par arrondissement et par années, le nombre de naissances, décès ou encore de mariages. D’autres réutilisations devraient progressivement se développer, d’autant plus que la variété des sets de données proposés sera grande. Après en avoir expliqué les grands principes, je vous propose de présenter aujourd’hui, au delà de l’exemple parisien, quelques autres exemples concrets d’initiatives d’ouverture des données, et ce que ces initiatives ont rendu possible. Données publiques avant tout ! La loi, notamment en France, encourage cette présence massive des données publiques dans le panorama de l’open data. Qui émet quel type de données ? Que peut on faire avec ?

The 70 Online Databases that Define Our Planet Back in April, we looked at an ambitious European plan to simulate the entire planet. The idea is to exploit the huge amounts of data generated by financial markets, health records, social media and climate monitoring to model the planet’s climate, societies and economy. The vision is that a system like this can help to understand and predict crises before they occur so that governments can take appropriate measures in advance. There are numerous challenges here. Nobody yet has the computing power necessary for such a task, neither are there models that will can accurately model even much smaller systems. Today, we get a grand tour of this challenge from Dirk Helbing and Stefano Balietti at the Swiss Federal Institute of Technology in Zurich. It turns out that there are already numerous sources of data that could provide the necessary fuel to power Helbing’s Earth Simulator. WikipediaWikipedia is the most famous cooperatively edited encyclopedia. Where’s George?

RFC 6749 - The OAuth 2.0 Authorization Framework [Docs] [txt|pdf] [draft-ietf-oauth-v2] [IPR] [Errata] PROPOSED STANDARD Errata Exist Internet Engineering Task Force (IETF) D. Hardt, Ed. RFC 6749 OAuth 2.0 October 2012 Table of Contents 1. RFC 6749 OAuth 2.0 October 2012 5. RFC 6749 OAuth 2.0 October 2012 Appendix A. 1. In the traditional client-server authentication model, the client requests an access-restricted resource (protected resource) on the server by authenticating with the server using the resource owner's credentials. RFC 6749 OAuth 2.0 October 2012 o Compromise of any third-party application results in compromise of the end-user's password and all of the data protected by that password. RFC 6749 OAuth 2.0 October 2012 1.1. OAuth defines four roles: resource owner An entity capable of granting access to a protected resource. RFC 6749 OAuth 2.0 October 2012 1.2. RFC 6749 OAuth 2.0 October 2012 (E) The client requests the protected resource from the resource server and authenticates by presenting the access token. 1.3. 1.4.

Europe: Fusion Tables (Beta) Bust your data out of its silo! Get more from data with Fusion Tables. Fusion Tables is an experimental data visualization web application to gather, visualize, and share data tables. Visualize bigger table data online Filter and summarize across hundreds of thousands of rows. Two tables are better than one! Merge two or three tables to generate a single visualization that includes both sets of data. Make a map in minutes Host data online - and stay in control Viewers located anywhere can produce charts or maps from it. Visualize bigger table data online Import your own data Upload data tables from spreadsheets or CSV files, even KML. Visualize it instantly See the data on a map or as a chart immediately. Publish your visualization on other web properties Now that you've got that nice map or chart of your data, you can embed it in a web page or blog post. See how journalists and nonprofits around the world use Fusion Tables Two tables are better than one! Make a map in minutes Share that map!

The Emerging Field of Data Markets – our competitive landscape One of the most common questions we’ve gotten since our international launch is “How is DataMarket different from X?”, where X is pretty much any of the companies or solutions mentioned below. While the obvious answer to that question is: “Try us out and see for yourself!“, this post is an attempt to explain. What is DataMarket? DataMarket is a search engine for statistical data. Or, as we wrote in the first draft of the business plan back in early 2008: DataMarket’s mission is to build an active marketplace for structured data and statistics. That said, here’s a rundown of some of the other players in this emerging field: Closest Matches Timetric Website: @timetric Status and background: Timetric is a UK based startup. In their own words: We design and build statistical data services to help people and businesses make better decisions. Team size: 6 (according to CrunchBase) Google Public Data Website: Team size: Approx. 10 people

The Palestine Papers How I Made Porn 20x More Efficient with Python Intro Porn is a big industry. There aren’t many sites on the Internet that can rival the traffic of its biggest players. And juggling this immense traffic is tough. To make things even harder, much of the content served from porn sites is made up of low latency live streams rather than simple static video content. What’s the problem? A few years ago, I was working for the 26th (at the time) most visited website in the world—not just the porn industry: the world. At the time, the site served up porn streaming requests with the Real Time Messaging protocol (RTMP). The user requests access to some live stream The server replies with an RTMP session playing the desired footage For a couple reasons, FMS wasn’t a good choice for us, starting with its costs, which included the purchasing of both: Windows licenses for every machine on which we ran FMS. ~$4k FMS-specific licenses, of which we had to purchase several hundred (and more every day) due to our scale. All of these fees began to rack up.

Related:  DataData Project Machine ~-Data SourcesData Journalisme