background preloader

Outils & Doc

Facebook Twitter

France - Résultats élections régionales 2015. Le Monde pousse sa data éditoriale. « Les données du Monde » sont en ligne depuis quelques jours à peine.

Le Monde pousse sa data éditoriale

Mélange de data et d’éditorial dédiés aux communes de France, elles ambitionnent de faire basculer Le Monde dans un data journalisme ambitieux et assumé. Robots-journalistes compris. Luc Bronner, directeur de la rédaction du Monde, s’explique. Interview. CB News : Vous avez discrètement mis en ligne il y a quelques jours « Les données du Monde ». Luc Bronner : C’est un sujet sur lequel nous travaillons depuis un an. Digital AgencyHei-Da. TextWrangler. TextWrangler TextWrangler is an all-purpose text and code editor for Mac OS X, based on the same award-winning technology as BBEdit, our leading professional HTML and text editor.

TextWrangler

We will be eventually retiring TextWrangler from our product line, and so we encourage anyone interested in TextWrangler to download and use BBEdit instead. We’ve put together a handy chart comparing BBEdit and TextWrangler, to help you out. Should I upgrade to BBEdit? Introducing Autotune. Written by Kavya Sukumar and Ryan Mark, July 8, 2015 Today we're announcing a new project we've been working on at Vox Media: Autotune.

Introducing Autotune

We built this application to address the problem of reusability in our work. This project is open source and available to everyone. As any news hacker knows, one of the most challenging requests we get is for "more of those things. " We'll make a neat chart, visualization or map, which sees some success: our readers or reporters like it or maybe it helps tell a better story. One of the most difficult messages to communicate to our non-developer colleagues is how tricky "reusability" is. It may sound as if this is a problem, a lack of foresight or a rookie mistake, but it is not. Home - Journalist's Resource Journalist's Resource: Research for Reporting, from Harvard Shorenstein Center. The best rapper alive, as decided by computers.

Finally, data science has begun tackling rap.

The best rapper alive, as decided by computers

It makes sense, because rap is a pretty good subject for algorithms to latch onto: lyrics are a dense data set, analysts have a lot of words to work with, and songs are heavy on allusions and references that make for fascinating connections. In 2014, Matt Daniels ranked rappers by the breadth of their vocabularies (Aesop Rock and GZA took first and second place, respectively). Algorithm That Counts Rap Rhymes and Scouts Mad Lines. “Men lie, women lie, numbers don’t” – Jay Z Among the many things rappers like to boast about, some are relatively easy to quantify, like money, whereas rhyming skills are something that have been very difficult to measure – up till now.

Algorithm That Counts Rap Rhymes and Scouts Mad Lines

In this post, I’ll present Raplyzer, a computer program which automatically detects rhymes from rap lyrics and which is used to rank popular rappers based on their average Rhyme factor. I’ll also present another program called BattleBot, which is a search engine for rhyming rap lines based on the algorithm used in Raplyzer. Automated Insights - High Quality Automated Content Services. New Articles on snopes.com. Emergent. Fact-checking U.S. politics. Semantria: Text Analytics and Sentiment Analysis for Everyone. Alpha: Computational Knowledge Engine. ClikView.

Publish your data online. Recueillir des données sur le Web. Recueillir des données sur le Web Vous avez tout essayé, et vous n’êtes toujours pas parvenu à mettre la main sur les données que vous voulez.

Recueillir des données sur le Web

Vous avez trouvé les données sur le web, mais hélas – aucune option de téléchargement n’est disponible et le copier-coller montre ses limites. N’ayez crainte, il y a toujours un moyen d’extraire les données. Vous pouvez par exemple tenter les actions suivantes. Data + Design. DocumentCloud. Visualization and Data Mining Software. OpenRefine (ex-Google Refine) How to Scrape Google Search Results with Google Sheets. Learn how to easily scrape Google search results pages and save the keyword ranking data inside Google Spreadsheets using the ImportXML formula.

How to Scrape Google Search Results with Google Sheets

This tutorial explains how you can easily scrape Google Search results and save the listings in a Google Spreadsheet. It can be useful for monitoring the organic search rankings of your website in Google for particular search keywords vis-a-vis other competing websites. Or you can exporting search results in a spreadsheet for deeper analysis. There are powerful command-line tools, curl and wget for example, that you can use to download Google search result pages. The HTML pages can then be parsed using Python’s Beautiful Soup library or the Simple HTML DOM parser of PHP but these methods are too technical and involve coding. If you ever need to extract results data from Google search, there’s a free tool from Google itself that is perfect for the job.

The idea is simple. Kimono : Turn websites into structured APIs from your browser in seconds. This Simple Data-Scraping Tool Could Change How Apps Are Made. The number of web pages on the internet is somewhere north of two billion, perhaps as many as double that.

This Simple Data-Scraping Tool Could Change How Apps Are Made

It’s a huge amount of raw information. By comparison, there are only roughly 10,000 web APIs–the virtual pipelines that let developers access, process, and repackage that data. In other words, to do anything new with the vast majority of the stuff on the web, you need to scrape it yourself. Even for the people who know how to do that, it’s tedious. Ryan Rowe and Pratap Ranade want to change that.

For the last five months, Rowe and Ranade have been building out Kimono, a web app that lets you slurp data from any website and turn it instantly into an API. Excitement’s already bubbling around the potential. Announcing Portia, the open source visual web scraper! DataMiner. Temboo. Web Data Platform & Free Web Scraping Tool. @joelmatriche » Le blog de jo. Lorsque des données sont correctement formatées sous forme de tableau dans une page web, il est facile de les importer directement dans une feuille Excel.

@joelmatriche » Le blog de jo

Et d'automatiser les mises à jour. Exemples : l'importation dynamique de cotations boursières dans une feuille de calcul et la surveillance, depuis Excel, des changements éventuellement apportés à une page web. Exemple 1 : la surveillance d'une page web. Imaginons que journaliste, je veuille consulter dès leur parution les comptes-rendus des commissions qui ont lieu à la Chambre belge des Représentants. Il existe bien sûr des outils de veille qui m'informeront de tout changement apporté, sur le site web de la Chambre, à la liste de ces commissions. La première étape est bien sûr d'ouvrir cette page et d'en copier l'adresse. Publier ses datas en responsive. ScraperWiki.