background preloader

Current dev

Facebook Twitter

Scrapy | A Fast and Powerful Scraping and Web Crawling Framework. Charlesflynn/eve-docs. Python REST API Framework — Eve 0.5.2-dev documentation. Simplegeo/python-oauth2. INCR. Increments the number stored at key by one. If the key does not exist, it is set to 0 before performing the operation. An error is returned if the key contains a value of the wrong type or contains a string that can not be represented as integer.

This operation is limited to 64 bit signed integers. Note: this is a string operation because Redis does not have a dedicated integer type. The string stored at the key is interpreted as a base-10 64 bit signed integer to execute the operation. Redis stores integers in their integer representation, so for string values that actually hold an integer, there is no overhead for storing the string representation of the integer.

Return value Integer reply: the value of key after the increment Examples redis> SET mykey "10" redis> INCR mykey (integer) 11 redis> GET mykey Pattern: Counter The counter pattern is the most obvious thing you can do with Redis atomic increment operations. This simple pattern can be extended in many ways: Pattern: Rate limiter. Get Started with the License - Swedish API License. Pingles. Kafka is a high-throughput, persistent, distributed messaging system that was originally developed at LinkedIn. It forms the backbone of uSwitch.com’s new data analytics pipeline and this post will cover a little about Kafka and how we’re using it.

Kafka is both performant and durable. To make it easier to achieve high throughput on a single node it also does away with lots of stuff message brokers ordinarily provide (making it a simpler distributed messaging system). Messaging Over the past 2 years we’ve migrated from a monolithic environment based around Microsoft .NET and SQL Server to a mix of databases, applications and services.

These change over time: applications and servers will come and go. This diversity is great for productivity but has made data analytics as a whole more difficult. We use Kafka to make it easier for the assortment of micro-applications and services, that compose to form uSwitch.com, to exchange and publish data. Metadata and State Performance Batch Load into HDFS. Amazon web services - How to handle fields enclosed within quotes(CSV) in importing data from S3 into DynamoDB using EMR/Hive.

Python tweepy package v1.8, tweepy.streaming module source code :: PyDoc.net. Tweepy (twitter API) Status object structure.