background preloader

ELK stack

Facebook Twitter

Building Microservices - O'Reilly Media. Throughout the text, the author introduces many different topics that must be taken into account, such as testing, and monitoring, among others.

Building Microservices - O'Reilly Media

Each chapter focuses on a specific subject. Here the author describes the problems of the old one huge application only and the benefits we get by moving towards microservices. He also covers the new challenges this new architecture brings with itself (nothing is free, after all). The whole thing is often coupled with real life anectods from the author's experience. As stated in the introduction, the book does not dive into any kind of platform or technology, thus ruling out becoming outdated in half a year.

Building Microservices is a pleasant read. Overall, a good read. As usual, you can find more reviews on my personal blog: books.lostinmalloc.com. How we build microservices at Karma. “Microservices” and “Microservice Architecture” are hot buzz words in the development community right now, but concrete examples of microservices in production are still scarce.

How we build microservices at Karma

I thought it might help to give a brief overview of how we’ve utilized microservices for our backend API at Karma over the past couple of years. It’s not exactly a “how-to,” more like a “why-to,” or a “wherefore-to,” but hopefully it will give you some insight as to whether microservices are appropriate for your application, and how you can go about using them. Why we chose microservices When we started building Karma, we decided to split the project into two main parts: the backend API, and the frontend application. The backend is responsible for handling orders from the store, usage accounting, user management, device management and so forth, while the frontend offers a dashboard for users which accesses this API.

For example, we have users, devices, and a store. 17_presentation. Logstash - open source log management. The ELK Stack in a DevOps Environment. If you're working in a DevOps shop, focus on business metrics is key.

The ELK Stack in a DevOps Environment

However, you want to derive those metrics and the data to support your meeting those key performance indicators using tools that support effective collaboration with a minimum of misery. Node.js - Is there a deployment tool similar to Fabric written in JavaScript? Pstadler/flightplan. Mscdex/ssh2. SSD Cloud Server, VPS Server, Simple Cloud Hosting. ELK Stack for Logging Tutorial - DevOps Library. Hello and welcome to the DevOps Library, this is Samantha with Episode 9, the ELK stack.

ELK Stack for Logging Tutorial - DevOps Library

Before we get started, let me quickly explain what the ELK stack is. ELK is an amazing Open Source logging system, and we have yet to find a single company that would not benefit tremendously by setting it up. VagrantFile: Transcript: Hello and welcome to the DevOps Library, this is Samantha with Episode 9, the ELK stack. Well, now that we’ve angered the Splunk gods, and hopefully made you interested in ELK, let’s get started. ELK stands for three separate pieces. Today we’re going to keep it simple, and install all of these on a single fresh 14.04 Ubuntu Server. Alright, let’s begin! How to Setup Realtime Analytics over Logs with ELK Stack. Once we know something, we find it hard to imagine what it was like not to know it. - Chip & Dan Heath, Authors of Made to Stick, Switch.

How to Setup Realtime Analytics over Logs with ELK Stack

Centralized logging with an ELK stack (Elasticsearch-Logstash-Kibana) on Ubuntu. Update 22/12/2015 I’ve reviewed the book Learning ELK stack by Packt Publishing, it’s available online for 5$ only: I’ve recently setup an ELK stack in order to centralize the logs of many services in my company, and it’s just amazing !

Centralized logging with an ELK stack (Elasticsearch-Logstash-Kibana) on Ubuntu

I’ve used the following versions of the softwares on Ubuntu 12.04 (also works on Ubuntu 14.04): Elasticsearch 1.4.1Kibana 3.1.2Logstash 1.4.2Logstash-forwarder 0.3.1. How to setup an Elasticsearch cluster with Logstash on Ubuntu 12.04. Hey there !

How to setup an Elasticsearch cluster with Logstash on Ubuntu 12.04

I’ve recently hit the limitations of a one node elasticsearch cluster in my ELK setup, see my previous blog post: Centralized logging with an ELK stack (Elasticsearch-Logback-Kibana) on Ubuntu After more researchs, I’ve decided to upgrade the stack architecture and more precisely the elasticsearch cluster and the logstash integration with the cluster. I’ve been using the following software versions: A bit on ElasticSearch + Logstash + Kibana (The ELK stack)

Big data in minutes with the ELK Stack. We’ve built a data analysis and dashboarding infrastructure for one of our clients over the past few weeks.

Big data in minutes with the ELK Stack

They collect about 10 million data points a day. Yes, that’s big data. My highest priority was to allow them to browse the data they collect so that they can ensure that the data points are consistent and contain all the attributes required to generate the reports and dashboards they need. Lusis/chef-logstash. Highly Available ELK (Elasticsearch, Logstash and Kibana) Setup - Everything Should Be Virtual. In this post I will be going over how to setup a complete ELK (Elasticsearch, Logstash and Kibana) stack with clustered elasticsearch and all ELK components load balanced using HAProxy.

Highly Available ELK (Elasticsearch, Logstash and Kibana) Setup - Everything Should Be Virtual

I will be setting up a total of four six servers (2-HAProxy, 2-ELK frontends and 2-Elasticsearch master/data nodes) in this setup however you can scale the ELK stack by adding additional nodes identical to logstash-1/logstash-2 for logstash processing and Kibana web interfaces and adding the additional node info to the HAProxy configuration files to load balance. You can also scale the Elasticsearch Master/Data nodes by building out addtional nodes and they will join the cluster.