Building an ELK stack with docker-compose

Because I have a hard time searching logfiles during development (I like to run everything on DEBUG), I decided to build myself an ELK stack (elasticsearch, logstash and kibana) to throw all my logs into and have a nice UI to search for a special log message.

Fortunately there are official docker images for these tools:

So, everything is easily available, I just needed to figure out how to glue it together.

Because I’m not interested in storing the data over a long period of time, I don’t care about the setup of the elasticsearch engine. When I’m done developing or debugging things, I want to throw away everything and start with a clean environment. So I don’t store anything outside of the docker containers and I don’t want to write any Dockerfiles myself.

The way to go is a simple docker-compose.yml which I can start with a single command and have everything set up to accept log messages from my java applications. So, here we go:

  image: elasticsearch
  - 9200:9200
  image: logstash:latest
  - elasticsearch:elasticsearch
  - 12201:12201
  command: logstash agent --debug -e 'input { log4j { mode => "server" port => "12201"} } output { elasticsearch { hosts => ["elasticsearch"] } stdout {} }'
  image: kibana
  - elasticsearch:elasticsearch
  - 5601:5601
  - ELASTICSEARCH_URL=http://elasticsearch:9200

As you can see, I only use images - no custom Dockerfile needed. I also put the configuration for logstash and kibana into the docker-compose.yml, so when I use another computer (and I do that often) I can just copy this one file, run docker-compose up, open http://localhost:5601 and get going.

I also exposed port 12201 on the logstash host to be able to send log messages from locally running applications. I modified my and added a new appender named logstash with the following configuration:


The comes from the default log4j installation, so no funky external dependencies needed. Just add those lines when deploying locally and the application will log to my ELK stack.

I tried to use docker networks, but logstash acted up and I think it does not particularly like underscores in the elasticsearch hostname. Unfortunately, docker-compose will generate hostnames with underscores, so I had to stick with the old way and use links. If you have any insight how to use a network, please get in touch.

Weitere Artikel

Nach den Crossfit Open

Crossfit Open WOD 17.5

Crossfit Open WOD 17.4

Crossfit Open WOD 17.3

Crossfit Open Workout 17.2

Crossfit Open Workout 17.1

Run Feedbin in your local Kubernetes cluster

Running Threema Web in Docker

Neues Workout Video

Diät Update #1