The ELK Stack is a powerful open-source platform composed of Elasticsearch, Logstash, and Kibana that enables users to ingest, process, store, search, and visualize large volumes of structured and unstructured data in real time.
The ELK Stack excels at collecting time‑series data from distributed devices (such as the Raspberry Pi Pico W and the ESP32) and sensors, turning these into actionable insights. Its scalability allows efficient storage and rapid querying of vast IoT datasets, while its flexible ingestion pipeline (Logstash/Beats) and real‑time dashboards (Kibana) enable monitoring, anomaly detection, and predictive maintenance— all essential for managing complex IoT deployments across many devices.
We will follow the following tutorial for setting up the ELK-Stack: https://github.com/deviantony/docker-elk
Requirements: You will first need to install Docker.
This tutorial, based on the deviantony/docker-elk (GitHub repo)
guides you through running the ELK stack (Elasticsearch, Logstash, Kibana) on your local machine using Docker and Docker Compose.
Clone the repo
Initial setup
This sets up default users (elastic
, logstash_internal
, kibana_system
) with passwords from .env
.
Start the stack
Or use -d
to detach and run in the background.
Access Kibana Wait ~1 minute for Kibana to initialize, then open:
Username: elastic
Password: changeme
(unless changed in .env
).
Please change the password immediately after accessing Kibana. This can be done in the UI, and more simply with the .env file.
For increased security, reset passwords from defaults:
Update .env
with new passwords, then restart relevant services:
Netcat (TCP input):
Or upload sample data via Kibana UI.
Stop but keep data:
Remove everything (volumes included):
Main branch typically tracks Elastic 9.x
To use another version, modify .env
:
Rebuild and re-run:
Ports exposed:
JVM tuning via environment variables:
Adjust based on available memory.
Custom configuration:
elasticsearch/config/elasticsearch.yml
logstash/config/logstash.yml
+ pipelines in logstash/pipeline/
kibana/config/kibana.yml
elasticsearch/
, logstash/
, or kibana/
.extensions/
(e.g. TLS, Fleet).docker compose up setup
docker compose up -d
http://localhost:5601
We’ll simulate sensor readings (random values) on the Pico W and forward them as HTTP POST requests to Logstash’s TCP input, which writes into Elasticsearch.
urequests
Ensure your Pico W is running MicroPython and you have Thonny or a similar IDE set up.
"urequests" is typically bundled. If not:
Here's a sample script that posts JSON to Logstash (set in your docker-elk
logstash/pipeline/
):
📋 Important:
<YOUR_HOST_IP>
with your machine’s IP address on the Pico's Wi‑Fi network.In logstash/pipeline/http.conf
:
This makes Logstash listen on port 50000 for JSON payloads, parses them, and indexes into Elasticsearch.
pico-*