## BLOG 23: tiny toy application**—a pocket-sized traffic booth
## **tiny toy application**—a pocket-sized traffic booth—that:
* Accepts incoming “requests” and drops them into an in-memory **queue**
* Exposes a **metric** (`myapp_queue_length`) at `/metrics`
* Processes **1 request every 2 seconds** (like a sleepy clerk sipping tea between tasks)
# 🟦 **📦 Dummy App (Node.js)**
:::spoiler **app.js**
```js
const express = require("express");
const client = require("prom-client");
const app = express();
app.use(express.json());
// Create a registry
const register = new client.Registry();
// Custom metric – current queue length
const queueLengthGauge = new client.Gauge({
name: "myapp_queue_length",
help: "Current number of items in the processing queue"
});
register.registerMetric(queueLengthGauge);
// Our in-memory queue
let queue = [];
// Endpoint to add an item to the queue
app.post("/enqueue", (req, res) => {
const item = req.body.item || `job-${Date.now()}`;
queue.push(item);
queueLengthGauge.set(queue.length);
console.log(`Enqueued: ${item}, queue length is now ${queue.length}`);
res.json({ message: "Item added", queue_length: queue.length });
});
// Worker that processes 1 item every 2 seconds
setInterval(() => {
if (queue.length > 0) {
const item = queue.shift();
console.log(`Processed: ${item}`);
queueLengthGauge.set(queue.length);
}
}, 2000);
// Expose metrics to Prometheus
app.get("/metrics", async (req, res) => {
res.set("Content-Type", register.contentType);
res.end(await register.metrics());
});
// Simple health endpoint
app.get("/", (req, res) => {
res.send("Queue processor is running");
});
const port = 8080;
app.listen(port, () => {
console.log(`Dummy queue app running on port ${port}`);
});
```
:::
---
:::spoiler **package.json**
```json
{
"name": "dummy-queue-app",
"version": "1.0.0",
"main": "app.js",
"license": "MIT",
"dependencies": {
"express": "^4.18.2",
"prom-client": "^14.1.0"
}
}
```
:::
---
:::spoiler **Dockerfile**
```Dockerfile
FROM node:18-alpine
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY . .
EXPOSE 8080
CMD ["node", "app.js"]
```
:::
---
[Optional]
# 🧪 **Test the app locally**
Start the server:
```bash
node app.js
```
Add items to the queue:
```bash
curl -X POST http://localhost:8080/enqueue -H "Content-Type: application/json" -d '{"item": "A"}'
```
Check Prometheus metrics:
```bash
curl http://localhost:8080/metrics
```
You'll see:
```
# HELP myapp_queue_length Current number of items in the processing queue
# TYPE myapp_queue_length gauge
myapp_queue_length 3
```
The worker will eat one item every 2 seconds 🍽️⏲️
---
# 🌿 **This app is perfect for HPA testing**
Because:
* Incoming traffic → queue grows → metric spikes
* Prometheus scrapes `myapp_queue_length`
* HPA scales when queue > 10
* More pods = more queue consumers → queue clears faster
It's like hiring extra staff when the line gets too long.
---