# LLM API User Guide
Internal LLM Chat API Documentation
> This internal API allows authorized users to interact with a local Large Language Model (LLM). The system uses Basic Authentication and provides endpoints to list available models and generate chat completions.
## Servers
```
http://X.X.X.X:8000
```
## Security Schemes
```
basicAuth
```
## Authentication
- Method: HTTP Basic Authentication
- Users: Predefined in the users dictionary
- Access control:
Users in the blocked group are denied access
- Only users in admin or user groups can access /v1/chat/completions
## GET / — Health Check
- Description: Verify that the API is running.
- Auth Required: ❌ No
Response Example:
> { "message": "FastAPI is running" }
## GET /v1/models — List Available Models
- Description: Retrieve the list of LLM models registered with the local Ollama backend.
- Auth Required: ✅ Yes (Basic Auth)
Response Example:
```
{
"object": "list",
"data": [
{
"id": "gemma:2b",
"object": "model",
"owned_by": "ollama",
"root": "sha256:abc123..."
}
]
}
```
Errors:
- 401 Unauthorized: Invalid or missing credentials
- 500 Internal Server Error: Ollama backend unreachable
## POST /v1/chat/completions — Generate Chat Completion
Description: Submit a prompt (text and/or image) to the LLM and receive a response.
Auth Required: ✅ Yes (Basic Auth)
Permissions: Only admin or user group members
Request Body (application/json):
```
{
"model": "gemma:2b",
"messages": [
{
"role": "user",
"content": "Tell me about the solar system."
}
],
"temperature": 0.7,
"top_p": 0.9,
"max_tokens": 1024,
"stream": false
}
```
Errors:
- 400 Bad Request: No valid user input provided
- 401 Unauthorized: Invalid credentials
- 403 Forbidden: Group not allowed
- 422 Unprocessable Entity: Model returned nothing
- 500 Internal Server Error: Ollama call failed
## Chat Logging
All completed interactions are stored in daily .jsonl log files:
- Format: chat_log_YYYY-MM-DD.jsonl
- Logged fields:
- Timestamp
- Session ID
- User
- Input & Response messages