# LLM Seed Configuration Feature ## Overview This feature introduces a **configurable LLM Seed** value that allows users to control the deterministic behavior of LLM responses. By setting a specific integer seed, users (especially developers and prompt engineers) can reproduce consistent outputs for the same prompt and parameters. --- ## 1. UI: Add “LLM Seed” Field ### Description A new optional numeric input field **“LLM Seed”** will be added to the **LLM Configuration** section in the UI. This allows users to specify an integer seed value that controls the reproducibility of model generations. ### Behavior * Field label: **Seed** * Input type: numeric (integer) * Placeholder: `e.g. 42` * Optional: If left empty, the backend will not include a seed in the configuration. * Validation: * Must be an integer. * Acceptable range: `0` – `2,147,483,647` (32-bit signed integer). * Non-numeric values should show a validation warning. ### UI Layout ![image](https://hackmd.io/_uploads/ByzXQGR6xl.png) --- ## 2. Kalimera Backend: Expose Seed in Configuration ### Description The Kalimera Backend must include the new **`seed`** field as part of the configuration object returned from the LLM settings API. ### Technical Details * **New field:** `seed` * **Type:** `integer | null` * **Source:** Value provided from the UI configuration. * **Default:** `null` (if no seed was entered by the user). ![image](https://hackmd.io/_uploads/SywTXzCTxl.png) ### Example Payload ```json "dialogNodes": [ { "topP": 1, "nodeType": "Welcome", "maxTokens": 15000, "userPrompt": "", "temperature": 0.1, "seed": 42,... ``` If no value is provided: ```json "dialogNodes": [ { "topP": 1, "nodeType": "Welcome", "maxTokens": 15000, "userPrompt": "", "temperature": 0.1, "seed": null,... ``` --- ## 3. VoiceAI Integration: Use Seed When Creating LLM Connection ### Description The **VoiceAI team** should update the LLM connection logic to consume the new `seed` field from the configuration and include it when initializing or sending requests to the LLM API. ### Implementation Notes * Retrieve `seed` from the Kalimera configuration. * When creating the LLM connection or request payload, include: ```json "seed": <value> ``` * If `seed` is `null`, omit the field entirely (to allow random generation as usual).