# DEC112 Chatbot Deployment
#### Content
* [Functionality](#Functionality)
* [System Requirements](#System-Requirements)
* [Tests & Monitoring](#Tests-amp-Monitoring)
* [KPIs](#KPIs)
* [Roadmap](#Roadmap)
* [Findinngs](#Findings)
## Functionality
The chatbot registers as an available endpoint for training chats (initiated in the DEC112 app with the “Test Emergency” button. (It can also store conversations [upon consent from the user] and share it later using Data Agreements with an emergency response provider who can use it as additional training material for call takers. -> currently not included)
### User Stories
* As a deaf person I want to be able to train emergency communication using chat functionality so that I can prepare for an emergency situation.
* As a user of an emergency chat training system I want to have the call taker simulation as realistic as possible so that I can gain practical experience in handling crisis situations.
* As an emergency response organisation, I want our call takers to receive specialised training in communicating with deaf individuals through emergency chat systems, so that they can more effectively address the unique needs of this community in crisis situations. A crucial component of this training is the inclusion of exemplary chat conversations.
* As the DEC112 organisation we want to share training chat data with emergency response organisations only through methods that adhere to the strictest privacy regulations so that we ensure full legal compliance.
### Design
**DEC112 Endpoint:** the endpoint registers at the terminating ESRP and receives any incoming messages / publishes response using a websockets connection
**Chatbot Service:** simulate call taker messages through the OpenAI provided ChatGPT API
```plantuml
@startuml
rectangle "Application" {
rectangle "DEC112 App" as decapp
}
rectangle "DID Repository" {
rectangle "Trust Registry" as semcon
}
rectangle "Chatbot" {
component "Semantic Container" {
component "DEC112 Endpoint" as endpoint
component "Chatbot Service" as chat_srv
database "storage" as db
}
endpoint -> chat_srv
chat_srv -> db
chat_srv <.right- db
}
rectangle "OpenAI" {
component "ChatGPT API" as chatgpt
}
rectangle "Emergency Service Provider" {
rectangle "Training" as org
}
decapp --> endpoint
chat_srv <.- chatgpt
chat_srv -down-> org
semcon .left-> decapp
semcon .-> org
@enduml
```
## System Requirements
### General
* Chatbot is based on the [Semantic Container infrastructure](https://github.com/OwnYourData/semcon)
* sources publicly available on Github: [OwnYourData/dc-chatbot](https://github.com/OwnYourData/dc-chatbot)
* provided as a single Docker image: [`oydeu/dc-chatbot`](https://hub.docker.com/r/oydeu/dc-chatbot)
* instance requires 256MB RAM
* storage requires Postgres instance (JSONB support recommended, version >= 9.4)
### Deployment
#### Standalone
```bash=
docker run -d --name chatbot -p 3600:3000 \
-e OAI_ACCESS_TOKEN="xxx" \
-e WS_ENDPOINT="wss://xxx" \
oydeu/dc-chatbot
```
#### Kubernetes
sample configurations available on Github:
* [deployment.yaml](https://github.com/OwnYourData/dc-chatbot/blob/main/kubernetes/deployment.yaml) -> edit environment variables
* [ingress.yaml](https://github.com/OwnYourData/dc-chatbot/blob/main/kubernetes/ingress.yaml) -> update hostname
* [cert.yaml](https://github.com/OwnYourData/dc-chatbot/blob/main/kubernetes/cert.yaml) -> update hostname
* [service.yaml](https://github.com/OwnYourData/dc-chatbot/blob/main/kubernetes/service.yaml) -> note: run as single instance (i.e., `type: NodePort`)
* [secrets.yaml](https://github.com/OwnYourData/dc-chatbot/blob/main/kubernetes/secrets.yaml) -> edit secrets (base64 encoded)
#### Configuration Options
| Option | Description |
| -------- | ----------- |
| `APP_KEY`* | client-id for chatbot to store conversation in container |
| `APP_SECRET`* | client-secret for chatbot to store conversation in container |
| `CHAT_LANG` | en, de (default: "de") |
| `DEFAULT_CALLTYPE` | ambulance, fire, police (default: "ambulance") |
| `OAI_ACCESS_TOKEN`* | OpenAI access token |
| `OAI_MODEL` | gpt-3.5-turbo, gpt-4 (default: "gpt-3.5-turbo") |
| `OAI_SYSTEM` | text file in configure/textblocks that provides configuration/context for chatbot (default: "OAI_system_default.txt") |
|`POSTGRES2_PASSWORD`* | password to access Postgres cluster |
| `RAILS_CONFIG_HOSTS` | host name where the service is operated, e.g. dec112-chatbot.data-container.net |
| `WS_ENDPOINT`* | websocket endpoint |
(options marked with a star * are required don't provide default values)
## Tests & Monitoring
### Pytests
* test welcome message: check container is running
### ng112-tester
* perform complete conversation (simple)
* https://github.com/DEC112-private/ng112-js-tester/commit/6c48f30ca1890fd5ca348e62159ef483bcdf57b7
### Monitoring
* report errors from pytests to Slack Channle #automated-tests_dev
## KPIs
* number of conversations started
* numver of conversations completed
* number of conversations with consent to data sharing
* average rating of completed conversations
* number of email addresses collected
* number of different users
* spent Credits on OpenAI (balance on Oct 15th: $19,73)
## Roadmap
- [ ] announce Staging Meeting on Slack
- requires available Staging infrastructure: Kubernetes Cluster from [nextlayer](https://www.nextlayer.at/)
- [ ] 2023-02-12 finish Deployment and configuration of monitoring
- [ ] 2024-02-13 19:00 Staging Meeting
- [ ] 2023-02-29 finish 2 weeks of testing
- [ ] 2023-03-04 19:00 Production Meeting
## Findings
Document here any findings or changes to the system during the evaluation period on the Staging System:
* item