### RasperryPi IOT系統簡章:基於 OpenAI GPT-3.5 的聊天助理
#### 功能概述
1. **儲存與管理對話歷史**:助理將所有對話記錄保存在 JSON 文件中,這包括用戶輸入和助理回應,並帶有時間戳。
2. **動態對話生成**:利用 OpenAI 的 API 和保存的對話歷史來生成相應的回應。
3. **過期對話管理**:自動移除超過時間限制的對話記錄,以保持對話上下文的時效性和相關性。
#### 主要組件
1. **聊天助理初始化**:
- 初始化過程中檢查對話歷史文件是否存在,不存在則創建。
```python
def ensureFileExists(self):
if not os.path.exists(self.filePath):
with open(self.filePath, "w", encoding='utf-8') as file:
json.dump([], file)
```
2. **處理用戶輸入**:
- 讀取對話歷史,更新對話並利用 OpenAI GPT-3.5 模型生成回應。
```python
async def interact(self, userInput):
conversationHistory = self.readConversationFromJson()
conversationHistory = self.removeOldConversations(conversationHistory)
conversationHistory.append({"role": "user", "content": userInput, "timestamp": self.getCurrentTimestamp()})
# API call to OpenAI
client = OpenAI(api_key=self.apiKey)
response = client.chat.completions.create(
model="gpt-3.5-turbo", messages=conversationHistory, max_tokens=50, stream=False
)
modelResponse = response.choices[0].message.content
conversationHistory.append({"role": "assistant", "content": modelResponse, "timestamp": self.getCurrentTimestamp()})
self.writeConversationToJson(conversationHistory)
print(modelResponse)
```
3. **管理與儲存對話記錄**:
- 對話記錄包括從 JSON 文件中讀取、更新和寫入對話數據。
```python
def readConversationFromJson(self):
with open(self.filePath, "r", encoding="utf-8") as file:
return json.load(file)
def writeConversationToJson(self, conversationHistory):
with open(self.filePath, "w", encoding='utf-8') as file:
json.dump(conversationHistory, file, ensure_ascii=False, indent=2)
```
#### 使用指南
1. **配置**:確保正確設置 OpenAI 的 API 密鑰並指定對話歷史的存儲路徑。
2. **運行助理**:執行 `main` 函數來啟動聊天助理,開始接受和回應用戶輸入。
這個聊天助理系統適用於需要自動化客服或提供動態對話互動的應用,例如客服機器人、互動遊戲角色或其他類型的自動化對話系統。
#### 原代碼
```python
import json
import datetime
import os
import io
from openai import OpenAI
class ChatAssistant:
def __init__(self, filePath, apiKey):
self.filePath = filePath
self.apiKey = apiKey
self.ensureFileExists()
def ensureFileExists(self):
if not os.path.exists(self.filePath):
with open(self.filePath, "w", encoding='utf-8') as file:
json.dump([], file)
async def interact(self, userInput):
try:
conversationHistory = self.readConversationFromJson()
conversationHistory = self.removeOldConversations(conversationHistory)
conversationHistory.append({"role": "user", "content": userInput, "timestamp": self.getCurrentTimestamp()})
conversationHistoryForApi = [{"role": msg["role"], "content": msg["content"]} for msg in conversationHistory]
client = OpenAI(api_key=self.apiKey)
response = client.chat.completions.create(
model="gpt-3.5-turbo", messages=conversationHistoryForApi, max_tokens=50, stream=False
)
modelResponse = response.choices[0].message.content
conversationHistory.append({"role": "assistant", "content": modelResponse, "timestamp": self.getCurrentTimestamp()})
self.writeConversationToJson(conversationHistory)
print(modelResponse)
except Exception as e:
print(f"An error occurred: {e}")
def getCurrentTimestamp(self):
return datetime.datetime.now().strftime('%Y-%m-%dT%H:%M:%S')
def removeOldConversations(self, conversationHistory, maxHistoryLength=5, timeLimitMinutes=60):
timeLimit = datetime.timedelta(minutes=timeLimitMinutes)
timeThreshold = datetime.datetime.now() - timeLimit
filteredMessages = [msg for msg in conversationHistory if "timestamp" in msg and datetime.datetime.fromisoformat(msg["timestamp"]) > timeThreshold]
return filteredMessages[-maxHistoryLength:] if len(filteredMessages) > maxHistoryLength else filteredMessages
def readConversationFromJson(self):
with open(self.filePath, "r", encoding="utf-8") as file:
return json.load(file)
def writeConversationToJson(self, conversationHistory):
with open(self.filePath, "w", encoding="utf-8") as file:
json.dump(conversationHistory, file, ensure_ascii=False, indent=2)
```