## 批量推理
如果你有大量不需要連貫的推理需求,可以使用**akasha.helper.call_batch_model** 來進行批量推理來提升速度。
### call_batch_model
```text
def call_batch_model(model: LLM, prompt: List[str],
system_prompt: Union[List[str], str] = "") -> List[str]:
```
``` python
import akasha
model_obj = akasha.helper.handle_model("openai:gpt-3.5-turbo", False, 0.0)
# this prompt ask LLM to response 'yes' or 'no' if the document segment is relevant to the user question or not.
SYSTEM_PROMPT = akasha.prompts.default_doc_grader_prompt()
documents = ["Doc1...", "Doc2...", "Doc3...", "Doc4..."]
question = "五軸是什麼?"
prompts = ["document: " + doc +"\n\n" + "User Question: "+ question for doc in documents]
response_list = call_batch_model(model_obj, prompt, SYSTEM_PROMPT)
## ["yes", "no", "yes", "yes"]
```