owned this note
owned this note
Published
Linked with GitHub
# LangChain Agent (LCEL Languages)
## DEMO
```PYTHON=
import os
from typing import Any, Union
from flytekit import workflow
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts.prompt import PromptTemplate
from langchain_openai import ChatOpenAI
from flytekitplugins.openai import ChatGPTTask
api_key = os.environ.get("OPENAI_API_KEY")
chatgpt_small_job = ChatGPTTask(
name="chatgpt gpt-3.5-turbo",
openai_organization="org-NayNG68kGnVXMJ8Ak4PMgQv7",
chatgpt_config={
"model": "gpt-3.5-turbo",
"temperature": 0.7,
},
)
model = ChatOpenAI(model="gpt-3.5-turbo",
openai_api_key=api_key,
openai_organization="org-NayNG68kGnVXMJ8Ak4PMgQv7",)
prompt = PromptTemplate(
input_variables=[
"question",
],
template="Question: {question}?",
)
output_parser = StrOutputParser()
@workflow
def wf(input: str) -> Union[str, Any]:
message = prompt(input=input)
o0 = model(input=message)
o1 = output_parser(input=o0)
return o1
```
![image](https://hackmd.io/_uploads/S1TVWyPmC.png)
## Limitations
### 1. Only support creating a class and does not support calling functions within the class.
Concept: If the object is not callable, we can't support.
Example:
#### Don't support
```python=
template = ChatPromptTemplate.from_messages([
("system", "You are a helpful AI bot. Your name is {name}."),
("human", "Hello, how are you doing?"),
("ai", "I'm doing well, thanks!"),
("human", "{user_input}"),
])
```
#### Support
```python=
prompt = PromptTemplate(
input_variables=[
"question",
],
template="Question: {question}?",
)
```
#### Good News
LangChain support multiple input for models and others, which means that if we can find 1 way to work, then we still can run langchain tasks on agent pods.
For example, we can use `"$ref":"#/definitions/StringPromptValue"` as input even if `"$ref":"#/definitions/AIMessage"` doesn't work
![image](https://hackmd.io/_uploads/Hkyn7JwmR.png)
More reference about input output binding:
1. [langchian official document](https://python.langchain.com/v0.1/docs/expression_language/interface/)
2. [LangChain Heptabase Note](https://app.heptabase.com/w/06eecc1507fbd6998d7c0eb60619012cbf0858cc3079fc649099afc99e9db538)
### 2. Only support Class which inherits from `Runnable` Interface
For example, OpenAI Embeddings doesn't inherit from class `Runnable`
![image](https://hackmd.io/_uploads/BkRAVJDmR.png)
reference:
https://python.langchain.com/v0.1/docs/expression_language/interface/
## Concerns
In [official documentation](https://python.langchain.com/v0.1/docs/modules/), we can see lots of components we want to support.
For example, Model I/O, Retrieval, Composition and More.
![image](https://hackmd.io/_uploads/rJFw81w7C.png)
Current version only support class which inherits from `Runnable` Interface, or you can say LangChain Expression Language (LCEL), I can try to support more components and prioritize if I know which is important.