# Langchain의 최상위 modules - Model I/O: language model을 다루는 핵심 modules - Data Connection - Chains - Memory - Agents - Callbacks # Model I/O language model과 직접적인 interface 제공 - Prompts: input 제어 - Language models: model interface 호출 - Output parsers: output에서 정보 추출 ## Prompts ### [Prompt templates](https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/): input의 template제공 - example: ```python= from langchain import PromptTemplate template = """/ You are a naming consultant for new companies. What is a good name for a company that makes {product}? """ prompt = PromptTemplate.from_template(template) prompt.format(product="colorful socks") ``` ```python= You are a naming consultant for new companies. What is a good name for a company that makes colorful socks? ``` - MessagePromptTemplate: system / ai / human과 같은 role기반 messate template 제공 - ChatPromptTemplate: n개의 message template을 기반으로 대화형 prompt template 제공 - [Feature Store](https://www.tecton.ai/blog/what-is-a-feature-store/): Data의 feature를 관리하는 공간으로 보다 효율적인 모델과 데이터간 interface(transform / storage / serving)를 제공 - Feature Store alternatives: Tecton, Featureform - Feature Store 데이터 + template으로 prompt 생성 후 LLMChain의 input으로 주입 sample ```python= from langchain.prompts import PromptTemplate, StringPromptTemplate template = """Given the driver's up to date stats, write them note relaying those stats to them. If they have a conversation rate above .5, give them a compliment. Otherwise, make a silly joke about chickens at the end to make them feel better Here are the drivers stats: Conversation rate: {conv_rate} Acceptance rate: {acc_rate} Average Daily Trips: {avg_daily_trips} Your response:""" prompt = PromptTemplate.from_template(template) class FeastPromptTemplate(StringPromptTemplate): def format(self, **kwargs) -> str: driver_id = kwargs.pop("driver_id") feature_vector = store.get_online_features( features=[ "driver_hourly_stats:conv_rate", "driver_hourly_stats:acc_rate", "driver_hourly_stats:avg_daily_trips", ], entity_rows=[{"driver_id": driver_id}], ).to_dict() kwargs["conv_rate"] = feature_vector["conv_rate"][0] kwargs["acc_rate"] = feature_vector["acc_rate"][0] kwargs["avg_daily_trips"] = feature_vector["avg_daily_trips"][0] return prompt.format(**kwargs) prompt_template = FeastPromptTemplate(input_variables=["driver_id"]) ``` ```python= from langchain.chat_models import ChatOpenAI from langchain.chains import LLMChain chain = LLMChain(llm=ChatOpenAI(), prompt=prompt_template) chain.run(1001) ``` ```python= "Hi there! I wanted to update you on your current stats. Your acceptance rate is 0.055561766028404236 and your average daily trips are 936. While your conversation rate is currently 0.4745151400566101, I have no doubt that with a little extra effort, you'll be able to exceed that .5 mark! Keep up the great work! And remember, even chickens can't always cross the road, but they still give it their best shot." ``` - [Custom prompt template](https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/custom_prompt_template): Prompt template을 직접 정의 - FewShotPromptTemplate: example set을 prompt에 표시 - ExampleSelector: example set에서 semantic similarity기반 example을 추출하여 표시 ```python= example_selector = SemanticSimilarityExampleSelector.from_examples( examples, # This is the list of examples available to select from. OpenAIEmbeddings(), # This is the embedding class used to produce embeddings which are used to measure semantic similarity. Chroma, # This is the VectorStore class that is used to store the embeddings and do a similarity search over. k=1 # This is the number of examples to produce. ) ``` - FewShotPromptTemplate + ExampleSelector: example selector에서 추출된 example기반으로 prompt에 sample 표시 ```python= prompt = FewShotPromptTemplate( example_selector=example_selector, example_prompt=example_prompt, suffix="Question: {input}", input_variables=["input"] ) ``` - Formatting: Prompt클래스의 .format_prompt 함수를 사용하면 현재 prompt setting의 형식 확인 가능 ```python= chat_prompt.format_prompt(input_language="English", output_language="French", text="I love programming.").to_messages() [SystemMessage(content='You are a helpful assistant that translates English to French.', additional_kwargs={}), HumanMessage(content='I love programming.', additional_kwargs={})] ``` - Partial Prompting: prompt의 input variables 중 일부만 정의를 해놓는 방식 - Composition: 여러개의 prompt를 pipeline으로 묶는 방식 https://python.langchain.com/docs/modules/model_io/prompts/prompt_templates/prompt_serialization ### Example selectors