W&B: Building LLM-Powered Apps
LLM 基本原理
Predict the next token
主要分為以下步驟:
- 提供inupt text:
Weights & Biases is
- 將input進行tokenize:
[1135,2338,3134,223,3432,2123]
- 連結LLM並將token提供給LLM
- LLM將會依序預測下一個最可能出現的token
- 將最可能的token進行輸出 Sample
如何控制LLM的輸出
1. Temperature in LLM
Temperature: The LLM temperature is a hyperparameter that regulates the randomness, or creativity, of the AI’s responses.
當temperature越高,回應的可能性越多、越難以控制
當temperature越低,回應的可能性越少、越精確
Image Not Showing
Possible Reasons
- The image was uploaded to a note which you don't have access to
- The note which the image was originally uploaded to has been deleted
Learn More →
Image Not Showing
Possible Reasons
- The image was uploaded to a note which you don't have access to
- The note which the image was originally uploaded to has been deleted
Learn More →
2. Top P sampling
Explain: Top-p sampling (or nucleus sampling) chooses from the smallest possible set of words whose cumulative probability exceeds the probability p
提供前 p 可能性個token進行回應
Image Not Showing
Possible Reasons
- The image was uploaded to a note which you don't have access to
- The note which the image was originally uploaded to has been deleted
Learn More →
Prompt Engineering
1. Level 5 prompt
Complex directive that includes the following:
- Description of high-level goal
- A detailed bulleted list of sub-tasks
- An explicit statement asking LLM to explain its own output
- A guideline on how LLM output will be evaluated
- Few-shot examples
Image Not Showing
Possible Reasons
- The image was uploaded to a note which you don't have access to
- The note which the image was originally uploaded to has been deleted
Learn More →
2. Zero shot
3. Few Shot
擁有真實的問題集->切割成一份份chunk
將以上chunk結合在prompt裡面:
透過以上方式可以產生一些我們需要的範例問題集,並透過此方式加以訓練模型,途中需要注意生成出來的問題集必須要符合我們的需求。