<style> img { display: block; margin-left: auto; margin-right: auto; } </style> > [Paper link](https://arxiv.org/abs/2308.09687) | [Note link](https://blog.csdn.net/qq_42801194/article/details/132644647) | [Code link](https://github.com/spcl/graph-of-thoughts) | AAAI 2024 :::success **Thoughts** 1. Graph of Thoughts (GoT) enhances LLMs’ capabilities through networked reasoning. 2. It enables a fine-grained control over individual thoughts. 3. They illustrate several use cases for GoT. ::: ## Abstract This paper introduces Graph of Thoughts (GoT), it can model the information generated by an LLM as an arbitrary graph. It enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops. ## Background Prompt engineering is a resource-efficient approach for solving different LLM tasks. > Chain-of-Thought (CoT) It is an approach for prompting, in which one includes the intermediate steps of reasoning within the prompt (intermediate “thoughts”), besides the task input/output. > Tree of Thoughts (ToT) It models the LLM reasoning process with a tree. It offers novel capabilities such as backtracking from non-promising outcomes. Below is a table that comparing different prompt strategy. ![image](https://hackmd.io/_uploads/Hkh4mf5oR.png) ## Method In GoT, an LLM thought is modeled as a vertex, while an edge is a dependency between such thoughts. Using GoT, one can aggregate arbitrary thoughts by constructing vertices that have more than one incoming edge. Overall, the graph abstraction harnessed by GoT seamlessly generalizes CoT and ToT to more complex thought patterns, without resorting to any model updates. ![image](https://hackmd.io/_uploads/BktjEx9sA.png) Below figure shows how GoT using aggregation and generation thought transformations. ![image](https://hackmd.io/_uploads/SJPh4x9s0.png) Below figure is the system architecture of GoT, and the APIs of respective modules. The user can straightforwardly extend the design towards new prompting schemes, experiment with novel thought transformations, and plug in different LLMs. ![image](https://hackmd.io/_uploads/SJ8lSxqsR.png) ![image](https://hackmd.io/_uploads/H17bHe9iR.png) ## Experiment In the experiments, they describe several use cases of GoT. ### Sorting ![image](https://hackmd.io/_uploads/ryZVSe9oA.png) ### Set Operations ![image](https://hackmd.io/_uploads/HyufEMcjR.png) ### Keyword Counting ![image](https://hackmd.io/_uploads/HyK4VM9sA.png) ### Document Merging ![image](https://hackmd.io/_uploads/H1HBNzqiR.png)