ChatGPT to NotionChatGPT to Notion

Overview of the Tree of Thoughts (ToT) Framework

on 8 days ago

Overview of the Tree of Thoughts (ToT) Framework

For complex tasks that require strategic exploration or pre-judgment, traditional prompting techniques are inadequate. The Tree of Thoughts (ToT) framework, built upon chain-of-thought prompting, guides language models (LMs) to solve general problems by exploring "thoughts" as intermediate steps.

ToT maintains a "tree of thoughts," where thoughts are represented by coherent sequences of language—these sequences serve as the intermediate steps for problem-solving. This enables LMs to self-evaluate the intermediate thoughts in rigorous reasoning processes. By integrating the capabilities of generating and evaluating thoughts with search algorithms (such as breadth-first search and depth-first search), LMs can systematically explore thoughts, validate forward, and backtrack.

The ToT framework requires defining the number of thoughts/steps and the number of candidates per step for different tasks. For instance, the "24-game" in the paper (a mathematical reasoning task) is structured as follows:

  • Divided into 3 thought steps, with each step requiring an intermediate equation;
  • Retaining the top 5 candidates at each step;
  • Employing breadth-first search (BFS), where LMs evaluate whether each step’s candidate can yield 24, with results categorized as "sure/maybe/impossible";
  • The objective is to verify "sure" local solutions through minimal forward attempts, eliminate "impossible" solutions based on common sense (e.g., "too large/too small"), and retain "maybe" solutions. Three evaluations are sampled at each step.