Chain-of-Thought
Learn what Chain-of-Thought (Chain-of-Thought Prompting) means in AI and machine learning, with examples and related concepts.
Definition
Chain-of-Thought (CoT) prompting is a technique where you instruct an LLM to show its reasoning step by step before giving a final answer.
Without CoT, models often jump straight to an answer and get complex problems wrong. With CoT, the model “thinks out loud,” which dramatically improves accuracy on math, logic, and multi-step reasoning tasks.
The simplest form: adding “Let’s think step by step” to your prompt. More advanced forms include structured reasoning frameworks and self-consistency checks.
How It Works
Without CoT:
Q: "If a store has 3 shelves with 8 books each, and 5 books are sold, how many remain?"
A: "19" ← (correct, but could easily be wrong for harder problems)
With CoT:
Q: "... Let's think step by step."
A: "Step 1: 3 shelves × 8 books = 24 books total
Step 2: 24 - 5 sold = 19 books remaining
Answer: 19"
For harder problems, the step-by-step reasoning catches mistakes that a direct answer would miss.
Why It Matters
- Accuracy — CoT can improve math/logic accuracy from ~20% to ~80%+ on some benchmarks
- Transparency — You can see where the model’s reasoning goes wrong
- Debugging — When the answer is wrong, the chain shows exactly which step failed
- Complex tasks — Essential for multi-step problems like planning, code debugging, and analysis
Example
from anthropic import Anthropic
client = Anthropic()
# Without CoT — model may rush to wrong answer
response = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=512,
messages=[{
"role": "user",
"content": """A farmer has 15 sheep. All but 8 die. How many sheep are left?"""
}]
)
# Might answer "7" (15-8) — wrong!
# With CoT — model reasons through the trick question
response = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=512,
messages=[{
"role": "user",
"content": """A farmer has 15 sheep. All but 8 die. How many sheep are left?
Think through this step by step before answering."""
}]
)
# "All but 8 die" means 8 survive. Answer: 8 ✓
CoT Variants
| Variant | How It Works |
|---|---|
| Zero-shot CoT | Just add “Let’s think step by step” |
| Few-shot CoT | Provide example reasoning chains |
| Self-consistency | Generate multiple chains, pick the most common answer |
| Tree of Thought | Explore multiple reasoning paths in parallel |
Key Takeaways
- Adding “think step by step” is one of the highest-ROI prompt engineering tricks
- CoT is essential for math, logic, code debugging, and any multi-step reasoning
- Claude and GPT-4 class models benefit most — smaller models may produce lower quality reasoning chains
- For critical decisions, use self-consistency: generate 3-5 chains and take the majority answer
- Some models (o1, Claude with extended thinking) do CoT internally without being asked
Part of the DeepRaft Glossary — AI and ML terms explained for developers.