What is LLM Council?¶
Instead of asking a single LLM for answers, LLM Council:
- Stage 1: Sends your question to multiple LLMs in parallel (GPT, Claude, Gemini, Grok, etc.)
- Stage 2: Each LLM reviews and ranks the other responses (anonymized to prevent bias)
- Stage 3: A Chairman LLM synthesizes all responses into a final, high-quality answer
Key Features¶
- Multi-model deliberation - Get answers validated across multiple AI models
- Peer review - Anonymous evaluation prevents model favoritism
- Flexible integration - Use as MCP server, HTTP API, or Python library
- Confidence tiers - Quick, balanced, high, and reasoning modes
- Jury mode - Binary verdicts for go/no-go decisions
Quick Start¶
# Install
pip install "llm-council-core[mcp]"
# Set API key
export OPENROUTER_API_KEY="sk-or-v1-..."
# Use with Claude Code
claude mcp add llm-council --scope user -- llm-council
Then in Claude Code:
Use Cases¶
- Code review - Get multiple AI perspectives on code changes
- Architecture decisions - Validate design choices with AI jury
- Content validation - Check factual accuracy across models
- Complex problem solving - Leverage diverse AI reasoning
Community¶
- Discord - Real-time chat and support
- GitHub Discussions - Q&A and ideas
- Contributing Guide - Help improve LLM Council
Next Steps¶
- Installation - Detailed setup instructions
- Quick Start - Get up and running in 5 minutes
- Configuration - Customize your council