Tokyo-based Sakana AI details a new Monte Carlo Tree Search-based technique that lets multiple LLMs cooperate on a single task, outperforming individual LLMs
Japanese AI lab Sakana AI has introduced a new technique that allows multiple large language models (LLMs) to cooperate on a single task …
Sakana AI dropped AB-MCTS, an algo that lets competing AI models work together, building on their strengths and errors, to solve complex problems It used ChatGPT, Gemini, and DeepSeek to solve 30% of ARC-AGI-2 puzzles vs just 23% for top solo models https://x.com/... [image]