Anthropic updates tool calling to reduce token use; tool search cuts tokens up to 80%, making larger tool sets practical.
With reported 3x speed gains and limited degradation in output quality, the method targets one of the biggest pain points in production AI systems: latency at scale.