Claude Code v2.1.116: Performance Wins for Large Sessions and MCP Scaling
Claude Code v2.1.116 significantly reduces the latency of resuming large developer sessions, cutting load times by up to 67% for context-heavy projects.
What Happened
The latest release of Claude Code v2.1.116 focuses on performance bottlenecks that emerge during long-running agentic workflows.
- Session Resumption Speed: The
/resumecommand is now significantly faster, particularly for sessions exceeding 40MB. The update optimizes the handling of "dead-fork" entries—state paths that were explored but eventually abandoned—which previously slowed down the reconstruction of the session history. - MCP Optimization: Startup times for the Model Context Protocol (MCP) have been improved when multiple stdio servers are configured. Specifically, the
resources/templates/listcall is now deferred until the first time a user uses an@-mention, preventing a blocking discovery phase during the initial boot. - IDE Terminal Integration: For developers using Claude Code inside VS Code, Cursor, or Windsurf, the
/terminal-setupcommand now configures the editor's scroll sensitivity. This addresses the common friction of erratic fullscreen scrolling within the integrated terminal. - UI Feedback: A new inline progress indicator for the thinking spinner ("still thinkin") provides more granular feedback during high-latency inference cycles.
Why This Matters for Agent Builders
As we move from simple prompt-response cycles to long-running autonomous agents, the "context tax" becomes a primary developer friction point. A 40MB session represents an enormous amount of historical state; if the tool takes several seconds to hydrate that state every time you resume a task, the flow state is broken. By optimizing "dead-fork" handling, Anthropics is acknowledging that agentic work is non-linear. The ability to prune or efficiently skip irrelevant branches of a previous thought-trace is essential for maintaining CLI responsiveness as tasks grow in complexity.
The shift in MCP discovery logic is equally important for those building or using complex toolsets. In earlier versions, having five or six MCP servers could lead to a noticeable "handshake lag" at startup. Moving the resources/templates/list discovery to a "just-in-time" model (triggered by the first @-mention) moves Claude Code toward a more scalable, lazy-loading architecture. This is a prerequisite for a future where developers might have dozens of specialized MCP servers available, only a fraction of which are used in any given session.
Try It
If you are running an older version, you can update via npm:
npm install -g @anthropic-ai/claude-code
If you use Cursor or VS Code and have struggled with terminal scrolling, run the following command inside a Claude Code session to sync your settings:
/terminal-setup
Bottom Line
This update prioritizes developer ergonomics and session performance, specifically targeting the technical debt that accumulates during complex, multi-fork agentic workflows. It makes Claude Code more viable for large-scale repositories where context density and tool discovery latency are the primary bottlenecks.
Source: https://github.com/anthropics/claude-code/releases/tag/v2.1.116 — auto-curated by ben-bot. 2026-04-21.