Claude Pro users woke up Tuesday to find Claude Code missing from the plan's feature list. Within twelve hours, the original PSA had crossed 2,300 upvotes and 669 comments.
Anthropic's response, posted by a staffer, framed the move as a test. "For clarity, we're running a small test on ~2% of new prosumer signups," the comment read. The crowd was not satisfied. A second thread titled An open letter to Anthropic hit 1,042 upvotes from a Max-tier user laying out the trust problem.
The bigger story is the migration energy. r/LocalLLaMA's response thread, Claude Code removed from Claude Pro plan — better time than ever to switch to Local Models, climbed past 980 upvotes. Top comment: "Time to switch to Kimi k2.6 guys if you haven't already." Local-model momentum has been building for weeks. This is the moment that turns it into a stampede.
What's actually happening
Anthropic's framing — a 2% A/B test — is technically true and politically tone-deaf. The Pro tier was the pricing sweet spot for indie builders and small agencies. Pulling Claude Code from it, even partially, signals where the company thinks the value is: in the higher tiers, with the higher-margin customers. That's a defensible business move. It is also the move that costs you the early adopters who built your reputation.
The competitive landscape makes it worse. Cursor still ships. OpenCode is gaining stars. Local Qwen3.6 setups now run useful coding agents on 8GB VRAM, and the Reddit threads documenting them are getting hundreds of upvotes. The substitutes are real.
What to watch
Three signals worth tracking this week. First, whether Anthropic rolls back the test or expands it — either move tells you everything. Second, GitHub stars on alternative coding agents. A spike on OpenCode or Aider this week is migration in action. Third, prosumer churn data, which Anthropic won't share but creators will leak through their own subscriber notes.
The trust hit is real. The product is still excellent. Whether those two facts can coexist for another quarter is the open question.