Anthropic's 3.5GW TPU Deal: What It Means for Small Business

Anthropic's 3.5GW TPU Deal: What It Means for Small Business

April 20, 2026 · Martin Bowling

Anthropic just locked in enough power for a small city

On April 7, Anthropic announced an expanded partnership with Google and Broadcom that will bring multiple gigawatts of new TPU capacity online starting in 2027. Broadcom’s SEC filing pegged the number at 3.5 gigawatts — roughly the electrical draw of three nuclear reactors, or enough to power about 2.6 million homes.

For the small business owner who uses Claude to write emails, or whose customer service chatbot runs on Claude under the hood, a chip deal announced for 2027 may sound like something to file under “not my problem.” It isn’t. Deals like this one are the reason the AI tools you pay for this year cost a fraction of what the same tools cost in 2023 — and why the 2027 version will be better and cheaper still.

What the deal actually involves

Anthropic’s compute strategy is three-legged: AWS Trainium, Google TPUs, and NVIDIA GPUs. This agreement deepens the Google leg. Broadcom designs the custom silicon that powers Google’s TPU systems, so the expanded deal ties all three companies together on manufacturing, hosting, and use.

Here are the details worth knowing:

  • 3.5 gigawatts of TPU capacity coming online starting in 2027, with the vast majority sited in the United States
  • Anthropic’s run-rate revenue has passed $30 billion, up from roughly $9 billion at the end of 2025
  • Over 1,000 enterprise customers now spend more than $1 million per year with Anthropic — double the 500-plus reported in February 2026
  • Extends Anthropic’s November 2025 pledge to invest $50 billion in U.S. AI infrastructure
  • Builds on Google Cloud TPU capacity that was announced in October 2025

Anthropic CFO Krishna Rao called the expansion “a continuation of our disciplined approach to scaling infrastructure.” Broadcom’s filing included a caveat worth noting: the full 3.5 GW is conditional on Anthropic’s continued commercial success, not guaranteed capacity.

Why a 2027 TPU deal matters to a 2026 business

AI tools run on chips. Chips run on electricity. The price you pay for a Claude-powered assistant, a Gemini-enhanced inbox, or a voice agent that answers your phone is ultimately a function of how expensive it is to serve your tokens. Every large deal like this one compresses that cost.

Token prices keep falling

Over the past two years, the cost to run frontier AI models has dropped sharply. Google cut Gemini’s serving costs by 78% in 2025. OpenAI has cut GPT-4-class pricing repeatedly. Claude’s per-token rates on Sonnet-class models are roughly a third of what they were when Claude 3 launched.

That trend is not coincidental. It is the direct result of more capacity hitting the market at the same time as better models squeeze more performance out of each watt. A 3.5 GW expansion in 2027 is another push in that direction. If you are paying a vendor today that wraps Claude, expect your margin room to widen — not shrink — over the next 24 months.

Diversified compute means fewer outages

Anthropic running on three chip families (Trainium, TPU, GPU) instead of one is a reliability story, not just a cost story. When one provider has an outage — and all of them eventually do — workloads can shift. For small businesses that depend on AI for customer-facing work, that matters more than most people realize. A restaurant with an AI phone agent taking reservations does not care which chip the model runs on. It cares that the phone keeps ringing through.

The timing gap is the catch

The capacity comes online in 2027. The demand is here now. Anthropic and every other frontier lab is supply-constrained today and will likely stay that way into 2026. That is why you occasionally see rate limits, slower responses during peak hours, or feature waitlists from AI providers. The deal does not fix that in the short term. It fixes it in 18 to 30 months.

What the critics are missing

Most of the coverage focuses on the gigawatt number and the revenue run-rate. The underreported angle is the shape of Anthropic’s customer base. Going from 500 to 1,000 enterprise accounts spending over $1 million annually in about two months is not a curiosity — it is a signal that the frontier AI market has stopped being a hobbyist experiment and started being a line item in corporate budgets.

For small businesses, that shift cuts two ways. Good news: the tooling, documentation, and integration partners around Claude and similar models will keep improving because enterprise money funds all of it. Uncomfortable news: the attention of the labs is on the million-dollar customers, not the $20-a-month ones. The gap between what a Fortune 500 deployment looks like and what a ten-person contractor shop can practically use is widening.

That gap is where Appalach.AI lives. Our AI Employees wrap frontier models inside workflows a small business can actually plug in on a Tuesday morning — not a six-month enterprise implementation.

What small businesses should actually do

You do not need to change anything today based on a 2027 deal. But a few habits will age well:

  1. Pick tools that are portable, not model-locked. If your chatbot vendor only supports one provider, ask what happens when pricing or capacity changes. The ones that can swap models behind the scenes will pass savings through to you.
  2. Budget for AI that gets cheaper, not more expensive. Rework your 2027 forecasting to assume token costs are 30 to 50% lower than today. That changes what is worth automating.
  3. Watch for bundle pricing. As compute comes online, expect providers to roll AI into existing SaaS plans rather than charge separately. Do not pay twice.
  4. Diversify your own critical workflows. If a single model going down would stop your business, treat that the same way you would treat a single point of failure in any other system.

The bottom line

Anthropic locking in 3.5 gigawatts of TPUs in 2027 is a story about AI getting cheaper and more reliable for every business that uses it — on a delay.

The tools you rent today are being subsidized by infrastructure investments that will not show up until 2027 and beyond. That is not a reason to wait. The businesses that start automating now will have two extra years of workflow refinement when the next cost cliff arrives.

Not sure where AI fits into your business? Get in touch — we help small businesses in Appalachia sort out what is worth adopting today from what is worth watching.

AI Tools Industry News Small Business