Fifteen cents. That was the price of one gigabyte of storage per month on Amazon S3 when it launched in 2006. Today the price is $0.023. Over 90 price cuts in 19 years. An 85% decline that traces the same cost curve as every infrastructure before it — generation technology improvements in electricity, fiber capacity explosions in telecom, Moore's Law in semiconductors. Jeff Bezos made the analogy explicit: enterprises should no more run their own data centers than run their own power plants. The comparison was strategic, not metaphorical. He was building Insull's model in software.
AWS as Insull
The economic logic is identical. Aggregate demand from thousands of customers onto shared infrastructure. Achieve unit costs no single enterprise could replicate internally. Pass some savings to customers. Attract more volume. Drive average costs down further. The virtuous cycle that Insull called the "gospel of consumption," Bezos called cloud computing.
Ninety-plus price cuts is not generosity. It is the natural behavior of an infrastructure provider riding a declining average cost curve — the more customers served, the lower the per-unit cost, and the rational response is to cut prices to attract more volume and drive costs lower still. This is the same dynamic that made Insull's electricity cheaper every year during the golden age. It is also the dynamic that, in electricity, eventually attracted regulatory attention. The golden age didn't last because the cost curve flattened. The treadmill started because regulators decided the savings belonged to consumers, not shareholders.
The Commodity Layer
Basic cloud infrastructure — virtual machines, object storage, block storage, CDN — is genuinely commoditizing. Price parity across AWS, Azure, and Google Cloud on basic compute has been achieved within a few percentage points. Workloads that aren't deeply embedded in a particular cloud's ecosystem can and do migrate based on price.
But the margins tell a more complicated story than the prices.
Read the gap. AWS at above 35%. Regulated utilities at 9-10%. The gap represents the distance between a cloud provider in its early-maturity phase and a utility that has been on the treadmill for four decades. The pattern from prior eras says the gap closes. The evidence from this era says: not yet. And possibly not soon.
The Managed-Services Moat
Cloud providers are responding to commodity IaaS the way every infrastructure incumbent has responded to commodity pressure: by migrating up the value stack. AWS's Redshift, RDS, Lambda, SageMaker. Azure's Active Directory integration and Teams. Google's BigQuery and Vertex AI. Each is a proprietary service with switching costs far higher than those of basic compute. You can move a virtual machine between clouds in an afternoon. You cannot move a data warehouse built on Redshift with years of stored procedures, custom integrations, and organizational knowledge. The migration isn't a weekend project. It's a rewrite.
This is structurally identical to AT&T's strategy during the Bell System era — using monopoly revenues from the regulated local network to fund proprietary equipment and services in adjacent markets. Western Electric made all the phones. Bell Labs funded the research. The vertical stack captured value at every layer. Cloud providers are building the same stack, with managed services replacing manufactured equipment and machine learning replacing Bell Labs.
The critical question — the one the pattern from prior eras can inform but not answer — is whether the managed-services moat deepens faster than the commodity layer commoditizes. AWS's stable-to-expanding margins suggest it does. The IaaS floor keeps falling, but the revenue mix keeps shifting toward higher-margin services. The claim that "the treadmill is coming for cloud" is pattern-matching dressed as prediction. The honest version: cloud faces competitive compression only if commodity compute commoditizes faster than managed services lock customers in. Nineteen years of evidence runs the other direction.
That doesn't mean the treadmill won't arrive. It means it hasn't yet, and the mechanism — if it comes — will be competitive, not regulatory. This is the distinction that determines everything about the outcome. Regulatory compression is gradual and floor-bounded. Competitive compression is rapid and floor-less. Cloud will face one or the other. Which one it faces depends on whether the EU Data Act, FTC inquiries, and CISA concerns about hyperscaler concentration crystallize into rate-of-return-style regulation — which would actually protect cloud margins by guaranteeing a return — or remain as competitive scrutiny that leaves margins to the market.
The AI Layer
This is where the 140-year framework reaches its limit.
AI infrastructure is somewhere between Stage 1 and Stage 2 of the infrastructure cycle — after the competitive chaos of the early model-training era, during the consolidation phase where a small number of hyperscaler-backed labs with access to sufficient capital are pulling away. The cost structure fits the pattern: high fixed costs (training runs in the hundreds of millions of dollars), low marginal costs (inference per query is small relative to training investment), and ecosystem lock-in (developers building on one API accumulate switching costs in their workflows and institutional knowledge).
The cost structure fits. The market structure doesn't.
Every infrastructure in this series had a clear directionality of market power: one dominant provider (or a small oligopoly of providers) selling to many customers. Edison sold electricity to homes. AT&T sold phone service to subscribers. AWS sells compute to enterprises. The power flows downstream from provider to customer.
AI infrastructure has power flowing in multiple directions simultaneously.
| Layer | Entity | Market Power | Position |
|---|---|---|---|
| Chips | Nvidia | Monopolist (training GPUs) | Seller |
| Cloud compute | AWS / Azure / GCP | Oligopolists | Seller to AI labs, competitor to AI labs |
| Model training | OpenAI / Anthropic / Google | Oligopsonists (few large buyers of compute, talent) | Buyer of cloud, seller of API |
| API access | Same labs | Emerging oligopolists | Seller to developers |
Read the middle two rows together. The cloud providers are simultaneously the infrastructure suppliers to AI labs AND their competitors in the API market. Google sells Anthropic TPU capacity through a multi-gigawatt deal and competes with Anthropic through Gemini. Amazon hosts Anthropic on AWS and competes through its own models. Microsoft hosts OpenAI on Azure and competes through Copilot. There is no prior infrastructure era where the infrastructure provider was simultaneously the customer's landlord and competitor. The closest analog is the Bell System after the Carterfone decision — AT&T was forced to allow non-Bell devices on its network while competing against those same devices with Western Electric equipment. The structural tension eventually broke the system apart.
The Ownership Anomaly
There is one specific way in which AI infrastructure is genuinely novel.
Every infrastructure monopolist in this series owned physical assets. Insull owned generators and wires. AT&T owned copper loops and switches. The Baby Bells owned poles. AWS owns data centers. Physical assets can be rate-based — regulators can measure them, value them, and set a return on them. The entire regulatory architecture of American infrastructure, from the 1907 Wisconsin public utility commission to the FCC's Title II classifications, is built on the premise that the regulated entity owns the bottleneck infrastructure.
The dominant AI labs — Anthropic, OpenAI — own no data centers. No fiber. No spectrum. No physical infrastructure at all. Anthropic's market power, such as it is, derives entirely from intellectual property and model quality. You cannot rate-base a neural network. You cannot set a regulated return on a training run. The regulatory toolkit built for physical infrastructure does not apply to an entity whose monopoly power — if that's even the right word — lives in weights and parameters.
This matters for two reasons. First, the power is more fragile than physical infrastructure monopoly. Open-source models — Llama, Mistral, DeepSeek — provide competitive pressure that Insull's grid never faced. Nobody open-sourced a power plant. Nobody built a free alternative to the Bell copper loop. AI models can be replicated, distilled, fine-tuned. The moat is model quality and iteration speed, not physical exclusivity. That moat can erode.
Second, AI models are differentiable in a way that electricity and telephone service are not. A kilowatt-hour is a kilowatt-hour. A dial tone is a dial tone. Claude and GPT are not interchangeable — developers choose between them based on capabilities, cost, latency, and personality (a word that has no meaning in prior infrastructure). The utility framework assumes a fungible commodity. AI is not a fungible commodity. This is the deepest break with the pattern.
The cost structure says utility. The product structure says something else. The gap between those two claims is where the next decade of AI economics will be decided.
Fifteen Cents
Fifteen cents was the price when cloud was a product — when S3 was a feature that AWS offered and enterprises could take or leave. Two-point-three cents is the price now that cloud is a layer — the thing everything else sits on, the infrastructure you notice only when it's absent.
The 85% decline follows the cost curve of every infrastructure in this series. The margin premium — still above 35% for AWS, against 9-10% for regulated utilities — measures the distance between where cloud is and where the pattern says it's going. Whether that distance closes through regulation (gradual, floor-bounded, and actually protective of margins) or through competition (rapid, floor-less, and potentially devastating) is the open question for cloud.
For AI, the question is different and harder. The cost structure of AI infrastructure maps to the pattern. The market structure — bilateral oligopoly, no physical asset ownership, differentiable products, open-source competitive pressure — does not. The 140-year base rate says: consolidation, then compression. The structural anomalies say: possibly, but through mechanisms the framework doesn't predict, at a speed the framework can't estimate, and with outcomes the framework can't specify.
The clock is running. Fifteen cents to two-point-three cents took 19 years. The cycle is compressing. But knowing that the clock runs faster tells you the direction, not the destination. And for the first time in 140 years of American infrastructure economics, the destination is genuinely uncertain.
Part of the Infrastructure Economics series