10 Comments
User's avatar
Aaron Scher's avatar

> And, while most data centers need to be located near to their key users, to reduce latency, AI data centers are often located where energy and land are cheap

I think it's often overstated how important it is for data centers to be near users, including in this piece. Just look at the actual latency numbers, e.g., https://wondernetwork.com/pings/San%20Francisco. From SF to the East Coast is sub 100ms (round-trip), from SF to New Orleans is ~51ms. This is just not enough to matter for *most* LLM uses we see today; real time things like translation or voice are the exception. Maybe uses will change, but I expect the vast majority of LLM use won't be latency sensitive on the order of 1/20th of a second.

For reference on how much 50ms is, ChatGPT as measured by Artificial Analysis (https://artificialanalysis.ai/models/gpt-5-chatgpt/providers) has ~530ms time to first token (latency), many small models are also in the 0.3-1s range.

Expand full comment
Shakeel Hashim's avatar

Interesting -- thanks for the analysis.

Expand full comment
Patrick Mathieson's avatar

Some really good points in here. But I am puzzled at why so many people including the author are prepared to dismiss the “$160B/320B per year over a decade” possibility.

If you add up all direct AI ARR (openAI/anthropic/cursor/etc) that number will have gone from below $10B to greater than $30B over the course of 2025. (This is before even adding things like Meta revenues, given that some of their massive advertising income is at least partially provided by GPU workloads. Can set that aside for the moment even though this might make total AI revenues many multiples higher).

It would not surprise me if these direct AI revenues went from $30B to $75B to $150B to $250B to $375B or something over the next five years, given the way the AI models are improving and spreading via increasingly strong business applications. Why then would we view the prospect of $1.6T-3.2T over a decade as outlandish? If anything it strikes me as quite likely that we clear that hurdle.

(EDIT: I recognize that we'd need a separate set of revenues to pay back 2026 capex, and then 2027 capex, and so on. That's a higher hurdle to clear. But just addressing the immediate take I see over and over which is that $x trillion of revenues in the next decade are not going to materialize.)

Expand full comment
Shakeel Hashim's avatar

Agreed on this — I personally think it's more plausible than not that the revenue keeps increasing as needed. But it's interesting to think about what might happen if it doesn't — especially as, as you say, the revenue growth might be contingent on consistent model improvement, which I put a very strong likelihood on but could easily turn out not to be the case.

Expand full comment
Patrick Mathieson's avatar

Totally. If the revenue doesn’t materialize then all of this is a disaster.

Still, this thread helped me distill a litmus test of sorts… how many years do we think it will take for AI revenues to 10x from where they are now? Somebody’s answer to that question should be predictive of whether they are bullish or bearish on the ecosystem….

Expand full comment
Patrick Mathieson's avatar

This chat prompted me to sketch down some thoughts: https://thedownround.substack.com/p/how-many-years-until-ai-revenues

tl;dr I think we 10x in 3 years.

Expand full comment
Patrick Mathieson's avatar

Also just a side note but I still don’t understand why Kuppy’s math presumes that 100% of AMZN/MSFT/META/GOOGL capex is going into AI data centers. Plenty of that $400B goes into other capital equipment unrelated to AI (AWS for example requires plenty of refreshed equipment to power their $120B/yr worth of plain old cloud workloads).

Expand full comment
Frank D's avatar

But only if there isnt increased unemployment and down sizing? Who are the customers if the reduction of societies with money diminishes? Workforces are going to go through a shake up over the next two years. Agi is a promise of what exactly? More replacement? The talk up of new roles is not quite materialising today bc many are still catching up with what ai can do for them. Ubi is a pipe dream as theatre with no seriousness behind it. The people funding ai are the uber wealthy or mega pension funds, could China be the pin to the US bubble?

Expand full comment
HansKu's avatar

But GPUs can be used for longer that 3-5 years. It's just that they're then outdated and there's only pressure to replac them if there's an economic reason to do so. If there is less demand then anticipated, it's also not pressing to replace the GPUs. Also, how should such a bubble burst in the first place. Sure some investors in speculative Start-up titles might get burned since the ROI of AI is lower than they expect, but broadly, most of the larger ai players doing these huge investments a) have to in order to stay competitive and b) have very valuable busines models aside from ai. In my opinion, there will be a soft landing once people realize this.

Expand full comment
Daniel Popescu / ⧉ Pluralisk's avatar

Perfect timing, this. The hype cycle is unsustainabile.

Expand full comment