There would be a supply crunch but a lot of dollars will be shuffled VERY fast to ramp up production.
If something even more drastic happens. China might even attempt unification with some reasoning like protecting Taiwan from USA or other nations.
When people do this kind of predictions, they often driven by emotional reaction. Best thing to switch actual evaluation on certain hypothesis is to make actual risks cost something.
I think the bigger problems of the AI bubble are energy and that it's gaining a terrible reputation for being the excuse for mass layoffs while suffocating the Internet with slop/brainrot content. All while depending on government funding to grow.
Maybe I’m missing something, but isn’t this just a standard American put option with a strike of $100 and expiry of Dec 31st?
He's answering the question "How should options be priced?"
Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Whether Nvidia specifically takes a big dive depends much more on whether they continue to meet growth estimates than general volatility. If they miss earnings estimates in a meaningful way the market is going to take the stock behind the shed and shoot it. If they continue to exceed estimates the stock will probably go up or at least keep its present valuation.
Other way around: if NVidia sinks, it likely takes a bunch of dependent companies with it, because the likely causes of NVidia sinking all tell us that there was indeed an AI bubble and it is popping.
They are maintaining this astronomical growth through data centers margins from the design of their chips and all of that started from graphics related to video games.
No? That’s why they have almost no competition. Hardware starting costs are astronomical
Are they already "too big to fail"? For better or worse, they are 'all in' on AI.
My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
Which is absolutely the right move when your latest datacenter's power bill is literally measured in gigawatts. Power-efficient training/inference hardware simply does not look like a GPU at a hardware design level (though admittedly, it looks even less like an ordinary CPU), it's more like something that should run dog slow wrt. max design frequency but then more than make up for that with extreme throughput per watt/low energy expense per elementary operation.
The whole sector of "neuromorphic" hardware design has long shown the broad feasibility of this (and TPUs are already a partial step in that direction), so it looks like this should be an obvious response to current trends in power and cooling demands for big AI workloads.
However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.
AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.
While thinking computers will replace human brains soon is rabid fanaticism this statement...
> AI will conquer the world like software or the smartphone did.
Also displays a healthy amount of fanaticism.
As far as AI conquering the world. It needs a "killer app". I don't think we'll really see that until AR glasses that happen to include AI. If it can have context about your day, take action on your behalf, and have the same battery life as a smartphone...
But yes. Cisco's value dropped when there was not same amount to spend on networking gear. Nvidia's value will drop as there is not same amount of spend on their gear.
Other impacted players in actual economic downturn could be Amazon with AWS, MS with Azure. And even more so those now betting on AI computing. At least general purpose computing can run web servers.
If it had given me the right easy to understand answer right away I would have spent 2 minutes of both MY time and ITS time. My point is if AI will improve we will need less of it, to get our questions answered. Or, perhaps AI usage goes up if it improves its answers?
The data is very strongly showing the quality of AI answers is rapidly improving. If you want a good example, check out the sixty symbols video by Brady Haran, where they revisited getting AI to answer a quantum physics exam after trying the same thing 3 years ago. The improvement is IMMENSE and unavoidable.
Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.
So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.
AI is different and businesses are already using it a lot. Of course there is hype, it’s not doing all the things the talking heads said but it does not mean immense value is not being generated.
It's like if your taxi company bought taxis that were more fuel efficient every year.
You kind of have to.
Replacing cars every 3 years vs a couple % in efficiency is not an obvious trade off. Especially if you can do it in 5 years instead of 3.
It can make sense at a certain scale, but it’s a non trivial amount of cost and effort for potentially marginal returns.
Isn't that precisely how leasing works? Also, don't companies prefer not to own hardware for tax purposes? I've worked for several places where they leased compute equipment with upgrades coming at the end of each lease.
That's where the analogy breaks. There are massive efficiency gains from new process nodes, which new GPUs use. Efficiency improvements for cars are glacial, aside from "breakthroughs" like hybrid/EV cars.
It's not like the CUDA advantage is going anywhere overnight, either.
Also, if Nvidia invests in its users and in the infrastructure layouts, it gets to see upside no matter what happens.
I have not seen hard data, so this could be an oft-repeated, but false fact.
If this was anywhere close to a common failure mode, I'm pretty sure we'd know that already given how crypto mining GPUs were usually ran to the max in makeshift settings with woefully inadequate cooling and environmental control. The overwhelming anecdotal evidence from people who have bought them is that even a "worn" crypto GPU is absolutely fine.
Another commonly forgotten issue is that many electrical components are rated by hours of operation. And cheaper boards tend to have components with smaller tolerances. And that rated time is actually a graph, where hour decrease with higher temperature. There were instances of batches of cards failing due to failing MOSFETs for example.
This doesn't mean much for inference, but for training, it is going to be huge.
(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.
(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.
(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.
I worked at a few data centers on and off in my career. I got lots of hardware for free or on the cheap simply because the hardware was considered “EOL” after about 3 years, often when support contracts with the vendor ends.
There are a few things to consider.
Hardware that ages produce more errors, and those errors cost, one way or another.
Rack space is limited. A perfectly fine machine that consumes 2x the power for half the output cost. It’s cheaper to upgrade a perfectly fine working system simply because it performs better per watt in the same space.
Lastly. There are tax implications in buying new hardware that can often favor replacement.
But no, there’s none to be found, it is a 4 year, two generations old machine at this point and you can’t buy one used at a rate cheaper than new.
Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.
Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.
I do wonder what people would think the reasoning would be for them to increase in value this much back then, prolly would just assume crypto related still.
Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.
I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.
I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.
edit: 2025* not 2024
I don't know if that's non-rational, or if people can't be expected to read the second sentence of an announcement before panicking.
Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc
It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.
All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.
Or, you know, when LLMs don't pay off.
Exactly, it is currently priced as though infinite GPUs are required indefinitely. Eventually most of the data centres and the gamers will have their GPUs, and demand will certainly decrease.
Before that, though, the data centres will likely fail to be built in full. Investors will eventually figure out that LLMs are still not profitable, no matter how many data centres you produce. People are interested in the product derivatives at a lower price than it costs to run them. The math ain't mathin'.
The longer it takes to get them all built, the more exposed they all are. Even if it turns out to be profitable, taking three years to build a data centre rather than one year is significant, as profit for these high-tech components falls off over time. And how many AI data centres do we really need?
I would go further and say that these long and complex supply chains are quite brittle. In 2019, a 13 minute power cut caused a loss of 10 weeks of memory stock [1]. Normally, the shops and warehouses act as a capacitor and can absorb small supply chain ripples. But now these components are being piped straight to data centres, they are far more sensitive to blips. What about a small issue in the silicon that means you damage large amounts of your stock trying to run it at full power through something like electromigration [2]. Or a random war...?
> The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
Yep. Nothing about this adds up. Existing data centres with proper infrastructure are being forced to extend use for previously uneconomical hardware because new data centres currently building infrastructure have run the price up so high. If Google really thought this new hardware was going to be so profitable, they would have bought it all up.
[1] https://blocksandfiles.com/2019/06/28/power-cut-flash-chip-p...
[2] https://www.pcworld.com/article/2415697/intels-crashing-13th...
Isn’t this entirely dependent on the economic value of the AI workloads? It all depends on whether AI work is more valuable than that cost. I can easily see arguments why it won’t be that valuable, but if it is, then that cost will be sustainable.
The math looks bad regardless of which way the industry goes, too. A successful AI industry has a vested interest in bespoke hardware to build better models, faster. A stalled AI industry would want custom hardware to bring down costs and reduce external reliance on competitors. A failed AI industry needs no GPUs at all, and an inference-focused industry definitely wants custom hardware, not general-purpose GPUs.
So nVidia is capitalizing on a bubble, which you could argue is the right move under such market conditions. The problem is that they’re also alienating their core customer base (smaller datacenters, HPC, gaming market) in the present, which will impact future growth. Their GPUs are scarce and overpriced relative to performance, which itself has remained a near-direct function of increased power input rather than efficiency or meaningful improvements. Their software solutions - DLSS frame-generation, ray reconstruction, etc - are locked to their cards, but competitors can and have made equivalent-performing solutions of their own with varying degrees of success. This means it’s no longer necessary to have an nVidia GPU to, say, crunch scientific workloads or render UHD game experiences, which in turn means we can utilize cheaper hardware for similar results. Rubbing salt in the wound, they’re making cards even more expensive by unbundling memory and clamping down on AIB designs. Their competition - Intel and AMD primarily - are happily enjoying the scarcity of nVidia cards and reaping the fiscal rewards, however meager they are compared to AI at present. AMD in particular is sitting pretty, powering four of the five present-gen consoles, the Steam Deck (and copycats), and the Steam Machine, not to mention outfits like Framework; if you need a smol but capable boxen on the (relative) cheap, what used to be nVidia + ARM is now just AMD (and soon, Intel, if they can stick the landing with their new iGPUs).
The business fundamentals paint a picture of cannibalizing one’s evergreen customers in favor of repeated fads (crypto and AI), and years of doing so has left those customer markets devastated and bitter at nVidia’s antics. Short of a new series of GPUs with immense performance gains at lower price and power points with availability to meet demand, my personal read is that this is merely Jenson Huang’s explosive send-off before handing the bag over to some new sap (and shareholders) once the party inevitably ends, one way or another.
How do you use fundamental analysis to assign a probability to Nvidia closing under $100 this year, and what probability do you assign to that outcome?
I'd love to hear your reasoning around specifics to get better at it.
Still, it's interesting the probability is so high while ignoring real-world factors. I'd expect it to be much higher due to: - another adjacent company dipping - some earnings target not being met - china/taiwan - just the AI craze slowing down
To put it another way, to price an option I need a) the current price of the underlying, b) the time until option expiry, c) the strike price of the option, and d) the collective expectation of how much the underlying's price will vary over the period between now and expiry. This last piece is "volatility", and is the only piece that can't be empirically measured; instead, through price discovery on a sufficiently liquid contract, we can reparameterize the formula to empirically derive the volatility expectation which satisfies that current price (or "implied volatility"). Due to the efficient market hypothesis, we can generally treat this as a best-effort proxy for all public information about the underlying. None of this calculation requires any measurement or analysis of the underlying's past price action, patterns, etc. The options price will necessarily include TA traders' sentiments about the underlying based on their TA (or whatever else), just as it will include fundamentals traders' sentiments (and, if you're quick and savvy enough, insiders' advance knowledge!) The price fundamentally reflects market sentiment about the future, not some projection of trends from the past.
Everything that can't go on forever will eventually stop. But when?
I do hope they crash so that I can buy as much as possible at a discount.
Nvidia stock crash will happen when the vendor financing bubble bursts.
They are engaged in a dangerous game of circular financing. So it is case of when, not if the chickens come home to roost.
It is simply not sustainable.
The only way the stock could remain at its current price or grow (which is why you'd hold it) is if demand would just keep going up (with the same lifecycle as current GPUs) and that there would be no competition, which the latter to me us just never going to be a thing.
Investors are convinced that Nvidia can maintain its lead because they have the "software" side, I.e. CUDA, which to me is so ridiculous, as if with the kind of capital that's being deployed into these datacenters, you couldn't fit your models into other software stacks by hiring people....
assuming LLM coding agents are good, but if they aren't any good, then what is the value of the CUDA code?
My personal opinion, having witnessed first hand nearly 40 years of tech evolution, is that this AI revolution is different. We're at the very beginning of a true paradigm shift: the commoditization of intelligence. If that's not enough to make people think twice before betting against it, I don't know what is. And it's not just computing that is going to change. Everything is about to change, for better or worse.
Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].
This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).
To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.
[1] https://s201.q4cdn.com/141608511/files/doc_financials/2025/a..., page 155: > Sales to direct Customers A, B and C represented 12%, 11% and 11% of total revenue, respectively, for fiscal year 2025.
[2] same, page 116: > Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.