(Apple is well known for shoving "lesser vendors" out of the way at TSMC)
Nvidia is the high-frequency trader hammering the newest node until the arb closes. Stability usually trades at a discount during a boom, but Wei knows the smartphone replacement cycle is the only predictable cash flow. Apple is smart. If the AI capex cycle flattens in late '27 as models hit diminishing returns, does Apple regain pricing power simply by being the only customer that can guarantee wafer commits five years out?
Nvidia's willingness to pay exorbitant prices for early 2nm wafers subsidizes the R&D and the brutal yield-learning curve for the entire node. But you can't run a sustainable gigafab solely on GPUs...the defect density math is too punishing. You need a high-volume, smaller-die customer (Apple) to come in 18 months later, soak up the remaining 90% of capacity and amortize that depreciation schedule over a decade.
Apple OTOH operates at consumer electronics price points. They need mature yields (>90%) to make the unit economics of an iPhone work. There's also the binning factor I am curious about. Nvidia can disable 10% of the cores on a defective GPU and sell it as a lower SKU. Does Apple have that same flexibility with a mobile SoC where the thermal or power envelope is so tightly coupled to the battery size?
For example the regular M4 can have 4 P-cores / 6 E-cores / 10 GPU cores, or 3/6/10 cores, or 4/4/8 cores, depending on the device.
They even do it on the smaller A-series chips - the A15 could be 2/4/5, 2/4/4, or 2/3/5.
For Apple, they have binning flexibility, with Pro/Max/Ultra, all the way down to iPads - and that’s after the node yields have been improved via the gazillion iPhone SoC dies.
NVIDIAs flexibility came from using some of those binned dies for GeForce cards, but the VRAM situation is clearly making that less important, as they’re cutting some of those SKUs for being too vram heavy relative to MSRP.
The Pro and Max chips are different dies, and the Ultra currently isn't even the same generation as the Max. And the iPads have never used any of those larger dies.
> NVIDIAs flexibility came from using some of those binned dies for GeForce cards
NVIDIA's datacenter chips don't even have display outputs, and have little to no fixed-function graphics hardware (raster and raytracing units), and entirely different memory PHYs (none of NVIDIA's consumer cards have ever used HBM).
Otherwise, yes, if a chip doesn't make M4 Max, it can make M4 Pro. If not, M4. If not, A18 Pro. If not that, A18.
And even all of the above mentioned marketing names come in different core configurations. M4 Max can be 14 CPU Cores / 32 GPU cores, and it can also be 16 CPU cores and 40 GPU cores.
So yeah, I'd agree that Apple has _extreme_ binning flexibility. It's likely also the reason why we got A19 / A19 Pro / M5 first, and we still don't have M5 Pro or M5 Max yet. Yields not high enough for M5 Max yet.
Unfortunately I don't think they bin down even lower (say, to S chips used in Apple Watches), but maybe in the future they will.
In retrospect, Apple ditching Intel was truly a gamechanging move. They didn't even have to troll everyone by putting an Intel i9 into a chassis that couldn't even cool an i7 to boost the comparison figures, but I guess they had to hedge their bet.
No, that's entirely wrong. All of those are different dies. The larger chips wouldn't even fit in phones, or most iPad motherboards, and I'm not sure a M4 Max or M4 Pro SoC package could even fit in a MacBook Air.
As a general rule, if you think a company might ever be selling a piece of silicon with more than half of it disabled, you're probably wrong and need to re-check your facts and assumptions.
There are two levels of Max Chip, but think of a Max as two pros on die (this is simplification, you can also think of as pro as being two normal cores tied together), so a bad max can't be binned into a pro. But a high-spec Max can be binned into a low-spec Max.
Not binning an M4 Max for an iPhone, but an M4 Pro with a few GPU or CPU cores disabled is clearly a thing.
Same for NVIDIA. The 4080 is a 4090 die with some SMs disabled.
The desktop 4090 uses the AD102 die, the laptop 4090 and desktop 4080 use the AD103 die, and the laptop 4080 uses the AD104 die. I'm not at all denying that binning is a thing, but you and other commenters are exaggerating the extent of it and underestimating how many separate dies are designed to span a wide product line like GPUs or Apple's computers/tablets/phones.
As you cut SMs from a die you move from the 3090 down the stack, for instance. That’s yield management right there.
If you want binning in action, the RTX ones other than the top ones, are it. Look for the A30 too, of which I was surprised there was no successor. Either they had better yields on Hopper or they didn't get enough from the A30...
> They need mature yields (>90%) to make the unit economics of an iPhone work.
Can you share how you know this information? >90% seems very specific.Sauce on the number?
iPhones are luxury goods with margins nowhere near typical for consumer electronics. Apple can easily stomach some short term price hikes / yield drops.
If Nvidia pays more, Apple has to match.
Not a system that necessarily works all that well if one player has a short-term ability to vastly outspending all the rest.
You can't let all your other customers die just because Nvidia is flush with cash this quarter...
Is the argument that Apple will go out of business? AAPL?
Wait,
> one player has a short-term ability to vastly outspending all the rest.
I assure you, Apple has the long-term and short-term ability to spend like a drunken sailor all day and all night, indefinitely, and still not go out of business. Of course they’d prefer not to. But there is no ‘ability to pay’ gap here between these multi-trillion-dollar companies.
Apple will be forced to match or beat the offer coming from whoever is paying more. It will cost them a little bit of their hilariously-high margins. If they don’t, they’ll have to build less advanced chips or something. But their survival is not in doubt and TSMC knows that.
Well yeah, people were identifying that back when Apple bought out the entirety of the 5nm node for iPhones and e-tchotchkes. It was sorta implicitly assumed that any business that builds better hardware than Apple would boot them out overnight.
It's not "build better hardware" though, it's "continue to ship said hardware for X number of years". If someone buys out the entire fab capacity and then goes under next year, TSMC is left holding the bag
It really is about making better hardware. Apple would be out-bidding Nvidia right now, but only if the iPhone had equivalent value-add to Nvidia hardware. Alas, iPhones are overpriced and underpowered, most people will agree.
I'd argue this from almost the opposite direction - there is no value-add for Apple because high-end smartphones exceeded the performance requirements of their user-base generations ago.
Nvidia has a pretty much infinite performance sink here (at least as long as training new LLMs remains a priority for the industry as a whole). On the smartphone side, there just isn't the demand for drastic performance increases - and in practice, many of us would like power and cost reduction to be prioritised instead.
TSMC isn't running a charity, it sells capacity to the highest bidder.
Of course customers as big as Apple will have a relationship and insane volumes that they will be guaranteed important quotes regardless.
If it takes 4 years to build a new fab and Apple is willing to commit to paying pay the price of an entire fab, for chips to be delivered in 4 years time - why not take the order and build the capacity?
But Nvidia has also spent billions/year in TSMC for more than a decade and this just keeps increasing.
The flat line prediction is now 2 years old...
Sure maybe they do better in some benchmarks, but to me the experience of using LLMs is and has been limited by their tendency to be confidently incorrect which betrays their illusion of intelligence as well as their usefulness. And I don't really see any clear path to getting past this hurdle, I think this may just be about as good as they're gonna get in that regard. Would be great if they prove me wrong.
New and better things are coming. They will just take time to implement, and I doubt they cancel current training runs. So I guess it will take up to a year for the new things to come out
Can the bubble burst in this time, because people lose patience? Of course. But we are far from the end.
I consider the start of this wave of AI to be approximately the 2017 Google transformer paper and yet transformers didn't really have enough datapoints to look exponential until GPT 3 in 2022.
The following is purely speculation for fun and sparking light-hearted conversation:
My gut feeling is that this generation of models transitioned out of the part of the sigmoid that looks roughly exponential after the introduction of reasoning models.
My prediction is that tranformer-based models will start to enter the phase that asymptotes to flatline in 1-2 years.
I leave open the possibility for a different form of model to emerge that is exponential but I don't believe transformers to be right now.
However, everyone knows that good faith reciprocity at that scale is not rewarded. Apple is ruthless. There are probably thousands of untold stories of how hard Apple has hammered it's suppliers over the years.
While Apple has good consumer brand loyalty, they arguably treat their suppliers relatively poorly compared to the Gold standard like Costco.
It definitely implies it though, I’m hopeful that competition is back.
They are always balls deep, if it takes them 2 years to get a TSMC yield, with as much as demand it exists for high-end fabs, they could already easily get financing to already build even more capacity.
Now they have literally the US government as an investor.
One would be naive to believe that they wouldn't get at least a few hundred billion dollars to scale it up given the so many risks involved in most of US tech sector being dependent on Taiwan.
I’m not saying you’re wrong but you’re previous paragraph sounding like you were wondering if it was the case vs. here you’re saying it’s known. Is this all true? Do they have a reputation for hammering their suppliers?
You can buy modern CPUs made in Iowa - at about $60,000 each. You can buy one from an intel fab (I'm not sure where they are) for under $1000 that is likely better. the Iowa made CPU would be a one-off made under license from Intel. The companies that do this made just enough to prove they can in case Intel fabs are bombed. (I assume this means that you can't actually buy such a CPU if you tried, but they do make them and that is about the cost they would have to charge to break even)
Apple has responded and has started moving a lot of manufacturing out of China. It just makes sense for risk management.
> China will remain the country of origin for the vast majority of total products sold outside the US, he added.
And international sales are a solid majority of Apple's revenue.
> Meanwhile, Vietnam will be the chief manufacturing hub "for almost all iPad, Mac, Apple Watch and AirPods product sold in the US".
> We do expect the majority of iPhones sold in US will have India as their country of origin," Mr Cook said.
Still not made in the US and no plan to change that. They will be selling products made in India/Vietnam domestically and products made in China internationally.
The tariffs are not bringing these jobs home.
The penalties for not delivering on timelines and production goals, and the scale being requested can mean substantial changes to your business. I remember a friend whose company was in talks with Apple telling me that there was some sense of relief when the deal fell through, just because of how much stress and risk and change the deal would entail.
However, a missing component could put tens of billions of dollars of revenue on the line for Apple. It is easier to say that any supplier Apple picks has to then quickly grow to the scale and process needed - and failing to do that successfully could very well be a fatal slip for the supplier.
Even in the iPod days, Apple often would invest in building out the additional capacity (factories) to meet their projected demand, and have a period of exclusivity as well. This meant that as MP3 player demand scaled up, they also wound up locking up production for the micro HDD and flash ram that competitors would need.
Your underlying statement implies that whoever is replacing apple is a better buyer which I don't think is necessarily true.
That means deprioritizing your largest customer.
Also theres the devil you know and the devil you dont know.
That's also a lie, it's only antagonistic when one of the sides is controlled by a psychopathic asshole, and it being antagonistic is a serious drag for the gains of both sides.
> According to Han, Nvidia has been difficult to work with for some time now. Like all other GPU board partners, EVGA is only told the price of new products when they're revealed to everyone on stage, making planning difficult when launches occur soon after. Nvidia also has tight control over the pricing of GPUs, limiting what partners can do to differentiate themselves in a competitive market.
https://www.gamespot.com/articles/evga-terminates-relationsh...
https://www.semiaccurate.com/2010/07/11/investigation-confir...
The only complete package integrator that manages to make a relationship work with Nvidia is Nintendo.
I shot a video at CNET in probably 2011 which was a single touchscreen display (i think it was the APX 2500 prototype iirc?) and it has the precise dimensions to the switch 1.
Nintendo was reluctantly a hardware company... they're a game company who can make hardware, but they know they're best when they own the stack.
And thats probably because Nintendo isn’t adding any pressure to neither TSMC nor Nvidia capacity wise; iirc Nintendo uses something like Maxwell or Pascal on really mature processes for Switch chips/socs.
I thought that was mainly due to bad thermals. I always got the impression that (like Intel) Nvidia only cared about performance, and damn the power consumption.
https://blog.greggant.com/posts/2021/10/13/apple-vs-nvidia-w...
Refusing to acknowledge anything was wrong was the real problem. But that's just a reminder that companies don't care about you. Brand loyalty is a quagmire.
Changing fabs is non-trivial. If they pushed Apple to a point where they had to find an alternative (which is another story) and Apple did switch, they would have to work extra hard to get them back in the future. Apple wouldn't want to invest twice in changing back and forth.
On the other hand, TSMC knows that changing fabs is not really an option and Apple doesn't want to do it anyway, so they have leverage to squeeze.
At this level, everyone knows it's just business and it comes down to optimizing long-term risk/reward for each party.
68K -> PowerPC, practically seamless
Mac OS 9 -> BSD / OS X with excellent backward compatibility
PowerPC -> x86
x86 -> ARM
Each major transition, biting off orders of magnitude more complexity of integration. Looking at this continuum, the next logical vertical integration step for Apple is fabrication. The only question in my mind, does Tim have the guts to take that risk.
The new Apple–ARM work would eventually evolve into the ARM6, first released in early 1992. Apple used the ARM6-based ARM610 as the basis for their Apple Newton PDA.
There are rumours that Intel might have won some business from them in 2 years. I could totally see Apple turning to Intel for the Mac chips, since they're much lower volume. I know it sounds crazy, we just got rid of Intel, but I'm talking using Intel as a fab, not going back to x86. Those are done.
They did had the expertise building it after all. What would happen, if TSMC now would build a M1 clone? I doubt this is a way anyone wants to go, but it seems a implied threat to me that is calculated in.
> "I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go thermonuclear war on this."
The falling out with Samsung was related, but more about the physical look of the phone
This is funny coming from Jobs.
- steve jobs
So, Schmidt had inside knowledge of before following Apple into the smartphone category? That makes the vengeful fury less unhinged.
Sounds like those $40b did not end up running out.
That's the theory/assumption. Android started as an OS for blackberry-style phones with physical keyboard, non touch screens.
Almost as soon as the iPhone launch, Schmidt left the board, and Android pivoted to a multi-touch interface almost immediately, and a year later the HTC Dream came out.
I don't think anyone has any real proof of wrongdoing but the timing is certainly suspicious
Samsung still makes the displays and the cameras for most iPhones. They continued to do business even while engaged in legal action. That they are still competitors wont stop them doing business when it suits them. Business doesn’t care about pride or loyalty; only money.
TSMC already makes them in their labs. They could tweak a few things, claim it is novel and just sell to the competition. (Apple would fight back of course with all they have and TSMC reputation would take damage)
China already has plenty of engineers who can make a chip, and experience with making CPUs. ARM licenses a lot of useful things for making a CPU (I don't know what). They would be better off in the long run making the chips they all ready understand better. Which is something they are doing. It takes longer and costs more, but because they understand they can also customize the next chip for something they think is good - if they are right they can be ahead of everyone else.
What China is lacking is the fabs to make a CPU. They have made good progress in building them, but there is a lot of technology that isn't in the chip that is needed to make a chip.
What do you mean by cloning? An exact copy of Apple SOC? What would that be useful for?
There are already other ARM SOCs that are as performant as Apple's, according to benchmarks.
This is false. Samsung competes with Apple on smartphones. Apple even filed a lawsuit against Samsung over smartphones.
Apple moved to TSMC because how can you trust someone to make chips for you containing your phone's core IP?
>I could totally see Apple turning to Intel for the Mac chips
I could totally see Apple will be wary turning their core IPs to Intel
In the long run, competition (where via Intel, Samsung or geopolitical diversification) is the only path that benefits anyone other than TSMC
Fabless players' IPs are their entire business.
It'll be hard to trust Intel given Intel's past behavior, especially against AMD.
Anyone making a claim that trust will be 0% based on a single thing is obviously oversimplifying the situation. Trust is built on behavior, reputation, time, repeatability, etc.
Trust is subjective and relative. If Alice doesn’t trust Eve, that doesn’t automatically mean that Bob doesn’t trust Eve. That usually requires both Alice and Bob to similar experiences or Bob must have a trust relationship with Alice.
There are other factors than trust as well - the US government really wants intel fabs to take off and they may be applying pressure that we are not aware of. It could well be that Apple is willing to risk Intel because the US government will buy a lot of macs/iphones but only if they CPU is made in the US. (this would be a smart thing for the US todo for geopolitical reasons)
Why do they keep using Samsung for their customized screens despite LG and Chinese competitors being competitive?
yes
> What's that got to do with what we're talking about regarding iPhone's core IP
The iPhone's core IP is iOS.
Collaboration on display and camera development leak major future milestones. Far more consumers care about cameras and displays than the CPU. Just like the camera and display the CPU IP is also protected by patents.
Common manufacturer Samsung[2]
https://en.wikipedia.org/wiki/TSMC
Apple A6 which is fabricated with Samsung 32 nm HKMG (Hi dielectric K, Metal Gate) CMOS process
Is there anyone who can match TSMC at this point for the top of the line M or A chips? Even if Intel was ready and Apple wanted to would they be able to supply even 10% of what Apple needs for the yearly iPhone supply?
> Not all of Apple‘s chips need to be fabbed at the smallest size, those could certainly go elsewhere.
When I saw that TSMC continues to run old fabs, I immediately thought about this idea. I am sure when Apple is designing various chips for their products, they design for a specific node based on available capacity. Not all chips need to be the smallest node size.Another thing: I am seeing a bunch of comments here alluding to Apple changing fabs. While I am not an expert, it is surely much harder than people understand. The precise process of how transistors are made is different in each fab. I highly doubt it is trivial to change fabs.
I guess you’d be doing that anyway with a brand new chip. But still probably easier to work with the tools/fab you know well.
I suppose you’d have to do it just switching nodes at TSMC. Which is why the A13 (or whatever) probably never moves to smaller nodes.
Sometimes Apple updates the chip in a product that doesn’t seem to need it, like the AppleTV. I wonder if it’s because the old node is going away and it’s easier to just use a newer chip that was designed for the newer node.
And anyway consumers don't really need beefy devices nowadays. Running local LLM on a smartphone is a terrible idea due to battery life and no graphics card; AI is going to be running on servers for quite some time if not forever.
It's almost as if there is a constant war to suppress engineer wages... That's the only variable being affected here which could benefit from increased competition.
If tech sector is so anti-competitive, the government should just seize it and nationalize it. It's not capitalism when these megacorps put all this superficial pressure but end up making deals all the time. We need more competition, no deals! If they don't have competition, might as well have communism.
Government jobs should only be an option if there are enough social benefits.
I've met many software engineers who call themselves communists. I can kind of understand. This kind of communist-like bureaucracy doesn't work well in a capitalist environment.
It's painful to work in tech. It's like our hands are tied and are forced to do things in a way we know is inefficient. Companies use 'security' as an excuse to restrict options (tools and platforms), treat engineers as replaceable cogs as an alternative to trusting them to do their job properly... And the companies harvest what they sow. They get reliable cogs, well versed in compliance and groupthink and also coincidentally full-blown communists; they're the only engineers remaining who actually enjoy the insane bureaucracy and the social climbing opportunities it represents given the lack of talent.
I'm going through a computer engineering degree at the moment, but I am thinking about pursuing Law later on.
Looking at other paths: Medicine requires expensive schooling and isn't really an option after a certain age and law, on the other hand, opened its doors too widely and now has a large underclass of people with third-tier law degrees.
Perhaps you can try to accept the realities of the system while trying to live the best life that you can?
Psyching yourself all the way, trying to find some sort of escape towards a good life with freedom later on...
The only way you don’t need to be versed in compliance or group think at a US firm as an employee is to either be
1) independently wealthy, so your job is a hobby you can walk away from
2) have some leverage on a currently in demand skill, but the second that leverage evaporates they will demand the compliance
Also I realized I undersold it, they aren’t just run as dictatorships/oligarchies, they are usually run as command economies as well.
The whole capitalist competition style behavior only happens with inter firm interactions, not internal ones
I spent most of my career working in companies with <50 employees, and only hit a couple of unpleasant founders. The few large companies that I worked in were always bureaucratic nightmares by comparison.
Small companies won't pay FAANG salaries, but they also won't make you feel like a meaningless cog in a vast unsympathetic, unproductive, machine.
I’ve worked for 3 companies like that. It was really great if your views aligned with the founder. If they didn’t, you got fucked.
I really enjoyed when a bunch of juniors were fired the day before Christmas because the founder heard them discussing the latest movies they watched and decided that they had bad opinions and shouldn’t work at his company since he’d be embarrassed if his peers heard their tastes. Not hyperbole, direct statements. We referred to it as the Red Christmas at the time.
I believe you got lucky, I don’t find your advice actionable.
I'm sorry you don't find it actionable. Please continue doing whatever you're doing now that is working for you.
Lol.
It doesn't work out because I don't have leverage, and tried to stand up for what I believe in. I also don't believe it would work for you unless you had views that aligned with the current oligarchical leadership that the entire US industry is operating under.
If you only have a good time when you found the "right" founder, because they will and are capable of harming your career or income when you disagree with them, and the law does effectively nothing to protect you from their ego driven tantrums, then you are a serf at best.
I'd agree with you if it was relatively common that employees who had differences of opinions with founders of companies, weren't forced out, but that is not my experience.
I do not find contentment out of accepting that some assholes are my Betters because they have more money than me.
Labor is the next option above slavery and indenture, and now that slavery and indenture are frowned upon, labor has absorbed that space as well.
If you want to have some control of your environment and destiny, you must be an independent agent, a contractor, entrepreneur, or consultant. A tradesman. You have special skills and expertise, your own tools, and a portfolio of masterpieces at the least.
There is nothing new in this space of human endeavour, it is as it has been, and I suspect will continue to be, for better or for worse. Sacrificing your agency for subservience is going to make you feel at the mercy of your “betters”. If you don’t want that, don’t do that. Labor law and other conventions have made it a little better, but the fundamental relationship is still master and servant.
If we go down this path, what can I say that doesn’t get my account banned and my speech suppressed for what what I would suggest doing to people with your opinion?
It’s not the way I think it -should be- but it is the way that it is. The incentive alignment keeps it at that local minima, and every attempt to move it to a new one so far has introduced so many perverse incentives that it ultimately causes the regression or even complete failure of the economies it is implemented in.
I don’t know what the answer is that maximises human happiness and minimises human misery, but I suspect it lies well outside of the paradigm of conventional market economics.
Within the dominant paradigm, It’s all a matter of risk management. With employment, you are paying your employer with your surplus value to handle the risks that you feel powerless to manage. Market risks, capital risks.
In exchange, you accept risks that your opinions and comfort won’t be prioritised, and in some cases even your physical well being.
In effect, you are betting against yourself being able to balance those risks against the risks posed by pursuing profitability.
The ability to manage risks is intersectional with your ability to manage discomfort and privation. When you run out of money, the house wins by default.
That’s why the foundational step for anyone should be to do whatever they must to obtain a safe fallback position. A place to be. A safety net. This is what enables risk accommodation. Without taking risk, there will be no advancement. If you don’t have a fallback plan, a safe spawn point, do everything in your power to create one, at least for your children.
Just out of curiosity, was it something despicable like them liking Marvel movies? Or more akin to disagreeing whether Eyes Wide Shut could be considered a good Kubrick movie?
If you want to see weird sexual pictures, might as well go all the way with "120 days of Sodom".
Or just go and see one of those documentaries of serial killers from the 70's, like Ted Bundy.
Trump is using his DOJ to probe Jerome Powell with a bogus lawsuit because the Fed won't lower rates on demand.
An independent Fed is the most important body for the USA. Lowering rates should be based on facts, not dictated by some bankrupt casino CEO. And now you want our government to nationalize the tech sector?
Are you talking about TSMC - because that is a single, albiet primary, node in a supply chain, that's also what you have to replicate. AMSL is another vital node.
So many people with "it's just a factory, how hard can it be". The answer is "VERY", as a few endavours have found out already - and they will probably find out even at TSMC Arizona.
I shall illustrate with Adrian Thompson's 1996 FPGA experiment at the University of Sussex.
Thompson used a genetic algorithm to evolve a circuit on an FPGA. The task was simple: get it to distinguish between a 1kHz tone and a 10kHz tone using only 100 logic gates and no system clock.
After about 4,000 generations of evolution, the chip could reliably do it but the final program did not work reliably when it was loaded onto other FPGAs of the same type.
When Thompson looked inside at what evolved, he found something baffling:
The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest - with no pathways that would allow them to influence the output - yet when he disabled any one of them the chip lost its ability to discriminate the tones.
Welcome to building semi-conductors.
There was a great video recently on the company + techniques used for cutting-edge lithography.
I was expecting an Asianometry video from your link
https://www.youtube.com/@Asianometry
Pure Silicon Crystals for the wafer is another very specialist supplier you can't just decide to become - your local gravity will probably have an effect you need to tune into
Also, how is nationalizing something pro-competition? Nationalized companies have a history of using their government connections to squash competition.
They're Apple. If TSMC fucks around too much, they might just start working towards building their own fab.
It the old days the leverage was that without Apple, no one is willing to pay for leading edge foundry development, at least not enough money to make it so compared to Apple. Now it is different. The demands for AI meant plenty of money to go around. And Nvidia is the one to beat, not Apple any more. The good thing for Apple is that as long as Nvidia continues to grow, their order can be spilt between them. No more relying on single vendor to pus.
Private companies can be nice to their suppliers. Owners can choose to stay loyal to suppliers they went to high school with, even if it isn't the most cost-efficient.
That said, they did that for a sapphire glass supplier for the Apple Watch and when their machines had QC problems they dropped them like a rock and went back to Corning.
But is that really any different from any other supplier? And who tf do you think they’re going to drop TSMC for right now? They are the cock of the walk.
Don't look now: https://www.macrumors.com/2025/11/28/intel-rumored-to-supply...
The modern Cortex and Infiniverse designs are so pathetic that RISC-V might mature by the time ARM is the industry standard. And the smaller ARM IP hasn't been profitable since China mass-produced the clones. Courting Intel into buying an architectural license with a free IP bonus is a legitimately smart move for ARM's longevity, from Apple's POV.
Companies have to be fairly large to be Costco suppliers. What suppliers lose in margin they more than make up for in scale. It's better to sell 10 million at 5% margin than 1 million at 10% margin.
And they don't require a % of supplier's business revenue as that would be illegal in the U.S. Most of the products found at Costco are generally found at other retailers, just in smaller packages or as different SKUs.
Costco are legendarily permissive with returns, to extent of things like accepting bare stick-like xmas trees back after xmas, and giving a full refund, but ultimately this is to their advantage in encouraging mindless consumerism (which is also the general American model - no-question-no-fault returns are generally an American thing, not a worldwide one).
Now, a liberal return policy may work out for Costco, and Costco is obviously a high volume hence desirable customer for a supplier, but if Costco is pushing much of the cost of returns back to the supplier, that does change the picture a bit!
That's the take I would pursue if I were Apple.
A quiet threat of "We buy wafers on consumer demand curves. You’re selling them on venture capital and hype"
From TSMC's perspective, Apple is the one that needs financial assistance. If they wanted the wafers more than Nvidia, they'd be paying more. But they don't.
This is the "venture capital and hype" being referred to, not Nvidia themselves.
That line is purified cope.
The reality is that TSMC has no competition capable of shipping an equivalent product. If AI fizzles out completely, the only way Apple can choose to not use TSMC is if they decide to ship an inferior product.
A world where TSMC drains all the venture capital out of all the AI startups, using NVidia as an intermediary, and then all the bubble pops and they all go under is a perfectly happy place for TSMC. In these market conditions they are asking cash upfront. The worst that can happen is that they overbuild capacity using other people's money that they don't have to pay back, leaving them in an even more dominant position in the crash that follows.
Business is a little more nuanced than this audience thinks, and it’s silly to think Apple has no leverage.
But also; Apple is one of the very few companies at their size that seems to have the political environment to make, and more importantly succeed, at decade investments. The iPhone wasn't an obvious success for 5 or 6 years. They started designing their own iPhone chips ~the iPhone 4 iirc, and pundits remarked: this isn't a good idea; today, the M5 in the iPad Pro outperforms every chip made by EVERYONE else in the world, by 25%, at a tenth the power draw and no active cooling (e.g. 9950X3D). Apple Maps (enough said). We're seeing similar investments today, things we could call "failures" that in 10 years we'll think were obviously going to be successful (cough vision pro).
No, it does not. The core inside the M5 is faster than every other core design in single-threaded burst performance. That is common for small machines with a low core count and no hyperthreading.
The chip itself does not outperform every other chip in the world, nor is it 10x more efficient than the 9950X3D. That's not even napkin math at that point, you're making up numbers with no relation to relevant magnitude.
The comparison point was for single core performance, which certainly makes the TDP comparison unfair if interpreted together. The numbers are ballpark-correct.
No one else is remotely close to Apple. Apple could stop developing chips for four years, and it’s very likely they would still ship the most efficient core architecture, and sit in the top five in performance. If you’re quibbling over the semantics of this particular comparison, you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months.
I doubt it, particularly not four years.
The numbers do not exist in isolation. They are "interpreted together" because statistics are more than just advertisement lines. The TDP comparison is mind-bogglingly stupid and you should really feel ashamed for defending it if you care about statistical integrity.
> you are not mentally ready for what M5 Ultra is going to do to these comparisons in a few months
I hope so. The past Ultra chips have been losing to Nvidia laptops in raster and compute efficiency.
Definitely! But I'd recon they would want to bootstrap that part of their supply chain as soon as possible? Say China does invade Taiwan, suddenly their main supplier is gone and the Intel capacity mostly goes to military and other high margin segments. If they instead own Intel they not only control the narrative but also capitalize on the increase in Intel's value.
Then again, Microsoft should have bought Intel: MS has roughly $102 billion in cash (+ short-term investments). Intel’s market value is approximately $176 billion. Considering Azure, Microsoft has heaps of incentive to buy Intel.
I would guess Google are more likely to greenfield develop their own foundry rather than try and buy Intel.
Antitrust would certainly block Apple specifically for this reason. Apple is not a credible supplier of DoD hardware and acquiring IFS would complicate their status as a Trusted Foundry.
If Apple had more time to reform their image and invest in MIL-STD processes then maybe it would work. As-is, I'd be shocked if the US let Intel become the victim of a hostile takeover. Even for a company as important as Apple.
I can certainly see Apple taking a large stake and board position in fabricators, but I can't see them being able to justify the ongoing investment in a closed fab.
They really, absolutely, are not.
It's not about "will there be a new hardware", it's about "is their order quantity predictable"
Like smartphones, AI chips also have a replacement cycle. AI chips depreciate quickly -- not because the old ones go bad, but because the new ones are so much better in performance and efficiency than the previous generation. While smartphones aren't making huge leaps every year like they used to, AI chips still are -- meaning there's a stronger incentive to upgrade every cycle for these chips than smartphone processors.
I've heard that it's exactly that, reports of them burning out every 2-3 years. Haven't seen any hard numbers though.
people are holding onto their phones for longer: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...
Heck if Apple wanted to be super cheeky, they could probably still pivot on the reserved capacity to do something useful (e.x. revised older design for whatever node they reserved where they can get more chips/wafer for cheaper models.)
NVDA on the other hand is burning a lot of good-will in their consumer space, and if a competitor somehow is able to outdo them it could be catastrophic.
Graphical fidelity is at the point that unless some new technology comes out to take advantage of GPUs, I don’t see myself ever upgrading the part. Only replacing it whenever it dies.
And that 1080 ti isn’t dead either, I passed the rig onto someone who wanted to get into PC gaming and it’s still going strong. I mostly upgraded because I needed more ram and my motherboards empty slots were full of drywall dust.
The phone I’m more liable to upgrade solely due to battery life degradation.
The 5090 I replaced it with has not been entirely worth it. Expensive GPUs for gaming have had more diminishing returns on improving the gaming experience than ever before, at least in my lifetime.
https://www.manufacturingdive.com/news/intel-layoffs-25-perc...
Apple can and should do it again!
It will likely be a naval plus air blockade to force a political solution to avoid the messiness of an invasion, but time is on China's side there.
Long term: demographics are worsening for China relative to now or 5 years ago.
Short term: China doesn’t yet have viable homegrown replacements for ASML, TSMC, etc.
Really short term: China blockading Taiwan and suffering the economic fallout would be much more painful than US blockading Cuba/Venezuela/etc.
A decisive kinetic action or a very soft political action, rather than a blockade seems more viable in the current state.
It’s very possible that they will be able to dominate South China Sea and their zone of the Pacific, even now, given the proximity advantages and ship/missile production; and I think that would be satisfactory to them.
20 years from now, China’s sphere and America’s sphere are separate, with China having a lead in competing for Africa, and Europe in a very weird place socially, economically, demographically, and WRT Russia/US competition.
I'm not like, rooting for this, I'm just trying to be realistic.
The US has an embargo that doesn't impact other countries that want to trade with Cuba. China is going to put an actual cordon around Taiwan.
Also, the US has no historical reason for claiming Cuba and has no real domestic pressure to do so (nobody in either party is asking for it). China has been very clear they see Taiwan as a part of China and will reunite with it not for economic or strategic reasons, but for nationalistic ones.
If or when China’s economic and/or demographics issues become problematic is exactly when the CCP likely would want to strike. At least seems to me like it’d be a good time to foment national pride.
Of course hopefully I’m wrong and you’re right.
Many of these larger geopolitical things are decades in the making. Even Trump’s Venezuela action has been a long time brewing. So much so that “US troops in Venezuela” has become a trope in military sci-fi. The primary change with Trump is how he presents and/or justifies it, or rather doesn’t.
What options do you suppose the military might be working on? Training to surround, and blockade? (Check) Information warfare? (Check) Building high numbers of landing craft? (Check) Building high numbers of modular weapon systems that can rapidly increase the number of offensive ships? (Check) Building numerous high volume drone warfare ships and airborne launchers? (Check)
Keep in mind that there are public language cues that preceded invasion such as declarations of the invalidity of the other country’s sovereignty, declarations that the other country is already part of the invading country. Have you seen any signs of that?
Your persistent doubts require ignorance of strong evidence.
They would benefit in what way?
Because their government seems to benefit a lot from Taiwan existing and being an enemy.
The US can't even remotely come close to stopping China in its own backyard today, in another 5-10 years they'll just have that much larger of a Navy. The US knows that's the situation. The US can supply a large one week bombing campaign against China and that's it, based on inventory levels. The US will exhaust its cruise missile supply instantly and the US has almost no meaningful drone-bomb supply. China can build cheap missiles by the tens of thousands perpetually, train them to the coast, and flatten Taiwan and any opponents as necessary. China is the only country that can sustain a multi-year WW2 style bombing campaign today, thanks to its manufacturing capabilities. Imagine them on a full war footing.
USA has been strategically re-homing TSMC to the US mainland for a long time now. 30% of all 2nm and better technologies are slated to be produced in Arizona by 2030.
The real loser in all of this will be the EU which will be completely without the ability to produce or acquire chips. They'll just end up buying from China and USA, which will only further deepen their dependence on those countries.
Compare to TSMC's Arizona project, which will supply 30% of TSMC's 2nm and smaller process output. Already just one of the six planned TSMC fabs in Arizona is pumping out ~30k WSPMs at 5nm or smaller.
And that doesn't even get into CoWoS packaging, which is essential for all the highest-performance and highest-margin parts.
The fact is: In semiconductors, Europe is getting left in the dust. Sure they can fab some mature node chips for industrial uses--and that's not nothing--but Smartphone SoCs, "AI" accelerators, DRAM, even boring CPUs simply cannot be made any more in Europe, and to the limited extent that they can, they will be horrendously uncompetitive on the market and outclassed in every performance metric by Chinese and American chips.
EU is on a big sovereignty kick right now, which makes sense given that their foreign dependencies keep blowing up in their faces. So it's strange that EU is so complacent about their foreign dependency on advanced node semiconductors.
It’s too old, too complacent, and too broke. Even compared to the US and our level of discord, there’s no unity across divisions.
The US absurdly threatens Greenland, but Denmark/EU’s response is “Sanction US tech or kick out US military bases on Europe”, rather than be able to rattle a saber back and show some credible backbone.
They sent warships to Greenland. What level of saber rattling do you expect?
Is it supposed to work independently of other technology at some point?
Then anyways: multilateral cooperation is at the heart of scientific progress anyways. It's fitting that ASML is in a country that is culturally strongly influenced by its history of seafaring and trade. Will see how the braindrain caused by people not wanting to live their lifes in a society taht doesn't share values like these will influence that whole technological armsrace thing.
Some people in Japan are coming up with a successor to EUV as far as I remember, what was their name again?
[1] https://spectrum.ieee.org/nanoimprint-lithography [2] https://www.rapidus.inc/en/
China absorbing Taiwan (especially to Americans) just doesn’t seem like a radical, terrifying concept.
A Hong Kong style negotiated transfer might be best for the world - Taiwanese that want to leave can, the US can build up a parallel source of semiconductors, China gets Taiwan without firing a shot.
My conspiracy theory is that there is some kind of "gentleman agreement" on this topic between the US and China.
As soon as Taiwan is not needed anymore by the US for chip fabrication, the US will at the very least loose their grip on it.
Note to commenters: that's my theory, does not mean I endorse it in any way.
So report the facts but sentences like "What Wei probably didn’t tell Cook is that Apple may no longer be his largest client" make it personal, they make you take sides, feel sorry for somebody, feel schadenfreude... (as you can observe in the comments)
How do you think it got in the LLM training set in the first place?
Okay, but this isn't a news article, it's an opinion piece on some guy's substack.
However this post and the comments really debunk that - here we have a clear example of the author turning these people into characters, archetypes of reality tv, and inviting the reader to have an emotional response to what is potentially interesting, but actually just the mundane business matter of dealing with demand spikes.
A normal conversation might take a step back, above the emotional baiting, and instead lament on how TSMC weren't able to develop sufficient supply capacity in time to maximise yield across not just these clients, but many others whom are looking to get involved in the AI hype train. Instead we're seeing something quite different, and quite uninformed. It's reading like a gossip post from an instagram thread.
I notice that HN is actually more vulnerable to these types of conversations. Maybe it's because HN likely weights towards an ASD audience, which has less experience in handling socially driven narratives. I do definitely see here more of the "one-sided" conversation that is typical of ASD.
Intel has even struggled with it since they traditionally didn’t sell capacity to other buyers. It worked for Intel because they traditionally had a near-monopoly over the laptop, desktop, and server chip market.
Apple certainly has the money to spin up their own chip fabricator, but there’s no guarantee it would be as good as TSMC, it would cost billions, and they would have less of an ability to sell capacity to other customers.
At the end of that effort they could be left with a chip fab that produces chips that still cost the same or more than what TSMC manufactures them for. It might just be cheaper to try and outbid Nvidia for priority.
https://appleinsider.com/articles/25/08/22/apple-chips-to-be...
Apple's investing heavily in the TSMC fab in Arizona, due to open in 2027, to have 3nm capabilities for its flagship chips, but it's unlikely that would ever cover a majority of that chipmaking.
https://www.aztechcouncil.org/tucson-chipmaker-tsmc-arizona-...
https://wccftech.com/tsmc-plans-to-bring-3nm-production-to-t...
That would ruin TSMC and others' independence.
Nvidia already did buy Intel shares so it is a thing.
Nvidia did discuss with TSMC for more capacity many times. It's not about financing or minimum purchasing agreements. TSMC played along during COVID and got hit.
As far as I know there was never a demand dip at any point in there.
Which barely impacts TSMC. Most of their revenue and focus is on the advanced nodes - not the mature 1s.
> As far as I know there was never a demand dip at any point in there.
When did I imply there was a demand dip? I said they built out too much capacity.
AFAIK only Apple has been commiting to wafers up to 3 years in the future. It would be a crazy bet for NVidia to do the same, as they don't know how big will be the business.
I know about the existence of the initiative but I don't know how it is progressing / what is actually going on on that front.
There's ~a dozen in the works or under construction
TMSC plans to have 2-3nm fabs operational in the next 2-3 years
So we're 2-3 years behind the standard (currently 2nm), and further behind on the bleeding edge sub-2nm fabs
Are the majority of the staff still shipped in from Asia?
Then the essential skilled personnel can’t come train people because the visa process was created by and is operated by the equivalent of four year olds with learning disabilities. Sometimes companies say fuck it we’re doing it anyway and then ice raids their facility and shuts it down.
I’d post the news articles about th above, but your googling thumbs work as well as mine.
I really don't care about most new phone features and for my laptop the M1 Max is still a really decent chip.
I do want to run local LLM agents though and I think a Mac Studio with an M5 Ultra (when it comes out) is probably how I'm going to do that. I need more RAM.
I bet I'm not the only one looking at that kind of setup now that was previously happy with what they had..
The only reason that Thunderbolt exists is to expose DMA over an artificial PCI channel. I'd hope they've made progress on it, Thunderbolt has only been around for fourteen years after all.
Data is saying demand >>>>> supply.
I’m not saying this is bad or anything, it’s just another iteration of the centralized vs decentralized pendulum swing that has been happening in tech since the beginning (mainframes with dumb terminals, desktops, the cloud, mobile) etc.
Apple might experience a slowdown in hardware sales because of it. Nvidia might experience a sales boom because of it. The future could very well bring a swing back. Imagine you could run a stack of Mac minis that replaced your monthly Claude code bill. Might pay for itself in 6mo (this doesn’t exist yet but it theoretically could happen)
You don't have to imagine. You can, today, with a few (major) caveats: you'll only match Claude from roughly ~6 months ago (open-weight models roughly lag behind the frontier by ~half a year), and you'd need to buy a couple of RTX 6000 Pros (each one is ~$10k).
Technically you could also do this with Macs (due to their unified RAM), but the speed won't be great so it'd be unusable.
I wish I were in that situation, but I find myself able to use lots more compute than I have. And it seems like many others feel the same.
As would almost innumerable others.
You're setting yourself up for making a huge part of your future revenue stream being set aside for ongoing chipfab capex and research engineering. And that's a huge gamble, since getting this all setup is not guaranteed to succeed.
I wonder what will happen in future when we get closer to the physical "wall". Will it allow other fabs to catch up or the opposite will happen, and even small improvements will be values by customers?
Or they could buy out Intel and sell off their cpu design division
At this point it would be corporate suicide if they were not outlining a strategy to own their own fab(s).
Apple has less cash available than TSMC plans to burn this year. TSMC is not spending 50 billion dollars just because it's fun to do so. This is how much it takes just to keep the wheels on the already existing bus. Starting from zero is a non-starter. It just cannot happen anymore. So, no one in their right mind would sell Apple their leading edge foundry at a discount either.
There was a time when companies like Apple could have done this. That time was 15+ years ago. It's way too late now.
[0]: https://www.wsj.com/business/earnings/tsmc-ends-2025-with-a-...
Also Nvidia's margins are higher which means that they will be willing to pay a higher unit price.
This seems like an open and closed case from TSMC's side.
More likely they will not use leading the leading edge fab process, which TBH is fine for the vast majority.
I don’t know the hedge to position against this but I’m pretty sure China will make good on its promise.
Buy in-demand fab output today, even at a premium price and even if you can't install or power it all, expecting shortages tomorrow. Which is pretty much the way the tech economy is already working.
So no, no hedge. NVIDIA's customers already beat you to it.
The most advanced ASML machines also cost something like $300-400M each and I am willing to bet if configured wrong can heavily damage themselves and the building they are in.
What's worse, China having a monopoly on production, or the entire world being set back by X years?
In this scenario X is a two-digit number, right?
On top TSMC has fabs and personnel with expertise in other places. After all this threat isn't new.
The 2027 date was a guideline for their military to be "ready", which they may not be either. That is a far cry from the decision to actually make a move. They will only do that if they're certain it will work out for them, and as things stand, it is very risky for Xi.
> Apple-TSMC: The Partnership That Built Modern Semiconductors
In 2013, TSMC made a $10 billion bet on a single customer. Morris Chang committed to building 20nm capacity with uncertain economics on the promise that Apple would fill those fabs. “I bet the company, but I didn’t think I would lose,” Chang later said. He was right. Apple’s A8 chip launched in 2014, and TSMC never looked back.
https://newsletter.semianalysis.com/p/apple-tsmc-the-partner...
I get why the numbers are presented the way they are, but it always gets weird when talking about companies of Apple’s size - percent increases that underwhelm Wall Street correspond to raw numbers that most companies would sacrifice their CEO to a volcano to attain, and sales flops in Apple’s portfolio mean they only sold enough product to supply double-digit percentages of the US population.
The giant conglomerates in Asia seem more able to do it.
Google has somewhat tried but then famously kills most everything even things that could be successful if smaller businesses.
Every time a CEO or company board says "focus," an interesting product line loses its wings.
I think Asian companies are much less dependent on public markets and have as strong private control (chaebols in South Korea for example - Samsung, LG, Hyundai etc).
If you look at US companies that are under "family control" you might see a similar sprawl, like Cargill, Koch, I'd even put Berkshire in this class even though it's not "family controlled" in the literal sense, it's still associated with two men and not a professional CEO.
* Search/ads
* YouTube
* Android/Play, Chrome, Maps
* Google Cloud, Workspace
* Pixel, Nest, Fitbit
* Waymo, DeepMind
* Google fiber
They're not a conglomerate like Alibaba but they're far from a one-trick pony, either :)
And ironically Apple acts like being a small contender the moment they feel some heat after a decade of relatively easy wins everywhere it seemed.
So finally there is a company that gives Apple some much needed heat.
That’s why I in absolute terms side with NVIDIA, the small contender in this case.
PS: I had one key moment in my career when I was at Google and a speaker mentioned the unit “NBU”. It stands for next billion units.
This is ten years ago and started my mental journey into large scale manufacturing and production including all the processes included.
The fascination never left. It was a mind bender for me and totally get why people miss everything that large.
At Google it was just a milestone expected to be hit - not one time but as the word next indicates multiple times.
Mind blowing and eye opening to me ever since. Fantastic inspiration thinking about software, development and marketing.
Technically, there are billions of transistors in every tensor chip manufactured by Google
Apple hit 3 billion iphones in mid 2025.
Worringly for Nvidia, Apple is producing products people want and are provenly useful, thus a vast majority of its value is solid, so revenue streams for fabs Apple uses is solid.
Nvidia on the other hand, is producing tangible things of value, GPUs, but which are now largely used in unproven technologies (when stacked against lofty claims) that barely more than a few seem to want, so Nvidia's revenue stream seems flimsy at best in the AI boom.
The only proven revenue stream Nvidia has (had?) is GPUs for display and visualisation (gaming, graphics, and non-AI non-crypto compute, etc.)
For a statistical word salad generator that is _generally_ coherent, sure it's proven.
But for other claims, such as replacing all customer service roles[1], to the lament of customers[2], and now that a number of companies are re-hiring staff they sacked because 'AI would make them redundant'[3] still make me strongly assert that Generative AI isn't the trillion dollar industry it is trying to market itself as.
Sure it has a few tricks, and helps in a number of cases, therefore is useful in those cases, but it isn't an 'earth-shattering mass-human-redundancy' technology, that colossally stupid amounts of circular investments are being poured into it which, I argue, makes fabs mostly, if not solely, dedicating themselves to AI are now in a precarious position when the AI bubble collapses.
[1] https://www.cxtoday.com/contact-center/openai-ceo-sam-altman...
[2] https://www.thestreet.com/technology/salesforce-ai-faces-bac...
[3] https://finance.yahoo.com/news/companies-quietly-rehiring-wo...
It may still be profitable for TSMC to use NVidia to funnel all the juicy VC game money to themselves, but the statement about proven vs unproven revenue stream is true. It'll be gone with the hype, unless something truly market changing comes along quickly, not the incremental change so far. People are not ready to pay the full costs of AI, it's that simple right now.
Shares are a short-term speculative gamble; you buy them in the hope that the price will rise and then you can sell them for a profit. Sometimes the gap between these two events is measured in milliseconds.
So the only thing that matters to Wall St is growth. If the company is growing then its price will probably rise. If it's not, it won't. Current size is unimportant. Current earnings are unimportant (unless they are used to fund growth). Nvidia is sexy, Apple is not, despite all the things you say (which are true).
So now Apple, Nvidia, AMD (possibly), and most car manufacturers will be up a creek without a paddle when China invades in 1-2 years. That is unless China's Xi is bluffing to mollify domestic war hawks and reunification zealots by going through the motions of building an army of war machines without intent to use them, but I don't think that's probable. It's possible that Trump already made agreements with Xi to cede "Oceania" if they allow the US to take Greenland and South America for empire-building neocolonialism.
Any other time and place? The power to run it, plus the power to cool it.
What kind of experiments are you doing? Did you try out exo with a dgx doing prefill and the mac doing decode?
I'm also totally interested in hearing what you have learned working with all this gear. Did you buy all this stuff out of pocket to work with?
That you are writing AI agents for a living is fascinating to hear. We aren't even really looking at how to use agents internally yet. I think local agents are incredibly off the radar at my org despite some really good additions as supplement resources for internal apps.
What's deployment look like for your agents? You're clearly exploring a lot of different approaches . . .
Just look at what people are actually using. Don't rely on a few people who tested a few short prompts with short completions.
I mean this is pretty fantastic.
How is owning a larger share of a company with proportionally less cash and a higher price per share than what you could have sold it for before bad.
Have you looked at precious metal charts as of late? Do 1/x and that's the value of the cash these companies are trading for a valuable business.
We really need many more smaller, more independent manufacturers. All the big guns, from NVIDIA, Apple, Intel, AMD, etc... have massively disappointed about 99.9% of us here now.
NVidia gets the capacity because they're willing to pay more. If Apple wants to, they can pay more to get it back.
mirroring, come to think of it, the movement to un-democratize of modern governments...
(I would be happier if the news behind Nvidia's strength was sales of good, reasonably priced consumer GPU cards...but it's clearly not. I can walk down the street and buy anything from Tim Cook, but 9 out of 10 times, I cannot buy a 5080/5090 FE card from Jenson Huang).
And possibly other types of hardware also had price bumped or used outdated chips because Apple has to build their iPhone/mac n+1.
That's why you see some folks actually mocking Apple about the situation. They were already affected.
If anything this might force a market-wide fix in the medium term.
Between the $99/year sideloading, Liquid Glass and fighting fruitlessly against CUDA, I think Apple needs a break to reflect on why their software strategy is so unpopular with everyone. The hardware advances are doing them more harm than good at this point.
Intel seems to be very competitive again when it comes to laptop battery life. If macbooks again get the reputation of sluggy and overheating that's not great for sales.
But the point here is that a few companies are outbidding everyone else, hoarding shittons of compute and putting it into their data centers, to rent to people. This is effectively taking compute ownership away from consumers and centralizing compute i.e. un-democratising.
Apple outcompeting other companies to put their products into the hands of regular people is vastly different.
But perhaps Apple needs them to power translucent UIs and Siri.
https://newsroom.intel.com/client-computing/ces-2026-intel-c...
So it's not available yet then?
It means that Apple doesn't have to be sole investor in latest node development which is more harder to justify, especially in the year where smartphone upgrade cycle is slowdown. Having NVIDIA (and AI boom) in the picture should help Apple reduce CAPEX for their semi-conductor investment.
There are many influencing factors that foreigners may not necessarily be aware of. In fact, this has little to do with TSMC. Rather, it is that China’s domestic public opinion environment has undergone major changes.
Over the past several decades, domestic public opinion was generally pro-American and pro-Western, and it deliberately emphasized the positive side of Taiwan, while providing Taiwan with substantial economic support. But in recent years the situation has changed dramatically. One reason is that Taiwanese public opinion has spread widely through platforms like Xiaohongshu, VPNs, and other channels(Like the Japanese, they pray for the Three Gorges Dam to collapse and drown large numbers of Chinese people, and they celebrate when natural disasters happen in China.). People have gradually realized that Taiwan is not what we once expected it to be; many people there are pro-Japanese, and economic support from mainland China would only have the opposite effect.
This has actually happened many times in history. There is an old saying: some people only fear force and do not respect virtue. In addition, drastic changes in the international situation, and especially Trump coming to power, have profoundly changed the perceptions of the Chinese people. One can say he was the most critical factor. From that point on, the pro-American camp within China has had very little room to speak. He tore off the so-called fig leaf of democracy and helped the Chinese people establish confidence in their own system.
Regarding the Taiwan issue, mainstream public opinion almost universally supports resolving it through force. Hong Kong has already demonstrated the drawbacks of resolving issues through non-violent means. In many matters, it is actually the Chinese people who are pushing the Communist Party forward, while the Party instead needs to restrain public sentiment and act rationally. Everyone wants to fight and to resolve the issue completely through swift action. If TSMC is destroyed, it does not matter; we cannot use it anyway, and high-end chips have long been embargoed against us. The ones affected will not be us, but others.
Of course, based on my frequent experience using the PTT forum, Taiwanese young people themselves are also deeply divided. Many people have seen China’s progress and are not that hostile, but many others are still trapped in indoctrination. The most ironic thing about democracy is that, in many cases, it controls people’s thinking more severely than so-called non-democratic countries, especially in small states. But none of this is important, because the overall trend is set and unstoppable.
(It seems that no one is paying attention to the fact that China is imposing its most severe embargo on Japan, because the United States has just invaded Venezuela and is threatening Greenland. The United Kingdom and France have just bombed a certain country. You see, when Western countries are doing bad things themselves, they feel embarrassed to accuse others)
Blackwell 100 8192-bit 8 Terabytes/sec
Blackwell 200 8192-bit 8 Terabytes/sec
Hopper H100 5120-bit 3.35 Terabytes/sec
Hopper H200 6144-bit 4.8 Terabytes/sec
Hopper precedes the Blackwell architecture, implementations of B40 and B100 accelerators circa 2023.
Semiconductor fabrication, either half-node, transistor-gate in terms of identifying the local 90nm semiconductor circuits for clear paths to 10nm node.
6 million Blackwell GPUs.. have left NVIDIA’s warehouses.. 15.6GW of power is required to make the last four quarters of NVIDIA GPUs sold turn onAfiak there is a law in Taiwan that says the overseas plants cannot be on par with local plants iro process nodes - two or one generations behind.
500 Internal Server Error