Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
7 years ago it was the same price, but then again, the last 7 years have involved accelerated inflation. So, the same price is actually a lower price.
If you're looking for a card in the sane $300 area, the Intel ARC B580 (12GB) or the RX 9060XT (8GB) are a reasonable value. If you want 12GB+ from Nvidia or AMD the used market in previous generations is a good place to look: maybe something like a RTX 3060Ti (12GB) or RX6800XT (16GB).
I personally don't think the GPU market is incredibly miserable. Maybe I am just used to the pain or something? Nvidia has a bit of a tax where but something like the RX 9070XT is basically the 3rd fastest gaming GPU money can buy and it's around $700. (I'm not sure why the 5070ti costs $200 more even given Nvidia's software advantages. It performs almost identically it just doesn't make purchase sense)
If you're getting 8GB then I'd say there's not much reason to go back to previous generations.
2017 GTX 1070 and 32GB ram. I don’t run games 4K and still haven’t had any problems running reasonably pretty recent stuff.
It may sound like pseudo-Buddhist claptrap, but it's also true. Or, I suppose, Fight Club claptrap. It's still true.
The choice is "do you want to participate in society, its benefits and drawbacks". You can't have only one side of that.
> attachment causes suffering
it also makes the economy go round (unfortunately)I used to think the plateau was here when the Xbox 360 and PS3 came out.
I don't mind that graphics have plateaued, because they aren't the important bit. If anything, I would rather that devs stop trying to chase graphics and make more games with shorter dev cycles.
Partially this is because there was usually an overlap in sales for early PS4 and late PS3, etc. if you have to support both console generations, it won’t truly be able to take advantage of the newer gen stuff.
I've kept playing games and upgrading my GPU every other generation, and they're still fully utilized, but I can't really see where the additional compute and money is going. My biggest visual upgrade during that time was actually going from LED to HDR OLED which is something that requires virtually no additional processing power.
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
Why associate them with roles that have a degree of positive association and human connection?
Treating them as faeries, vampires, or demons seems more accurate.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
Now we are just playing with fire.
> we are in the financialization and rent-seeker world now
sometimes i wonder if this is what happens when organic growth stops/slows; when (for lack of a better word) desperate people just start looking for any alternative to keep the growth train running...All of them come from the 'make money from other people's labour' background, and they will, of course, use that political power to optimize that wealth extraction.
You can see the divide everywhere. People with lots of money think supply and demand, congestion pricing, etc. are great tools because it doesn't impact them at all compared to people on the bottom. Those are only good solutions if you're not the one falling off the bottom rung of the ladder.
Is it really shocking that people are upset to see the supply of resources being cornered and hoarded by the ultra rich with the most likely outcome being the only way to get access to those goods will be to pay forever?
The possibility of AI becoming a must-have knowledge repository or memory assistant is scary if you couple it with the idea of never being able to own it. How much is your memory worth? What if you can't compete in terms of productivity without having access to AI? What about the people that can't afford the "first month of rent"?
People come in and make angry posts like the GP because they know they're getting disenfranchised and don't have the power to do anything to change it.
I get what you’re saying, and there definitely are people who are angry about the US slipping, and standards of living reverting to the mean a bit, and looking to blame someone. The True Believer came out in the aftermath of WW2 and tried to analyze why it happened, and laid out that the most dangerous group of people aren’t the ones who’ve been poor for a long time, but those who were recently poor, who remembered a more prosperous time. Those people get tremendously angry about it, and represent fertile ground for politicians and motivated groups to plant the seeds of hate.
People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
Is it just people trying to sow division when you're potentially describing an entire upcoming generation?
> People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
I don’t agree. The current generation of young people can’t afford housing and education without taking on decades of debt. Buying a pair of 3090s for local AI isn’t even on the radar. Even if they could, it’s unlikely they’d be able to make productive use of them. The big AI companies haven’t even scraped the surface when it comes to memory, specialized knowledge, etc..
I see people downvoted my comment and I’m not sure why. I’m not trying to pile on to create drama. I’m trying to explain there’s a growing cohort of people that have a right to be angry because they’re watching global productivity increase as their standard of living is decreasing. Who wouldn’t be upset?
The dangerous part is that people angry about it are easy to sway with propaganda. It’s not the billionaire families colluding to fix food prices, which happened with bread in Canada, it’s the “insert another marginalized group here” that’s causing the problem.
I didn't mean that most people are going to go out and drop $1,000 and run their own models locally, I meant that it's pretty good evidence that they're not permanently locked out of owning access to AI, if that's a priority to them.
I agree with most of the rest, I'm a strong proponent of all sorts of safety nets, and higher top tax rates/cap gains tax rates. But it's also important to maintain perspective. A lot of what's happened is that citizens of very rich countries are maybe seeing their standards of living decrease somewhat while many more people globally are seeing their standards of living skyrocket. Visiting family in China every 5 years, the difference is astounding every time.
Upvoted that comment, fwiw, you answered in good faith, not sure why it's downvoted.
You wave off systemic issues as no big deal and discuss the potential of a 3090 graphics card. Tell us you're a privileged first worlder without telling us...
That you refuse to discuss solutions to political problems impacting a lot of people who, in our society are off the hook for you too, you're deciding to take the risk your own life doesn't vanish.
You're not relevant to others. Americans lack of political action to ensure a safety net exists for everyone just leaves everyone indifferent should you too end up giving blow jobs behind a Burger King for a portion of kids meal someone threw out a car window should it come to that for you.
So go ahead and pretend reality doesn't exist outside your own experience, little Dark Triad. But if you end penniless in the gutter, you'll only have yourself to blame
The point about 3090s was that reasonably good local AI costs on the order of $1,000, so Americans aren't structurally locked out of owning the means to run their own models like the person I was responding to seemed to be claiming. If you can afford a desktop, local AI is in reach if owning it is a priority for you. I don't recommend that route, but it's possible.
From your other comments, sounds like you're also a "privileged first worlder" who got to go to college and attend Burning Man, so let's not fling stones. I'm extremely lucky, I'm extremely aware of it, a visit to some of the actual poorest parts of the world, where people wash themselves and their clothes in rivers that stink so badly of sewage that it's hard to breathe without gagging made me very aware of how lucky even the poorest Americans are, despite how bad it can feel to be in close proximity to some of the richest people in the world when you're not.
And if you're not an account who's part of an "AI-fueled agitprop campaign", I'm sorry for whatever's happened to you that's given you so much rage that you're feeling the need to come here and dump on nearly everyone you've interacted with. I hope things go better for you in the future, I really do.
It’s bad, but it’s not “literally own nothing”.
people will own an increasing number of dumb terminals connected to rented services.
does that reduce the number of computers? well, no..
so, imo : the trick isn't to reduce physical ownership of devices, the trick is to make it so that you need Big Iron in order to do anything.
One way that might be achieved is by forming social and cultural dependence on models so large that no one individual could possibly run them...
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
That has never before happened in the history of computing, and it violates long-held, fundamental assumptions.
If the cheapest useful computer ends up costing $3K, it will still be purchased and will still be worth it at around $1/day of useful life.
[0] https://web.archive.org/web/20160306232450/http://www.pageta... [1] https://www.gartner.com/en/newsroom/press-releases/2026-1-20...
Starbucks' revenue was almost $10B in the last quarter. Most people can clearly afford $1/day for something as useful as a computer.
It's extremely obvious from your flippant attitude that you are doing quite well financially and are completely out of touch with the financial realities of the vast majority of people. Congratulations on your financial success, but maybe lay off on thinking that everyone else can afford the luxuries that you can.
Now, per unit costs is rising faster than inflation. The WD HDD I bought in 2017 for $65 real ($49 nominal) is now $95 real, 50% more expensive after inflation.
Trust me when I say my income has not increased by 50% post-inflation since then! (Also … I really should not have checked that number. Needless to say, it's not positive.)
What you're showing me is that you are completely out of touch with the financial realities the vast majority of people face.
There is a reason that the Macbook Neo has been a smashing success.
If the cheapest useful computer ends up costing $3k, then most people will simply no longer own a computer whenever their current computer dies unless their livelihood depends upon it, which for most people it does not.
But from what I can see a lot of people aren't really interested in PCs. Most of the non-techie, non-gamer people I know do everything on mobile.
Maybe it's different in the US. In Canada, the median income for 25-54 years old was just under $60k / year in 2024. When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income. My gut reaction to that is that most people don't even end up with that much disposable income in total, let alone for a single purchase.
HN is skewed with people way at the top end of income earners, especially on a global scale. Imagine getting $30k / year to spend on everything you need and then consider how much $3k on a computer is.
My dad had to take a loan to buy our first computer. Who wants that? It's dumbfounding to see the number of people cheering on backwards progress where we end up where we were 3+ decades ago.
If it lasts for 10 years, it's more like 1% of the after tax income of a median individual earner over that period.
I think a computer is clearly valuable enough that people will entirely rationally spend 1% of their income on it if that's what it costs. (I'm not "cheering it on"; I'm just observing and predicting that lots of normal people will still buy computers.)
Maybe we'll get a chinese hardware black market.
[1] https://www.reuters.com/sustainability/boards-policy-regulat...
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
No one really resists or pushes back. When I resist I hear "that's what consumers want", "it's for security", or that I'm the problem. There is no one to complain to even, except to low paid kiddos in customer service.
Technofascism
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
Own nothing and be happy.
Have you intended to say "because reasons"? There should be a long chain of reasoning connecting "LLMs will never be able to strictly follow instructions written in natural language (as agreed by 90% consensus of experts or some such, because you can't formally verify adherence to informal natural language instructions)" and "physics doesn't allow that." And I can't find it anywhere. Neither in your comment history, nor in literature.
But the fact is that there's plenty of literature out there on hallucination and unreliability of LLMs already. If you know otherwise, let us inform Dario before next funding round.
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
If any of these people don't work or don't work enough, they undeserving immoral moochers and should be miserable and in pain.
> and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
It totally fine if these people never lift a finger in their lives. In fact, they deserve it. NEVER question that. N-E-V-E-R! It's great! Capitalism is great! Capitalism is fair!
That confident "will" in that prognosis may ultimately stimulate a consensus "why?" response in the population to explore alternative outcomes ..
I spent the last half a century making sure they have no leverage and I am not interested in being coerced.
It's called security.
if you are living mobile, you probably need gas or batteries for warmth or cooling. if your climate is currently comfortable, temperatures can be raised.
or maybe you are a nomad hunting and gathering your own food? the wilderness can be pillaged and sold and "secured" until there's nothing left to eat.
there is no perfect security.
No lairs necessary. You can read up on people who do FIRE.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
... Do you want corporations to have that power too or something? What are you saying here?
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
So, no, I am not too worried about Amazon removing my $9.99 book.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
The only reason you have those watch brands to mention is because they are non-functional status symbols. People that want a watch buy something else.
The same way, people that want a computer will buy from whoever is actually selling them. Manufacturers that want to sell only to datacenters won't last for long.
> replace them with terminals that stream everything to the cloud
they've been trying for a looong time on that one. i still remember those junky "net appliance"s from the early 2000s [0] and oracle and sun making big statements about them...[0] https://www.ecommercetimes.com/story/sonys-evilla-joins-audr...
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
> The deepseek-v4-pro model is currently offered at a 75% discount, extended until 2026/05/31 15:59 UTC.
However even when the discount ends its still very cheap. It will go back to $0.0145 / M cache hit. That's still 34x cheaper than GPT 5.5.
If you serve a single user you'll never get your electricity price back, nevermind hardware costs.
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
ok....
> PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards ...
I'm so confused by what you're trying to say here.
I'm interested to know, WHY is PC so open? what led to that?
https://en.wikipedia.org/wiki/Columbia_Data_Products
That, and I’m pretty sure the DOJ had ended the antitrust suit (which was about bundling) by the time the PC was released.
At a high level, the IBM PC platform were very well documented & sold well, to the effect of producing tons of software and peripherals add-ons ("PC Compatibles"). This led some other computer companies to reverse engineer the proprietary IBM BIOS, allowing them to run the same software and use the same peripherals. Because these were clean room reimplementations, IBM didn't have a legal case to prevent their sale.
Fast forward a bit, IBM's attempt at a new, closed platform, PS/2, flopped. People wanted their more open hardware. Windows became dominate enough that all the demand was for x86 based hardware that could run Windows. Microsoft was happy to work with many vendors.
The PC is very open today, but Apple survived. Atari ST and Amiga probably survived longer than you think as well.
In the whole history of computing PC is the only platform where buying a computer means crazy number of options and configuration mixes to choose from and expect it to work! And warranty would support it too! You can run any OS of your choice on it and that's also reasonable expectation.
Any other platform (SUN, Be, Amiga, NeXT, Apple) it was always buying it from one company only from its list of products. And even running with a different version of OS means warranty doesn't cover it.
I guess it really is just the PC.
I just want to warn people who haven't heard server-grade hardware in-person before: this is only for people who can put a server rack somewhere unpopulated like a garage or basement. Servers will make you think "wow, leafblowers sure are quiet". They are not suitable for apartment dwellers such as myself. When I was setting up my 1U before shipping it off to a colo, I wrote scripts and had detailed plans of the things I needed to run so I could minimize the time it was making my ears bleed.
1Us have the most compromised ventilation and compensate with loud fans running at high speeds.
This is all built to be put in a place where noise is not an issue
I have a 4U NAS with a supermicro board and an i3 chip with 6 WD Red NAS drives and it’s very quiet. The chassis came without fans so I installed the brand I like.
It was a learning experience, and I think everyone should experience that kind of industrial noise at least once to appreciate how quiet consumer hardware is.
(And yes, my workstation has a clear case and LED RAM. Yes, I'm an idiot. Whenever Windows applies an update late at night, I wake up if it turns back on. I don't know what I was thinking when I built that thing, but never again.)
A bigger issue for enterprise hardware is that it's optimized for performance per watt under load, not idle power consumption. Running a mostly-idle rack server 24/7 can result in a pretty sizable electric bill. This also depends heavily on the model. Some will idle at ~50 watts, others at ~300, but both of these are significantly higher than a Raspberry Pi or an old laptop which for personal use will generally do the job.
Business class desktops are also a good alternative here. Many models have pretty reasonable idle power consumption (check this for yourself, I've seen 6W but also 60W) and then you get a couple of drive bays and PCIe slots and expandable RAM which you don't get from a Raspberry Pi.
It's really not worth it to run old hardware 24/7 unless it's making money. Buying a new machine of equivalent capability is (normally) pretty cheap, and it doesn't take very long for the power savings to pay for themselves.
They can be had with fairly respectable specs too. Certainly enough to play around with small local models.
Some of the newer hardware is actually worse because the idle power consumption of PCs since around 2010 is determined in significant part by the low-load efficiency of the power supply. Brand new machines with the wrong power supply can use several times as much power at idle as ten year old machines with the right power supply. Annoyingly, power supply efficiency at idle is rarely documented so the only thing to do is measure it.
Even though my server rack is in the garage I try to keep it quiet. A couple of them are fanless Atom-based and others have fans but they are built to be quiet. If you need hardware that generates a lot of heat, go with 4U for large fans that spin slow, thus low noise.
The "wow, leafblowers sure are quiet" happens when you stuff a lot of heat generation into a 1U chassis that then requires lots of tiny fans running at full speed. Those you don't want at home! But it is easy to avoid. Data centers do this to maximize density, but that's unlikely to matter at home.
Not exacty enterprise grade servers then?
I had to mod the chassis slightly (with just pliers, tape and random inserts) to fit these fans in there, and add fans in front to push the air in. The PSU that came with it was obnoxiously loud, but thankfully, Supermicro has a quiet version that I can't even hear. Even if SM didn't have this PSU, I could have easily modified the PSU and fit some noctuas in there without any issue or safety concerns - like I did with my enterprise grade Mikrotik switch that also had obnoxious fans by default.
I even have an enterprise grade UPS that is dead silent when it's not running on battery power (I swapped the fans there too).
I essentially try to buy enterprise gear whenever possible. Not only is it usually much better than the consumer alternative, but it also is frequently much cheaper too because of second hand market. Before AI sucked the soul out of the hardware market in general, you could have bought enterprise SSDs that had life expectancy - TBW - measured in petabytes, and MTFB - practically never - for half the price of the top consumer SSD that had TBW measured in tens of TB and MFTB of yesterday.
And the entire rack is just slightly more louder than the PC I was using.
The only consumer grade computer at my home is my MacBook and my phone.
Every house I’ve lived in has had machinery for water pumping and heating and we just put our server along with them.
A 1U case runs the gamut in noise from vacuum to jet-engine.
meanwhile..i see axiomtek industrial computers that dont even have a power button sold with 7yr warranties..
I had a 2U Xeon beast I kept water-cooled. Before I installed the water cooling, a bit noisy and 60C. Afterwards, total silence and 30C.
At least, that's what I hope happens. What will probably happen is people will continue to migrate away from the PC platform and towards closed platforms for the convenience, if history is any indication.
Is this for people trying to start the next netflix out of their garage before they have any money to put the servers in a colo?
Mostly because the base Mini has Thunderbolt 4 which maxes out at 40Gbps. Anything with a PCIe 4.0 x16 slot will take a 100Gbps NIC. 100Gbps is around 10GBps (8 bits per byte plus encapsulation overhead). Desktop CPUs can do AES-GCM at 2.5GBps+ per core and have up to 16 cores and around 50GBps of memory bandwidth (dual channel DDR4-3200), so the NIC still seems like the bottleneck.
Degrowthing is dumb, people will find use if they have more.
Realistically a Mac Mini will probably blow a lot of things out of the water on price / performance. Even an older one.
An actual rack with noisy 1U or 2U servers may be a bit overkill but on the plus side there's a guaranteed endless supply of such used servers.
Now there's a happy middle ground: used workstations with ECC memory, that you then use as servers.
People would be really wise to not underestimate what a 12 years old dual-Xeon, 14 cores each, 56 threads in total can do, for example. And such a complete workstation can basically be found for less than what it takes to fill my car's gas tank (granted it's got a big tank and it's fancy car whose manufacturer recommends to only use 98+ octane).
A single Xeon workstation with shitload of memory in a tower form factor is basically silent. Mine is. Dead quiet, next to the vaccuum cleaner and the cat's foot in a tiny room. I use it as a headless server.
And that's with the default PSU and fans. There are, of course, people modding these with adapters for regular consumer PSUs and then putting ultra-quiet PSUs in those. Same with Noctua fans etc.
And as for the usual complain: "but a server that is on 24/7 consumes too much electricity"... I only turn on my servers at home when I begin to work: I don't need these to be on 24/7.
So yeah: "Server CPU + ECC" doesn't imply noise. And "Server CPU + ECC" doesn't imply it has to be on 24/7 neither.
I like my Dell Precision T7910 (dual-socket Xeon FTW) a lot.
What are you using?
Most home users need a small amount of compute, and are sensitive to noise and power use.
The SoC design with unified memory is generally well suited for residential use because it's quite energy-efficient, quiet and small (compared to traditional GPU-powered gaming rigs). Great performance-per-annoyance, so to say.
Why did we listen to the Worldcoin guy again?
I would also say that most consumers, who are almost exclusively buying gaming-oriented boards, do not need anything high end. They can pretty much buy the cheapest board available.
I am shopping around for a mini ITX board and the difference between something at $180 and something at $400 is basically one to two faster USB ports, which are pretty much irrelevant on desktop computers, and a few minor conveniences that I imagine most people can do without.
The higher-end chipsets add no discernible advantage and there are no CPUs that are unsupported by the lower end chipsets (on the AMD side, at least).
The high end stuff is just available for people with a lot of money.
This is obnoxiously difficult to shop for in the desktop/workstation space.
Like, what are you actually connecting your desktop to?
The only reason laptops depend on Thunderbolt is because they have limited internal expansion and need high performance external I/O.
If you need more things than gaming boards offer then obviously you have very advanced needs and can go pay for a workstation board, something like an sTR5 socket Threadripper board.
I don't mind having to work within those physical limits but I do want to be able to search for boards that support N components. i.e 1x 4.0x8, 2x 3.0x8, 4x 5.0x4 . But the best you can search for is physical sizes of pcie slots and then dive into a spec sheet for each one, only to find that the 6 x16 slots only have 1.0x1 of bandwidth each.
Unfortunately, as far as finding something cheaper than that, you're looking for a product that appeals to a very small to non-existent market demographic.
Most of the buyers who want workstation boards (companies) do not want a computer that requires assembly.
The demographic that builds their own PC is almost exclusively doing so to play games.
Everyone else who wants to use a computer wants a portable laptop.
The good news is that all the complaints you have about gaming boards are mostly cosmetic. There's nothing unreliable about gaming boards. They all support the specs they claim to support. You don't have to use any overclocking features (I don't). They are off by default.
If you want low idle power consumption, what you actually want is a system that has soldered RAM (LPDDR) which essentially goes against the other parameters of what you asked for. You don't want a module desktop PC at all if that's your parameter. What you really want is a mini PC or a Mac mini.
You're asking for a workstation board with low idle power consumption, but nobody who wants that is optimizing for low idle power consumption.
The best system for you is probably an HP, Dell, or Lenovo workstation PC. The good news for you is that these are all over eBay as corporations tend to sell them in bulk when they're done with them in just a few short years. They're reliable, quiet, and have low idle power consumption. Or, you can go with the big workstations that support ECC RAM.
We can talk all day long about how my mom waited tables for $2/hour and her first car cost $5,000 brand new, it's not relevant anymore.
Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP. Probably same or better on Linux.
There's even Win 11 LTSC now
I don't remember Win10 being particularly lean (although I'm sure 11 is worse). And the M4 is definitely a much more powerful CPU. Can you not run Firefox and uBO on that? Or have they really weighed things down that much with the OS somehow?
> Probably same or better on Linux.
Even with the Cinnamon desktop environment I can vouch it uses considerably less RAM for just the desktop (ordinary applications are probably about the same) and offers much faster filesystem access by default. I'm sure this is at least partly due to not being weighed down by built-in anti-malware (that would do basically nothing for people who are comfortable using Linux in the first place).
Entry level motherboards are still $100.
$300+ is a very high end motherboard.
The existence of very high end products is confusing because it can give the impression that you have to buy a $300 motherboard because it exists. If you compare features side by side you're rarely missing anything important for the entry level motherboards.
Some people really want the best of the best and feel the need to buy motherboards with Thunderbolt 4 and other future-proofing measures just in case they might need them, but it's premium and luxury territory.
It's smarter to buy a cheap motherboard that meets your needs now. If in the future you find the need for USB4 or some other feature, upgrade the motherboard.
More often than not, builders will try to future proof for eventualities that don't arrive before it's time to upgrade to the next CPU socket anyway. There are a lot of people with expensive, outdated "futureproofed" builds who would have been better off saving the money on the original purchase so they could upgrade sooner instead.
Wanna guess how many times I've used that USB-C port? Maybe once or twice in the 9 years I've owned it. Never needed it. I also couldn't tell you what X370 is getting me that B350 wouldn't have gotten me.
"$1600 is too much for a video card" - me a few years ago on not buying an RTX4090 from nvidia's website.
"I only need 32Gb of RAM. If I want more later, I'll just updgrade" - Me a year ago.
Both mistakes, with hindsight. I will always future proof from here on out.
"$100 is a reasonable amount for a video card, I know this is on the budget side but at least I have a card this way" — me 12 years ago.
"I guess it's worth it to spring for 8GB of RAM..." — me 12 years ago.
Still using the same machine, with no regrets (just the occasional bit of envy).
Different people have different expectations and requirements.
Note that this was before the RAM shortage, but I bet you could still do this now and save a little versus mid-tier current gen gear.
Then you get a new board designed for the new features instead of something several years old and you come out $100 on top.
Futureproofing is nonsense. PCs just don't work that way, and haven't for decades.
Right, but the problem is that by now your $100 new motherboard requires a new CPU and new RAM. Which is very much not $100.
In the past we got away with PCI cards to add features without changing the motherboard, but we still ended up changing everything every 2 years anyway…
Entry level motherboards were $50 (meaning 40 on sale)
13-14gen Intel Cores are still more than enough for your average home gamer, Zen 5 shows only marginal improvement over Zen 4 except for a very narrow range of workloads, getting wider than 128bit memory bus is prohibitively expensive while relatively cheap consumer boxes like Mac Mini run circles around dual-channel DDR5 setups, so on, so forth.
Sure, presenting this as a consequence of AI boom is convenient for a news outlet, but even before the craze both Intel and AMD were dragging their feet.
I'm not buying it. Both the premise and the new motherboard, that is.
# dmidecode 3.7 # SMBIOS entry point at 0xba970000 Found SMBIOS entry point in EFI, reading table from /dev/mem. SMBIOS 3.3.0 present.
Handle 0x0023, DMI type 16, 23 bytes Physical Memory Array Location: System Board Or Motherboard Use: System Memory Error Correction Type: Multi-bit ECC Maximum Capacity: 1 TiB Error Information Handle: 0x0022 Number Of Devices: 8
Handle 0x0027, DMI type 17, 92 bytes Memory Device Array Handle: 0x0023 Error Information Handle: 0x0026 Total Width: 72 bits Data Width: 64 bits Size: 32 GiB Form Factor: DIMM Set: None Locator: DIMM5 Bank Locator: BANK4 Type: DDR4 Type Detail: Synchronous Registered (Buffered) Speed: 3200 MT/s Manufacturer: Unknown Serial Number: 05A23401 Asset Tag: Not Specified Part Number: RRD25600D4C8K256 Rank: 2 Configured Memory Speed: 3200 MT/s Minimum Voltage: 1.2 V Maximum Voltage: 1.2 V Configured Voltage: 1.2 V Memory Technology: DRAM Memory Operating Mode Capability: Volatile memory Firmware Version: Unknown Module Manufacturer ID: Bank 1, Hex 0x80 Module Product ID: Unknown Memory Subsystem Controller Manufacturer ID: Unknown Memory Subsystem Controller Product ID: Unknown Non-Volatile Size: None Volatile Size: 32 GiB Cache Size: None Logical Size: None ...
On modern systems (all 64 bit AMD, and Intel Core "i" onwards, so quite old now) the memory controller is integrated into the CPU, so what the CPU supports is what you get, and the latest CPUs are DDR5 only. Intel did have a transitional phase of CPUs that can do both DDR4 or 5 depending on motherboard, but AMD it's AM4 = DDR4, AM5 = DDR5.
What's that? Egg prices are back down after suppliers cranked up their output? Surely nothing like that is possible with hardware... Personal computing is dead forever...
https://www.reuters.com/world/asia-pacific/sk-hynix-invest-a...
> SK Hynix has reportedly broken ground on a new advanced memory packaging facility in West Lafayette, Indiana, that should boost the supply of US-made high-bandwidth memory (HBM)
https://www.theregister.com/on-prem/2026/04/22/sk-hynix-brea...
> Samsung to advance mega-fab expansion by 6 months to get ahead in capacity race; SK Hynix follows suit
https://www.kedglobal.com/korean-chipmakers/newsView/ked2026...
If you really want to see a radical shakeup that would have some very exciting effects, could I interest you in a little Total Atomic Anihilation?
But RAM prices went to the moon, so I instead opted to repair the desktop. (It's only ~15 years old.) It's alive, again, and performs well enough.
The HDD in it is pretty old (not as old as the rest of it, it's on its second drive; 15 years would be quite impressive!), and still works for now, but there too, prices are silly and well above inflation. (I looked it up again: the same HDD is 50% more expensive today than when I bought it, in real, accounting-for-inflation dollars.)
Since this mess started, I've bought dozens of unused and like-new systems for clients. All with modern hardware - in the $250-$600 range.
----------------
1. Within a few months, these manufacturers will likely raise desktop mobo and CPU prices with the justification that "volumes are too low".
2. If you're upgrading from an older machine, it likely has a format of RAM that's not compatible with newer boards. Upgrading the cheap parts now and waiting for the expensive bits to come down is simply not an option. It's all or nothing.
Game and application developers should be paying close attention to this. You're used to the average user's system spec going up every year. That's stopped for now. The average memory in new systems may actually retreat!
Instead, I hittup ebay, got six used gen3 processors, found a "good deal" on a couple tb of new ram (still insanely expensive), and came out with the same overall horsepower for a total of $20k instead of $110k.
I know this is about consumer desktop, but seeing the comments about upgrading old hardware caused me to chime in. This is happening in the production/enterprise level in some segments.