This feels like a "no one needs more than 640k of RAM"^ kind of comment.
Only triple? In 24 years? Maybe I'm misunderstanding the transition from fossil fuels to renewables where we're replacing one source for another as opposed to increasing the power use, but it does feel like the demand for power, especially with Data Centres in the current news cycle, would take us 10x in "the shortest amount of time it's possible to 10x power generation".
> Fusion: Once the technology is fully commercialized within the next decade
I don't really want to say it, but isn't the joke that fusion been a decade away for 50 years?
^I know this is not quite what was said, I'm just using it for reference.
So you have two opposed forces in action. Rapidly increasing demand for electricity consuming services, and rapidly increasing efficiency of those services. It also helps that a lot of that additional demand is only possible due to increased efficiency. Imagine if every phone was as power inefficient as an old Pentium 4. They would last about 30 mins and burn your hands in the process.
Even with datacentres and AI, there is huge economic pressure to increase the efficiency of the devices involved, and there’s been no slow down in year-on-year increases of compute/W, even if the total amount compute per chip isn’t as rapid as it used to be.
You may argue that Jevon's paradox might not apply to home power use. I mean, how many lights and how many refrigerators could one house possibly have? But AI use and it's associated power consumption is VERY susceptible to Jevon's paradox.
> The report finds that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028.
Even if data centers went 10x, that would only increase our electricity use by a bit over a third.
.
But for something more fun:
This[1] says global energy use is 186,000 TWh/year. Or an average of about 21.2 TW.
The surface area of the earth[2] is 510 million km^2, or 510 trillion m^2.
Which works out to global energy use being an average of about 42 mW / m^2.
Per Wikipedia[3], the IPCC says that human-caused greenhouse warming is 2.72 W/m^2 .
Which is "only" about 65 times global energy use.
Which means if we did start using double-digit multiples of our current energy use, it starts to matter whether we're adding that energy to the environment (fission/fusion, fossil, probably geothermal) or just redirecting it (hydro, wind, solar). With the caveat for solar that the panels probably have lower albedo than what they're on top of.
[1] https://ourworldindata.org/energy-production-consumption
[2] https://www.universetoday.com/articles/surface-area-of-the-e...
I understand in the 50's we needed reactors to create plutonium to fend off russians.
I understand in the 80's the solar panels were expensive.
But now, when the panels are cheap and lithium batteries are cost competitive and sodium batteries are being actively developed (and already put into cars), there is simply no excuse.
Why then?
Australia is the same. More sunshine than we know what to do with. Vast amounts of land that is essentially unpopulated and no good for much else (arguably, the beauty of nature etc.), and yet we have two main political parties: one is passionately anti-renewables and essentially drill-baby-drill whilst the other is milquetoast on renewables.
... all the while Australia is dependent upon importing the power that fuels critical infrastructure and logistics. Makes no fucking sense whatsoever, unless the status quo is making massive profits and can't face the possibility of any alternative.
Bring on the funerals.
a) expensive (meaning much more money can be extracted from the taxpayers) or
b) we do need to keep the nuclear physicist employed (but why cancel SSC then?)