- Three conductors vs two, but they can be the next gauge up since the current flows on three conductors
- no significant skin effect at 400Hz -> use speaker wire, lol.
- large voltage/current DC brakers are.. gnarly, and expensive. DC does not like to stop flowing
- The 400Hz distribution industry is massive; the entire aerospace industry runs on it. No need for niche or custom parts.
- 3 phase @ 400Hz is x6 = 2.4kHz. Six diodes will rectify it with almost no relevant amount of ripple (Vmin is 87% of Vmax) and very small caps will smooth it.
As an aside, with three (or more) phase you can use multi-tap transformers and get an arbitrary number of poles. 7 phases at 400Hz -> 5.6kHz. Your PSU is now 14 diodes and a ceramic cap.
- you still get to use step up/down transformers, but at 400Hz they're very small.
- merging power sources is a lot easier (but for the phase angle)
- DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
What are you talking about? There's a very significant skin effect at 400Hz. Skin effect goes up with frequency. These datacenters use copper busbars, not cable, so skin effect is an important consideration.
You obviously need at least a dozen stands in parallel!!
Clearly skin effect scales with frequency but, 400 Hz is still low, only 2.5x lines frequency (the scale is by the root); so the skin depth is 3mm. 3mm on each side makes for a pretty hefty rectangular cross-section.
I'm pretty sure you have my delivery address from when I bought sorted Lego from you about 10 years back.
Let me know when to expect the 100,000Amp test equipment!
I shall make sure I wear better PPE than just my reading glasses.
:-)
Ah, that lego project... that was one I always wondered if I should have industrialized it but sourcing enough lego was a real problem.
That's low voltage lightning :)
now run that unshielded wire 50 meters past racks of GPUs and enjoy your EMI
> The 400Hz distribution industry is massive; the entire aerospace industry runs on it
nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms
> 3 phase @ 400Hz is x6 = 2.4kHz... Your PSU is now 14 diodes and a ceramic cap
you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates
> large voltage/current DC breakers are.. gnarly, and expensive. DC does not like to stop flowing
SiC solid-state DC breakers are shipping today from every major vendor
> DC-DC converters are great, but you're not going to beat a transformer in efficiency or reliability
wide-bandgap converters are at 95%+ with no moving parts
Multipole expansion scales faster than r^2.
Also, im not in the field (clearly) but GPUs cant handle 2.4 kHz? The quarter wavelength is 30km.
"nothing in that catalog is rated for 100kW–1MW rack loads at 800Vrms"
Current wise, the catalog covers this track just fine. As to the voltages, well that's the whole point of AC! The voltage you need is but a few loops of wire away.
"you still need an inverter-based UPS upstream, which is the exact conversion stage DC eliminates"
So keep it? To clarify, this is the "we're too good for plebeian power, so we'll transform it AC->DC->AC", right?
"SiC solid-state DC breakers are shipping today from every major vendor"
Of course they do. They're also pricey, have limited current capability (both capital costs and therefore irrelevant when the industry is awash with GCC money) and lower conduction, and therefore higher heat.
They're really nice though.
"wide-bandgap converters are at 95%+ with no moving parts"
transformers have no moving parts. Loaded they can do 97%+ efficiency, or 2MW of heat eliminated on a 100MW center.
The skin depth by the way is sqrt(2 1.7e-8 ohm m / (2 pi 400Hz mu0))=~3mm for copper---OK for single rack, but starts to be significant for the type of bus bars that an aisle of racks might want.
As for efficiency, both 400Hz transformers AND fancy DC-DC converters are around 95% efficient, except that AC requires electronics to rectify it to DC, losing another few percent, so the slight advantage goes to DC, actually.
As for merging power, remember that DC DC converter uses an internal AC stage, so it's the same---you can have multiple primary windings, just like for plain AC.
I am a recovering audiophool.
I do own a pair of 2m long Monster Cable speaker cables (with locking gold plated banana plugs). I am fairly certain I've used welders with smaller cables.
(In my defence, I bought those as a teenager in the late 80s. I am not so easily marketed to with snake oil these days. I hope.)
(On the other hand, I really like the idea of a reliably stable plus and minus 70V or maybe 100V DC power supply to my house. That'd make audio power amplifiers much easier and lighter...)
See e.g. https://www.dell.com/support/kbdoc/en-us/000221234/wiring-in...
Honestly, that was pretty surprising to me when I had to work with some telco equipment a couple of decades ago. To this day, I don't think I've encountered anything else that requires negative voltage relative to ground.
[1] https://www.analogisnotdead.com/article26/what-is-going-on-w...
So the grid was always charging up the lead acid batteries, and the phone lines were always draining them? Or was there some kind of power switching going on where when the grid was available the batteries would just get "topped off" occasionally and were only drained when the power went out?
Actually, there was one. Even earlier phones had their own power. A dry-cell battery in each phone, and every 6 months, the phone company would come around with a cart and replace everyone's battery. Central battery was found to be more convenient, since phone company employees didn't have to go around to everyone's site. Central offices could economize scale and have actual generators feeding rechargeable batteries.
I was wiring in a phone extension for my grandma once as a boy and grabbed the live cable instead of the extension and stripped the wire with my teeth (as you do). I've been electrocuted a great number of times by the mains AC, but getting hit by that juicy DC was the best one yet. Jumped me 6ft across the room :D
Yes, of course both of those things are true, and yes, some data centers do engage in those processes for their unique advantages. The issue is that aside from specialty kit designed for that use (like the AWS Outposts with their DC conversion), the rank-and-file kit is still predominantly AC-driven, and that doesn't seem to be changing just yet.
While I'd love to see more DC-flavored kit accessible to the mainstream, it's a chicken-and-egg problem that neither the power vendors (APC, Eaton, etc) or the kit makers (Dell, Cisco, HP, Supermicro, etc) seem to want to take the plunge on first. Until then, this remains a niche-feature for niche-users deal, I wager.
https://www.nokia.com/bell-labs/publications-and-media/publi...
Every single DC I’ve worked in, from two racks to hundreds, has been AC-driven. It’s just cheaper to go after inefficiencies in consumption first with standard kit than to optimize for AC-DC conversion loss. I’m not saying DC isn’t the future so much as I’ve been hearing it’s the future for about as long as Elmo’s promised FSD is coming “next year”.
Looking at the manual for the first server line that came to mind, you can buy a Dell PowerEdge R730 today with a first party support DC power supply.
If there was anything like a high power transistor back then he would have used that. High power transistors that are robust enough to handle the grid were designed inly recently over 100 years after the tesla/edison ac/dc argument.
the podcaster Sebastian Major from "Our Fake History" did a looonnngg patreon episode on tesla and debunked most of the weird myths around tesla. Sebastian doesn't have a vendetta or anything, it's just amazing how much of the Tesla stuff is just nonsense or is viewed through a very weird bias nowadays. Major also briefly touches on the weird Edison stuff and how the internet has twisted Edison into a villain.
IMHO, the vision he had about universal free electricity (transmitted wirelessly) was the dumbest. It was a novel idea, and he invested a lot (his time and other people's money) in it. The problem with his idea is that there was no way to monetize it (and profit from it). (There were also the technical issues of the power loss over distance (1/R^2), the harm to the environment, and the interference with radio communications.)
Edison was quite a villain. He stole many of his "inventions", and orchestrated a PR campaign against Tesla touting the "evils" of AC power. AFAIK, the electric chair was either invented or inspired by him.
I know these things because I've read many books on various topics related to Tesla, and all of this knowledge predates the Internet.
This!
The soon people realized these facts the better. The pervasive high rise buildings did not happen before the invention of modern cranes.
Exactly twenty years ago I was doing a novel research on GaN characterization, and my supervisors made a lot money with consulations around the world, and succesfully founded govt funded start-up company around the technology. Together with SiC, these are the two game changing power devices with wideband semiconductor technology that only maturing recently.
Heck, even the Nobel price winning blue LED discovery was only made feasible by GaN. Watch the excellent video made by Veritasium for this back story [1].
[1] Why It Was Almost Impossible to Make the Blue LED:
I only found Edison in the headline, I didn't find it anywhere in the body, nor did I find Tesla. Glancing through the article it almost seems like someone tried to make a catchy headline to get clicks.
I always thought AC’s primary benefit was its transmission efficiency??
Would love to learn if anyone knows more about this
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
BTW, megavolt DC DC converters are a sign to behold: https://en.wikipedia.org/wiki/File:Pole_2_Thyristor_Valve.jp...
There are many factors involved, and "efficiency" is only one. Cost is the real driver, as with everything.
AC is effective when you need to step down frequently. Think transformers on poles everywhere. Stepping down AC using transformers means you can use smaller, cheaper conductors to get from high voltage transmission, lower voltage distribution and, finally lower voltage consumers. Without this, you need massive conductors and/or high voltages and all the costs that go with them.
AC is less effective, for instance, when transmitting high power over long, uninterrupted distances or feeding high density DC loads. Here, the reactive[1] power penalty of AC begins to dominate. This is a far less common problem, and so "Tesla won" is the widely held mental shortcut. Physics doesn't care, however; the DC case remains and is applied when necessary to reduce cost.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
* https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
The gain from DC-DC converters is small and DC devices are small part of usage compared appliances. There is no way will pay back costs of replacing all the appliances.
The part that would genuinely be cheaper is avoiding problematic flicker. It takes a reasonably high quality LED driver to avoid 120Hz flicker, but a DC-supplied driver could be simpler and cheaper.
(My stand mixer is the lone sad exception)
I spent a few years getting flown out around the world to service gear at different datacenters. I learned to pack an IEC 60320 C14 to NEMA 5-15R adapter cable and a dumb, un-protected* NEMA 5-15R power strip. While on-site at the datacenters, an empty PDU receptacle was often easy to find. At hotels, I'd bring home a native cable borrowed from or given to me by the native datacenter staff or I'd ask the hotel front desk to borrow a "computer power cable," (more often, I'd just show them a photo) and they generally were able to lend me one. It worked great. I never found a power supply that wasn't content with 208 or 240V.
Example adapters: https://www.amazon.com/dp/B0FD7PHB7Y or https://www.amazon.com/dp/B01IBIC1XG
*: Some fancier power strips with surge suppression have a MOV over-voltage varistor that may burn up if given 200V+, rendering the power strip useless. Hence, unprotected strips are necessary.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
Allegedly
While on "work experience" from high school I was put on washing power lines coming straight out of the local power station near the ocean - lots of salt buildups to clear.
Same deal, flashover suits and occasional arcs .. and much laughter from the ground operators who drifted the work bucket close.
It would have self-extinguished if you waited long enough for the probe to vaporize.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
Electromagnets dont work for DC, so your breaker will never trip. For thermal protection, you need current, so that checks out, and it would make sense for it to be rated under 50V as thats considered the highest voltage thats not life threatening on touch.
PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Can you say more about "creep"? Is the resistance changing? Or is material actually migrating?
Also curious why it's worse using DC.
If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
Once you get into higher power (laptops and up), switching and distribution get harder, so the advantages fade.
For bigger appliances (fridge, etc), AC is fine + practical.
The irony...
However, there's also PoE (24 or 48V!), so maybe that's the right approach. It's not like each outlet is going to run a heater anyway.
Unless you mean running AC and installing inverters in the wall? What is this even for? All my electronics are DC but critically they all require different voltages. The only thing I might benefit from would be higher voltage service because there are times that 15 A at 120 V doesn't cut it.
Other people, of course, have other definitions of high voltage:
"This resonant tower is known as a Tesla coil. This particular one is just over 17 feet tall and it can generate about a million volts at 60,000 cycles per second."
and:
"This pulse forming network can deliver a shaped pulse of over 50,000 amps with a total energy of about 1,057 times the tower primary energy"
It is silly to have AC to DC converters in all of my wall connected electronics ( LED bulbs, home controller, computer equipment etc )
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
In all likely not worth the trouble. When I moved to Canada I gave away most of my power tools for that reason and when I moved back I had to do that all over again.
If you ever have to do it again, you can probably get a transformer rated high enough for power-tools for cheaper than replacing all of your power tools.
How expensive would a proper AC->DC->AC brick for that power level be?
A pure sinewave inverter for that kind of power is maybe 600 to 1000 bucks or so, then you'd still need the other side and maybe a smallish battery in the middle t stabilize the whole thing. Or you could use one of those single phase inverters they use for motors.
I think the answer to your question is that it mostly doesn't matter for personal mug size quantities of hot water and if it does matter to you there are readily available competing options such as dedicated taps for your kitchen sink.
Perhaps the biggest reason is that a traditional kettle on any half decent electric range will match if not exceed the power output of any imported electric kettle. Many even go well beyond that with one burner marked "quick boil" or similar.
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
Many datacenters I'd been to at that point were already DC.
Didn't think this was that new of a trend in 2026, but also acknowledge I did not visit more than a handful of datacenters since 2007.
It just seemed like a undenyably logical thing to do.
GE has a paper about the power conversion design, but it doesn't mention the unit to rack electrical and mechanical interface. Liteon is working on that, but the animation is rather vague.[2] They hint at hot plugging but hand-wave how the disconnects work. Delta offers a few more hints.[3] There's a complex hot-plugging control unit to avoid inrush currents on plug-in and arcing on disconnect. This requires active management of the switching silicon carbide MOSFETs.
There ought to be a mechanical disconnect behind this, so that when someone pulls out a rackmount unit, a shutter drops behind it to protect people from 800V. All these papers are kind of hand-wavey about how the electrical safety works.
Plus, all this is liquid-cooled, and that has to hot-plug, too.
[1] https://library.grid.gevernova.com/white-papers-case-studies...
[2] https://www.youtube.com/watch?v=CQOreYMhe-M&
[3] https://filecenter.deltaww.com/Products/download/2510/202510...
> When it is detected that the PDB starts to detach from the interface, the hot-swap controller quickly turns off the MOSFET to block the discharge path from Cin to the system. After the main power path is completely disconnected, the interface is physically detached, and no current flows at this time
> For insertion, long pins (typically for ground and control signals) make contact first to establish a stable reference and enable pre-insertion checks, while short pins (for power or sensitive signals) connect later once conditions are safe; during removal, the sequence is reversed, with short pins disconnecting first to minimize interference.
Somehow this seems the wrong approach to AI.
For 800V DC, a simple UPS could interface with the main supply using just a pair of (large) diodes, and a more complex and more efficient one could use some fancy solid state switches, but there’s no need for anything as complex as a line-interactive AC UPS.