8GB RAM for AI on a Pi sounds underwhelming even from the headline
Are they seeing a worthwhile niche for the tinkerers (or businesses?) who want to run local LLMs with middling performance but still need full set of GPIOs in a small package? Maybe. But maybe this is just Raspberry jumping on the bandwagon.
I don't blame them for looking to expand into new segments, the business needs to survive. But these efforts just look a bit aimless to me. I "blame" them for not having another "Raspberry Pi moment".
P.S. I can maybe see Frigate and similar solutions driving the adoption for these, like they boosted Coral TPU sales. Not sure if that's enough of a push to make it successful. The hat just doesn't have any of the unique value proposition that kickstarted the Raspberry wave.
Of course, Raspberry Pi just like everyone else has their custom patches, but at least to my knowledge you can use a straight Linux kernel and still have a running system.
But now if I want some low power linux PC replacement with display output, for the price of the latest RPi 5, I can buy on the used market a ~2018 laptop with a 15W quad core CPU, 8GB RAM, 256 NVME and 1080p IPS display, that's orders of magnitude more capable. And if I want a battery powered embedded ARM device for GPIO over WIFI, I can get an ESP32 clone, that's orders of magnitude cheaper.
Now RPi at sticker price is only good for commercial users since it's still cheaper than the dedicated industrial embedded boards, which I think is the new market the RPI company caters to. I haven't seen any embedded product company that hasn't incorporate RPis in its products they ship, or at least in their lab/dev/testing stage, so if you can sell your entire production stock to industrial users who will pay top dollar, why bother making less money selling to consumers, just thank them for all the fish. Jensen Huang would approve.
And the thin clients when they are for sale tend to have their SSDs ripped out by IT for data security, so then it's a hassle to go out and buy and extra SSD, compared to just buying a used laptop that already comes with display , keyboard, etc.
But it won't be as reliable, mostly motherboards won't last long.
The ticking timebomb lemons with reliability or design issues, will just die in the first 2-4 years like clockwork, but if they've already survived 6+ years without any faults, they'll most likely be reliable from then on as well.
Ok, let us say they ll last 4 more years, so 10 years total lifespan.
A PI would last a lot longer.
One might get lucky with such a laptop, but I won't count on it.
Regarding higher quality components, I think the for the usecase (I mean the kinds of thing it is supposed to be used for) of Raspberry PI, reliability is more important.
This also matches with my experience.
That you think that reliability is more important for a Raspberry Pi usecase than a laptop doesn't somehow magically make it a fact that its components are of higher quality than your average laptop. You only speculate and then speculate further on the basis of your original speculation. That's not how you arrive at a basis for a factual claim or an estimate.
And if you buy a 5-year-old corporate laptop in very good condition with minimal visible wear on the keyboard and touchpad, it was likely only used as a desktop replacement connected to a dock, so unlikely to have suffered abuse not apparent from visual inspection alone.
If you're planning to use it as an actual laptop, price out a replacement battery before purchase, as battery capacity will degrade over time, even if the laptop is exclusively used on AC, so will always be something of a crapshoot.
Otherwise, I'd expect the rate of component failure to be no higher than for any other lightly-used laptop of similar vintage, which is low.
Why not 50 more years if we're just making up numbers? I still have an IBM thinkpad from 2006 in my possession with everything working. I also see people with Macbooks from the era with the light up apple logo in the wild and at DJs.
>A PI would last a lot longer.
Because you say so? OK, sure.
>I can buy on the used market a ~2018 laptop with a 15W quad core CPU, 8GB RAM, 256 NVME and 1080p IPS display, that's orders of magnitude more capable..
I understand what you're saying but saying it isn't enough. There's nothing to support your claim.
3-5 years of office use for a Pi. [1]
Sure, there's other numbers to find as well, but I'd suggest that they're pretty comparable in the way they handle environments. If one would fail, so would the other.
[0] https://pcpatching.com/2025/11/extend-your-pcs-life-how-long...
[1] https://raspberrypicase.com/how-long-does-a-raspberry-pi-las...
I'm also currently building a small device with 5" touchscreen that can control a midi fx padle of mine. It's just so easy to find images, code and documentation on how to use the GPIO pins.
Might be niche, but that is just what the Pi excels at. It's a board for tinkers and it works.
It's 10 bucks more. ¯\_(ツ)_/¯ Still half the price that I see intel NUCs for sale. Which of course are way more capable. But still, I don't mind the price that much.
I could go with a cheaper alternative, but then AFAIK you might have to fiddle with images, kernel and documentation. For me that is worth 10 bucks.
I don't really care how it compares to past models or inflation to justify its price tag. I was just comparing to to what you can buy on the used market today for the same price and it gets absolutely dunked on in the value proposition by notebooks since the modern full spec RPi is designed to more of a ARM PC than an cheap embedded board.
60 Euros for 2GB and 100 for 8GB models is kind of a ripoff if you don't really need it for a specific niche use case.
I think an updated Pi-zero with 2GB RAM and better CPU stripped of other bells and whistles for 30 Euros max, would be amazing value, and more back to the original roots of cheap and simple server/embedded board that made the first pi sell well.
The mobile and embedded X86 chips have closed the gap a lot in power consumption since the PI first launched.
Now you can even get laptops with broken screens for free, and just use their motherboard as a home server alternative to a PI. Power consumption will be a bit higher, but not enough to offset the money you just saved anytime soon.
https://www.insidemylaptop.com/wp-content/uploads/2018/03/Le...
The big pain with using something like this would mostly be the IO and odd form factor.
Which is basically just cutting out the middlemen in a transaction that might cost $100 on eBay.
Used corporate laptops are particularly cost-effective if you're interested in running Windows, as unlike Intel NUCs and most SBC products, they typically include hardware-locked Windows 10 Pro licenses which can be upgraded to Windows 11 Pro for free.
Things like used PCs and forgotten closet laptops were running circles around brand-new Raspberry Pi systems, in performance per dollar, for as long as we've had new Raspberry Pis to make that comparison with.
Those first Pis didn't even have wifi, and they were as picky about power supplies and stuff back then as a Pi 5 is today.
The primary aspects that are new are that the featureset of new models continues to improve, and the price of a bare board has increased by an inflation-adjusted ~$10.
(Meanwhile: A bare Pi 3B still costs $35 right now -- same as in 2016. When adjusted for inflation, it has become cheaper. $35 in 2016 is worth about $48 today.)
What prices are you using for the 3b and 5 to get this percentage? The lowest percentage I got from available data is a 57% increase ($35 -> $55)
40 EUR form 2016 is now ~52 EUR.
Compared to 62 EUR for the current model.
1. Testing images to be deployed on customer Pis.
2. Testing software on ARM64 Linux. Pis are still cheaper than used Apple Silicon Macs, and require less fiddling to run Linux. I currently have a free Oracle Cloud instance that would work just as well for this, but it could go away at any time and it's a PITA to reprovision.
3. Running Mathematica, because it's free on Pi, I only use it a few times a year, and a fully-loaded Pi 5 is cheaper than a single-year personal license to run it on any other platform.
4. Silly stuff like one Pi 3 I have set up to emulate a vintage IBM mainframe.
How much cheaper then 50 bucks can a tablet get? With the pi I can quickly in a hacky way connect rotary encoders with female-female dupon cables, use a python GPIO library made for raspberry pi.
https://media.discordapp.net/attachments/1461079634354639132...
I can also use it for Zynthian. And if I'm done with it, I can build a new printer :P
Buy one with a psu, and you're 100 to 150.
See my other comment, pi5 2 GB is about ~10% more expensive then when 3 or 3b got released when factoring in inflation. ~60 EUR including 25% VAT.
With PSU it's 77 EUR including 25% VAT right now.
4 GB version + case + 64GB SD + PSU = 135 EUR, but I don't need that much ram, disk space or the case. When I put it into a 3d printer I also don't need the PSU.
Unless you want your printer to power up on demand, then you need a separate PSU and an SSR (and you still need a buck converter because printers don't supply 5V at required amperage).
However, there is no way it's 10% more expensive than the original Pi with inflation. It's easily double, if not more.
You can buy low end tablets significantly cheaper now. A tablet with a screen, more ram, storage, a battery.
A Pi is a rip off, for what it costs to build.
It's not like they make the OS or kernel either. They basically copy and paste Debian, and add some fluff modz, most of it OSS they didn't write.
https://www.amazon.ca/Expansion-Octa-core-Processor-Touchscr...
https://www.amazon.ca/Raspberry-Pi-8GB-2023-Processor/dp/B0C...
rPi3 40 EUR form 2016 is now ~52 EUR adjusted for inflation.
Compared to 62 EUR for the current model rPi 2Gb.
Cheapest tablet with larger then 1gb ram locally is around ~85 EUR. All prices including 25% VAT.
I'm in the market to replace my aging Intel NUCs, but RPi is still cheaper.
There are dozens and dozens of NUC style / form factor machines available these days. Especially cheap ones from China. Not sure what you mean by gaping hole post 2023. I'm running 3 of them with N97 and N150 Cpus. All bought within the last 18 months.
Cheap Chinese mini PCs just aren't well documented and don't have predictable supply.
What moving parts do competitors have to be less mechanically reliable?
In fact, a NUC or used laptop would be even more reliable since you can replace NVME storage and RAM sticks. If your RPI ram goes bad you're shit out of luck.
>RPi will still have lower power consumption and is far more compact.
Not that big of on an issue in most home user cases as a home server, emulator or PC replacement. For industrial users where space, power usage and heat is limited, definitely.
>I'm in the market to replace my aging Intel NUCs, but RPi is still cheaper.
Cheaper if you ignore much lower performance and versatility vs a X86_X64 NUC as a home server.
> Not that big of on an issue in most home user cases as a home server
I don't know what "most home users" want, but I can understand wanting something more compact and efficient (also easier to keep cool in tighter or closed spaces), even at home.
> Cheaper if you ignore much lower performance and versatility vs a X86_X64 NUC as a home server.
Or maybe they noticed they don't need all the performance and versatility. Been there. It's plenty versatile and can run everything I need.
Unfortunately, it's close to dying. The heat from the CPU disintegrated the plastic of the SATA cable header on the motherboard. I fixed it for now with a bit of glue, but it's not going to hold indefinitely. And NUCs were pretty pricey.
RPi with a SATA/M.2 disk and a PoE hat is not that much cheaper than Intel, but it uses much less power. They also tend to not have cables that are kept under mechanical strain. I have a single-purpose RPi that's been running since 2014, and it's doing just fine.
I don't think I could a RPi as cheaply once parts and power supply etc are taken into account.
The RPi Zero 2W costs $15 and runs HA just fine. One can splurge on a pricey case, microSD, and high-amp GAN charger, and still be under 50% of your spend. You don't have to buy the flagship RPi.
If it's an option I would always go with an SSD for HA. It makes a big difference in usability. Writing often and a lot to SD cards, like HA does, kills them way too fast
Strong disagree: my experience has been great, my HA has been running on a Zero 2W for more than 2 years! I have several HACS plugins enabled - just not any if the video or AI stuff. The same Pi concurrently runs PiHole. For a while, it also acted as a git mirror via iSCSI-backed Gitea, but I had to migrate Gitea off of it since it was memory-hungry. You can do a lot with 500MB in headless mode.
Go price out a used 1l form factor PC.
After you buy a case, and a real disk, the pi, cost savings is gone.
Meanwhile you can pick up a used 8th gen intel 1L form factor for about 100 bucks. You can pick up one that will take a PICE card for 150ish bucks, with remote management.
The 8th gen or better intel has all sorts of extra features that may make it worth while (transcoding/video support).
* https://tweakers.net/nieuws/80350/verkoop-goedkoop-arm-syste...
https://www.raspberrypi.com/products/raspberry-pi-500-plus/
I can't justify it though as I've no use for it.
However I think it is way closer to their original vision than anything else, i.e. It is a lot like the 1980s computers, the magic they were trying to capture.
For 100€ that would be something I'd buy for every niece and nephew to play with. For 200€ it's not even for me, I'd rather buy something like the uConsole RPI-CM4: https://www.clockworkpi.com/product-page/uconsole-kit-rpi-cm...
I noticed I can do 90% of the stuff I'd use an Arduino for with a RPi, except I had the full power of an internet connected Linux machine available. The Arduinos are still collecting dust somewhere =)
But now we have the ESP32 filling the same niche along with the Pi Zero W, so I don't really understand the purpose of RPi 4 and 5. They're not cheap compared to the price nor very powerful in any measure.
You don't even need a full laptop, any Chinese miniPC will blow the RPi5 out of the water AND some of them have expandable storage+RAM, while also having 5-20x more CPU/GPU oomph. They do consume a few watts more power, so there _might_ be a niche for the Raspberry Pi, but it's not a big one.
They are good for commercial installations like smart displays in stores (think big screens with menus behind fast food counters) and information kiosks. The extra HDMI port lets you drive two screens with one pi and the extra processing power keeps the UI smooth on high resolution. They also have hardware acceleration video decoding for shops wishing to play hi res promo videos and hobbyists building media terminals.
Cost is not a major concern here because the installation volume is low and there are far bigger expenses anyway. Just take a look at how much commercial displays are. The Pi company’s future supply guarantee is also nice because you know that within a given number of years if something breaks or you need another screen, you can just buy another identical pi and be done with it. Good luck sourcing a Chinese mini pc with compatible footprints, port orientations etc five years down the road.
I do get the business case of a stable repeatable platform, but I'm just tinkering with crap, not spinning up a hardware startup =)
Awful how? A SBC can take advantage of many software written from the dawn of x86.
RISC-V is going through this exact same problem right now. All of the current implementations have terrible documentation, and tailoring Linux for each of these is proving to be difficult. All of these vendors include on-board devices that have terrible doc and software support.
RISC-V would do well to adopt and promote a similar spec.
Almost nothing useful runs in 8.
This is the problem with this gen of “external AI boards” floating around. 8, 16, even 24 is not really enough to run much useful, and even then (ie. offloading to disk) they're so impractically slow.
Forget running a serious foundation model, or any kind of realtime thing.
The blunt reality is fast high memory GPU systems you actually need to self host are really really expensive.
These devices are more optics and dreams (“itd be great if…”) than practical hacker toys.
I wouldn't dare suggest that. The RPi was never for everyone yet it turned out it was for many. It was small but powerful for the size, it was low power, it was extremely flexible, it had great software support, and last but not least, it was dirt cheap. There was nothing like that on the market.
They need to target a "minimum viable audience" with a unique value proposition otherwise they'll just Rube-Goldberg themselves into irrelevance. This hat is a convoluted way to change the parameters of an existing compromise and turn it into a different but equally difficult compromise. Worse performance, better efficiency, adds cost, and it doesn't differentiate itself from the competing Hailo-10H-based products that work with any system not just RPi (e.g. ASUS UGen300 USB AI Accelerator).
> the idea of miniaturising
If you aren't ditching the laptop you aren't miniaturizing, just splitting into discrete specialized components.
OTOH with ram prices being where they are and no signs of coming back down in the foreseeable future a second hand pi 4 may be a very wise choice.
Not true, you're thinking about earlier models.
The Picos are great for the smaller stuff, new Pis are great for bigger stuff, and old Pis and Zeros are still available. They've innovated around their segment.
The AI stuff is just an expression of that. People are doing AI on Pi5s and this is just a way to make that better.
- I can boot it w/o having to learn about custom U-Boot implementations
- I, as a consumer or small business, can buy
- Can not only buy today but also still buy in 2 years
- Doesn't cost a small fortune
- Can be tugged away behind TVs and other small niches
> [...] if I want some low power linux PC replacement with display output, for the price of the latest RPi 5, I can buy on the used market a ~2018 laptop
I guess. I don't care about the AI hat at all.
There are some absolutely useful things you can do with TTS/STT/Diarization/etc on even really minimal specs.
Some of those will run fine on RPis even without this new hat.
The extra ram probably opens the door to a large number of vision/image models, which typically want a minimum of 16Gb, but do better with 24/32.
There are just a HUGE number of case specific models that do just fine on hardware at the RPi level, assuming you have the ram to load them.
https://www.gmktec.com/products/nucbox-g3-plus-enhanced-perf...
If ARM is a requirement, then RPi is your only option that I know of.
https://teampandory.com/2024/09/24/gmktec-g5-mini-pc-review-...
Something like this ($155) cost less than the Pi + case + power supply + nvme addon board + ssd, and it also runs windows and any other x86 OS.
The RPi I can depend upon to be shitty as well, but in the exact same way. So it stays fit for purpose.
I don't think you will find anything on the market enabling you to create your own audiophile quality AMP, DAC, or AMP+DAC for a pretty attractive price except a Pi 3/4/5 with a HifiBerry (https://www.hifiberry.com/) HAT.
That said, more options at the (relatively speaking) low end of the AI hardware market probably isn't a bad thing. I'm not particularly an AI enthusiast generally, but if it is going to infest everything anyway, then at least I would like a decent ecosystem for running local models.
As someone else mentioned: if the hat could efficiently be leveraged with the YOLO models on Frigate for a low volume camera setup that could be a nice niche use case for it.
Either way I hope the RPi org keeps dropping things like this and letting the users sort out the use cases with their dollars.
https://www.raspberrypi.com/products/ai-hat-plus-2/
It's no more "made by a third party" than any other electronics device made by a contract manufacturer.
In regard to their niche, their niche is a ridiculously well-documented ecosystem for SBCs. Want to do something with your RPi? You can find it on Google, and the LLM of your choice is probably trained to give you the answer on how to do it. If you're just tinkering or getting a POC ready, that's a big help.
Of course, if you're in the business of hardware prototyping, and have a set of libraries and programs you know you're going to work with, you don't need to care as much.
Chromebooks did what RPI should have done.
Was very helpful in me learning Linux. The only alternative I had at the time was a few old Pentium 4 machines, which were very noisy and my parents didn't like me leaving turned on for a long time.
The things you can do locally with AI now are amazing. For several years there's been multiple open source products that can do both audio and visual processing locally using AI models. Local-only Home Assistant is almost equivalent to Siri. The more things you throw at it, the more computing power it needs (especially for low latency), and that's where the dedicated GPUs/NPUs (previously ASICs) are needed. And consider the expanded use cases; drones and robots can now navigate the world autonomously using a $150 SoC and some software.
Although the op is not wrong, maybe their decisions are data driven and pay off?
My rpi3 (that's been running since 2019) died last year and I was able to buy another and just plug in the SD card.
Case closed. And that's extremely slow to begin with, the Pi 5 only gets what, a 32 bit bus? Laughable performance for a purpose built ASIC that costs more than the Pi itself.
> In my testing, Hailo's hailo-rpi5-examples were not yet updated for this new HAT, and even if I specified the Hailo 10H manually, model files would not load
Laughable levels of support too.
As another datapoint, I've recently managed to get the 8L working natively on Ubuntu 24 with ROS, but only after significant shenanigans involving recompiling the kernel module and building their library for python 3.12 that Hailo for some reason does not provide outside 3.11. They only support the Pi OS (like anyone would use that in prod) and even that is very spotty. Like, why would you not target the most popular robotics distro for an AI accelerator? Who else is gonna buy these things exactly?
I was able to run a speech to text on my old Pixel 4 but it’s a bit flaky (the background process loses the audio device occasionally). I just want to take some wake word and then send everything to remote LLM and then get back text that I do TTS on.
I was only using it for local Home Assistant tasks, didn't try anything further like retrieving sports scores, managing TODO lists, or anything like that.
TinyML is a book that goes through the process of building a wake word model for such constrained environments.
I fail to see the use-case on a Pi. For learning you can have access to much better hardware for cheaper. Perhaps you can use it as a slow and expensive embedding machine, but why?
Tiny LLMs are pretty much useless as general purpose workhorses, but where they shine is when you finetune them for a very specific application.
(In general this is applicable across the board, where if you have a single, specific usecase and can prepare appropriate training data, then you can often fine-tune a smaller model to match the performance of a general purpose model that is 10x its size.)
The vision processing boost is notable, but not enough to justify the price over existing HATs. The lack of reliable mixed-mode functionality and sparse software support are significant red flags.
(This reply generated by an LLM smaller than 8GB, for ants, using the article and comment as context).
... why though? CV in software is good enough for this application and we've already been doing it forever (see also: Everseen). Now we're just wasting silicon.
1. Can I run a local LLM that allows me to control Home Assistant with natural language? Some basic stuff like timers, to do/shopping lists etc would be nice etc.
2. Can I run object/person detection on local video streams?
I want some AI stuff, but I want it local.
Looks like the answer for this one is: Meh. It can do point 2, but it's not the best option.
2. Has been possible in realtime since the first camera was released and has most likely improved since. I did this years ago on the pi zero and it was surprisingly good.
No. Get the larger PI recommended by the article.
Quote from the article:
> So power holds it back, but the 8 gigs of RAM holds back the LLM use case (vs just running on the Pi's CPU) the most. The Pi 5 can be bought in up to a 16 GB configuration. That's as much as you get in decent consumer graphics cards1.
> Because of that, many quantized medium-size models target 10-12 GB of RAM usage (leaving space for context, which eats up another 2+ GB of RAM).
…
> 8 GB of RAM is useful, but it's not quite enough to give this HAT an advantage over just paying for the bigger 16GB Pi with more RAM, which will be more flexible and run models faster.
The model specs shown for this device in the article are small, and not fit for purpose even for the relatively trivial use case you mentioned.
I mean, look, lots of people have lots of opinions about this (many of them wrong); it’s cheap, you can buy one and try… but, look. The OP really gave it a shot, and results were kind of shit. The article is pretty clear.
Don’t bother.
You want a device with more memory to mess around with for what you want to do.
I buy a raspberry pi because I need a small workhorse - I understand adding RAM for local LLMs, but it would be like a raspberry pi with a GPU, why do i need it when a normal mini machine will have more ram, more compute capacity and better specs for cheaper?
I daresay they could charge more than a comparably specced computer (if they don't already) and they would still be a viable purchase.
Unless i'm missing something - which is where i'm like why not just buy a NUC with similiar RAM for far less.
[1] https://www.raspberrypi.com/news/introducing-raspberry-pi-ha...
They seem very fast and I certainly want to use that kind of thing in my house and garden - spotting when foxes and cats arrive and dig up my compost pit, or if people come over when I'm away to water the plants etc.
[edit: I've just seen the updated version in Pimonori and it does claim usefulness for LLMs but also for VLMs and I suspect this is the best way to use it].
That said, perhaps there is a niche for slow LLM inference for non-interactive use.
For example, if you use LLMs to triage your emails in the background, you don't care about latency. You just need the throughput to be high enough to handle the load.
Yes, but that is normal I guess:
I once tried to run a segmentation model based on a vision transformer on a PC and that model used somewhere around 1 GB for the parameters and several gigabytes for the KV cache and it was almost entirely compute bound. You couldn't run that type of model on previous AI accelerators because they only supported model sizes in the megabytes range.
That's also limited to 8Gb RAM so again you might be better off with a larger 16Gb Pi and using the CPU but at least the space is heating up.
With a lot of this stuff it seems to come down to how good the software support is. Raspberry Pis generally beat everything else for that.
A NPU that adds to price but underperforms a rasp cpu?
You get SBC with 32gb ram…
Nevermind the whole minipc ecosystem which will crush this
YOLO for example.
If it could run whisper, it'd be a solid addition to a pi based home assistant setup.
My impression so far was that the resulting models are unusably stupid, but maybe there are some specific tasks where they still perform acceptably?
Dont need more than 8gb. It'll be enough power. IT can do audio to audio.
Hitching their wagon to the AI train comes with different expectations, leading to a mixed bag of reviews like this.
The price point is still a little high for most tasks but I’m sure that will come down.
I have a NAS/home server that I could put it on but I don't want my home automation going down when I tinker with it.