Compatibility with Raspberry Pi
Hello,
I work for Uptime Ind. We have a Raspberry Pi Blade Cluster that we would like to run Exo on.
We attempted to get Exo running, but it returned 0 TFLOPS.
Questions:
Thank you for your assistance.
same issue. I try to run it with CPU only, but returned 0 TFLOPS.
I have the same problem, have multiple Raspberry Pi 4B, 5 around and would like to try llama 3.1 on them. @AlexCheema Is it realistic to implement CPU only mode, or it requires a lot of effort?
@bowenjw please let know if any luck with connecting GPU via PCIE on Raspberry Pi in the future.
Any luck getting things up and running with your Pi Cluster? If yes, any info regarding the HAILO or Coral boards? While I have a mixed architecture and CPU cluster for k3s (M1, Intel, RP4) I've not been been able to get the RP4s (running the latest Raspian) to work with exo-explore. Been thinking I'll have to configure them to run Ubuntu 24...
Part of my interest in the HAILO and Coral boards is that several of my Intel-based nodes have WiFi boards that could be replaced with either HAILO or Coral boards since WiFi's not terribly useful when it comes to exo-explore clusters.
At this juncture my per node power budget is <30 watts and my non-Apple node cost is <$200. Should the new Jetson Nano Super kits ever start shipping again I'd be willing to allow for a few extra watts and dollars so that I could add one to the mix.
Thanks!
At this juncture my per node power budget is <30 watts and my non-Apple node cost is <$200. Should the new Jetson Nano Super kits ever start shipping again I'd be willing to allow for a few extra watts and dollars so that I could add one to the mix.
Thanks!
its too bad they are stuck at 8gb. And if you try for an alternative 16g jetson option ( or even 32g ), they are stupid priced.
@erikkassebaum I got moved to another project so I am no longer working on this. Still want to get it working tho.
@Nurb4000 the CM5 will have a model with 16GB (https://liliputing.com/raspberry-pi-cm5-is-now-available-for-45-and-up-bcm2712-and-up-to-16gb-of-ram-eventually/)
@Nurb4000 the CM5 will have a model with 16GB (https://liliputing.com/raspberry-pi-cm5-is-now-available-for-45-and-up-bcm2712-and-up-to-16gb-of-ram-eventually/)
Wont be like a jetson tho. RK3588's are better hardware than the RPI, and has 'fair' NPUs on board ( with shared ram ), and available everywhere. I have a couple of those with 32G. I am still hoping for some RISC-v with high end NPUs and tons of ram. ( but i wont hold my breath for soon, as the Milk-V oasis got put on hold.. )
Just wanted to mention I was able to get distributed-llama going on some Pis, though that project differs a bit in execution from this one. But as a demonstration of 'LLM on multiple machines' it's handy.
(Also just wanted to point out a duplicate issue for cross-reference: https://github.com/exo-explore/exo/issues/290).