exo icon indicating copy to clipboard operation
exo copied to clipboard

Compatibility with Raspberry Pi

Open bowenjw opened this issue 1 year ago • 7 comments

Hello,

I work for Uptime Ind. We have a Raspberry Pi Blade Cluster that we would like to run Exo on.

We attempted to get Exo running, but it returned 0 TFLOPS.

Questions:

  1. Is it possible to run Exo with CPU only?
  2. Does Exo support accelerators like HAILO or Coral?

Thank you for your assistance.

bowenjw avatar Aug 01 '24 16:08 bowenjw

same issue. I try to run it with CPU only, but returned 0 TFLOPS.

mct2611 avatar Aug 13 '24 03:08 mct2611

I have the same problem, have multiple Raspberry Pi 4B, 5 around and would like to try llama 3.1 on them. @AlexCheema Is it realistic to implement CPU only mode, or it requires a lot of effort?

brushknight avatar Aug 21 '24 12:08 brushknight

@bowenjw please let know if any luck with connecting GPU via PCIE on Raspberry Pi in the future.

FFAMax avatar Oct 06 '24 03:10 FFAMax

Any luck getting things up and running with your Pi Cluster? If yes, any info regarding the HAILO or Coral boards? While I have a mixed architecture and CPU cluster for k3s (M1, Intel, RP4) I've not been been able to get the RP4s (running the latest Raspian) to work with exo-explore. Been thinking I'll have to configure them to run Ubuntu 24...

Part of my interest in the HAILO and Coral boards is that several of my Intel-based nodes have WiFi boards that could be replaced with either HAILO or Coral boards since WiFi's not terribly useful when it comes to exo-explore clusters.

At this juncture my per node power budget is <30 watts and my non-Apple node cost is <$200. Should the new Jetson Nano Super kits ever start shipping again I'd be willing to allow for a few extra watts and dollars so that I could add one to the mix.

Thanks!

erikkassebaum avatar Feb 21 '25 22:02 erikkassebaum

At this juncture my per node power budget is <30 watts and my non-Apple node cost is <$200. Should the new Jetson Nano Super kits ever start shipping again I'd be willing to allow for a few extra watts and dollars so that I could add one to the mix.

Thanks!

its too bad they are stuck at 8gb. And if you try for an alternative 16g jetson option ( or even 32g ), they are stupid priced.

Nurb4000 avatar Feb 22 '25 12:02 Nurb4000

@erikkassebaum I got moved to another project so I am no longer working on this. Still want to get it working tho.

@Nurb4000 the CM5 will have a model with 16GB (https://liliputing.com/raspberry-pi-cm5-is-now-available-for-45-and-up-bcm2712-and-up-to-16gb-of-ram-eventually/)

bowenjw avatar Feb 22 '25 13:02 bowenjw

@Nurb4000 the CM5 will have a model with 16GB (https://liliputing.com/raspberry-pi-cm5-is-now-available-for-45-and-up-bcm2712-and-up-to-16gb-of-ram-eventually/)

Wont be like a jetson tho. RK3588's are better hardware than the RPI, and has 'fair' NPUs on board ( with shared ram ), and available everywhere. I have a couple of those with 32G. I am still hoping for some RISC-v with high end NPUs and tons of ram. ( but i wont hold my breath for soon, as the Milk-V oasis got put on hold.. )

Nurb4000 avatar Feb 22 '25 15:02 Nurb4000

Just wanted to mention I was able to get distributed-llama going on some Pis, though that project differs a bit in execution from this one. But as a demonstration of 'LLM on multiple machines' it's handy.

geerlingguy avatar Jun 05 '25 20:06 geerlingguy

(Also just wanted to point out a duplicate issue for cross-reference: https://github.com/exo-explore/exo/issues/290).

geerlingguy avatar Jun 05 '25 20:06 geerlingguy