I’m looking to hear from anyone who has actually run the Axelera Metis M.2 AI inference card on a Raspberry Pi 5. From what I’ve seen, the card can be made to work on the Pi5 and other SBCs, but there are reports of challenges such as power delivery limits (sometimes needing external 5V/12V injection or powered adapters), mechanical fit issues with M.2 carriers, and quirks with driver/firmware setup. On the Pi 5 in particular I’m curious about two things: first, how stable is it to power the card directly from the Pi versus using an external PSU, and second, how much the single-lane PCIe 2/3 link limits inference performance compared to running the card on a multi-lane host. Did anyone find the bandwidth bottleneck so severe that the official Raspberry Pi AI HAT+ was actually faster or more practical for real workloads like YOLOv8 or image classification? If you’ve tested this, could you share what adapter you used, how you powered the card, which Pi OS/kernel build, and any measured latency/throughput numbers? Even brief notes like “needed external power” or “PCIe saturated so no speedup over HAT+” would be really helpful.
Statistics: Posted by Fuzzy Logic — Tue Sep 30, 2025 9:34 pm — Replies 0 — Views 47