
Michał Pogoda-Rosikoń
8
min read
Jul 11, 2025
Developers building AI systems often face a gap between software models and physical interactions—training vision or speech models is straightforward, but testing them in real-world, embodied scenarios requires accessible hardware. Reachy Mini, a collaboration between Hugging Face and Pollen Robotics announced on July 9, 2025, aims to bridge this by providing an open-source robot platform for prototyping human-robot interactions. The vision is to democratize robotics for AI integration, allowing anyone from individual hackers to teams in education or business to experiment with multimodal applications without high costs or proprietary barriers.
Key Use Cases
Reachy Mini targets scenarios where AI models interact with the physical world:
Prototyping Conversational Agents: Combine speech recognition models with the robot's microphones and speaker for natural dialogues, adding expressive gestures via head movements and antennas to enhance engagement.
Vision-Based Interactions: Use the wide-angle camera for object detection or facial recognition, enabling applications like interactive demos where the robot responds to visual cues in real time.
Educational Tools: As an assembly kit with beginner-friendly SDKs, it supports teaching AI concepts—students can code behaviors in Python or Scratch, simulating robotics before hardware arrives.
Research and Business Experimentation: Validate multimodal pipelines, such as integrating Hugging Face models for voice-to-motion responses, in low-stakes environments. Businesses can prototype customer-facing bots or internal tools for AI-assisted workflows, leveraging the open ecosystem to avoid vendor lock-in.
These use cases build on the platform's emphasis on community-driven development: behaviors are shared via the Hugging Face Hub, accelerating iteration and collaboration.
Hardware Overview
The robot stands at 28 cm tall and weighs 1.5 kg, designed for desktop use with expressive hardware to mimic lifelike interactions. Core components include a head with 6 degrees of freedom, full base rotation, two animated antennas, a 5W speaker, and a wide-angle camera. Microphone count varies by model (2 or 4), with an accelerometer in the advanced variant.
Two configurations are available:
Reachy Mini Lite:
Compute: Host (Mac/Linux)
Connectivity: Wired
Microphones: 2
Accelerometer: No
Price: $299 (+ taxes/shipping)
Reachy Mini:
Compute: Raspberry Pi 5
Connectivity: WiFi + Battery
Microphones: 4
Accelerometer: Yes
Price: $449 (+ taxes/shipping)
Shipments begin in late summer 2025 for the Lite, extending into 2026 for phased full-model deliveries. As an early-stage project, units ship without warranties to incorporate user feedback.
Software Integration
Control starts with an open-source Python SDK for hardware access and AI model chaining. Extensions for JavaScript and Scratch are planned, alongside a simulation SDK for virtual testing. Over 15 pre-built behaviors on the Hugging Face Hub provide starting points, with users encouraged to contribute via uploads, Discord discussions, and GitHub repositories.
Compatibility with tools like LeRobot enables advanced motion control, while direct ties to Hugging Face's model hub support deploying open-source vision, speech, or language models seamlessly.
Pre-orders are open, positioning Reachy Mini as a practical entry into embodied AI development.
Looking to integrate AI into your product or project?
Get a Free consultation with our AI experts.