MIT Hardware x AI

Over the course of 48 hours, our team of five engineers and designers developed Maren, an embodied AI rover that learns through sensing—seeing, listening, and interpreting the world as a dynamic, moving entity. As Maren interacts with her environment, she begins to develop a distinct personality shaped by her experiences. She even requests new sensors from us, forming a relationship that invites us to reconsider what it means for AI to exist alongside us in a physical form.


At the core of Maren is the K2 Think V2 LLM, which processes the data and performs reasoning to decide how to interface with the sensors and how the data she understands will impact her personality. We integrated whisper.cpp for speech-to-text, enabling Maren to listen to her surroundings. The system runs on a Qualcomm Arduino Uno Q mounted on the rover, running a fleet of podman containers. Using the VIAM software platform, we were able to seamlessly control cameras and motors, while hot-swappable Seeed Studio sensors connected via Arduino pins provided Maren with a flexible and extensible physical body.


Read more on our devpost!

Association

Hackathon

Date

03 2026