Welcome to Cerberus, an AI-powered surveillance system built on the Unitree Go1 robot dog. The project performs localization and mapping using ultrasonic sensors, and leverages the fisheye camera for object detection and person activity recognition.
- Introduction
- Robot Architecture
- System Overview
- Software Flow
- Setup Instructions
- Running the Patrol Service
- Acknowledgements
The goal of this project is to develop an intelligent, autonomous surveillance robot dog capable of patrolling, detecting people and their activities, and reporting to a connected monitoring system.
The Unitree Go1 EDU system has the following architecture:
-
Main Control Board: MCU (
192.168.123.10) -
Motion Control Board: Raspberry Pi 4B (
192.168.123.161) -
Sensing Motherboards:
- Nano (head) →
192.168.123.13 - Nano (body) →
192.168.123.14 - Nano or NX (body) →
192.168.123.15
- Nano (head) →
For this project, I used:
- The Motion Control Board for processing and robot commands
- The Sensing Motherboard (Head) for camera-based perception
You can connect to the Motion Control Board via Wi-Fi or Ethernet. Once inside (via SSH), you can access all other internal nodes.
- Camera Feed: Fisheye camera data is received via UDP from the head board.
- Ultrasonic Sensors: A Python script on the Motion Control Board listens to LCM messages and retransmits them via MQTT.
- Robot Control: MQTT topics are used to publish control commands to the Go1.
- Mapping and Navigation: The system uses an occupancy grid map to autonomously patrol and avoid obstacles.
┌────────────────────────────┐
│ External Control System │
│ (your computer, IP .100) │
└────────────┬───────────────┘
│ Wi-Fi (192.168.123.0/24)
┌────────────▼───────────────┐
│ Motion Control Board (Pi) │
│ - us_mqtt.py (MQTT relay) │
│ - patrol_service.py │
└────────────┬───────────────┘
│
┌────────────▼───────────────┐
│ Sensing Board (Head) │
│ - UDP camera streaming │
└────────────────────────────┘
patrol_service.pysubscribes to MQTT topics likerobot/ultrasonicand launches theobject_detectormodule.object_detectorreceives UDP video, processes the frames with OpenCV (GStreamer enabled), and displays the stream.- The robot's movement is coordinated using local sensor data and a live occupancy grid map.
git clone https://github.com/AgnelFernando/Cerberus.git
git clone https://github.com/unitreerobotics/UnitreecameraSDK.git- Wi-Fi Name:
GoXXXXXXXX - Password:
0000000
scp -r UnitreeCameraSDK pi@192.168.12.1:/home/pi/
scp -r Cerberus/us_lcm pi@192.168.12.1:/home/pi/
scp Cerberus/scripts/kill_cam_process.sh pi@192.168.12.1:/home/pi/
# password: 123ssh pi@192.168.12.1
scp -r UnitreeCameraSDK unitree@192.168.123.13:/home/unitree/
scp kill_cam_process.sh unitree@192.168.123.13:/home/unitree/
# password: 123ssh unitree@192.168.123.13
chmod +x kill_cam_process.sh
./kill_cam_process.sh
cd UnitreeCameraSDK
vim trans_rect_config.yamlModify the IpLastSegment field:
data: [ 15. ] # Change this
# to:
data: [ 100. ] # So that it streams to your computer at .100Then build and launch the camera stream:
mkdir build
cd build
cmake ..
make
cd ..
./bin/example_putImagetransLeave this terminal open to maintain the stream.
ssh pi@192.168.12.1
cd us_lcm
python us_mqtt.pycd Cerberus
conda create -n cerberus python=3.10
conda activate cerberus
pip install -r requirements.txtgit clone https://github.com/opencv/opencv.git
cd opencv
mkdir build && cd build
cmake -D CMAKE_BUILD_TYPE=Release \
-D CMAKE_INSTALL_PREFIX=$CONDA_PREFIX \
-D PYTHON_EXECUTABLE=$(which python) \
-D WITH_GSTREAMER=ON \
-D BUILD_opencv_python3=ON \
-D OPENCV_GENERATE_PKGCONFIG=ON \
..
make -j$(nproc)
make installEnsure Go1 is in walk mode, then:
python patrol_service.pyThis will:
- Start receiving ultrasonic data from MQTT
- Launch the object detection module
- Begin autonomous patrol based on a grid map
Contributions are welcome!
If you’d like to improve Cerberus, feel free to:
- Fork the repo
- Create a feature branch
- Submit a pull request
