Skip to content

Commit 148bf74

Browse files
Juliajmaciejmajeksachinkum0009
authored
feat: RAI Semantic Map Memory (rai_semap) (#727)
Co-authored-by: Maciej Majek <46171033+maciejmajek@users.noreply.github.com> Co-authored-by: Maciej Majek <maciej.majek@robotec.ai> Co-authored-by: Sachin Kumar <sachinkum123567@gmail.com>
1 parent 1567ab1 commit 148bf74

39 files changed

+6276
-114
lines changed

poetry.lock

Lines changed: 131 additions & 114 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

pyproject.toml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,13 @@ optional = true
4040
[tool.poetry.group.s2s.dependencies]
4141
rai_s2s = {path = "src/rai_s2s", develop = true, extras = ["all"]}
4242

43+
[tool.poetry.group.semap]
44+
optional = true
45+
46+
[tool.poetry.group.semap.dependencies]
47+
rai_semap = {path = "src/rai_semap", develop = true}
48+
49+
4350
[tool.poetry.group.simbench]
4451
optional = true
4552

src/rai_semap/README.md

Lines changed: 139 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,139 @@
1+
# RAI Semantic Map Memory
2+
3+
⚠️ **Experimental Module**: This module is in active development. Features may change and some functionality is still in progress.
4+
5+
## Overview
6+
7+
Imagine your robot exploring a new warehouse or office building. Using SLAM (Simultaneous Localization and Mapping), it builds a geometric map showing walls and open areas, but it doesn't remember what objects it saw—like that tool cart or equipment in the storage area.
8+
9+
RAI Semantic Map Memory solves this by adding a memory layer. As the robot explores, it remembers not just where walls are, but also what objects it detected and where they were located. Later, you can ask questions like "where did I see a pallet?" or "what objects are near the loading dock?" and the robot can answer using its stored memory.
10+
11+
This module provides persistent storage of semantic annotations—linking object identities (like "shelf", "cart", "pallet") to their 3D locations in the map. It enables spatial-semantic queries that combine "what" and "where" information.
12+
13+
## Some Usage Examples
14+
15+
- Store object detections with their locations as the robot explores
16+
- Query objects by location: "what's near point (x, y)?"
17+
- Visualize stored annotations overlaid on the SLAM map
18+
19+
For detailed design and architecture, see [design.md](../design.md).
20+
21+
## Quick Start
22+
23+
The following examples use the ROSBot XL demo to illustrate how to use rai_semap.
24+
25+
### Prerequisites
26+
27+
- ROS2 environment configured
28+
- rai_semap installed: `poetry install --with semap`
29+
- ROSBot XL demo configured (see [ROSBot XL demo](../../docs/demos/rosbot_xl.md))
30+
31+
### Step 0: Launch the ROSBot XL demo
32+
33+
Follow the instruction from [ROSBot XL demo](../../docs/demos/rosbot_xl.md).
34+
35+
### Step 1: Launch the Semantic Map Node
36+
37+
Start the semantic map node to begin collecting and storing detections:
38+
39+
```bash
40+
ros2 launch src/rai_semap/rai_semap/scripts/semap.launch.py
41+
```
42+
43+
This uses default configuration files from `rai_semap/ros2/config/`. The default configs assume depth topic `/camera/depth/image_rect_raw` and camera info topic `/camera/color/camera_info`. If your topics use different names, create custom config files or override parameters.
44+
45+
To use custom configs:
46+
47+
```bash
48+
ros2 launch src/rai_semap/rai_semap/scripts/semap.launch.py \
49+
node_config:=/path/to/node.yaml \
50+
detection_publisher_config:=/path/to/detection_publisher.yaml \
51+
perception_utils_config:=/path/to/perception_utils.yaml
52+
```
53+
54+
### Step 2: Collect Detections
55+
56+
In a separate terminal, run the navigation script to move the robot through waypoints and collect detections:
57+
58+
```bash
59+
python -m rai_semap.scripts.navigate_collect \
60+
--waypoints 2.0 0.0 4.0 0.0 2.0 2.0 \
61+
--collect-duration 10.0 \
62+
--use-sim-time
63+
```
64+
65+
The script navigates to each waypoint and waits to allow detections to be collected and stored in the semantic map.
66+
67+
### Step 3: Validate Stored Data
68+
69+
After navigation completes, verify what was stored:
70+
71+
```bash
72+
python -m rai_semap.scripts.validate_semap \
73+
--database-path semantic_map.db \
74+
--location-id default_location
75+
```
76+
77+
The validation script shows total annotation count, annotations grouped by object class, confidence scores, and spatial distribution.
78+
79+
## Configuration
80+
81+
Configuration parameters (database_path, location_id, topics, etc.) are set in YAML config files. If config files are not provided, default configs in `rai_semap/ros2/config/` are used.
82+
83+
## Visualization
84+
85+
View your semantic map annotations overlaid on the SLAM map in RViz2.
86+
87+
### Start the Visualizer
88+
89+
```bash
90+
python -m rai_semap.ros2.visualizer \
91+
--ros-args \
92+
-p database_path:=semantic_map.db \
93+
-p location_id:=default_location \
94+
-p update_rate:=1.0 \
95+
-p marker_scale:=0.3 \
96+
-p show_text_labels:=true
97+
```
98+
99+
### Setup RViz2
100+
101+
Launch RViz2 with the provided config file:
102+
103+
```bash
104+
rviz2 -d src/rai_semap/rai_semap/scripts/semantic_map.rviz
105+
```
106+
107+
The config file includes:
108+
109+
- Map display subscribed to `/map` topic
110+
- MarkerArray display subscribed to `/semantic_map_markers` topic
111+
- Fixed Frame set to `map`
112+
113+
The visualizer shows color-coded markers by object class (bed=blue, chair=green, door=orange, shelf=purple, table=violet). Marker transparency scales with confidence score, and optional text labels show object class names.
114+
115+
## Querying the Semantic Map
116+
117+
Query stored annotations programmatically using the Python API:
118+
119+
```python
120+
from geometry_msgs.msg import Point
121+
from rai_semap.core.backend.sqlite_backend import SQLiteBackend
122+
from rai_semap.core.semantic_map_memory import SemanticMapMemory
123+
124+
# Initialize memory
125+
backend = SQLiteBackend("semantic_map.db")
126+
memory = SemanticMapMemory(
127+
backend=backend,
128+
location_id="default_location",
129+
map_frame_id="map",
130+
resolution=0.05,
131+
)
132+
133+
# Query annotations near a location
134+
center = Point(x=2.0, y=0.0, z=0.0)
135+
annotations = memory.query_by_location(center, radius=2.0)
136+
137+
for ann in annotations:
138+
print(f"Found {ann.object_class} at ({ann.pose.position.x}, {ann.pose.position.y})")
139+
```

0 commit comments

Comments
 (0)