You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you did not install CUDA, you can run instead (note that the tracking performance is significantly better with the GPU version):
90
90
```bash
91
91
roslaunch dbot_example launch_example_cpu.launch
92
92
```
93
-
Note that the tracking performance is significantly better with the GPU version.
93
+
94
94
95
95
96
96
As soon as you launch the example, rviz should start, and an interactive marker should show up (in the form of an impact wrench). This marker is for initialization of the tracker, you can move it to align it
@@ -125,8 +125,7 @@ first the workspace setup of the object tracking above. Then continue
If CUDA is not being used, you can start the CPU based setup instead:
156
+
If CUDA is not being used, you can start the CPU based setup instead (note that the tracking performance is significantly better with the GPU version):
158
157
```bash
159
158
roslaunch dbrt_example launch_example_cpu.launch
160
159
```
161
-
Note that the tracking performance is significantly better with the GPU version.
162
160
163
161
This will start the data playback, the visualization and the robot tracker.
164
162
You should see a point cloud in white, the robot model using only joint
165
-
encoders in red, and the corrected robot model in blue. It should be visible
163
+
encoders in red, and the corrected robot model (fusing joint encoders and depth images) in blue. It should be visible
166
164
that the blue robot model is significantly better aligned with the point cloud than
0 commit comments