Skip to content

Commit e11dd5b

Browse files
authored
Update README.md
1 parent 35275ae commit e11dd5b

File tree

1 file changed

+10
-12
lines changed

1 file changed

+10
-12
lines changed

README.md

Lines changed: 10 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -64,13 +64,13 @@ catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
6464
```
6565

6666
### Install and run the example
67-
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
67+
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved)
6868
and mesh models of some objects. Additionally it contains launch files, which allow you
6969
to run the code easily.
7070

71-
To install, follow these steps:
71+
To install, follow these steps (note that cloning may take a while because the bagfile is large):
7272
```bash
73-
cd projects/tracking/src
73+
cd $HOME/projects/tracking/src
7474
git clone https://git-amd.tuebingen.mpg.de/open-source/dbot_getting_started.git
7575
cd ..
7676
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
@@ -86,11 +86,11 @@ Now you can run the example:
8686
roslaunch dbot_example launch_example_gpu.launch
8787
```
8888

89-
If you did not install CUDA, you can run instead:
89+
If you did not install CUDA, you can run instead (note that the tracking performance is significantly better with the GPU version):
9090
```bash
9191
roslaunch dbot_example launch_example_cpu.launch
9292
```
93-
Note that the tracking performance is significantly better with the GPU version.
93+
9494

9595

9696
As soon as you launch the example, rviz should start, and an interactive marker should show up (in the form of an impact wrench). This marker is for initialization of the tracker, you can move it to align it
@@ -125,8 +125,7 @@ first the workspace setup of the object tracking above. Then continue
125125
with the instructions below:
126126

127127
```bash
128-
cd $HOME
129-
cd projects/tracking/src
128+
cd $HOME/projects/tracking/src
130129
git clone git@github.com:bayesian-object-tracking/dbrt.git
131130
cd ..
132131
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
@@ -138,10 +137,10 @@ catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
138137

139138
### Install and run the example
140139

141-
Add the following example project to the workspace
140+
Add the following example project to the workspace (note that cloning may take a while due to the size of the data)
142141

143142
```bash
144-
cd src
143+
cd $HOME/projects/tracking/src
145144
git clone https://git-amd.tuebingen.mpg.de/open-source/dbrt_getting_started.git
146145
cd ..
147146
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
@@ -154,15 +153,14 @@ recorded sensory data:
154153
roslaunch dbrt_example launch_example_gpu.launch
155154
```
156155

157-
If CUDA is not being used, you can start the CPU based setup instead:
156+
If CUDA is not being used, you can start the CPU based setup instead (note that the tracking performance is significantly better with the GPU version):
158157
```bash
159158
roslaunch dbrt_example launch_example_cpu.launch
160159
```
161-
Note that the tracking performance is significantly better with the GPU version.
162160

163161
This will start the data playback, the visualization and the robot tracker.
164162
You should see a point cloud in white, the robot model using only joint
165-
encoders in red, and the corrected robot model in blue. It should be visible
163+
encoders in red, and the corrected robot model (fusing joint encoders and depth images) in blue. It should be visible
166164
that the blue robot model is significantly better aligned with the point cloud than
167165
the red one.
168166

0 commit comments

Comments
 (0)