Skip to content

Commit 35275ae

Browse files
authored
Update README.md
1 parent 1e5a72c commit 35275ae

File tree

1 file changed

+10
-11
lines changed

1 file changed

+10
-11
lines changed

README.md

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@ The Bayesian Object Tracking organization on github is a collection of packages
2121
3D tracking of rigid objects (using depth images), as well as robot arm tracking (using depth images and joint encoders).
2222
For more details about the research which underlies these packages, please have a look at https://am.is.tuebingen.mpg.de/research_projects/probabilistic-object-tracking-using-a-depth-camera.
2323

24-
The core libraries for object tracking are ROS independent. However,
25-
the tracker integration with sensors is based on the ROS eco-system.
24+
The core library for object tracking (dbot) is ROS independent. However,
25+
the integration with sensors (dbot_ros, dbrt) is based on the ROS eco-system.
2626

2727
Here, we give instructions on how to install the code and a getting started
2828
repository. This repository contains a complete example, including the
@@ -43,7 +43,7 @@ to your needs.
4343
* [Eigen](http://eigen.tuxfamily.org/) 3.2.1 or later
4444

4545
## Object Tracking
46-
The object tracking can be used without the robot tracking package (dbrt).
46+
The object tracking (dbot, dbot_ros) can be used without the robot tracking package (dbrt).
4747

4848
### Workspace setup and compilation
4949
```bash
@@ -64,9 +64,8 @@ catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
6464
```
6565

6666
### Install and run the example
67-
6867
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
69-
and mesh models of some objects. Additionally it contains launch files, which allow
68+
and mesh models of some objects. Additionally it contains launch files, which allow you
7069
to run the code easily.
7170

7271
To install, follow these steps:
@@ -86,20 +85,20 @@ Now you can run the example:
8685
```bash
8786
roslaunch dbot_example launch_example_gpu.launch
8887
```
88+
8989
If you did not install CUDA, you can run instead:
9090
```bash
9191
roslaunch dbot_example launch_example_cpu.launch
9292
```
9393
Note that the tracking performance is significantly better with the GPU version.
9494

9595

96-
As soon as you launch the example, an interactive marker should show up in
97-
rviz. This is for initialization of the tracker, you can move it to align it
98-
with the point cloud, but it should already be approximately aligned. Once you
99-
are done, you can click on the object marker and the tracker should start. You should
100-
do so before the object is being moved in the playback of the bagfile.
96+
As soon as you launch the example, rviz should start, and an interactive marker should show up (in the form of an impact wrench). This marker is for initialization of the tracker, you can move it to align it
97+
with the point cloud. In this example, it should already be approximately aligned. Once you
98+
are done moving the marker, you can click on it and the tracker should start (note that in the recorded sequence the object starts moving at some point, make sure you initialize before that). You should see a green object
99+
model following the actual object visible in the white point cloud.
101100

102-
### Addition documentation
101+
### Additional documentation
103102

104103
For additional details about the object tracking, please checkout the
105104
[dbot_ros](https://github.com/bayesian-object-tracking/dbot_ros/blob/master/README.md) package.

0 commit comments

Comments
 (0)