You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+10-11Lines changed: 10 additions & 11 deletions
Original file line number
Diff line number
Diff line change
@@ -21,8 +21,8 @@ The Bayesian Object Tracking organization on github is a collection of packages
21
21
3D tracking of rigid objects (using depth images), as well as robot arm tracking (using depth images and joint encoders).
22
22
For more details about the research which underlies these packages, please have a look at https://am.is.tuebingen.mpg.de/research_projects/probabilistic-object-tracking-using-a-depth-camera.
23
23
24
-
The core libraries for object tracking are ROS independent. However,
25
-
the tracker integration with sensors is based on the ROS eco-system.
24
+
The core library for object tracking (dbot) is ROS independent. However,
25
+
the integration with sensors (dbot_ros, dbrt) is based on the ROS eco-system.
26
26
27
27
Here, we give instructions on how to install the code and a getting started
28
28
repository. This repository contains a complete example, including the
@@ -43,7 +43,7 @@ to your needs.
43
43
*[Eigen](http://eigen.tuxfamily.org/) 3.2.1 or later
44
44
45
45
## Object Tracking
46
-
The object tracking can be used without the robot tracking package (dbrt).
46
+
The object tracking (dbot, dbot_ros) can be used without the robot tracking package (dbrt).
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
69
-
and mesh models of some objects. Additionally it contains launch files, which allow
68
+
and mesh models of some objects. Additionally it contains launch files, which allow you
70
69
to run the code easily.
71
70
72
71
To install, follow these steps:
@@ -86,20 +85,20 @@ Now you can run the example:
86
85
```bash
87
86
roslaunch dbot_example launch_example_gpu.launch
88
87
```
88
+
89
89
If you did not install CUDA, you can run instead:
90
90
```bash
91
91
roslaunch dbot_example launch_example_cpu.launch
92
92
```
93
93
Note that the tracking performance is significantly better with the GPU version.
94
94
95
95
96
-
As soon as you launch the example, an interactive marker should show up in
97
-
rviz. This is for initialization of the tracker, you can move it to align it
98
-
with the point cloud, but it should already be approximately aligned. Once you
99
-
are done, you can click on the object marker and the tracker should start. You should
100
-
do so before the object is being moved in the playback of the bagfile.
96
+
As soon as you launch the example, rviz should start, and an interactive marker should show up (in the form of an impact wrench). This marker is for initialization of the tracker, you can move it to align it
97
+
with the point cloud. In this example, it should already be approximately aligned. Once you
98
+
are done moving the marker, you can click on it and the tracker should start (note that in the recorded sequence the object starts moving at some point, make sure you initialize before that). You should see a green object
99
+
model following the actual object visible in the white point cloud.
101
100
102
-
### Addition documentation
101
+
### Additional documentation
103
102
104
103
For additional details about the object tracking, please checkout the
0 commit comments