Need help with SARosPerceptionKitti?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

appinho
194 Stars 68 Forks MIT License 219 Commits 0 Opened issues

Description

ROS package for the Perception (Sensor Processing, Detection, Tracking and Evaluation) of the KITTI Vision Benchmark Suite

Services available

!
?

Need anything else?

Contributors list

# 117,973
MATLAB
sensor-...
ros-kin...
C++
194 commits
# 380,390
MATLAB
sensor-...
ros-kin...
C++
7 commits
# 447,984
MATLAB
sensor-...
multi-o...
C++
3 commits
# 561,554
MATLAB
sensor-...
multi-o...
C++
1 commit

License: MIT

SARosPerceptionKitti

ROS package for the Perception (Sensor Processing, Detection, Tracking and Evaluation) of the KITTI Vision Benchmark

Demo

Setup

Sticking to this folder structure is highly recommended:

    ~                                        # Home directory
    ├── catkin_ws                            # Catkin workspace
    │   ├── src                              # Source folder
    │       └── SARosPerceptionKitti         # Repo
    ├── kitti_data                           # Dataset
    │   ├── 0012                             # Demo scenario 0012
    │   │   └── synchronized_data.bag        # Synchronized ROSbag file

1) Install ROS and create a catkin workspace in your home directory:

mkdir -p ~/catkin_ws/src

2) Clone this repository into the catkin workspace's source folder (src) and build it:

cd ~/catkin_ws/src
git clone https://github.com/appinho/SARosPerceptionKitti.git
cd ~/catkin_ws
catkin_make
source devel/setup.bash

3) Download a preprocessed scenario and unzip it into a separate

kitti_data
directory, also stored under your home directory:
mkdir ~/kitti_data && cd ~/kitti_data/
mv ~/Downloads/0012.zip .
unzip 0012.zip
rm 0012.zip

Usage

1) Launch one of the following ROS nodes to perform and visualize the pipeline (Sensor Processing -> Object Detection -> Object Tracking) step-by-step:

source devel/setup.bash
roslaunch sensor_processing sensor_processing.launch home_dir:=/home/YOUR_USERNAME
roslaunch detection detection.launch home_dir:=/home/YOUR_USERNAME
roslaunch tracking tracking.launch home_dir:=/home/YOUR_USERNAME
  • Default parameters: * scenario:=0012 * speed:=0.2
    * delay:=3

Without assigning any of the abovementioned parameters the demo scenario 0012 is replayed at 20% of its speed with a 3 second delay so RViz has enough time to boot up.

2) Write the results to file and evaluate them:

roslaunch evaluation evaluation.launch home_dir:=/home/YOUR_USERNAME
cd ~/catkin_ws/src/SARosPerceptionKitti/benchmark/python
python evaluate_tracking.py

Results for demo scenario 0012

| Class | MOTA | MOTP | MOTAL | MODA | MODP | | ------------ |:-------:|:-------:|:-------:|:-------:|:-------:| | Car | 0.881119| 0.633595| 0.881119| 0.881119| 0.642273| | Pedestrian | 0.546875| 0.677919| 0.546875| 0.546875| 0.836921|

Contact

If you have any questions, things you would love to add or ideas how to actualize the points in the Area of Improvements, send me an email at [email protected] ! More than interested to collaborate and hear any kind of feedback.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.