attention-tracker

by chili-epfl

chili-epfl / attention-tracker
128 Stars 84 Forks Last release: Not found 39 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

⚠️️ Attention: This library is currently not maintained. Please use the gazr fork instead. ⚠️️

Attention Tracker

Face tracking for head pose estimation

Head pose estimation

This library (

libhead_pose_estimation.so
) performs 3D head pose estimation based on the fantastic dlib face detector and a bit of OpenCV's solvePnP magic (it uses adult male anthropometric data to match a real 3D head to the projected image).

The library returns a 4x4 transformation matrix.

It supports detection and tracking of multiple faces at the same time, and runs on-line, but it does not feature face identification.

Installation

Note: The library has only been tested on Linux. We can only provide limited support for other operating systems!

Pre-requisites

Dlib: You need to download and extract

Dlib
somewhere. This application has been tested with
dlib-18.16
.

OpenCV: You need to install OpenCV. If you're using Ubuntu, you could run:

sudo apt-get install libopencv-dev

Installation

The library uses a standard

CMake
workflow:
$ mkdir build && cd build
$ cmake -DDLIB_PATH= ..
$ make

Note that the first time you compile the project,

dlib
will compile as well. It takes a few minutes. This won't happen the next times.

To test the library, run:

./head_pose_test ../share/shape_predictor_68_face_landmarks.dat

You should get something very similar to the picture above.

Finally, to install the library:

$ make install

ROS support

The ROS wrapper provides a convenient node that exposes each detected face as a TF frame.

Enable the compilation of the ROS wrapper with:

cmake -DWITH_ROS=ON

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.