Need help with dab-and-tpose-controlled-lights?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

144 Stars 42 Forks 22 Commits 3 Opened issues


Control your lights with dab and t-pose, duh

Services available


Need anything else?

Contributors list

# 169,817
21 commits

Dab and T-Pose Controlled Lights

Control your lights by dabbing and t-pose'ing, duh

Dab and T-Pose Controlled Lights

Check out the full blog post here.

Getting the ZWave Controller working on the TX2

By default, I couldn't write to the

device that the ZWave USB controller came up on with my NVDIA Tegra TX2.

You'll need to do something along the following to get permissions:

$ sudoedit /etc/udev/rules.d/50-myusb.rules

Followed by inserting the following lines to add perms to the


Getting an OpenPose Model running on the TX2

There are two ways, I started out by using the original OpenPose repo to get a build and proof of concept running.

I got ~~probably half a frame~~ 1.4(ish) frames per second out of that. You can see how I initialize and use the built in devboard camera in the
file in this repo. That comes directly out of the included Python examples.

Because I wanted better response time, I ended up searching for a better model. I found tf-pose-estimation.

It requires Tensorflow, so be sure to grab the latest Jetpack release from NVIDIA here when installing.

After that, it should run with the included
, just be sure to run it with the right model. Mobilenetv2large was the bare minimum for an acceptable detection for me:
$ python3 --model=mobilenet_v2_large --resize=432x368

With this model, I get 4(ish) frames per second on the TX2, much better for detection latency. I may try seeing if I can optimize further after getting a full proof of concept running.

Exploring OpenPose Data and Training a New Classifier

We'll use some saved examples of T-Poses and Dabs in order to train our classifier. You can see the Jupyter notebook here with examples of labeling and converting our raw

Numpy exports to CSVs and Pandas datasets, along with cleanup and training.

The current (working) architecture looks like this:

Dab and T-Pose Neural Network Architecture

Running the Project Itself

Dab and T-Pose Architecture

You'll need to get OpenPose up and running, along with the Python libraries for OpenCV and ZWave. After that, you can use the included program, just run it under the


If you want to grab more example poses for retraining, just replace the
with the one included in this repo's

Known bugs

For some reason, the model test I run on my original

dataset doesn't seem to work. I think I messed the data up somewhere along the way in the Jupyter Notebook. If you figure out where that happens, open a PR please. :)

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.