dancenet

by jsn5

jsn5 / dancenet

DanceNet -💃💃Dance generator using Autoencoder, LSTM and Mixture Density Network. (Keras)

463 Stars 72 Forks Last release: Not found MIT License 40 Commits 1 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

DanceNet - Dance generator using Variational Autoencoder, LSTM and Mixture Density Network. (Keras)

License: MIT Run on FloydHub DOI

Main components:

  • Variational autoencoder
  • LSTM + Mixture Density Layer

Requirements:

  • Python version = 3.5.2

### Packages * keras==2.2.0 * sklearn==0.19.1 * numpy==1.14.3 * opencv-python==3.4.1

Dataset

https://www.youtube.com/watch?v=NdSqAAT28v0 This is the video used for training.

How to run locally

  • Download the trained weights from here. and extract it to the dancenet dir.
  • Run dancegen.ipynb

How to run in your browser

Run on FloydHub

  • Click the button above to open this code in a FloydHub workspace (the trained weights dataset will be automatically attached to the environment)
  • Run dancegen.ipynb

Training from scratch

  • fill dance sequence images labeled as
    1.jpg
    ,
    2.jpg
    ... in
    imgs/
    folder
  • run
    model.py
  • run
    gen_lv.py
    to encode images
  • run
    video_from_lv.py
    to test decoded video
  • run jupyter notebook
    dancegen.ipynb
    to train dancenet and generate new video.

References

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.