by philkr

philkr / magic_init
126 Stars 49 Forks Last release: Not found Other 11 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:

Data-dependent initialization of convolutional neural networks

Created by Philipp Krähenbühl.


This code implements the initialization presented in our arXiv tech report, which is under submission at ICLR 2016.

This is a reimplementation and currently work in progress. Use at your own risk.


This code is released under the BSD License (refer to the LICENSE file for details).


If you find our initialization useful in your research, please consider citing:

  title={Data-dependent Initializations of Convolutional Neural Networks},
  author={Kr{\"a}henb{\"u}hl, Philipp and Doersch, Carl and Donahue, Jeff and Darrell, Trevor},
  journal={arXiv preprint arXiv:1511.06856},


Checkout the project and create a symlink to caffe in the

ln -s path/to/caffe/python/caffe caffe


Here is a quick example on how to initialize alexnet:

python path/to/alexnet/deploy.prototxt path/to/output.caffemodel -d "path/to/some/images/*.png" -q -nit 10 -cs
flag allows you to initialize the network using your own images. Feel free to use imagenet, Pascal, COCO or whatever you have at hand, it shouldn't make a big difference. The
(queit) flag suppresses all the caffe logging,
controls the number of batches used (while
controls the batch size). Finally
rescales the gradients accross layers. This rescaling currently works best for feed-forward networks, and might not work too well for DAG structured networks (we are working on that).

To run the k-means initialization use:

python path/to/alexnet/deploy.prototxt path/to/output.caffemodel -d "path/to/some/images/*.png" -q -nit 10 -cs -t kmeans


python -h
should provide you with more help.

Pro tips

If you're numpy implementation is based on openblas, try disabeling threading

, it can improve the runtime performance a bit.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.