Need help with Lasagne?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

3.8K Stars 978 Forks Other 1.2K Commits 139 Opened issues


Lightweight library to build and train neural networks in Theano

Services available


Need anything else?

Contributors list

.. image:: :target:

.. image:: :target:

.. image:: :target:

.. image:: :target:

.. image:: :target:


Lasagne is a lightweight library to build and train neural networks in Theano. Its main features are:

  • Supports feed-forward networks such as Convolutional Neural Networks (CNNs), recurrent networks including Long Short-Term Memory (LSTM), and any combination thereof
  • Allows architectures of multiple inputs and multiple outputs, including auxiliary classifiers
  • Many optimization methods including Nesterov momentum, RMSprop and ADAM
  • Freely definable cost function and no need to derive gradients due to Theano's symbolic differentiation
  • Transparent support of CPUs and GPUs due to Theano's expression compiler

Its design is governed by

six principles
  • Simplicity: Be easy to use, easy to understand and easy to extend, to facilitate use in research
  • Transparency: Do not hide Theano behind abstractions, directly process and return Theano expressions or Python / numpy data types
  • Modularity: Allow all parts (layers, regularizers, optimizers, ...) to be used independently of Lasagne
  • Pragmatism: Make common use cases easy, do not overrate uncommon cases
  • Restraint: Do not obstruct users with features they decide not to use
  • Focus: "Do one thing and do it well"


In short, you can install a known compatible version of Theano and the latest Lasagne development version via:

.. code-block:: bash

pip install -r pip install

For more details and alternatives, please see the

Installation instructions


Documentation is available online:

For support, please refer to the

lasagne-users mailing list


.. code-block:: python

import lasagne import theano import theano.tensor as T

# create Theano variables for input and target minibatch inputvar = T.tensor4('X') targetvar = T.ivector('y')

# create a small convolutional neural network from lasagne.nonlinearities import leakyrectify, softmax network = lasagne.layers.InputLayer((None, 3, 32, 32), inputvar) network = lasagne.layers.Conv2DLayer(network, 64, (3, 3), nonlinearity=leakyrectify) network = lasagne.layers.Conv2DLayer(network, 32, (3, 3), nonlinearity=leakyrectify) network = lasagne.layers.Pool2DLayer(network, (3, 3), stride=2, mode='max') network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5), 128, nonlinearity=leaky_rectify, W=lasagne.init.Orthogonal()) network = lasagne.layers.DenseLayer(lasagne.layers.dropout(network, 0.5), 10, nonlinearity=softmax)

# create loss function prediction = lasagne.layers.getoutput(network) loss = lasagne.objectives.categoricalcrossentropy(prediction, targetvar) loss = loss.mean() + 1e-4 * lasagne.regularization.regularizenetwork_params( network, lasagne.regularization.l2)

# create parameter update expressions params = lasagne.layers.getallparams(network, trainable=True) updates = lasagne.updates.nesterovmomentum(loss, params, learningrate=0.01, momentum=0.9)

# compile training function that updates parameters and returns training loss trainfn = theano.function([inputvar, target_var], loss, updates=updates)

# train network (assuming you've got some training data in numpy arrays) for epoch in range(100): loss = 0 for inputbatch, targetbatch in trainingdata: loss += trainfn(inputbatch, targetbatch) print("Epoch %d: Loss %g" % (epoch + 1, loss / len(training_data)))

# use trained network for predictions testprediction = lasagne.layers.getoutput(network, deterministic=True) predictfn = theano.function([inputvar], T.argmax(testprediction, axis=1)) print("Predicted class for first test input: %r" % predictfn(test_data[0]))

For a fully-functional example, see

, and check the
for in-depth explanations of the same. More examples, code snippets and reproductions of recent research papers are maintained in the separate
Lasagne Recipes
_ repository.


If you find Lasagne useful for your scientific work, please consider citing it in resulting publications. We provide a ready-to-use

BibTeX entry for citing


Lasagne is a work in progress, input is welcome.

Please see the

Contribution instructions
_ for details on how you can contribute!

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.