Need help with allennlp-demo?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

153 Stars 75 Forks Apache License 2.0 551 Commits 73 Opened issues


Code for the AllenNLP demo.

Services available


Need anything else?

Contributors list

This repository contains the code for the AllenNLP demo.

We're actively refactoring some bits and pieces of the codebase, you can expect better documentation to land in the near future.

For more information see the AllenNLP project.


You'll need Docker installed on your local machine.

Running a Local Environment

The AllenNLP demo at a high level is composed of two components:

  1. A JavaScript application for rendering the user-interface. The code for this can be found in
  2. A series of Python applications that each provide a small HTTP API endpoint for doing interesting things with a single model. The code for this can be found in

There's three ways to run things locally:

  1. If you're working on a single model endpoint consult the README in the api directory for more specific instructions.

  2. If you're only working on the user-interface, you can start things up by running:

    docker-compose -f docker-compose.ui-only.yaml up --build

    or use the script:


Once that's complete you'll be able to access your local version by opening http://localhost:8080 in a browser. Changes to the code should be automatically applied.

Note: To clean up docker containers, be sure to use:

   docker-compose -f docker-compose.ui-only.yaml down


   ./bin/ui down
  1. If you'd like to run an end to end environment that includes the user-interface and a model endpoint, you can do so by running:

    MODEL=bidaf_elmo docker-compose up --build


environment variable specifies which model in
to run locally. The name should match the name of the directory in
. If the model has a custom
, set the
environment variable to the path to that file:
   MODEL=masked_lm MODEL_DOCKERFILE=allennlp_demo/masked_lm/Dockerfile docker-compose up --build

Once everything's started open http://localhost:8080 in the browser of your choice.

Code changes will be automatically applied, while changes to backend or frontend dependencies require rerunning


We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.