Code for the AllenNLP demo.
This repository contains the code for the AllenNLP demo.
We're actively refactoring some bits and pieces of the codebase, you can expect better documentation to land in the near future.
For more information see the AllenNLP project.
You'll need Docker installed on your local machine.
The AllenNLP demo at a high level is composed of two components:
There's three ways to run things locally:
If you're working on a single model endpoint consult the README in the api directory for more specific instructions.
If you're only working on the user-interface, you can start things up by running:
docker-compose -f docker-compose.ui-only.yaml up --build
or use the script:
Once that's complete you'll be able to access your local version by opening http://localhost:8080 in a browser. Changes to the code should be automatically applied.
Note: To clean up docker containers, be sure to use:
docker-compose -f docker-compose.ui-only.yaml down
If you'd like to run an end to end environment that includes the user-interface and a model endpoint, you can do so by running:
MODEL=bidaf_elmo docker-compose up --build
MODELenvironment variable specifies which model in
api/to run locally. The name should match the name of the directory in
api/allenlp_demo. If the model has a custom
Dockerfile, set the
MODEL_DOCKERFILEenvironment variable to the path to that file:
MODEL=masked_lm MODEL_DOCKERFILE=allennlp_demo/masked_lm/Dockerfile docker-compose up --build
Once everything's started open http://localhost:8080 in the browser of your choice.
Code changes will be automatically applied, while changes to backend or frontend dependencies require rerunning