Need help with usaspending-api?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

162 Stars 64 Forks Other 15.9K Commits 31 Opened issues


An RESTful API for U.S. federal spending data.

Services available


Need anything else?

Contributors list

No Data

USAspending API

Code style: black Build Status Test Coverage Code Climate

This API is utilized by to obtain all federal spending data which is open source and provided to the public as part of the DATA Act.

USAspending Landing Page

Creating a Development Environment

Ensure the following dependencies are installed and working prior to continuing:


If not using Docker:

Using Docker is recommended since it provides a clean environment. Setting up your own local environment requires some technical abilities and experience with modern software tools.

  • Command line package manager
    • Windows' WSL bash uses
    • MacOS users will use
    • Linux users already know their package manager (yum, apt, pacman, etc.)
  • PostgreSQL
    version 10.x (with a dedicated
  • Elasticsearch
    version 7.1
  • Python 3.7 environment
    • Highly recommended to use a virtual environment. There are various tools and associated instructions depending on preferences
    • See Required Python Libraries for an example using

Cloning the Repository

Now, navigate to the base file directory where you will store the USAspending repositories

$ mkdir -p usaspending && cd usaspending
$ git clone
$ cd usaspending-api

Database Setup

There are three documented options for setting up a local database in order to run the API:

  1. Local Empty DB. Use your own local postgres database for the API to use.
  2. Containerized Empty DB. Create an empty directory on your localhost where all the database files will persist and use the docker-compose file to bring up a containerized postgres database.
  3. Local Populated DB. Download either the whole database or a database subset from the USAspending website.

Option 1: Using a Locally Hosted Postgres Database

Create a Local postgres database called 'datastoreapi' and either create a new username and password for the database or use all the defaults. For help, consult: - Postgres Setup Help

Make sure to grant whatever user you created for the data_store api database superuser permissions or some scripts will not work:


Option 2: Using the Docker Compose Postgres Database

See below for basic setup instructions. For help with Docker Compose: - Docker Compose

Database Setup and Initialization with Docker Compose
  • None of these commands will rebuild a Docker image! Use

    if you make changes to the code or want to rebuild the image before running the

  • If you run a local database, set

    should be changed if it isn't 5432.
    • docker-compose up usaspending-db
      will create and run a Postgres database.
    • docker-compose run --rm usaspending-manage python3 -u migrate
      will run Django migrations:
    • docker-compose run --rm usaspending-manage python3 -u load_reference_data
      will load essential reference data (agencies, program activity codes, CFDA program data, country codes, and others).
    • docker-compose run --rm usaspending-manage python3 -u matview_runner --dependencies
      will provision the materialized views which are required by certain API endpoints.
Manual Database Setup
  • docker-compose.yaml
    contains the shell commands necessary to set up the database manually, if you prefer to have a more custom environment.

Option 3: Downloading the database or a subset of the database and loading it into PostgreSQL

For further instructions on how to download, use, and setup the database using a subset of our data please go to:

USAspending Database Download

Elasticsearch Setup

Some of the API endpoints reach into Elasticsearch for data.

  • docker-compose up usaspending-es
    will create and start a single-node Elasticsearch cluster, using the
    specified in the
    configuration file. We recommend using a folder outside of the usaspending-api project directory so it does not get copied to other containers.
  • The cluster should be reachable via at http://localhost:9200 ("You Know, for Search").

  • Optionally, to see log output, use

    docker-compose logs usaspending-es
    (these logs are stored by docker even if you don't use this).

Running the API

docker-compose up usaspending-api
  • You can update environment variables in
    (buckets, elasticsearch, local paths) and they will be mounted and used when you run this.

The application will now be available at


Note: if the code was run outside of Docker then compiled Python files will potentially trip up the docker environment. A useful command to run for clearing out the files on your host is:

find . | grep -E "(__pycache__|\.pyc|\.pyo$)" | xargs rm -rf

Using the API

In your local development environment, available API endpoints may be found at


Deployed production API endpoints and docs are found by following links here:

Loading Data

Note: it is possible to run ad-hoc commands out of a Docker container once you get the hang of it, see the comments in the Dockerfile.

For details on loading reference data, DATA Act Broker submissions, and current USAspending data into the API, see

For details on how our data loaders modify incoming data, see

Running Tests

Test Setup

To run all tests in the docker services run

docker-compose run --rm usaspending-test

To run tests locally and not in the docker services, you need:

  1. Postgres A running PostgreSQL database server (See Database Setup above)
  2. Elasticsearch A running Elasticsearch cluster (See Elasticsearch Setup above)
  3. Required Python Libraries Python package dependencies downloaded and discoverable (See below)
  4. Environment Variables Tell python where to connect to the various data stores (See below)

Once these are satisfied, run:

(usaspending-api) $ pytest

Required Python Libraries

Create and activate the virtual environment using

, and ensure the right version of Python 3.7.x is being used (the latest RHEL package available for
: as of this writing)
$ pyenv install 3.7.2
$ pyenv local 3.7.2
$ python -m venv .venv/usaspending-api
$ source .venv/usaspending-api/bin/activate

Your prompt should then look as below to show you are in the virtual environment named

(to exit that virtual environment, simply type
at the prompt
(usaspending-api) $


application dependencies
(usaspending-api) $ pip install -r requirements/requirements.txt

Environment Variables

Create a

file in the repo root, which will be ignored by git. Change credentials and ports as-needed for your local dev environment.
export DATABASE_URL=postgres://usaspending:[email protected]:5432/data_store_api
export ES_HOSTNAME=http://localhost:9200
export DATA_BROKER_DATABASE_URL=postgres://admin:[email protected]:5435/data_broker


does not pick this up after saving the file, type
$ direnv allow

Alternatively, you could skip using

and just export these variables in your shell environment.

Including Broker Integration Tests

Some automated integration tests run against a Broker database. If the dependencies to run such integration tests are not satisfied, those tests will bail out and be marked as Skipped. (You can see messages about those skipped tests by adding the

flag to pytest, like:
pytest -rs

To satisfy these dependencies and include execution of these tests, do the following: 1. Ensure you have

installed and running on your machine 1. Ensure the

source code is checked out alongside this repo at
1. Ensure you have the
environment variable set, and pointing to a live PostgreSQL server (no database required) 1. Ensure you have built the
backend Docker image by running:
    (usaspending-api) $ docker build -t dataact-broker-backend ../data-act-broker-backend

NOTE: Broker source code should be re-fetched and image rebuilt to ensure latest integration is tested

Re-running the test suite using

pytest -rs
with these dependencies satisfied should yield no more skips of the broker integration tests.


To submit fixes or enhancements, or to suggest changes, see

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.