An RESTful API for U.S. federal spending data.
No Data
This API is utilized by USAspending.gov to obtain all federal spending data which is open source and provided to the public as part of the DATA Act.
Ensure the following dependencies are installed and working prior to continuing:
Dockerwhich will handle the other application dependencies.
Bashor another Unix Shell equivalent
Git
Using Docker is recommended since it provides a clean environment. Setting up your own local environment requires some technical abilities and experience with modern software tools.
apt-get
Homebrew
PostgreSQLversion 10.x (with a dedicated
data_store_apidatabase)
Elasticsearchversion 7.1
pyenv
Now, navigate to the base file directory where you will store the USAspending repositories
$ mkdir -p usaspending && cd usaspending $ git clone https://github.com/fedspendingtransparency/usaspending-api.git $ cd usaspending-api
There are three documented options for setting up a local database in order to run the API:
Create a Local postgres database called 'datastoreapi' and either create a new username and password for the database or use all the defaults. For help, consult: - Postgres Setup Help
Make sure to grant whatever user you created for the data_store api database superuser permissions or some scripts will not work:
postgres=# ALTER ROLE <> WITH SUPERUSER;
See below for basic setup instructions. For help with Docker Compose: - Docker Compose
None of these commands will rebuild a Docker image! Use
--buildif you make changes to the code or want to rebuild the image before running the
upsteps.
If you run a local database, set
POSTGRES_HOSTin
.envto
host.docker.internal.
POSTGRES_PORTshould be changed if it isn't 5432.
docker-compose up usaspending-dbwill create and run a Postgres database.
docker-compose run --rm usaspending-manage python3 -u manage.py migratewill run Django migrations: https://docs.djangoproject.com/en/2.2/topics/migrations/.
docker-compose run --rm usaspending-manage python3 -u manage.py load_reference_datawill load essential reference data (agencies, program activity codes, CFDA program data, country codes, and others).
docker-compose run --rm usaspending-manage python3 -u manage.py matview_runner --dependencieswill provision the materialized views which are required by certain API endpoints.
docker-compose.yamlcontains the shell commands necessary to set up the database manually, if you prefer to have a more custom environment.
For further instructions on how to download, use, and setup the database using a subset of our data please go to:
Some of the API endpoints reach into Elasticsearch for data.
docker-compose up usaspending-eswill create and start a single-node Elasticsearch cluster, using the
ES_CLUSTER_DIRspecified in the
.envconfiguration file. We recommend using a folder outside of the usaspending-api project directory so it does not get copied to other containers.
The cluster should be reachable via at http://localhost:9200 ("You Know, for Search").
Optionally, to see log output, use
docker-compose logs usaspending-es(these logs are stored by docker even if you don't use this).
docker-compose up usaspending-api
settings.py(buckets, elasticsearch, local paths) and they will be mounted and used when you run this.
The application will now be available at
http://localhost:8000.
Note: if the code was run outside of Docker then compiled Python files will potentially trip up the docker environment. A useful command to run for clearing out the files on your host is:
find . | grep -E "(__pycache__|\.pyc|\.pyo$)" | xargs rm -rf
In your local development environment, available API endpoints may be found at
http://localhost:8000/docs/endpoints
Deployed production API endpoints and docs are found by following links here:
https://api.usaspending.gov
Note: it is possible to run ad-hoc commands out of a Docker container once you get the hang of it, see the comments in the Dockerfile.
For details on loading reference data, DATA Act Broker submissions, and current USAspending data into the API, see loading_data.md.
For details on how our data loaders modify incoming data, see data_reformatting.md.
To run all tests in the docker services run
docker-compose run --rm usaspending-test
To run tests locally and not in the docker services, you need:
Once these are satisfied, run:
(usaspending-api) $ pytest
Create and activate the virtual environment using
venv, and ensure the right version of Python 3.7.x is being used (the latest RHEL package available for
python36u: as of this writing)
$ pyenv install 3.7.2 $ pyenv local 3.7.2 $ python -m venv .venv/usaspending-api $ source .venv/usaspending-api/bin/activate
Your prompt should then look as below to show you are in the virtual environment named
usaspending-api(to exit that virtual environment, simply type
deactivateat the prompt).
(usaspending-api) $
pip
installapplication dependencies
(usaspending-api) $ pip install -r requirements/requirements.txt
Create a
.envrcfile in the repo root, which will be ignored by git. Change credentials and ports as-needed for your local dev environment.
export DATABASE_URL=postgres://usaspending:[email protected]:5432/data_store_api export ES_HOSTNAME=http://localhost:9200 export DATA_BROKER_DATABASE_URL=postgres://admin:[email protected]:5435/data_broker
If
direnvdoes not pick this up after saving the file, type
$ direnv allow
Alternatively, you could skip using
direnvand just export these variables in your shell environment.
Some automated integration tests run against a Broker database. If the dependencies to run such integration tests are not satisfied, those tests will bail out and be marked as Skipped. (You can see messages about those skipped tests by adding the
-rsflag to pytest, like:
pytest -rs)
To satisfy these dependencies and include execution of these tests, do the following: 1. Ensure you have
Dockerinstalled and running on your machine 1. Ensure the
Brokersource code is checked out alongside this repo at
../data-act-broker-backend1. Ensure you have the
DATA_BROKER_DATABASE_URLenvironment variable set, and pointing to a live PostgreSQL server (no database required) 1. Ensure you have built the
Brokerbackend Docker image by running:
(usaspending-api) $ docker build -t dataact-broker-backend ../data-act-broker-backend
NOTE: Broker source code should be re-fetched and image rebuilt to ensure latest integration is tested
Re-running the test suite using
pytest -rswith these dependencies satisfied should yield no more skips of the broker integration tests.
To submit fixes or enhancements, or to suggest changes, see CONTRIBUTING.md