Need help with openaq-api?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

152 Stars 42 Forks Other 505 Commits 53 Opened issues


OpenAQ Platform API - NO LONGER IN USE see

Services available


Need anything else?

Contributors list

OpenAQ Platform API

Build Status Slack Chat


This codebase is no longer in use, please see Version 1 of the OpenAQ API is still available via but has been reimplemented in the same repository as version 2.


This is the main API for the OpenAQ project.

Starting with

, there is a web-accessible API that provides endpoints to query the air quality measurements. Documentation can be found at

openaq-fetch takes care of fetching new data and inserting into the database. Data format is explained in openaq-data-format.

Getting started

Install prerequisites:

Clone this repository locally (see these instructions) and activate the required Node.js version with:

nvm install

The last step can be skipped if the local Node.js version matches the one defined at .nvmrc.

Install module dependencies:

npm install


Initialize development database:

npm run init-dev-db

This task will start a PostgreSQL container as daemon, run migrations and seed data. Each of these tasks is available to be run independently, please refer to package.json to learn the options.

After initialization is finished, start the development server:

npm run dev

Access http://localhost:3004.

Stop database container after finishing:

npm run stop-dev-db


Initialize test database:

npm run init-test-db

This task will start a PostgreSQL container as daemon, run migrations and seed data. After initialization is finished, run tests:

npm run test

Stop database container after finishing:

npm run stop-test-db

Deploying to production

The server needs to fetch data about locations and cities in the measurement history using AWS Athena. This service must be configured via the following environment variables:

    : an AWS Access Key that has permissions to create Athena Queries and store them in S3;
    : the corresponding secret;
    : S3 location (in the form of
    ) where the results of the Athena queries should be stored before caching them;
    : the name of the table registered in AWS Athena.

Automatic Athena synchronization is disabled by default. It can be enabled by setting the environment variable

. The sync interval can be set using
variable, in miliseconds. The default interval is set in file config/default.json.

If needed, the synchronization can be fired manually. First, the

variable must be set to allow access webhooks endpoint. Sending a POST request to /v1/webhooks, including the parameters
, will start a sync run. An example with curl:
curl --data "key=123&action=ATHENA_SYNC" https://localhost:3004/v1/webhooks

Other environment variables available:

| Name | Description | Default | |---|---|---| | APIURL | Base API URL after deployment | http://:3004 | | NEWRELICLICENSEKEY | New Relic API key for system monitoring | not set | | WEBHOOKKEY | Secret key to interact with openaq-api | '123' | | USEREDIS | Use Redis for caching? | not set (so not used) | | USEATHENA | Use AWS Athena for aggregations? | not set (so not used) | | REDISURL | Redis instance URL | redis://localhost:6379 | | DONOTUPDATECACHE | Ignore updating cache, but still use older cached results. | not set | | AGGREGATIONREFRESHPERIOD | How long to wait before refreshing cached aggregations? (in ms) | 45 minutes | | REQUESTLIMIT | Max number of items that can be requested at one time. | 10000 | | UPLOADSENCRYPTIONKEY | Key used to encrypt upload token for /upload in database. | 'notsecure' | | S3UPLOAD_BUCKET | The bucket to upload external files to for /upload. | not set |

AWS Athena for aggregations

The Athena table is

that represents the fetches from
and has the following schema:
CREATE EXTERNAL TABLE fetches.fetches_realtime (
  date struct,
  parameter string,
  location string,
  value float,
  unit string,
  city string,
  attribution array>,
  averagingPeriod struct,
  coordinates struct,
  country string,
  sourceName string,
  sourceType string,
  mobile string

Uploads & Generating S3 presigned URLs

Via an undocumented

endpoint, there is the ability to generate presigned S3 PUT URLs so that external clients can authenticate using tokens stored in the database and upload data to be ingested by
. There is a small utility file called
that you can use like
UPLOADS_ENCRYPTION_KEY=foo node index.js your_token_here
to generate encrypted tokens to be manually stored in database.


There is a Dockerfile included that will turn the project into a Docker container. The container can be found here and is currently mostly used for deployment purposes for AWS ECS. If someone wanted to make it better for local development, that'd be a great PR!


There are a lot of ways to contribute to this project, more details can be found in the contributing guide.

Projects using the API

  • openaq-browser site | code - A simple browser to provide a graphical interface to the data.
  • openaq code - An isomorphic Javascript wrapper for the API
  • py-openaq code - A Python wrapper for the API
  • ropenaq code - An R package for the API

For more projects that are using OpenAQ API, checkout the Community page.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.