Need help with openaq-api?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

openaq
152 Stars 42 Forks Other 505 Commits 53 Opened issues

Description

OpenAQ Platform API - NO LONGER IN USE see https://github.com/openaq/openaq-api-v2

Services available

!
?

Need anything else?

Contributors list

OpenAQ Platform API

Build Status Slack Chat

NO LONGER IN USE

This codebase is no longer in use, please see https://github.com/openaq/openaq-api-v2. Version 1 of the OpenAQ API is still available via api.openaq.org/v1 but has been reimplemented in the same repository as version 2.

Overview

This is the main API for the OpenAQ project.

Starting with

index.js
, there is a web-accessible API that provides endpoints to query the air quality measurements. Documentation can be found at https://docs.openaq.org/.

openaq-fetch takes care of fetching new data and inserting into the database. Data format is explained in openaq-data-format.

Getting started

Install prerequisites:

Clone this repository locally (see these instructions) and activate the required Node.js version with:

nvm install

The last step can be skipped if the local Node.js version matches the one defined at .nvmrc.

Install module dependencies:

npm install

Development

Initialize development database:

npm run init-dev-db

This task will start a PostgreSQL container as daemon, run migrations and seed data. Each of these tasks is available to be run independently, please refer to package.json to learn the options.

After initialization is finished, start the development server:

npm run dev

Access http://localhost:3004.

Stop database container after finishing:

npm run stop-dev-db

Testing

Initialize test database:

npm run init-test-db

This task will start a PostgreSQL container as daemon, run migrations and seed data. After initialization is finished, run tests:

npm run test

Stop database container after finishing:

npm run stop-test-db

Deploying to production

The server needs to fetch data about locations and cities in the measurement history using AWS Athena. This service must be configured via the following environment variables:

  • ATHENA_ACCESS_KEY_ID
    : an AWS Access Key that has permissions to create Athena Queries and store them in S3;
  • ATHENA_SECRET_ACCESS_KEY
    : the corresponding secret;
  • ATHENA_OUTPUT_BUCKET
    : S3 location (in the form of
    s3://bucket/folder
    ) where the results of the Athena queries should be stored before caching them;
  • ATHENA_FETCHES_TABLE
    : the name of the table registered in AWS Athena.

Automatic Athena synchronization is disabled by default. It can be enabled by setting the environment variable

ATHENA_SYNC_ENABLED
to
true
. The sync interval can be set using
ATHENA_SYNC_INTERVAL
variable, in miliseconds. The default interval is set in file config/default.json.

If needed, the synchronization can be fired manually. First, the

WEBHOOK_KEY
variable must be set to allow access webhooks endpoint. Sending a POST request to /v1/webhooks, including the parameters
key=
and
action=ATHENA_SYNC
, will start a sync run. An example with curl:
curl --data "key=123&action=ATHENA_SYNC" https://localhost:3004/v1/webhooks

Other environment variables available:

| Name | Description | Default | |---|---|---| | APIURL | Base API URL after deployment | http://:3004 | | NEWRELICLICENSEKEY | New Relic API key for system monitoring | not set | | WEBHOOKKEY | Secret key to interact with openaq-api | '123' | | USEREDIS | Use Redis for caching? | not set (so not used) | | USEATHENA | Use AWS Athena for aggregations? | not set (so not used) | | REDISURL | Redis instance URL | redis://localhost:6379 | | DONOTUPDATECACHE | Ignore updating cache, but still use older cached results. | not set | | AGGREGATIONREFRESHPERIOD | How long to wait before refreshing cached aggregations? (in ms) | 45 minutes | | REQUESTLIMIT | Max number of items that can be requested at one time. | 10000 | | UPLOADSENCRYPTIONKEY | Key used to encrypt upload token for /upload in database. | 'notsecure' | | S3UPLOAD_BUCKET | The bucket to upload external files to for /upload. | not set |

AWS Athena for aggregations

The Athena table is

fetches_realtime
that represents the fetches from
openaq-data
and has the following schema:
CREATE EXTERNAL TABLE fetches.fetches_realtime (
  date struct,
  parameter string,
  location string,
  value float,
  unit string,
  city string,
  attribution array>,
  averagingPeriod struct,
  coordinates struct,
  country string,
  sourceName string,
  sourceType string,
  mobile string
 )
 ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
 LOCATION 's3://EXAMPLE_BUCKET'

Uploads & Generating S3 presigned URLs

Via an undocumented

/upload
endpoint, there is the ability to generate presigned S3 PUT URLs so that external clients can authenticate using tokens stored in the database and upload data to be ingested by
openaq-fetch
. There is a small utility file called
encrypt.js
that you can use like
UPLOADS_ENCRYPTION_KEY=foo node index.js your_token_here
to generate encrypted tokens to be manually stored in database.

Dockerfile

There is a Dockerfile included that will turn the project into a Docker container. The container can be found here and is currently mostly used for deployment purposes for AWS ECS. If someone wanted to make it better for local development, that'd be a great PR!

Contributing

There are a lot of ways to contribute to this project, more details can be found in the contributing guide.

Projects using the API

  • openaq-browser site | code - A simple browser to provide a graphical interface to the data.
  • openaq code - An isomorphic Javascript wrapper for the API
  • py-openaq code - A Python wrapper for the API
  • ropenaq code - An R package for the API

For more projects that are using OpenAQ API, checkout the OpenAQ.org Community page.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.