📰 Brazilian government gazettes, accessible to everyone.
The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:
Diário Oficial is the Brazilian government gazette, one of the best places to know the latest actions of the public administration, with distinct publications in the federal, state and municipal levels.
Even with recurrent efforts of enforcing the Freedom of Information legislation across the country, official communication remains - in most of the territories - in PDFs.
The goal of this project is to upgrade Diário Oficial to the digital age, centralizing information currently only available through separate sources.
When this project was initially released, had two distinct goals: creating crawlers for governments gazettes and parsing bidding exemptions from them. Now going forward, it is limited to the first objective.
If you want to understand how Diário Oficial works, you'll want to get the source, build it, and run it locally.
After you cloned the repository, you may want to run the following from the source folder:
$ make setup $ docker-compose up
The gazettes spiders are written using Scrapy framework and must be executed with crawl command:
scrapy crawl. However, it's recommended to use the processing container for that:
docker-compose run --rm processing. The following example is the command to run the gazette crawler for Florianópolis/SC:
$ docker-compose run --rm processing bash -c "cd data_collection && scrapy crawl sc_florianopolis"
There is a make target allowing you run the scrapy shell inside the container used by the crawler:
There is another make target allowing you run access the PostgreSQL database:
You need the password to access the database. You can find it in the .env file.
You can also run the spider with some less key strokes. The following make target allows you to run the spider. It calls the same command of the docker compose described in the documentation:
SPIDER=sc_florianopolis make run_spider
This problem most probably occurs due to a mismatch between your system's user id and the container's user id and there is a volume in place connecting both file systems (that's the default case here).
Run this command in your system's terminal to get your user's id:
$ id -u
Copy the output, replace the value of the environment variable
LOCAL_USER_IDin the generated
.envfile with the copied value and execute
docker-compose build. With the image rebuilt you are ready to go.
To save yourself this effort in the future, you can replace the value of
.envwill already be generated with the correct value for it when
make setupis executed.
If you are interested in fixing issues and contributing directly to the code base, please see the document CONTRIBUTING.md.
This project is maintained by Open Knowledge Foundation Brasil, thanks to the support of Digital Ocean and hundreds of other names.