Need help with ipfs-search?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

480 Stars 60 Forks GNU Affero General Public License v3.0 721 Commits 28 Opened issues


Search engine for the Interplanetary Filesystem.

Services available


Need anything else?

Contributors list


Build Status Maintainability Test Coverage GoDoc Backers on Open Collective Sponsors on Open Collective

Search engine for the Interplanetary Filesystem. Sniffs the DHT gossip and indexes file and directory hashes.

Metadata and contents are extracted using ipfs-tika, searching is done using ElasticSearch 7, queueing is done using RabbitMQ. The crawler is implemented in Go, the API and frontend are built using Node.js.

The ipfs-search command consists of two components: the crawler and the sniffer. The sniffer extracts hashes from the gossip between nodes. The crawler extracts data from the hashes and indexes them.


A preliminary start at providing a minimal amount of documentation can be found in the docs folder.


Please find us on our Freenode/Riot/Matrix channel #ipfssearch.


ipfs-search provides the daily snapshot for all of the indexed data using elasticsearch snapshots. To learn more about downloading and restoring snapshots, read docs

Related repo's

Contributors wanted

Building a search engine like this takes a considerable amount of resources (money and TLC). If you are able to help out with either of them, mail us at [email protected] or find us at #ipfssearch on Freenode (or on Matrix).

Please read the file before contributing.


For discussing and suggesting features, look at the project planning.


  • Go 1.13
  • Elasticsearch 7.x
  • RabbitMQ / AMQP server
  • NodeJS 9.x
  • IPFS 0.7


Configuration can be done using a YAML configuration file, or by specifying the following environment variables: *


A default configuration can be generated with:

ipfs-search -c config.yml config generate
with the configuration file you'd like to use.)

To use a configuration file, it is necessary to specify the

option, as in:
ipfs-search -c config.yml crawl

The configuration can be (rudimentarily) checked with:

ipfs-search -c config.yml config check


$ go get ./...
$ make



The most convenient way to run the crawler is through Docker. Simply run:

docker-compose up

This will start the crawler, the sniffer and all its dependencies. Hashes can also be queued for crawling manually by running

ipfs-search a 
from within the running container. For example:
docker-compose exec ipfs-crawler ipfs-search add QmS4ustL54uo8FzR9455qaxZwuMiUhyvMcX9Ba8nUH4uVv

Local setup

Local installation is done using vagrant:

git clone ipfs-search
cd ipfs-search
vagrant up

This starts up the API on port 9615, Elasticsearch on 9200 and RabbitMQ on 15672.

Vagrant setup does not currently start up the frontend.

Ansible deployment

Automated deployment can be done on any (virtual) Ubuntu 16.04 machine. The full production stack is automated and can be found here.


This project exists thanks to all the people who contribute.


Thank you to all our backers! 🙏 [Become a backer]


ipfs-search is supported by NLNet through the EU's Next Generation Internet (NGI0) programme.

Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.