Search engine for the Interplanetary Filesystem.
The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:
Search engine for the Interplanetary Filesystem. Sniffs the DHT gossip and indexes file and directory hashes.
Metadata and contents are extracted using ipfs-tika, searching is done using ElasticSearch 5, queueing is done using RabbitMQ. The crawler is implemented in Go, the API and frontend are built using Node.js.
The ipfs-search command consists of two components: the crawler and the sniffer. The sniffer extracts hashes from the gossip between nodes. The crawler extracts data from the hashes and indexes them.
A preliminary start at providing a minimal amount of documentation can be found in the docs folder.
Please find us on our Freenode/Riot/Matrix channel #ipfssearch.
Building a search engine like this takes a considerable amount of resources (money and TLC). If you are able to help out with either of them, mail us at [email protected] or find us at #ipfssearch on Freenode (or #ipfs-search:chat.weho.st on Matrix).
Please read the Contributing.md file before contributing.
For discussing and suggesting features, look at the project planning.
Configuration can be done using a YAML configuration file, or by specifying the following environment variables: *
A default configuration can be generated with:
bash ipfs-search -c config.yml config generate(substitute
config.ymlwith the configuration file you'd like to use.)
To use a configuration file, it is necessary to specify the
-coption, as in:
bash ipfs-search -c config.yml crawl
The configuration can be (rudimentarily) checked with:
bash ipfs-search -c config.yml config check
$ go get ./... $ make
The most convenient way to run the crawler is through Docker. Simply run:
This will start the crawler, the sniffer and all its dependencies. Hashes can also be queued for crawling manually by running
ipfs-search afrom within the running container. For example:
docker-compose exec ipfs-crawler ipfs-search add QmS4ustL54uo8FzR9455qaxZwuMiUhyvMcX9Ba8nUH4uVv
Local installation is done using vagrant:
git clone https://github.com/ipfs-search/ipfs-search.git ipfs-search cd ipfs-search vagrant up
This starts up the API on port 9615, Elasticsearch on 9200 and RabbitMQ on 15672.
Vagrant setup does not currently start up the frontend.
Automated deployment can be done on any (virtual) Ubuntu 16.04 machine. The full production stack is automated and can be found here.
Thank you to all our backers! 🙏 [Become a backer]
Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Become a sponsor]