by kgretzky

kgretzky / dcrawl

Simple, but smart, multi-threaded web crawler for randomly gathering huge lists of unique domain nam...

434 Stars 75 Forks Last release: Not found MIT License 7 Commits 0 Releases

Available items

No Items, yet!

The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:


dcrawl is a simple, but smart, multi-threaded web crawler for randomly gathering huge lists of unique domain names.



How it works?

dcrawl takes one site URL as input and detects all

links in the site's body. Each found link is put into the queue. Successively, each queued link is crawled in the same way, branching out to more URLs found in links on each site's body.

How smart crawling works: * Branching out only to predefined number of links found per one hostname. * Maximum number of allowed different hostnames per one domain (avoids subdomain crawling hell e.g. blogspot.com). * Can be restarted with same list of domains - last saved domains are added to the URL queue. * Crawls only sites that return text/html Content-Type in HEAD response. * Retrieves site body of maximum 1MB size. * Does not save inaccessible domains.

How to run?

go build dcrawl.go
./dcrawl -url http://wired.com -out ~/domain_lists/domains1.txt -t 8


     ___                          __
  __| _/________________ __  _  _|  |
 / __ |/ ___\_  __ \__  \\ \/ \/ /  |
/ /_/ \  \___|  | \// __ \\     /|  |__
\____ |\___  >__|  (____  /\/\_/ |____/
     \/    \/           \/       v.1.0

usage: dcrawl -url URL -out OUTPUT_FILE -t THREADS

-ms int maximum different subdomains for one domain (def. 10) (default 10) -mu int maximum number of links to spider per hostname (def. 5) (default 5) -out string output file to save hostnames to -t int number of concurrent threads (def. 8) (default 8) -url string URL to start scraping from -v bool verbose (default false)


dcrawl was made by Kuba Gretzky from breakdev.org and released under the MIT license.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.