We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 1880e0d commit 32de40dCopy full SHA for 32de40d
README.md
@@ -10,6 +10,7 @@ A concurent web crawler to crawl the web.
10
- Filter Duplicates.
11
- Filter URLs that fail a HEAD request.
12
- User specifiable max timeout between two successive url requests.
13
+- Max Number of Links to be crawled.
14
15
16
### Sample Implementation Snippet
0 commit comments