While developing SpiderSuite, a new crawling tool from ProjectDiscovery came out, that is Katana and I was very much impressed by its capabilities, wide range of configurable options and mainly its efficiency and speed (Since its written in golang, this doesn’t come as a surprise).
I have been using Katana since it came out, mean while developing my own advance crawling tool. But since Katana is a command-line tool it becomes a hassle trying to analyze individual crawled page contents, hence I decided to introduce an Import feature into Spidersuite where you can import crawl results from Katana and visualize them.
Here is a brief guide on how you can do so:
- Download and Installation
You can download Katana at https://github.com/projectdiscovery/katana/releases
- Crawling with Katana
The crawl target for this article is https://crawler-test.com at crawler testing site.
- Crawl the target using katana and saving results in Json file using command:
katana -u https://crawler-test.com -json > crawl_results.json
2. You can also crawl the target using katana and store http requests/responses results into a custom directory using command:
katana -u https://crawler-test.com -store-response-dir dirname
- Loading Katana’s Json crawl result into SpiderSuite
Simply go to Application > Import from > Katana > Json
Choose the Json file.
Accept to load its content into SpiderSuite.
- Loading Katana’s crawl result from Custom Directory into SpiderSuite
Simply go to Application > Import from > Katana > Index
Choose the index file (which is located inside the custom directory containing crawl results)
Accept and load its contents into SpiderSuite.
- Navigating results using SpiderSuite
Now that you’ve completely loaded your results from Katana into SpiderSuite you have the ability to navigate and analyze the contents of every single page.
You also have the option to save the results into a SpiderSuite’s project file simply by clicking on the Save item on the menu.
Here is the overview of the results:
Thank you for taking your time reading this post. Please do checkout both of the crawling tools, as they can be very helpful in your arsenal of security tools.