Having completed my incursion into Metasploiitable 2 I’m beginning my foray into Mutillidae II.
Before starting the manual hands-on stuff I thought I’d throw some automated scanners at the web app for fun and see what results they might generate for me.
Mutillidea version 2.6.5 is hosted on my Windwes 7 system using XAMPP and i’m scanning from Kali Linux.
Google’s Skipfish is the seventh most popular web application vulnerability scanner:
Skipfish is an active web application security reconnaissance tool. It prepares an interactive sitemap for the targeted site by carrying out a recursive crawl and dictionary-based probes. The resulting map is then annotated with the output from a number of active (but hopefully non-disruptive) security checks. The final report generated by the tool is meant to serve as a foundation for professional web application security assessments.
As I’ve been having trouble with the length of time some of these scans take (this morning w3af – having run overnight – told me it would take over thirteen days to complete a “fast scan”) I opted to disable the dictionary based probes and brute force attempts:
# skipfish -o /root/Desktop/resuts -L -Y -W- http://192.168.1.96/mutillidae/
skipfish version 2.05b by <lcamtuf@google.com>
- 192.168.1.96 -
Scan statistics:
Scan time : 0:04:22.272
HTTP requests : 1363 (5.2/s), 5804 kB in, 314 kB out (23.3 kB/s)
Compression : 0 kB in, 0 kB out (0.0% gain)
HTTP faults : 401 net errors, 0 proto errors, 0 retried, 641 drops
TCP handshakes : 411 total (3.3 req/conn)
TCP faults : 0 failures, 135 timeouts, 0 purged
External links : 345 skipped
Reqs pending : 0
Database statistics:
Pivots : 324 total, 286 done (88.27%)
In progress : 0 pending, 0 init, 38 attacks, 0 dict
Missing nodes : 8 spotted
Node types : 1 serv, 34 dir, 46 file, 8 pinfo, 81 unkn, 154 par, 0 val
Issues found : 35 info, 338 warn, 6 low, 3 medium, 0 high impact
Dict size : 0 words (0 new), 0 extensions, 0 candidates
[+] Copying static resources...
[+] Sorting and annotating crawl nodes: 324
[+] Looking for duplicate entries: 324
[+] Counting unique nodes: 86
[+] Saving pivot data for third-party tools...
[+] Writing scan description...
[+] Writing crawl tree: 324
[+] Generating summary views...
[+] Report saved to '/root/Desktop/resuts/index.html' [0x193bdbb6].
[+] This was a great day for science!
Here is the link to the reported findings in HTML.
I’m not attempting to exploit any of this information at this point.