The recent DeleGate robot.txt User-Agent String Handling Remote Overflow Vulnerability is a perfect example of the type of vulnerabilities that I hope the Evil Website Testing Suite will eventually be able to expose. This particular vulnerability would not be detected with the current version of ewts and writing a robots.txt fuzzer isn't on the top of my todo list, but it is on the list. I just saw the vulnerability release and was happy to see that these type of vulnerabilities do get some exposure.
I have added a few more tests to ewts and come to the realization that
I really need to write up a framework for it. As it currently stands it
relies on directory indexes to list the tests so they will be crawled,
yet about half of the files are supposed to only receive hits under
certain conditions, but a crawler will crawl them if they are in the
index, breaking the test case.
The initial framework will focus on allowing tests to be enabled and disabled and only expose the initial page(s) the test needs displayed. I don't intend to introduce any metrics yet. This will allow me to "standardize" the test layout and make it easier for others to contribute.
If you do wish to contribute shoot me a message through the sourceforge project page and we'll take it from there.
The initial framework will focus on allowing tests to be enabled and disabled and only expose the initial page(s) the test needs displayed. I don't intend to introduce any metrics yet. This will allow me to "standardize" the test layout and make it easier for others to contribute.
If you do wish to contribute shoot me a message through the sourceforge project page and we'll take it from there.