Saw an exciting blog post from Michael Sutton at SPI Dynamics. In it he demonstrates a small application he put together to search through websites with the Google API looking for sites that appear vulnerable to SQL injection attacks. I originally heard Caleb Sima (founder of SPI) talk about this at the TRISC conference in Houston, Texas earlier this year.
First of all the techniques demonstrated by the tool are unfortunately in the grey area where they are trivially easy to implement and test but of questionable legality. How many folks remember what happened to Daniel Cuthbert when he did some poking around of his own on a web site he was using? Granted that was in the UK but it would be tough to convince me that the US government these days is immune to that sort of overreaction.
Second of all it is tough to attach any sort of weight to the statistics that came out of this exercise. This doesn’t give any idea of what percentage of websites are vulnerable to SQL injection attacks – it gives an idea of what percentage of sites using “id=10” somewhere in their URLs appear to be vulnerable to SQL injection attacks. That’s not a terribly meaningful number standing on its own because the initial sample is fairly arbitrary.
Finally what this does demonstrate (and this is really the point of the post) is that SQL injection is prominent and easy to find in an automated fashion.
SQL injection is almost completely preventable:
- automated testing tools (both dynamic and static analysis) can usually find it
- simple coding standards can usually defeat it (via stored procedures and parameterized queries) unless developers try really hard to introduce the vulnerabilitie
If your organization doesn’t have even base-level coding standards and security testing in place then you are begging to be on the receiving end of these sorts of attacks. The Internet is a dangerous place.
dan _at_ denimgroup.com