A denial of service attack works by flooding a server with information and making it so busy that it locks-up and becomes useless. Normally an attacker would have to martial some serious horsepower to overwhelm a big website but this latest vulnerability allows them to do it with a lot less effort. Frankly it’s the last thing the web needs in this summer of me-too hacktivism.
The vulnerability works by exploiting a feature in web servers that gives you the ability to pause and resume your downloads. These days if you have to stop downloading something part-way through you can generally pick up where you left off and you don’t have to start again from scratch.
This useful feature is possible because a web server can be told to give you only the part of a file you need. In fact it’s possible to ask for more than one part of a file at the same time. And that’s the problem. It seems you can legitimately ask for hundreds of very large overlapping parts of a file in a single request. Enough parts that a relatively modest number of requests can tie a server’s CPU and memory in knots.
It seems that Apache deals with these kinds of request particularly inefficiently but the exploit is at least partly caused by a weakness in the HTTP protocol itself – the set of rules that determine how any web server should behave. Because all web servers follow the same set of rules it’s possible that all web servers are vulnerable to some degree.
According to the advisory an attack tool is already in the wild and in use. Although no patch is available at the time of writing one is expected within hours. In the meantime diligent webmasters will probably want to consider the mitigation strategies outlined in the advisory.
More generally this raises some interesting questions about the wisdom of having so much of the web reliant on one piece of software, no matter how good it is.