We have a lot of people who spider some of the public sites we use. People ignore robots.txt and other types of limiters we put on the thousands of public websites we host for corporations. Some of them are hitting us 2-3 times a second, and you multiple that by a couple dozen and its a lot of web traffic that is essentially not being used properly.
Thank goodness we use Akamai, who is our savior. They are setting up a agent based limiter that will limit the "freshness" of the content based on the type of user.
what's going to be next, these spiders posing as IE?
I wouldn't think we are too far off.