Any script has the potential to be a resource hog when malicious spiders come crawling along.
Google, Inktomi, and others intentionally limit the crawling rate in order to prevent hitting the server too hard, even for static pages. Not all spiders are as nice. A single malicious spider can easily make a hundred requests a minute, which can promptly bring any server to its knees.