Any script has the potential to be a resource hog when malicious spiders come crawling along.
Google, Inktomi, and others intentionally limit the crawling rate in order to prevent hitting the server too hard, even for static pages. Not all spiders are as nice. A single malicious spider can easily make a hundred requests a minute, which can promptly bring any server to its knees.
Donate to UBBDev today to help aid in Operational, Server and Script Maintenance, and Development costs.
Please also see our parent organization VNC Web Services if you're in the need of a new UBB.threads Install or Upgrade, Site/Server Migrations, or Security and Coding Services.