ADWOFF - He is talking about programs that are usualy used to harvest emails, but they do have other uses. Back in the day, they were mostly link hoppers and text crunchers, hopping links and looking for *@*.com or org or what ever. The new breed traverse cgi programs as well and a few of the really smart ones are written specificly for the major boards.
Robot finds ubb.cgi and hits it, then gets a list of all forums, goes to each forum, gets a list of every subject, goes to every subject, crunches the html to collect email addresses.
On primitive boards, it is just bandwidth that is eaten and thats not all that bad because it isnt really all that much. But on the new boards, every link is .cgi driven which requires more cpu usage than just dishing out a static .html
bb77 - Possible solutions
Light weight solution, you could look up robot.txt files and use them. This will stop some of the legitimate robots, spiders, and harvesters but not the ones that are hard core spamers.
You could put the whole thing behind a .htaccess file and give your members the password. Generaly speaking, this will stop the commercialy sold harvesters which are mostly automated, it is not worth their time to get the username and password from you. Especialy if you change it regularly.
You could also make your board closed to the public for viewing. That way the harvesters would have to get a username and password to use the board itself.
UBB solution, add a routine to block read floods the same way you can block post floods. Someone reads lets say 10 pages with less than a minute apart, it locks their IP address out and gives them a screen to complain to the webmaster so s/he can adjust the settings to acomodate his or her membership style.