This is something i've been researching. AFAIK (and it's only from what i've read) it's a bit of a subjective measure, factors such as server specs can mean what might be considered a high load on a lesser machine, it's acceptable on a higher-spec machine.
What i'd like to know though, is what is it a measure of? Is it measured from 0-100%?
or 0-10?
whats the parameters?