Rate limiting could help when an automated process is scanning arbitrary, generated URLs, inevitably generating a shitton of 404 errors -- something your rate limiting logic can easily check for (depending on server/proxy software of course). Normal users or even normal bots won't generate excessive 404's in a short time frame, so that's potentially a pretty simple metric by which apply a rate limit. Just an idea though, I've not done that myself...
Specifically, I use fail2ban to count the 404s and ban the IP temporarily when certain threshold is exceeded in a given time frame. Every time I check fail2ban stats it has hundreds of IPs blocked.