If what is hitting you is really a search engine robot, then perhaps you can use robots.txt to limit or prevent parsing of certain areas of your site.
Assuming that the pages are open for anyone to see (as in not password protected), then the quick answer is that you can't stop someone from doing that. All you can really do is make it more cumbersome to accomplish what they are trying to do. I've not needed to do this myself, so not sure what is really effective, but I'd suggest reading up on "throttling". As you supplied PHP links, here's a start:
One thing I immediately thought of was to track requests by IP address and if more than XXX requests in YY seconds, then make the requests by that IP address take longer, perhaps using sleep() With that, you'd need some way to track the number of requests over a period of time, perhaps with a database. Settings a session value will likely not be any good as the robot is probably making individual requests and not saving any session state on it's end.
Maybe someone else has some good suggestions.