Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think I saw an article recently where someone used the http protocol to serve gzipped content that was specially crafted to explode in size on the receiver side. This could be a good preventative to crawlers as they don't typically have that much space dedicated to each instance.





I like the general idea of having zero tolerance for bad behaviour.

> bad behaviour

It’s worth noting that the “zip bomb” was at a resource location specified in the Disallow section of robots.txt, meaning the server specifically told the bot not to go there and it did anyway.

(Not that the parent commenter seems confused, just that it hadn’t been noted.)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: