HN2new | past | comments | ask | show | jobs | submitlogin

Because when you actually read it, you will download 1 article per 30 minutes. If you batch-download all of their content now, you'll create much more peak load and, hence, costs for them.


You can use --limit-rate=500k if you want to limit the bandwidth you're consuming.

And load does not translate to cost for everybody. If you saturate the connection to my VPS, I don't pay more, it just gets slower for everybody in contention. I can spin up mirrors but if I'm offering a free resource like this, I'd be more likely to limit the bandwidth-per-client-IP or just actively let it run slow. They could even limit the bandwith to the subdirectory with...

    location /download/ {
        limit_conn addr 1;
        limit_rate 50k;
    }


On the other hand, if you download it all at once you aren't constantly reloading a page when you get back to it and you aren't limited to reading it when you're online. I do see the point you're making, though I think it depends on the author's perspective.


So read the wget man page and add settings to your liking:

--wait=5000 --random-wait

Done...


so he should seed a torrent instead of a wget command?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: