Webserver gives a few kB and stalls

Today I wanted to download a few standards documents, about 15 PDF and ZIP files. But this stupid webserver always just sent between 100kB and 700kB, then it completely stopped sending data. This was annoying me. I wanted to look into these documents.

Don’t panic!

wget has a nice --continue option. This way I can hit CTRL+C and continue. repeating this until I have the whole file. But this is too cumbersome when downloading many and/or big files. I wanted a completely automated way to do that. And there is one, wget has many more options. Keep retrying forever and set the read timeout to 3 seconds:

wget --continue --tries=0 --read-timeout=3 URLS...

Tadaa! It downloads everything, problem solved :)

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>