I have to do this something like 12 times a week, usually.
If you're running MacOS X, or any variant of *nix, you may or may not be aware of wget and how cool it is..
Case in point:
Vendor X just posted a file on their ftp site, in this case, it's Dialogic, a major telecom vendor for carrier grade stuff.
[pserwe@host ~]$ wget -c --ftp-user=
-c, my friends, is for continue, so if the transfer gets stalled, or there's some sort of transient issue, I can restart it using the exact same command line and pick up where I left off.
What's not to like? It's approximately 15 times faster than firing up some obnoxious GUI and sucking a file down.. I also get a really nice status line the entire time it's running..
--12:50:45-- ftp://ftp.dialogic.com/posted_file.zip
=> `posted_file.zip'
Resolving ftp.dialogic.com... 192.219.17.67
Connecting to ftp.dialogic.com|192.219.17.67|:21... connected.
Logging in as user ... Logged in!
==> SYST ... done. ==> PWD ... done.
==> TYPE I ... done. ==> CWD not needed.
==> SIZE posted_file.zip ... 55734149
==> PASV ... done. ==> REST 9646080 ... done.
==> RETR posted_file.zip ... done.
Length: 55734149 (53M), 46088069 (44M) remaining
100%[++++++++++++++++============================================================================>] 55,734,149 413K/s in 1m 50s
12:52:36 (408 KB/s) - `posted_file.zip' saved [55734149]
[pserwe@host ~]$
If there is no authentication required, for a lot of http or anonymous ftp sites, like .. mirrors.kernel.org for instance, you can omit the --ftp-user=
Even better example..
I have this vendor that makes this super expensive (major fraction of $1M USD) test and measurement system for telephone calls. These guys can sniff and break down a protocol like nobody else. They have to do software updates at various times that require a bit over a GB of files dumped onto my system.
They typically do it by pushing from their office over a 3Mb bonded T1 or fractional DS3 (not sure, probably bonded T1's though). It takes quite a bit of time at that speed.
I tipped them off to wget, and now they suck the files down from my system over a 1Gb link. It moves quite a bit faster! The actual source of the files is a server in Colo on serious bandwidth, so I get around 800K to 1MB/s pulling from there.
Good times, and time saved.. just by using wget.
No comments:
Post a Comment