April 20th, 2004, 01:33 AM
Anybody know a good "website grabbing" prog?
I need to dl a medium sized website and all the website grabbers ive found on google have this crap where they only download like 100 or so files and then you have to register. Since its a pretty big site, i dont really feel like staying up until 3 in the morning doing ctrl+s.
April 20th, 2004, 09:45 AM
What about wget? (just type that in Google. First hit is the *nix version, 2nd the Windows version).
April 20th, 2004, 03:22 PM
There is also a tool called Offline commander. Offline commander will allow you to grab entire sites for offline viewing.
April 20th, 2004, 09:35 PM
sounds good, thnx guys
April 21st, 2004, 08:20 AM
HTTrack is also a very useful program with lots of options that works on both Windows and Unix. However, if you're downloading someone else's site please be considerate because it can use a lot of bandwidth and other server resources (particularly if you're making several requests per second) which tends to annoy webmasters and server admins and you may end up with your IP blocked by the site.
April 21st, 2004, 08:25 AM
There's also Teleport, I think it's trial version for a while.. or maybe not, look it up, it's been a while since I used it.
May 24th, 2004, 11:25 PM
yup I'm with MsMittens on this one..
all them GUI-apps are not needed to download a website..
wget -m http://domain.com/site
is all you'll need
ASCII stupid question, get a stupid ANSI.
When in Russia, pet a PETSCII.
Get your ass over to SLAYRadio
the best station for C64 Remixes !
May 27th, 2004, 02:37 AM
blackwidow, coast web master
"When I get a little money I buy books; and if any is left I buy food and clothes." - Erasmus
"There is no programming language, no matter how structured, that will prevent programmers from writing bad programs." - L. Flon
"Mischief my ass, you are an unethical moron." - chsh
Blog of X