Short:        Auto download or link-check entire Web sites easily! (v1.02)
Author:       Chris S Handley (NOspam@chris.s.handleyATbtinternet.com)
Uploader:     Chris S Handley (NOspam chris s handleyATbtinternet com)
Type:         comm/tcp
Version:      v1.02
Requires:     HTTPResume v1.3+, Rexxsupport.library, ARexx
Architecture: m68k-amigaos

Introduction
------------
Have you ever visited a cool web site & wanted to keep a copy of some/all of it,
but it would takes ages to find & download all the respective pages/files?

This is the answer!

You supply this ARexx script with the start page URL, and a destination
directory (which should be empty), and maybe a few other options - and off it
goes!  Note that it needs HTTPResume v1.3+ to work (get from Aminet).

Latest News
-----------
This is a minor update to my "final" v1.01 release.  I have fixed my new
AmigaDOS script (that makes using GetAllHTML easier) to work properly.


If you cannot get ARexx to work, please read my warning below about text
editors.

History
-------
v1.02  (28-06-03) - Fixed script.  Script can now download single pages.
                    Handling of URLS with "?" in them was a bad idea, so
                    original behaviour restored.
v1.01  (10-02-03) - Added a script to make it easier to use, and explanation
                    of how to run GetAllHTML from an IBrowse button.
v1.00  (22-08-02) - Should now handle URLs with "?" in them.  Now considers
                    SWF (flash) files as pictures.  Final version?
                    Added anti-spam stuff to email address :(
<snip>