I'm looking for a software package that can take a
snapshot of a website for
archival purposes and go 'x' levels deep. I want it to be able to snag
stuff such as PDF documents.
GNU wget:
wget -m -np
http://www.compaq.com/some/path/to/interesting/stuff/
http://www.gnu.org/software/wget/wget.html