more specifically, this: wget -m -l 2 -k -E news.bbc.co.uk -m : Causes it to mirror the site -l 2: links of depth 2. Note that the higher this, the more it will take to download. A link depth of 1 is recommended, otherwise you will be dl'ing a lot. -k and -E: ensure that the files are locally browsable and converts the extensions to html Then, create an alias for wget -m -l 2 -k -E such as mirror in your bash profile and you are ready to go. You can even make simple bash scripts so that you download several web sites everyday to be readable offline.