Ryan Chandler Ryan Chandler

How to download a website for offline viewing with wget

Tips & Tricks

I've been on my fair share of flights recently and as we all know, the Wi-Fi is a little hit or miss. It is getting better thanks to Starlink but it still isn't 100% reliable.

I wanted to get some things done on a plane and one of my most visited websites is laravel.com. Despite being a Laravel developer and an engineer at Laravel, there are still things that I have to lookup sometimes: validation rules, collection method names, and more.

So I decided to download the entire Laravel website to my machine so that I could serve it using a local web server. It turns out this is super easy to do with wget.

If you haven't got wget installed already, you can use brew on macOS to install it.

brew install wget

Once you've got it installed you can use the following command to download a copy of a website for offline viewing, including frontend assets and child pages.

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent <url>

I only wanted the latest Laravel docs so I ran the following command:

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://laravel.com/docs/12.x

This created a laravel.com directory in the same one as the command was run. You can then serve this directory using a local web server such as php -S localhost:8000.

If you want to wrap this up into a nice little shell command / helper, you can add the following to your .zshrc (or equivalent):

offline-copy() {
    wget --mirror --convert-links --adjust-extension --page-requisites --no-parent "$@"
}