How I Set Up a Local WordPress Dev Workflow Using a Raspberry Pi and Docker

For a while, I was doing something I knew was a bad idea — editing my live WordPress blog directly on the server. Tweaking plugins, testing themes, writing posts — all on production. One wrong move and the site goes down. Not ideal.

My blog runs at reuben.findingcities.in on a VPS, inside a Docker container, proxied through Apache. It works great. But the lack of a local development environment was always a nagging problem I kept pushing off.

So I finally fixed it. Here’s how.

The Goal

I wanted a simple workflow:

  1. Run a local copy of WordPress on my home network
  2. Make all changes locally — write posts, install plugins, tweak themes
  3. Push everything to the live server when ready with a single command

No paid tools. No complicated CI/CD pipelines. Just a Raspberry Pi sitting on my desk and a couple of bash scripts.

Why the Raspberry Pi?

I already had a Raspberry Pi running CasaOS with Portainer for managing Docker containers. It’s always on, uses almost no power, and is accessible from anywhere via Tailscale. It was the obvious choice for a local dev server — much better than running WordPress on my laptop and worrying about it going to sleep mid-transfer.

The Setup

The local WordPress instance runs inside Docker on the Pi, just like the live version runs on the VPS. Same WordPress image, same database structure. This means there are no surprises when pushing to live — what you see locally is exactly what you get on the server.

I deployed it as a stack in Portainer — just paste the docker-compose, hit deploy, and WordPress is running locally within a minute.

Both machines are connected via Tailscale, so the Pi can talk to the VPS securely without any open ports or firewall changes.

The Migration Approach

The two things that need to sync between local and live are:

The database — this is where all your posts, settings, menus, and plugin configurations live. The tricky part is that the database contains your site’s URL baked into it. So when pushing from local to live, you need to swap the local URL for the live one before importing.

wp-content — this folder contains everything else: your themes, plugins, and all uploaded media. It’s just files, so it gets packaged into a tarball and transferred across.

WordPress core files don’t need migrating at all — both containers run the same Docker image so they’re always identical.

The Script

Rather than doing this manually every time, I wrote a bash script called push-to-live.sh that handles the entire process in one command.

Here’s what it does, step by step:

  1. Backs up the live site first — before touching anything on the server, it creates a timestamped backup of both the live database and wp-content. So you can always roll back.
  2. Exports the local database — dumps the MySQL database from the local Pi container.
  3. Swaps the URLs — replaces the local dev URL with the live production URL in the SQL file using sed.
  4. Uploads the database — transfers the SQL file to the VPS over SSH.
  5. Packages and uploads wp-content — tarballs the entire wp-content folder and transfers it across.
  6. Imports on the live server — imports the database and extracts wp-content into the live WordPress container.
  7. Fixes permissions — makes sure Apache can read all the files.
  8. Cleans up — removes temp files from the server.

The whole thing runs in a couple of minutes depending on how much media you have.

A Few Things I Learned Along the Way

Always back up before pushing. The script does this automatically, but I learned this the hard way before the script existed. Always have a timestamped backup you can restore from.

The database URL swap is the most important step. WordPress stores the site URL in the database, not just in config files. If you forget to swap the local URL to the live URL before importing, your live site will try to load from your local address and break completely.

wp-content contains everything you care about. Plugins, themes, and all your uploaded images live in wp-content. WordPress core files are the same everywhere since they come from the Docker image — you only need to sync wp-content and the database.

Tailscale makes this incredibly easy. Without Tailscale I’d have to deal with open SSH ports, firewall rules, or a VPN setup. With Tailscale both machines are just on the same private network. The script uses SSH and SCP as if they were on the same LAN.

The Result

The workflow is now exactly what I wanted:

Write and test locally → Run push-to-live.sh → Live site updated

I can install a new plugin locally, make sure it doesn’t break anything, then push. I can write a blog post in the local wp-admin without any risk to the live site. And every push creates a backup automatically, so I have a full history of the site I can restore from if anything goes wrong.

The script and setup instructions are on GitHub if you want to use this for your own setup:
👉 https://github.com/reubenology/WordPress-Docker-Migration-Scripts

What’s Next

The next step is integrating this with my n8n automation workflows — specifically pushing blog posts written in Google Docs directly to WordPress via the REST API. But that’s a story for another post.

If you’re running a self-hosted WordPress setup and want a simple local dev workflow without paying for tools like WP Engine or Local by Flywheel, I hope this helps. Feel free to reach out via reuben.findingcities.in if you have questions.


Reuben Noronha is the Managing Director of Proximite Group, a UAE-based digital marketing and business consultancy. He writes about self-hosted tech, marketing automation, and open-source tools.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top