The problem is the other solutions are just not quite there yet for me. I really tried, but I'm probably just too accustomed to Notion and its features.
One of the described issues in that post with Notion was the performance - or the lack of it. I feel like in the last few days the team behind Notion did some backend improvements (also a database upgrade). It definitely feels snappier now. Searching is faster, databases load quicker and you can actually use templates now without waiting forever for them to be set up in the page you're trying to create.
But this post is not really about this, so let's not get into too much detail and focus on our topic.
In the settings of Notion there is an Export all workspace content button. And for backup purposes I did those regularly, but depending on the size of your workspace it can take a while, plus it's a pretty manual process that you have to constantly remind yourself about.
And I'm the type of guy that tries to automate everything away so I can focus on the important things in my life (plus it's fun to do for an engineer, let's be honest).
For this reason I have written a small open source command line application in Go that can do just that: back up your Notion workspace or just specific parts of it - automatically. And thanks to Go it's platform independent, so you can even run it on macOS or Windows (there are in the repository, if you don't want to build it yourself).
You can find it on Github at
The easiest way to run is by downloading a pre-built binary, but since it's open source you can of course build it yourself.
Then you have to set at least 2 environment variables and you're good to go.
This is the Notion page ID you want to set up for the automatic backup
On every page you can hit those 3 dots in the corner and click Copy link
Alternatively, these 3 dots are also in the sidebar for every page if you hover with your mouse over them
The page ID can look something like this
You can either set this env variable to the whole URL or just the ID part at the end after the last.
On Linux based systems you can put them into your .*rc files (, , ...), so they are always available.
After that,your .*rc file and you're already set up.
On Windows (in cmd.exe or PowerShell), you can use the commandto permanently set environment variables.
After that, you need to close that cmd.exe or PowerShell window and open another one for these changes to take effect.
There are 2 other environment variables you can set for this application, but they have their defaults and don't need to be necessarily set (check the Github repository if you want to set them anyway).
Now you can execute the binary application by callingor (assuming you have already 'd into that folder).
That's pretty much it. Once the application has finished downloading your backup (it's gonna be a .zip file) you can do whatever you want with it, e.g. integrate it into your backup solution and forget about it.
You could also automatically unzip it after every export so your backup solution can deduplicate the files better. There are endless possibilities. The important thing is just to have a backup in case something happens.
I keep it pretty simple.
I have set up a cron job for once per week that does a backup of my workspace and then pushes the file into my encrypted Google Drive folder with (great tool if you're not familiar yet). The contents of my encrypted Google Drive folder are also replicated into other backup systems, but that's irrelevant for this post.
I hope you found this blog post and the tool useful. If there's only one thing to learn from this post, it's to always have backups for the worst case scenario.