Archiving Personal Photos Offline: A Complete M-Disk Workflow

Posted on 2026-03-02
TL;DR
I've built a photo archive on M-Disk Blu-rays using open source command line tooling like par2, and SHA256 verification. Offline, redundant, and designed to last decades without cloud subscriptions.

Everyone has personal data that is worth archiving. Data that does not change, that is not accessed all the time but data that you want to access in the future. There are different solutions to this problem. Here I’ll show you how I archive my photo collection.

Over the years I have collected quite a number of personal photos. I wanted a way to store them safely to have them available for many years to come. Most people use a photo service that promises to store pictures for a monthly fee in their cloud. At first their prices are quite appealing but over the time as you use more and more storage and as you have to pay each month it becomes more and more unattractive. That is the reason I want to store the data on physical media in my control.

Physical Storage Options

There are different options for storing larger amounts of Data on physical media. For me it is important to store the data for a long time without having to copy it, refresh it or renew the physical media every now and then. I want the data to be stored offline, that means I do not want to use for example an network attached storage device that has to be plugged in. The last criterion of importance to me was a format that has sufficient relevance, so there is a high likelihood that devices capable of reading it will be available in the future.

M-Disk Standard

For now I went with M-Disk. That is a disk technology compatible with the DVD or Blu-ray standard. They use a glassy carbon material instead of the organic material that is commonly used in DVDs and Blu-rays. That’s why they claim they have a longer lifetime of several hundred years. Of course there is no way to find out how long they live since they only have been around for a few years by now. However there have been several tests by different entities that suggest a longer lifetime than regular blu-rays.

M-Disks exist in different storage sizes: DVD with 4,7 GB, Blu-ray with 25 GB and Blu-ray XL with up to 128 GB of storage. Depending on the amount of data to archive that may be a lot of disks. I went with the regular Blu-ray ones since it appears to be a good tradeoff between large enough storage and a widespread format that can be read by a lot of drives. For the drives I won’t be recommending any devices. I’ve got an external USB Blu-ray writer that is capable of writing M-Disks.

A Thought about Encryption

The content on a disc could be encrypted using different methods. I thought about this but decided against it. My personal photo collection is insensitive enough to be readable by anyone who can get their hands on the disks. Access Keys also need to be securely stored in a way that is different than the disks. If the keys are lost there is no way to restore the archive. That’s why for now I decided against encrypting the data on my disks.

What is on a Disk

My disks currently follow a straight forward folder structure.

2026_PHOTOS_2022_0812
   ├── README.md
   ├── VERIFICATION.md
   ├── manifest.sha256
   ├── redundancy/
   └── CONTENT/

The readme file contains information about the disk. This is primarily for myself if I’m accessing one of the discs in the future. Each disk gets a disk id that I assign to it which also goes into the readme. Here is an example of a readme file in one of my discs:

# Archive Disk
Disc ID: 2026_PHOTOS_2022_0812

This data belongs to Benjamin Brunzel
E-Mail: b***@***.de
*Address goes here*

## Metadata
Created: 2026-01-09
Category: Photos
Content:
 - Photos from 2022 August to December
 - Includes hamburg, radtour
Size: 20GB

## Privacy / Usage Notice
These data contain personal and private content (photos, videos, documents).
They are intended for personal use only.

**It is strictly prohibited to copy, distribute, publish, or otherwise use the data without the explicit permission of the owners.**

The verification file describes the verification process that can be used to check the integrity of the disk. This process will be described later.

The manifest file contains sha256 checksums for every file in the CONTENT folder. They allow for checking if a file can be correctly read from the archive. This is checked during verification.

The redundancy folder contains redundant parity data per top level folder in CONTENT This allows to reconstruct the data in case of losing data within the archive due to unreadable sectors. In my configuration I use 15% of redundancy. This means I can recover from up to 15% lost sectors. So if a disk only has 86% of readable sectors I’m theoretically able to restore the files without errors.

Creating a Disk

To create a disk I’m selecting which data goes onto it. I copy it to a directory on my linux machine that I want to use to burn the data to the disk. In that directory I create the target file structure as it should be found on the resulting disk. I have to make sure not to exceed 20GB of file size in CONTENT because I will add the 15% of redundancy that has to be accounted for.

After I’m fine with the contents that should go on my disk I create the sha256 checksums with the following command on the shell:

find ./CONTENT -type f -print0 | sort -z | xargs -0 sha256sum > manifest.sha256

Next, I’m generating the redundancy using the par2 program and move those files into the redundancy directory. It will generate multiple files for each folder in the top level of the CONTENT directory.

mkdir redundancy

for dir in CONTENT/*/; do
  if [ -d "$dir" ]; then
    par2 create -r15 -s4194304 "$(basename "$dir").par2" $(find "$dir" -type f)
  fi
done

mv *.par2 redundancy

Now the directory is ready to generate the universal disk format file system image that can be used to burn the resulting disk. We use this command:

mkisofs -udf -r -V 2026_PHOTOS_2022_0812 -o ../2026_PHOTOS_2022_0812.iso .

You should check the filesize of the resulting iso file. It must not exceed 25 GB to fit onto a blu-ray. Finally, I write the image onto a M-Disk using the growisofs command. Make sure you have your drive attached and an empty disk inserted. We will limit the writing speed to 4x which is reported to work better when writing M-Disk.

growisofs -dvd-compat -speed=4 -Z /dev/sr0=2026_PHOTOS_2022_0812.iso

Once this succeeds the data has been written to the disk. Now it’s time to verify the integrity of the disk we just created.

Verification of a Disk

This procedure will be conducted directly after creation of a new disk to make sure that the data on it can be relied on. After that it is recommended to verify stored disks every so often to ensure data remains to be readable. If you experience unreadable sectors you should still be able to restore all the data using the par2 parity data.

There are multiple steps to verification. First we’ll read all sectors of the disk printing an error in case there are any broken sectors.

dd if=/dev/sr0 of=/dev/null bs=2048 conv=noerror,sync status=progress

In case of an error it would look like this: dd: error reading '/dev/sr0': Input/output error Note that we are using the block size of 2048 as this is the standard block size for optical media.

If that command runs without any issues we continue with checking the checksums from the manifest file we’ve created earlier.

sudo mount /dev/sr0 /mnt
cd /mnt
sha256sum -c manifest.sha256 2>&1 | grep -v ': OK$'

This command checks all the files for their checksum in the manifest file. Only files that are not OK will be printed. If there is no output from this command everything is fine and the verification was successful. This process should be done every few years. I write down the verification result with date in the booklet of the disks enclosure. I’ve created over twenty disks by now and so far none of them had any issues during creation and verification.

Storage of the Disks

The Disks should be stored in a dark and dry environment in vertical orientation. I write each iso twice and store the disks in two different locations.

Let’s see how long this archive will hold up for me. So far all disks were without any errors. It is worth noting that writing disks and also the verification takes quite some time. If you have a lot of disks this becomes unbearable. Probably it would be an option to look into Blu-ray XL disks for larger data sets. For my personal archive I’ve not had any issue yet.

Are you archiving data? What is your approach? I’d like to hear about it.

Portrait of the blog's author. Dude with full beard and short hair.
Author: Benjamin Brunzel I'm a software engineer based in Hamburg, Germany. If you want to get in touch contact me in the fediverse