Migrate from OSS to Pro
Step-by-step guide for migrating from OSS Password Pusher to Self-Hosted Pro
Overview
This guide walks you through migrating your data from the open source Password Pusher to Self-Hosted Pro. The migration process exports all your users, pushes, audit logs, and file attachment references to a JSON file, which is then imported into your Pro instance.
Sensitive data in the export file (push payloads, notes, passphrases) remains encrypted with your OSS instance’s key. During import, Pro uses the OSS master key you provide to decrypt that data and re-encrypt it with Pro’s key. That is why the OSS master key is required for the import step.
What gets migrated:
- Users — All user accounts with login credentials preserved
- Pushes — All push records (text, file, URL, QR) with encrypted payloads
- Audit Logs — Complete audit trail history
- File Attachments — Active Storage references (actual files stay in place)
What changes during migration:
- Push expiration changes from
expire_after_daysto Pro’s duration-based system - Encrypted payloads are re-encrypted with your Pro instance’s master key
What is not migrated:
- API tokens — Not migrated; users must regenerate API tokens in Pro after migration (Account → API Tokens).
Important: Always create a complete backup before starting the migration. The import runs in a database transaction and can be rolled back, but backups are still essential.
Prerequisites
Before beginning, ensure you have:
- OSS Password Pusher running version 1.68.2 or later
- Pro Self-Hosted instance set up and running (fresh installation)
- OSS encryption key — Printed in the export output (Step 1). Required at import so Pro can decrypt sensitive data in the export file and re-encrypt it with Pro’s key.
- Storage for file attachments:
- Starter (standard) container: Only local storage is supported. If OSS used cloud storage (S3, GCS, Azure, etc.), the import will skip file attachments and list the affected pushes. Those pushes will not have files in Pro—you can recreate them as needed, or import into Advanced or Enterprise and configure storage to migrate the files.
- Advanced/Enterprise: Configure Pro’s storage (S3, GCS, or Azure) to use the same bucket and credentials as OSS.
- Local disk: Same storage volume (or copied files) accessible to Pro at its storage path.
Migration Steps
Step 1: Export Data from OSS
On your OSS Password Pusher instance, run the export task:
# Docker
docker exec -it pwpush bin/rails pwpush:export
# Docker Compose
docker compose exec pwpush bin/rails pwpush:export
This creates a JSON export file inside the OSS container at /opt/PasswordPusher/tmp/:
============================================================
Password Pusher OSS -> Pro Migration Export
============================================================
Gathering data...
Exporting 150 users...
Exporting 5000 pushes...
Exporting 25000 audit logs...
Exporting 200 active storage blobs...
Exporting 200 active storage attachments...
============================================================
Export Complete!
============================================================
Export file: /opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json
Storage backend: amazon
Record counts:
- users: 150
- pushes: 5000
- audit_logs: 25000
- active_storage_blobs: 200
- active_storage_attachments: 200
NEXT STEPS:
------------------------------------------------------------
1. Copy this JSON file to your Pro instance
2. Configure Pro storage to access the same files (see Step 3). For cloud storage, use Advanced or Enterprise and set Pro’s S3/GCS/Azure service to the same bucket and credentials.
3. Run the import task in Pro (prefer passing the key via `PWPUSH_OSS_MASTER_KEY` to avoid shell history):
PWPUSH_OSS_MASTER_KEY=your_key bin/rails 'pwpush:import[/path/to/export.json]'
or: bin/rails 'pwpush:import[/path/to/export.json,YOUR-OSS-MASTER-KEY]'
Your OSS master (encryption) key for this instance:
749b1022e1cb83fb04f3022eacaf3bfef60c6d47f83e6fb41f534a05fc69929f
Use this value as YOUR-OSS-MASTER-KEY (decrypts and re-encrypts push payloads).
============================================================
Step 2: Copy Export File to Pro Instance
The export file was created inside the OSS container at /opt/PasswordPusher/tmp/. You need to copy it out of the OSS container and then into your Pro container.
Copy the file out of the OSS container
First, copy the export file from inside the OSS container to your local machine:
# Docker
docker cp pwpush-oss:/opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json ./
# Docker Compose
docker compose cp pwpush:/opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json ./
Replace pwpush_export_20260221_120000.json with the actual filename shown in the export output.
Copy the file into the Pro container
Next, copy the export file into your Pro container:
# Docker
docker cp ./pwpush_export_20260221_120000.json pwpush-pro:/opt/PasswordPusher/tmp/
# Docker Compose
docker compose cp ./pwpush_export_20260221_120000.json pwpush-pro:/opt/PasswordPusher/tmp/
If your OSS and Pro containers are on different servers, transfer the file between servers first using scp or similar:
# Copy from local machine to remote server running Pro
scp ./pwpush_export_20260221_120000.json user@pro-server:~/
Then on the Pro server, copy the file into the container.
Step 3: Configure Storage (If Using File Attachments)
Starter container: If your OSS export used non-local storage (e.g. S3, B2, MinIO), the import will skip file attachments and report which pushes were affected. To migrate those files, use the Advanced or Enterprise container and configure storage as below.
Local Disk Storage
OSS and Pro use different subpaths. Either mount OSS storage at Pro’s path or copy the files:
| Edition | File Storage Path |
|---|---|
| OSS | /opt/PasswordPusher/storage/ |
| Pro | /opt/PasswordPusher/storage/files/ |
Option A — Mount OSS volume at Pro’s path
# docker-compose.yml for Pro
volumes:
- pwpush-oss-storage:/opt/PasswordPusher/storage/files
Option B — Copy files into Pro
docker cp pwpush-oss:/opt/PasswordPusher/storage/. ./oss-files/
docker cp ./oss-files/. pwpush-pro:/opt/PasswordPusher/storage/files/
docker compose exec pwpush-pro chown -R 1000:1000 /opt/PasswordPusher/storage/files
Cloud Storage (Advanced/Enterprise)
Pro configures storage in Administration Center → Settings → File Storage.
- S3-compatible (Amazon S3, Backblaze B2, MinIO, R2, etc.): Configure Pro’s S3 service with the same bucket, endpoint, and credentials as OSS.
- Google Cloud / Azure: Configure Pro’s GCS or Azure service with the same bucket/container and credentials.
Files stay in your existing bucket; Pro only needs the same credentials and bucket so imported blob references resolve.
Step 4: Import Data to Pro
On your Pro instance, run the import task with the export file path. Pass the OSS master key via the PWPUSH_OSS_MASTER_KEY environment variable (recommended, avoids shell history) or as the second argument:
# Docker — use env var (recommended)
docker exec -it -e PWPUSH_OSS_MASTER_KEY=your_oss_key pwpush-pro bin/rails 'pwpush:import[/opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json]'
# Docker Compose — use env var (recommended)
docker compose exec -e PWPUSH_OSS_MASTER_KEY=your_oss_key pwpush-pro bin/rails 'pwpush:import[/opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json]'
# Or pass key as second argument
docker compose exec pwpush-pro bin/rails 'pwpush:import[/opt/PasswordPusher/tmp/pwpush_export_20260221_120000.json,YOUR-OSS-MASTER-KEY]'
You’ll see progress output:
============================================================
Loading export file...
============================================================
Import from OSS Password Pusher
============================================================
Export metadata:
Version: 1.68.1
Exported at: 2026-02-21T12:00:00Z
Schema version: 1
Storage backend: amazon
Record counts:
- users: 150
- pushes: 5000
- audit_logs: 25000
- active_storage_blobs: 200
- active_storage_attachments: 200
IMPORTANT: Ensure your Pro instance is configured with storage backend:
amazon
============================================================
Starting import...
Importing users...
Users: 150/150 (100.0%)
-> 140 users created, 10 matched to existing accounts
Importing pushes...
Pushes: 5000/5000 (100.0%)
-> 5000 pushes imported, 0 skipped (already exist)
Importing audit logs...
Audit logs: 25000/25000 (100.0%)
-> 25000 audit logs imported
Importing Active Storage records...
Blobs: 200/200 (100.0%)
Attachments: 200/200 (100.0%)
-> 200 blobs, 200 attachments imported
============================================================
Import completed successfully!
============================================================
Summary:
- Users created: 140
- Users matched to existing: 10
- Pushes imported: 5000
- Pushes skipped (already exist): 0
- Audit logs imported: 25000
- Active Storage blobs imported: 200
- Active Storage attachments imported: 200
If you use the Starter container and OSS used cloud storage, the import skips file attachments and prints a notice listing the affected pushes. Those pushes will not have files in Pro; you can recreate them as needed, or import into Advanced or Enterprise and configure storage to migrate the files.
Step 5: Verify the Migration
After import completes:
- Test user login — Log in with an existing user’s credentials
- Reverify admin users — In Administration Center (open the app menu and choose Administration Center, or go to
/admin), go to Users. Confirm which users should have administrator access and add or remove admin rights as needed. Admin status is imported from OSS, but you should confirm it matches your expectations. - Check existing pushes — Navigate to the dashboard and verify pushes appear
- Test push retrieval — Access an existing secret URL and verify it works
- Test file downloads — If you have file pushes, verify files can be downloaded
- Create new push — Create a new push to verify the system works end-to-end
Troubleshooting
Error: “Export file not found”
Ensure the file path is correct and accessible from within the container:
# List files in tmp directory
docker compose exec pwpush-pro ls -la /opt/PasswordPusher/tmp/
# Verify file exists
docker compose exec pwpush-pro cat /opt/PasswordPusher/tmp/pwpush_export_*.json | head -20
Error: “Unsupported schema version”
The export file was created with a different version of the export task. Ensure both your OSS and Pro instances are running compatible versions.
Note: “Users already exist with matching email addresses”
This is informational, not an error. When the import finds users in Pro with matching email addresses from the OSS export, it will:
- Match the OSS user to the existing Pro user
- Import their pushes, audit logs, and file attachments to the existing user’s account
- Skip creating duplicate user records
This allows you to import data even if some users already exist in Pro.
Note: “Pushes have URL tokens that already exist” or “skipped (invalid url_token)”
Informational, not errors. The import skips pushes whose URL token already exists in Pro (e.g. from a previous import) or whose URL token is blank/invalid. It continues with all other pushes.
Error: “Failed to re-key ciphertext”
The OSS master key provided doesn’t match the key used to encrypt the data.
Solutions:
- Use the key from the export output (Step 1). If you no longer have it, retrieve it on the OSS instance:
docker compose exec pwpush bin/rails runner "puts Lockbox.master_key"(ordocker exec -it pwpush ...for Docker). - Check if OSS was using the default key or if
PWPUSH_MASTER_KEY_PREVIOUSwas configured (key rotation).
File Attachments Not Working
Starter container and OSS used cloud storage: File attachments are not imported. The import output lists the affected pushes. Those pushes will not have files in Pro—recreate them as needed, or import into the Advanced or Enterprise container (configure storage to match OSS, e.g. Pro’s S3 service pointing at your OSS bucket) and re-run the import to migrate the files.
Files in storage but not accessible after migration:
- Advanced/Enterprise: In Administration Center → File Storage, ensure the service (S3, GCS, or Azure) uses the same bucket and credentials as OSS.
- For local storage, ensure the volume is mounted at Pro’s storage path or files were copied there. Check that files have the same uid/gid as the Pro container (default 1000:1000). If they don’t, fix ownership from the host or from inside the container, e.g.
docker compose exec pwpush-pro chown -R 1000:1000 /opt/PasswordPusher/storage/files. - Test access:
docker compose exec pwpush-pro bin/rails runner "puts ActiveStorage::Blob.first.url"
Import Fails Partway Through
The import runs in a database transaction. If it fails, all changes are rolled back.
To retry:
- Check the error message for the specific issue
- Fix the underlying problem
- Run the import again — it’s safe to re-run on a clean database
Post-Migration Checklist
After confirming everything works:
- Verify all user accounts can log in
- Test creating and viewing pushes
- Test file uploads and downloads (if using file pushes)
- Verify audit logs appear in push details
- Test the admin dashboard
See Also
- Pro Self-Hosted — Overview of Pro Self-Hosted features