python - Django backup strategy with dumpdata and migrations -
as in this question, set dumpdata-based backup system database. setup akin running cron script calls dumpdata , moves backup remote server, aim of using loaddata recover database. however, i'm not sure plays migrations. loaddata has ignorenonexistent switch deal deleted models/fields, not able resolve cases columns added one-off defaults or apply runpython code.
the way see it, there 2 sub-problems address:
- tag each
dumpdataoutput file current version of each app - splice fixtures migration path
i'm stumped how tackle first problem without introducing ton of overhead. enough save file per backup contained {app_name: migration_number} mapping?
the second problem think easier once first 1 solved, since process roughly:
- create new database
- run migrations forward appropriate point each app
- call
loaddatagiven fixture file - run rest of migrations
there's code in this question (linked bug report) think adapted purpose.
since these regular/large snapshots of database, don't want keep them data migrations cluttering migrations directory.
i taking following steps backup, restore or transfer postgresql database between instance of project:
the idea keep least possible migrations if manage.py makemigrations run first time on empty database.
let's assume have working database our development environment. database current copy of production database which should not open changes. have added models, altered attributes etc , actions have generated additional migrations.
now database ready migrated production -as stated before- not open public not altered in way. in order achieve this:
- i perform normal procedure @ development environment.
- i copy project production environment.
- i perform normal procedure production environment
however, means while make changes in our development environment no changes should happen @ production database because they overridden.
normal procedure
before else, have backup of project directory (which includes requirements.txt file), backup of database , -of course- git friend of mine.
i take
dumpdatabackup in case need it:./manage.py dumpdata --exclude auth.permission --exclude contenttypes --exclude admin.logentry --exclude sessions --indent 2 > db.jsoni take
pg_dumpbackup use:pg_dump -u $user -fc $database --exclude-table=django_migrations > path/to/backup-dir/db.dumpi delete migrations every application.
in case
migrationsfolder symlink, use following script:#!/bin/bash dir in $(find -l -name "migrations") rm -rf $dir/* donei delete , recreate database:
for example, bash script can include following commands:
su -l postgres -c "pgpassword=$password psql -c 'drop database $database ;'" su -l postgres -c "createdb --owner $username $database" su -l postgres -c "pgpassword=$password psql $database -u $username -c 'create extension $extension ;'"i restore database dump:
pg_restore -fc -u $username -d $database path/to/backup-dir/db.dumpi create migrations in following way:
./manage.py makemigrations <app1> <app2> ... <appn>... using following script:
#!/bin/bash apps=() app in $(find ./ -maxdepth 1 -type d ! -path "./<project-folder> ! -path "./.*" ! -path "./") apps+=(${app#??}) done all_apps=$(printf "%s " "${apps[@]}") ./manage.py makemigrations $all_appsi migrate using fake migration:
./manage.py migrate --fake
in case has gone wrong , ***, (this can happen, indeed), can use backup revert previous working state. if use db.json file step one, goes this:
when pg_dump or pg_restore fails
i perform steps:
- 3 (delete migrations)
- 4 (delete , recreate database)
- 6 (makemigrations)
and then:
apply migrations:
./manage.py migrateload data db.json:
./manage.py loaddata path/to/db.json
then try find out why previous effort not successful.
when steps performed successfully, copy project server , perform same ones box.
this way, keep least number of migrations , able use pg_dump , pg_restore box shares same project.
Comments
Post a Comment