I have a question about alembic multidb template or using alembic for multiple database in general.
Our project has your usual three levels of database (development, testing, production). Depending on which level of database and whether it runs via Airflow or not, the URL for each database will be different. I'd like to use a .env file to store the URLs and credentials and run the migration using a command like:
alembic -x db=development airflow=False upgrade head
What should I do to achieve this? Thank you in advance for any assitance!
Related
Due to some quirks in our database we need to reassign table owners post creation. Currently we leverage Alembic - has anyone got a simple way to create some sort of post hook that runs certain SQL commands after a migration?
It sounds like what you're asking for can be resolved by editing the env.py file of alembic.. From the docs (emphasis mine):
env.py - The env.py script is part of the generated environment so that the way
migrations run is entirely customizable. The exact specifics of how to
connect are here, as well as the specifics of how the migration
environment are invoked. The script can be modified so that multiple
engines can be operated upon, custom arguments can be passed into the
migration environment, application-specific libraries and models can
be loaded in and made available.
I bet you could get the behavior you want by adding a call in run_migrations_online().
(This is assuming you want to run the post_hook after EACH migration. If it was migration specific, you can update the upgrade() function in the generated migration file)
I have a development Django project using MySQL, and it is deployed at PythonAnywhere. I am able to push my code updates via GIT, and the Django migrations take care of the database STRUCTURE but my question is about data.
During development I may add a new capability that relies on master data which I enter in the DEV database as I develop and test. When deploying I'd like to copy over the master data to the new database rather than re-enter it all.
Is exporting and importing files the best way or is there a more professional way?
I think the simplest way to do this is by using the dumpdata management command.
The output for this command can be used in executing the loaddata management command.
I want to update a field in my users table on my django project hosted on heroku.
Is there a way I can run a script(if so from where?) using what?
That allows me to update a field in the database? I could do this manually in the django admin but it would take way to long as there are large number of users.
Any advice is appreciated.
i suggest you update the data in your local then make a fixture, commit and push it in your heroku. then do load the data using the terminal
update data (locally)
make a fixture (manage.py dumpdata)
commit and push to heroku
login via terminal (heroku login)
load the data (heroku run python manage.py loaddata .json)
I could create tables using the command alembic revision -m 'table_name' and then defining the versions and migrate using alembic upgrade head.
Also, I could create tables in a database by defining a class in models.py (SQLAlchemy).
What is the difference between the two? I'm very confused. Have I messed up the concept?
Also, when I migrate the database using Alembic, why doesn't it form a new class in my models.py? I know the tables have been created because I checked them using a SQLite browser.
I have done all the configurations already. The target for Alembic's database and SQLALCHEMY_DATABASE-URI in config.py are the same .db file.
Yes, you are thinking about it in the wrong way.
Let's say you don't use Alembic or any other migration framework. In that case you create a new database for your application with the following steps:
Write your model classes
Create and configure a brand new database
Run db.create_all(), which looks at your models and creates the corresponding tables in your database.
So now consider the case of an upgrade. For example, let's say you release version 1.0 of your application and now start working on version 2.0, which requires some changes to your database. How can you achieve that? The limitation here is that db.create_all() does not modify tables, it can only create them from scratch. So it goes like this:
Make the necessary changes to your model classes
Now you have two options to transfer those changes to the database:
5.1 Destroy the database so that you can run db.create_all() again to get the updated tables, maybe backing up and restoring the data so that you don't lose it. Unfortunately SQLAlchemy does not help with the data, you'll have to use database tools for that.
5.2 Apply the changes manually, directly to the database. This is error prone, and it would be tedious if the change set is large.
Now consider that you have development and production databases, that means the work needs to be done twice. Also think about how tedious would it be when you have several releases of your application, each with a different database schema and you need to investigate a bug in one of the older releases, for which you need to recreate the database as it was in that release.
See what the problem is when you don't have a migration network?
Using Alembic, you have a little bit of extra work when you start, but it pays off because it simplifies your workflow for your upgrades. The creation phase goes like this:
Write your model classes
Create and configure a brand new database
Generate an initial Alembic migration, either manually or automatically. If you go with automatic migrations, Alembic looks at your models and generates the code that applies those to the database.
Run the upgrade command, which runs the migration script, effectively creating the tables in your database.
Then when you reach the point of doing an upgrade, you do the following:
Make the necessary changes to your model classes
Generate another Alembic migration. If you let Alembic generate this for you, then it compares your models classes against the current schema in your database, and generates the code necessary to make the database match the models.
Run the upgrade command. This applies the changes to the database, without the need to destroy any tables or back up data. You can run this upgrade on all your databases (production, development, etc.).
Important things to consider when using Alembic:
The migration scripts become part of your source code, so they need to be committed to source control along with your own files.
If you use the automatic migration generation, you always have to review the generated migrations. Alembic is not always able to determine the exact changes, so it is possible that the generated script needs some manual fine tuning.
Migration scripts have upgrade and downgrade functions. That means that they not only simplify upgrades, but also downgrades. If you need to sync the database to an old release, the downgrade command does it for you without any additional work on your part!
I can add that for Django there are two commands
makemigrations (which creates migrations files)
migrate (which translates migrations into sql and executes them on database)
I found its great for somebody's understanding to switch between batteries included framework(Django) and other frameworks like Flask/ Falcon.
So using alembic migrations is very convenient, and makes easy to track database changes.
Is there SQLAlchemy automigration tool like South for Django?
I looked to sqlalchemy-migrate but it doesn't seem to generate sql update scripts automatically or upgrade downgrade DB
Looks like with sqlalchemy-migrate you need to
a) manually copy your old model to a new file
b) crate new model in application and copy it to a new file
c) write manually create/drop/alter tables in python sqlalchemy extended dialect
d) generate sql alter script
e) run command to execute alter sql script
As for me it doesn't solve the problem and only adds overhead, as I can simply do d) manually and it will be much faster then do a), b), c) manually just to d) that you can do in one step.
Is there any auto migration libraries for SQLAlchemy like South for Django, or many RoR-like migration tools?
What I need is to change SQLAlchemy model in python app, run tool and it will compare current DB schema to new DB schema that new model should use, and create Alter scripts that I can adjust manually and execute.
Is there any solution like this in Python?
You can perform automatic migrations with Alembic. I use it in two large-scale projects I am currently working on. The automatic migration generator can recognize:
Table additions and removals
Column additions and removals
Change of nullable status on columns
Basic changes in indexes, explicitly-named unique constraints, and foreign keys
(see also: https://alembic.sqlalchemy.org/en/latest/autogenerate.html)
Install alembic
pip install alembic
or (depending on the version of Python you are using):
pip3 install alembic
Configure alembic
Execute the following command in your project:
alembic init alembic
This will set up alembic for your project, in a folder called alembic.
You will then need to edit the generated alembic.ini configuration file.
In the file env.py, tell Alembic where to find SQLAlchemy's metadata object in your project.
(see also: https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file)
Generate the migration
Simply execute the following command line:
alembic revision --autogenerate -m "Message for this migration"
Or even (not recommended):
alembic revision --autogenerate
Upgrade the database
After this, I upgrade the database with this simple command from the folder containing the alembic.ini configuration file:
alembic upgrade head
(see also: http://rodic.fr/blog/automatic-migrations-sqlalchemy-alembic/)
There is Alembic which looks very promising, but the problem is (for now) that the support for SQlite databases is very limited.
No there is none at this moment. Alembic require you to write code regarding adding/deleting/altering table structure or creating/dropping table. So nothing like django south exists for sqlalchemy.
Have you looked into using sqlalchemy-migrate?
http://shane.caraveo.com/2010/09/13/database-migrations-for-sqlalchemy-part-duex/