I am doing some prototyping for a new desktop app i am writing in Python, and i want to use SQLite and an ORM to store data.
My question is, are there any ORM libraries that support auto-generating/updating the database schema and work with SQLite?
SQLAlchemy is a great choice in the Python ORM space that supports SQLite.
SQLAlchemy, when used with the sqlalchemy-migrate library.
Related
PyQt has some nice features like QSqlTableModel and QSqlRelationalTableModel, however, it doesn't contain ORM functionalities and I hate using raw SQL statements in my program. What is a best way to integrate a SQL ORM library like Sqlalchemy, to PyQt SQL facilities? Currently the only solution I can think of is to build my object models in SqlAlchemy and execute the compiled SQL statements with QSqlDatabase manually. Is there any better way to do this? Are there any ways, say, to build a custom backend/adapter for SqlAlchemy that use QSqlDatabase? Other ORM libraries like Peewee are fine, too.
https://docs.sqlalchemy.org/en/13/core/engines.html?highlight=create_engine#sqlalchemy.create_engine
Create a fake engine using the 'mock' strategy that redirects all compiled sql statements to the QSqlDatabase to execute.
There seems to be a catch: functionalities that require interaction with a real database will not work, like checking if a table exists before creation.
Also this doesn't seem to work with the Session API, any calls to commit() and flush() will cause a NotImplementedError.
I have my data in database, and I want to build an API using python to query the database. What do I use or where do I even start ?
I can connect to my database and pull the data. But I do not know how to build an API for other users to query the database
NOT a trivial solution, but worth looking into, depending on how big a project this is and what else you may want to do with Python and MySQL.
Django is a framework for Python that can handle a lot of things but, most importantly, provides a very powerful interface to MySQL and other databases.
One of the many add-ons for Django is Django REST Framework which is specifically designed for setting up Restful APIs using Django & Python.
I have used Django & Django REST Framework extensively. It can take a while to get up to speed, but I think still a lot less work than if you were to take a basic connector between Python & MySQL (having to code SQL statements manually) and code the API interface at a lower level (Django REST Framework isn't perfect but it takes care of a lot of the details).
I'm updating from an ancient language to Django. I want to keep the data from the old project into the new.
But old project is mySQL. And I'm currently using SQLite3 in dev mode. But read that postgreSQL is most capable. So first question is: Is it better to set up postgreSQL while in development. Or is it an easy transition to postgreSQL from SQLite3?
And for the data in the old project. I am bumping up the table structure from the old mySQL structure. Since it got many relation db's. And this is handled internally with foreignkey and manytomany in SQLite3 (same in postgreSQL I guess).
So I'm thinking about how to transfer the data. It's not really much data. Maybe 3-5.000 rows.
Problem is that I don't want to have same table structure. So a import would be a terrible idea. I want to have the sweet functionality provided by SQLite3/postgreSQL.
One idea I had was to join all the data and create a nested json for each post. And then define into what table so the relations are kept.
But this is just my guessing. So I'm asking you if there is a proper way to do this?
Thanks!
better create the postgres database. write down the python script which take the data from the mysql database and import in postgres database.
I have seen sqlalchemy-migrate and alembic, but I do not want to use those frameworks. How can I write the migration script? Most of the migrations as I understand revolve around altering/dropping existing tables? Additionally, I use sqlalchemy mostly at orm level than schema/core/engine level?
The reasons I wish to do-it-myself is mostly a learning purpose and understanding how django orm automatically generates a migration script?
You should just use alembic to execute raw sql to start. Then if you decide to try to use more alembic features you'll be all set.
For example after creating a new revision named drop nick you can execute raw sql:
op.execute ('ALTER TABLE users DROP COLUMN nickname')
This way alembic can handle the version numbers but you can, or rather have to, do all the sql manipulations manually.
I have a database in postgresql and I have to connect it to a web server. I'm familiar with python programming but not web programing. I have some information about DAL (Database Abstraction Layer) to write my queries from within python. I have to generalize my queries to functions. Is it a good idea to to do it using DAL and subsequently use web2py technology to connect it to a web server?
The web2py DAL is a great choice if being database agnostic is a requirement for you. This means the DAL dynamically generates the SQL in real time using the specified dialect for the database back end, so that you do not have to write SQL code or learn different SQL dialects; therefore, your application will be portable among different types of databases. Since SQL is generated dynamically by the DAL, it ensures that all inserted data is properly escaped, which prevents Injection Flaws and makes SQL Injection impossible.
Additionally, although I had not found anything mentioned in the web2py book about how to automatically generate a model from an existing database, there is a way to quickly and easily create a database model from an existing PostgreSQL database by using script available on github created by Mariano Reingart, who based it on a script to "generate schemas from dbs" (mysql) created by Alexandre Andrade. The script is used to create a web2py db model from a PostgreSQL database.
Since I was using a MSSQL database, I needed a similar script, but I couldn't find anything, so I made minor edits to the script, specifically around the SQL and datatypes so that it was more in line with MSSQL, and it worked like a charm. I had a database model of about 150 tables created in seconds and it saved me a lot of time.
https://github.com/phektus/cvstash/blob/master/scripts/extract_pgsql_models.py
I hope this helps others out there who are looking for the same.
web2py DAL has support for Postgres and you can use it within web2py or you can take only the dal.py and use with your favorite project/framework.
For existing databases I recommend you to read the chapter 6 of http://web2py.com/book
I have multiple apps running with Postgres and it works very nice!