MySQLDB in batch mode - python

I'm running my python 2 mysql bridge with MySQL database. Now I am running an application to generate an overview of the current database state. For this I am sequentially querying the database in a for loop and store each reply for each query.
Can this be done more efficiently by generating all queries first, execute them and get all results in one go?

Related

Retrieve Status of Rows Processed for long query using PYODBC / SQLAlchemy in Python

I have several SQL queries that I run regularly but can take a while to complete. I've noticed that different applications which query SQL Server might show a status update (E.g. 1200 rows received).
I was wondering if within SQL Alchemy or PYODBC there was a way to retrieve this, perhaps treat the query as a threaded worker and poll it as it completes.
Thanks,

SQLAlchemy, Postgres: Run SQL without transaction

I am using SQLAlchemy and pg8000 to connect to a Postgres database.
I have checked table pg_stat_activity, which shows me a few select queries in 'idle in transaction' state, many of those. But the application much more reads than writes, that is, inserts are few and far between.
I suspect that a new transaction is created for each query, even for simple select statements.
Is it possible to run a read-only query without the need for a transaction? So that it does not need to be committed/rolled back?
Currently, the app runs its queries with method sqlalchemy.engine.Engine.execute for CRUD operations and cursors for calling stored procedures. How should I update these method calls to indicate I want some of them not to start transactions?

Pulling from database and updating database at the same time

Is it possible to run a python flask app that pulls from a sql database, while also running a python script that updates the sql database every few seconds?
Yes, databases are designed to handle this type of concurrent access. If the database is in the middle of an update, it will wait until the update is complete before handling the Flask app's query, and it will complete the query before starting the next incoming update.

Remote Postgres to Postgres data

I am working on a project now where I need to load daily data from one psql database into another one (both databases are on separate remote machines).
The Postgres version I'm using is 9.5, and due to our infrastructure, I am currently doing this using python scripts, which works fine for now, although I was wondering:
Is it possible to do this using psql commands that I can easily schedule? or is python a flexible enough appproach for future developments?
EDIT:
The main database contains a backend connected directly to a website and the other contains an analytics system which basically only needs to read the main db's data and store future transformations of it.
The latency is not very important, what is important is the reliability and simplicity.
sure, you can use psql and an ssh connection if you want.
this approach (or using pg_dump) can be useful as way to reduce the effexcts of latency.
however note that the SQL insert...values command can insert several rows in a single command. When I use python scripts to migrate data I build insert commands that insert up-to 1000 rows, thus reducing latency by a factor of 1000,
Another approach worth considering is dblink which allows postgres to query a remote postgres directly, so you could do a select from the remote database and insert the result into a local table.
Postgres-FDW may be worth a look too.

flask-sqlalchemy write huge file to database

I am trying to load a huge file like 5900 lines of sql creates, inserts and alter tables into mysql database with flask-SQLalchemy.
I am parsing the file and seperate each command by splitting between ;
This works as expected.
Here is what I am having so far.
For the SQL Query execution I am using the Engine API of SQLAlchemy.
When I execute the queries it seems that the database quits its job after like 5400lines of the file, but the application logs the full execution until line 5900 without error.
When i do the creates and inserts seperately it also works, so is there a way to split the batch execution or use pooling or something like that, which does not make the database stuck.
Thank you!

Categories