Python and MySQL Insert query - python

I have a question regarding insert query and python mysql connection. I guess that I need to commit after every insert query made
Is there a different way to do that? I mean a fast way like one in php.
Second this also is same for update query I guess ?
Another problem here is that once you commit your query connection is closed, assume that I have a different insert queries and every time i prepare it I need to insert it to the table. How can I achieve that with python. I am using MySQLdb Library
Thanks for your answers.

You don't need to commit after each insert. You can perform many operations and commit on completion.
The executemany method of the DBAPI allows you to perform many inserts/updates in a single roundtrip
There is no link between committing a transaction and disconnecting from the database. See the Connection objects methods for the details of the commit and close methods

Related

why my query works but the data is not saved? [duplicate]

This question already has an answer here:
python script is not saving into database
(1 answer)
Closed 6 months ago.
I have a small script made in Python which has two functions. The first to send data into a table. The second can read the table.
When I call the function that triggers my insert query, the data is not saved in the database.
When I insert an identical query directly into SQL Server it works fine.
So my script is good and my query is good too. Firewall systems are properly configured.
So why data is not saved ?
The primary key of my table is an IDENTITY column. When I activate my insert function, the IDENTITY column still auto increments while no data is saved.
I give you my script :
Here my SQL Server :
The SQL query for creating my table
I try my best to find a solution, i need your help to understand my problem.
As a quick fix, add commit after execute:
...
cursor.execute(sql)
cursor.connection.commit()
But I would also advise you to keep as little connections as possible. In your current code you create new connection for each operation.
After inserting with your cursor you need to commit your inserts with connection.commit()
As qaziqarta pointed out it is best practice to not open a new connection everytime you are trying to insert or read something from the database.
You should initialize your connection once at the beginning and close it after you are done reading/writing.

Update SQL database registers based on JSON

I have a table with 30k clients, with the ClientID as primary key.
I'm getting data from API calls and inserting them into the table using python.
I'd like to find a way to insert rows with new clients and, if the ClientID that comes with the API call already exists in the table, update the existing register with the updated information of this client.
Thanks!!
A snippet of code would be nice to show us what exactly you are doing right now. I presume you are using an ORM like SqlAlchemy? If so, then you are looking at doing an UPSERT type of an operation.
That is already answered HERE
Alternatively, if you are executing raw queries without an ORM then you could write a custom procedure and pass required parameters. HERE is a good write up on how that is done in MSSQL under high concurrency. You could use this as a starting point for understanding and then re-write it for PostgreSQL.

Inserting into SQL with pyodbc from remote computer

Hello I am using my second computer to gather some data and insert it into the SQL database. I set up everything when it comes to reading and writing the database remotely, and I can insert new rows just by using the normal SQL.
With pyodbc I can read tables, but when I insert new data, nothing happens. No error message, but also no new rows in the table.
I wonder if anyone has faced this issue before and knows what the solution is.
The cursor.execute() method only prepares the SQL statement. Then, since this is an INSERT statement, you must use the cursor.commit() method for the records to actually populate your table. Likewise for a DELETE statement, you need to commit, as well.
Without more perspective here, I can only assume that you are not committing the insert.
Notice, similarly, that when you run cursor.execute("""select * from yourTable"""), you need to run cursor.fetchall() or another fetch statement to actually retrieve and view your query.

Python and SQLAlchemy: How to detect external changes on database

Some devices are asynchronously storing values on a common remote MySQL database server.
I would like to write a supervisor app in Python (and possibly SQLAlchemy) to recognize the external INSERT events on the database and act upon the last rows' data. This is to avoid a long manual test to see if every table is being updated regularly or a logger crashed.
Can somebody just tell me where to search online this kind of info and, even better, an example?
EDIT
I already read all tables periodically using a datetime primary key ({date_time}), loading the last row of each table, and comparing to the previous values:
SELECT * FROM table ORDER BY date_time DESC LIMIT 1
but it looks very cumbersome and doesn't guarantee that I don't lose some rows between successive database checks.
The engine is an old version of INNODB that I cannot upgrade: I cannot use the UPDATE field in schema because it simply doesn't work.
To reword my question:
How to listen any database event with a daemon-like Python application (sleeping thread) and wake up only when something happens?
I want also to avoid SQL triggers because this would be just too heavy to manage: tables are in hundreds and they are added/removed very often according to the active loggers.
I gave a look to SQLAlchemy but all reference I could find, if I don't misunderstood it, are decorators to act on INSERTs made by SQLAlchemy's itself. I didn't find anything about external changes to the database.
About the example request: I am not interested in a copy-and-paste, because first I want to understand how stuff works. I prefer (even incomplete) examples because SQLAlchemy documentation is far too deep for my knowledge and I simply cannot put the pieces together.

Select Data from Table and Insert into a different DB

I'm using python and psycopg2 to remotely query some psql databases, and I'm trying to figure out the best way to select the data I need from the remote table, and insert it into a table on a separate DB (local application server).
Most of the stuff I've read has directed me to avoid executemany and look toward COPY operations, but I'm unsure how to implement this on a specific select statement as opposed to the entire table. Should I be headed this way or am I completely off?
but I'm unsure how to implement this on a specific select statement as opposed to the entire table
COPY isn't limited to tables, you can use a query as the source as well, check out the examples in the manual, it shows how to use COPY to create a text file based on a query:
http://www.postgresql.org/docs/current/static/sql-copy.html#AEN59055
(3rd example)
Take a look at http://ryrobes.com/featured-articles/using-a-simple-python-script-for-end-to-end-data-transformation-and-etl-part-1/
Granted, this is pulling from Oracle and inserting into SQL Server, but the concepts should be the same.

Categories