why my query works but the data is not saved? [duplicate] - python

This question already has an answer here:
python script is not saving into database
(1 answer)
Closed 6 months ago.
I have a small script made in Python which has two functions. The first to send data into a table. The second can read the table.
When I call the function that triggers my insert query, the data is not saved in the database.
When I insert an identical query directly into SQL Server it works fine.
So my script is good and my query is good too. Firewall systems are properly configured.
So why data is not saved ?
The primary key of my table is an IDENTITY column. When I activate my insert function, the IDENTITY column still auto increments while no data is saved.
I give you my script :
Here my SQL Server :
The SQL query for creating my table
I try my best to find a solution, i need your help to understand my problem.

As a quick fix, add commit after execute:
...
cursor.execute(sql)
cursor.connection.commit()
But I would also advise you to keep as little connections as possible. In your current code you create new connection for each operation.

After inserting with your cursor you need to commit your inserts with connection.commit()
As qaziqarta pointed out it is best practice to not open a new connection everytime you are trying to insert or read something from the database.
You should initialize your connection once at the beginning and close it after you are done reading/writing.

Related

Update SQL database registers based on JSON

I have a table with 30k clients, with the ClientID as primary key.
I'm getting data from API calls and inserting them into the table using python.
I'd like to find a way to insert rows with new clients and, if the ClientID that comes with the API call already exists in the table, update the existing register with the updated information of this client.
Thanks!!
A snippet of code would be nice to show us what exactly you are doing right now. I presume you are using an ORM like SqlAlchemy? If so, then you are looking at doing an UPSERT type of an operation.
That is already answered HERE
Alternatively, if you are executing raw queries without an ORM then you could write a custom procedure and pass required parameters. HERE is a good write up on how that is done in MSSQL under high concurrency. You could use this as a starting point for understanding and then re-write it for PostgreSQL.

Inserting into SQL with pyodbc from remote computer

Hello I am using my second computer to gather some data and insert it into the SQL database. I set up everything when it comes to reading and writing the database remotely, and I can insert new rows just by using the normal SQL.
With pyodbc I can read tables, but when I insert new data, nothing happens. No error message, but also no new rows in the table.
I wonder if anyone has faced this issue before and knows what the solution is.
The cursor.execute() method only prepares the SQL statement. Then, since this is an INSERT statement, you must use the cursor.commit() method for the records to actually populate your table. Likewise for a DELETE statement, you need to commit, as well.
Without more perspective here, I can only assume that you are not committing the insert.
Notice, similarly, that when you run cursor.execute("""select * from yourTable"""), you need to run cursor.fetchall() or another fetch statement to actually retrieve and view your query.

Python and MySQL Insert query

I have a question regarding insert query and python mysql connection. I guess that I need to commit after every insert query made
Is there a different way to do that? I mean a fast way like one in php.
Second this also is same for update query I guess ?
Another problem here is that once you commit your query connection is closed, assume that I have a different insert queries and every time i prepare it I need to insert it to the table. How can I achieve that with python. I am using MySQLdb Library
Thanks for your answers.
You don't need to commit after each insert. You can perform many operations and commit on completion.
The executemany method of the DBAPI allows you to perform many inserts/updates in a single roundtrip
There is no link between committing a transaction and disconnecting from the database. See the Connection objects methods for the details of the commit and close methods

Python | MySQLdb -> I cant insert

I have an odd problem.
There are two databases. One has all products in it for an online shop. The other one has the online-shop software on it and only the active products.
I want to take the values from db1, change some values in python and insert into db2.
db1.execute("SELECT * FROM table1")
for product in db1.fetchall():
# ...
db2.execute("INSERT INTO table2 ...")
print "Check: " + str(db2.countrow)
So I can get the values via select, even selecting from db2 is no problem. My check always gives me 1 BUT there are no new rows in table2. The autoincrement value grows but the there is no data.
And I dont even get an error like "couldnt insert"
So does anybody has an idea what could be wrong? (if i do an insert manually via phpmyadmin it works... and if i just take the sql from my script and do it manually the sql-statements work aswell)
EDIT: Found the answer here How do I check if an insert was successful with MySQLdb in Python?
Is there a way to make these executes without committing everytime?
I have a wrapper around the mysqldb and it works perfectly for me, changing the behaviour with commit I need to make some big changes.
EDIT2: ok i found out that db1 is MyISAM (which would work without commiting) and db2 is InnoDB (which actually seems only to work with commiting) I guess I have to change db2 to MyISAM aswell.
Try adding db2.commit() after the inserts if you're using InnoDB.
Starting with 1.2.0, MySQLdb disables autocommit by default, as
required by the DB-API standard (PEP-249). If you are using InnoDB
tables or some other type of transactional table type, you'll need to
do connection.commit() before closing the connection, or else none of
your changes will be written to the database.
http://mysql-python.sourceforge.net/FAQ.html#my-data-disappeared-or-won-t-go-away

Select Data from Table and Insert into a different DB

I'm using python and psycopg2 to remotely query some psql databases, and I'm trying to figure out the best way to select the data I need from the remote table, and insert it into a table on a separate DB (local application server).
Most of the stuff I've read has directed me to avoid executemany and look toward COPY operations, but I'm unsure how to implement this on a specific select statement as opposed to the entire table. Should I be headed this way or am I completely off?
but I'm unsure how to implement this on a specific select statement as opposed to the entire table
COPY isn't limited to tables, you can use a query as the source as well, check out the examples in the manual, it shows how to use COPY to create a text file based on a query:
http://www.postgresql.org/docs/current/static/sql-copy.html#AEN59055
(3rd example)
Take a look at http://ryrobes.com/featured-articles/using-a-simple-python-script-for-end-to-end-data-transformation-and-etl-part-1/
Granted, this is pulling from Oracle and inserting into SQL Server, but the concepts should be the same.

Categories