my python code write in a temporary data base - python

I wrote a python code that write/read data in/from a MySQL DB. The problem is that the table still empty even after I write in it. And when I close the program I loose all the data.
this is how I created the tables:
cursor.execute("CREATE TABLE IF NOT EXISTS employees (id INT UNSIGNED NOT NULL AUTO_INCREMENT,Name VARCHAR(20) NOT NULL,LastName VARCHAR(20) NOT NULL,Post VARCHAR(20) NOT NULL,RasID SMALLINT UNSIGNED NOT NULL,PRIMARY KEY (id)) ENGINE=INNODB;")
this is how I insert data in the tables:
cursor.execute("INSERT INTO employees VALUES (NULL, %s, %s, %s, %s);",(UserName, UserLastName, UserPost, int(data['RasID'])))
and this how I select data from the tables:
cursor.execute("SELECT * FROM employees WHERE Name= %s;",(Jmsg['face'],))
this what I get after inserting data in the table and the program still running:
mysql> select * from employees;
Empty set (0.00 sec)
NB: I can select data after inserting it when the program still running but as I mentioned the table is empty. So, is the code write in a temporary table or what?

Try
connection.commit()
MySQL Connector/Python, which you're probably using here, does not autocommit which means that you have do it manually to "push" changes to database.
You may want to commit after every execute but you may also try to run it sometimes to conserve your bandwidth (but then you risk that you lose your changes when something went wrong).

A transaction in a database takes generally no effect until it is commited.
I don't know about MySQL in Python, but I do know that sqlite3's Connection instances have a commit method, that will write the transaction into the database.
In addition, when working with sqlite3, closing the connection by calling Connection.close() or (by leaving a with block, I think) should write the current transaction.
But anyway, it's bad practice to leave an object that was opened open.
And by "bad practice", I mean "dangerous and prone to bugs".

Related

(Multiple Questions) Python script compared to MsSQL workbench Error Code: 2013. Lost connection to MySQL server during query ; and unique index

A little background, I am building up a database of stock prices for analysis (if this is not the best method, let me know). I am using MySql and running python scripts though mysql.connector I have a table prices that looks a bit like this.
CREATE TABLE `Prices` (
`id` int PRIMARY KEY AUTO_INCREMENT,
`date_id` int,
`ticker_id` int,
`Open` decimal(6,2),
`Close` decimal(6,2),
`High` decimal(6,2),
`Low` decimal(6,2),
`Volume` int,
`Adj_Open` decimal(6,2),
`Adj_Close` decimal(6,2),
`Adj_High` decimal(6,2),
`Adj_Low` decimal(6,2),
`Adj_Volume` decimal(6,2)
);
So far I have only stored the prices for Apple in it and it has a little over 10000 entries. I tried to select/show the table from Mysql Workbench but kept getting Error Code: 2013. Lost connection to MySQL server during query. The same thing happens when I tried to truncate the table. I even increased the timeout from the 30s to 600 as suggested in one StackOverflow post, but I still get the same error.
When I run the following in python
def print_table(name):
mycursor = mydb.cursor()
text ="SELECT * FROM " + name
mycursor.execute(text)
myresult = mycursor.fetchall()
for i in myresult:
print(i)
mycursor.close()
#print table prices
print_table("prices")
It doesn't take 2 seconds and I have a printout out of the entire table content. Is there other settings that I have to change to allow me to run these queries in workbench without getting errors.
Second question:
I would like to have a unique key based on 2 columns in the table date_id and ticker_id how can I go about setting this up. So that the following code works without entering the same date and price twice when I update the table in the future.
sql = "INSERT IGNORE INTO prices (date_id,ticker_id,open,close,high,low,volume) VALUES (%s,%s,%s,%s,%s,%s,%s)"
mycursor.execute(sql,(date_id,ticker_id,p_open, p_close,p_high,p_low,p_volume))

Posgres in python (psycopg2) INSERT INTO statement hangs/freezes without inserting or crashing

I'm attempting to insert data into my persons table. I've tried both by using a cursor in python cursor.execute(sql), and by connecting to the database with the terminal and inserting there. However, the program simply stops at the point of execution. I commit at the end. The table is empty and looks like this:
CREATE TABLE Persons(
AKey INT PRIMARY KEY, -- Person ID primary key
Name VARCHAR(128) UNIQUE NOT NULL, -- Person name
Website VARCHAR(256), -- URL for persons website
IKey INT REFERENCES Institutions -- Institution affiliation
);
And an example insert looks like this:
INSERT INTO persons (Akey, Name, Website, IKey) VALUES(1, 'John Smith', 'www.foo.bar', 1);
The insert is not made, nor does it produce an error. The terminal or python just stops at the insert statement, apparently not doing anything. Inserting into other tables works without any problems.
EDIT:
I should mention that I am the only one doing transactions on this database, which only contains empty tables.
It sounds like you are blocked on a lock. See Lock Monitoring. Some other session has done something which conflicts with yours, such as inserting a person with the same akey or name as you are trying to insert, or deleting the row from institutions which you are trying to reference, and that session has not committed. Now your session is waiting to see if that other one commits or rolls back.
I think you forget to commit your query.
A full example looks like this:
import contextlib
import psycopg2
db_config = dict(database="...", user="...", password="...")
with contextlib.closing(psycopg2.connect(**db_config)) as connection:
try:
with contextlib.closing(connection.cursor()) as cursor:
sql = "INSERT INTO persons (Akey, Name, Website, IKey) VALUES(1, 'John Smith', 'www.foo.bar', 1);"
cursor.execute(sql)
connection.commit()
except:
connection.rollback()
raise

Why does a mySQL Python Query not insert a new entry?

So I am trying to add a new entry into my mySQL Database.
The problem here is, that it increases the id, but does add the entry.
After a little bit of googling I found that a failed INSERT query also increases the AUTO_INCREMENTd value (id in my case).
The mySQL Table is created using
CREATE TABLE IF NOT EXISTS TS3_STAMM_1 (id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY, name VARCHAR(64) NOT NULL, ts3_uid VARCHAR(64) NOT NULL, points INT(8) UNSIGNED NOT NULL); which is called by the function qServer.execute(querystring) inside python's MySQLdb module.
Then I use qString = "INSERT INTO TS3_STAMM_1 (name, ts3_uid, points) VALUES ('{}', '{}', {})".format(name, uid, pnts) (the datatypes are correct, I at least quadrouplechecked) with the function qServer.exectue(qString) to insert a new entry into the database.
But it is incrementing the ID, but its not adding an entry. So my guess would be its a failed query, but why? How does it happen? How to fix it?
Simple SELECT querys work fine the same way, also adding data manually works fine. Only the python query fails.
Note: qServer is the connection to the server, and its defined with:
try:
qConn = MySQLdb.connect(host="...", user="...", passwd="...", db="...")
qServer = qConn.cursor()
except OperationalError:
print("Cannot connect to mySQL Database! Aborting...")
exit(1)
Use commit Luke.
>>> cursor.execute("INSERT INTO employees (first_name) VALUES (%s)", ('Jane', ))
>>> qConn.commit()
Using str.format for creating SQL query is bad idea.

Table won't alter when using psycopg

I'm having some trouble altering tables in my postgres database. I'm using psycopg2 and working out of Python. I tried to add a serial primary key. It took a long time (large table), and threw no error, so it did something, but when I went to check, the new column wasn't there.
I'm hoping this is something silly that I've missed, but right now I'm at a total loss.
import psycopg2
username = *****
password = *****
conn = psycopg2.connect(database='mydb',user=username,password=password)
query = "ALTER TABLE mytable ADD COLUMN sid serial PRIMARY KEY"
cur = conn.cursor()
cur.execute(query)
conn.close()
Other things I've tried while debugging:
It doesn't work when I remove PRIMARY KEY.
It doesn't work when try a different data type.
You need to add a commit statement in order for your changes to reflect in the table. Add this before you close the connection.
conn.commit()

How can I insert this on my table?

Creating my table:
cursor.execute("""
CREATE TABLE if not exists intraday_quote (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
symbol VARCHAR(10) NOT NULL,
date DATE,
time DATE,
open FLOAT,
high FLOAT,
low FLOAT,
close FLOAT,
volume INTEGER);
""")
and I`m trying to insert this:
conn = sqlite3.connect('intraday_quote.db')
cursor = conn.cursor()
# Prepare SQL query to INSERT a record into the database.
sql = """INSERT INTO intraday_quote(symbol) VALUES ('Mac123432')"""
cursor.execute(sql)
No insertion happened in the database. What I am missing?
You need to commit your changes so they can get into effect in database.
commit all db operations like update, insert.
cursor.commit()
after your execute is succeeded. You can get return of the cursor.execute. If it is not None then you can try committing the changes else use rollback(exercise for you :) ) so you wont end up with wrong data updated in database.
You need to do conn.commit() to see the changes in the database. Quoting the documentation
This method commits the current transaction. If you don’t call this method, anything you did since the last call to commit() is not visible from other database connections. If you wonder why you don’t see the data you’ve written to the database, please check you didn’t forget to call this method.

Categories