I'm running this from PyDev in Eclipse...
import pymysql
conn = pymysql.connect(host='localhost', port=3306, user='userid', passwd='password', db='fan')
cur = conn.cursor()
print "writing to db"
cur.execute("INSERT INTO cbs_transactions(leagueID) VALUES ('test val')")
print "wrote to db"
The result is, at the top of the Console it says C:...test.py, and in the Console:
writing to db
wrote to db
So it's not terminating until after the execute command. But when I look in the table in MySQL it's empty. A record did not get inserted.
First off, why isn't it writing the record. Second, how can I see a log or error to see what happened. Usually there should be some kind of error in red if the code fails.
Did you commit it? conn.commit()
PyMySQL disable autocommit by default, you can add autocommit=True to connect():
conn = pymysql.connect(
host='localhost',
user='user',
passwd='passwd',
db='db',
autocommit=True
)
or call conn.commit() after insert
You can either do
conn.commit() before calling close
or
enable autocommit via conn.autocommit(True) right after creating the connection object.
Both ways have been suggested from various people at a duplication of the question that can be found here: Database does not update automatically with MySQL and Python
Related
connection.py File
def create_connection():
connection = psycopg2.connect("dbname=suppliers user=postgres password=postgres")
return connection
def create_cursor():
connection = create_connection()
cursor = connection.cursor()
return cursor
Above example will create a connection 2 times when calling both create_connection() and create_cursor() method
Query File
def common_query():
sql_query = "INSERT INTO supply VALUES (1, 'test', 'test')"
conn = create_connection()
cursor = create_cursor()
with conn:
with cursor as cur:
cur.execute(sql_query)
conn.close()
Above example will call create_connection and create_cursor method, but as you can see while calling create_connection , connection has already been established and in create_cursor() while calling create_connection() method again it create another connection.
So while execution query it does't show any error nor it insert my data into database.
Let me know whats happened in it ?
You create two connections for each call to common_query. One is explicitly closed, the other is closed at some point because it went out of scope. (Python is a garbage collected language)
You don't commit on either one, so whatever work you did gets rolled back automatically. This is unrelated to the first point. The same thing would happen if you had only created one connection (and also didn't commit on it).
I am not able to connect to MySQL sever using python it gives and error which says
MySQLdb._exceptions.OperationalError: (1130, "Host 'LAPTOP-0HDEGFV9' is not allowed to connect to this MySQL server")
The code I'm using:
import MySQLdb
db = MySQLdb.connect(host="LAPTOP-0HDEGFV9", # your host, usually localhost
user="root", # your username
passwd="abcd13de",
db="testing") # name of the data base
cur = db.cursor()
cur.execute("SELECT * Employee")
for row in cur.fetchall():
print(row[0])
db.close()
This is an authorization problem not a connectivity problem. Is the db running locally? If not, confirm with the admin where it is hosted. If so, try changing the host parameter to 127.0.0.1?
As described here the admin can get the hostname by running:
select ##hostname;
show variables where Variable_name like '%host%';
If the connection was timing out you could try setting the connect_timeout kwarg but that's already None by default.
All tutorials told me database connection is a precious resource.We must close it after do some operations on it.And reopen it when we want do another thing.But I only find out a property(open) that indicate the connection status.
It means I need create connection object for every query|update|delete?
If I won't create connection for every operation (code like below) and how safely destroy the connection?
connection = pymysql.connect(host='localhost',
user='root',
password='password',
db='blog',
charset='utf8',
cursorclass=pymysql.cursors.DictCursor)
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
connection.close()
#do other things ..............................
#maby return here and do not execute below
#occur error below
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
Normally your program opens a database connection and keeps it open until the program finishes. Creating/opening a connection is an expensive operation so normally you just want to keep it open for the duration of your program. During the creation of a connection the database has to allocate resources for that connection. If you are going to open and close a connection for every operation this will negatively effect the complete database performance since the database has needs exclusive access to those memory structures and that causes waits.
I'm using MySQLdb module for Python to make some simple queries. When I do a certain UPDATE, it hangs for a while and finally gives this error:
operational error (1205 'lock wait timeout exceeded try restarting
transaction')
The code I'm using is the following:
def unselectAll():
try:
db = MySQLdb.connect(host='localhost', user='user', passwd='', db='mydatabase')
cursor = db.cursor()
cursor.execute('UPDATE MYTABLE SET Selected=0')
except MySQLdb.Error, e:
print 'ERROR ' + e.args[0] + ': ' + e.args[1]
If I try to use that query in console, works perfectly. Also, if connecting without db parameter and using mydatabase.MYTABLE at the query doesn't work either.
Any help?
This could be because the UPDATE isn't getting commited - have you tried autocommit=True for the connection? As in
db = MySQLdb.connect(host='localhost', user='user', passwd='', db='mydatabase', autocommit=True)
or maybe even
db.autocommit(True)
after you've created the connection.
I have a python script that makes about ten INSERTs into a MySQL database. This is its current structure:
conn = MySQLdb.connect (host = DB_HOST,
port = DB_PORT,
user = DB_USER,
passwd = DB_PASSWORD,
db = DB_NAME)
cursor = conn.cursor()
cursor.execute("INSERT INTO...")
# do some stuff, then another INSERT
cursor.execute("INSERT INTO...")
# do some other stuff, then another INSERT
cursor.execute("INSERT INTO...")
etc...
conn.commit()
cursor.close()
conn.close()
Is the above the correct way to do multiple inserts, or should I be closing the cursor after each INSERT? Should I be doing a commit after each INSERT?
should I be closing the cursor after each INSERT?
It doesn't much matter. Cursors are reused cleverly.
You can close them to be super careful of your resources.
Do this.
from contextlib import closing
with closing( conn.cursor() ) as cursor:
cursor.execute("INSERT INTO...")
This assures that the cursor is closed no matter what kind of exceptions happen.
Should I be doing a commit after each INSERT?
That depends on what your application is expected to do.
If it's an "all or nothing" proposition, then you do one commit. All the inserts are good or none of them are.
If partial results are acceptable, then you can commit after each insert.