Error checking with MySQLdb - python

I'm having trouble finding any information on how to do error checking on MySQLdb. I have been trying to do a simple update command for a MySQL database and it simply is not working. No matter how I change the terms, or the type of variables I submit to it.
Here are some of my (commented out) attempts:
timeid = twitseek['max_id']
#timeup = "UPDATE `timeid` set `timestamp`='" + str(timeid) + "';"
#print timeup
#c.execute(timeup)
#timeup = "UPDATE timeid SET timestamp=\"" + str(timeid) + "\"";
#timeup = "UPDATE timeid set timestamp = '500';"
timeup = 500
c.execute("""UPDATE timeid SET timestamp = %d;""", timeup)
#c.execute(timeup)
All I want to do is upload the value of timeid to the timestamp column's first value (or any value) in the table timeid.
Nothing I do seems to work and I've been sitting here for literally hours trying countless iterations.

You seem to be missing an obligatory call to .commit() on your connection object to commit your change.
# Your cursor is c
# We don't see your connection object, but assuming it is conn...
c.execute("""UPDATE timeid SET timestamp = %d;""", timeup)
conn.commit()
The above method will produce valid SQL, but you don't get the security benefit of prepared statements this way. The proper method to pass in parameters is to use %s, and pass in a tuple of parameters:
c.execute("UPDATE timeid SET timestamp = %s;", (timeup,))
conn.commit()
From the MySQLdb FAQ:
Starting with 1.2.0, MySQLdb disables autocommit by default, as
required by the DB-API standard (PEP-249). If you are using InnoDB
tables or some other type of transactional table type, you'll need to
do connection.commit() before closing the connection, or else none of
your changes will be written to the database.
Conversely, you can also use connection.rollback() to throw away any
changes you've made since the last commit.
As far as error checking goes, a failed connection or a syntactically invalid query will throw an exception. So you would want to wrap it in a try/except as is common in Python.

Related

Updating an SQL table while searching another SQL table

I have an SQL database "Garage.db" that has 3 tables:
Customer, Car and MOT
I want to update the field BookedMOT in the MOT table when someone has entered a Registration that is in the Car table. Can someone help me with the SQL query that can do this, thank you.
I am coding this in python 3.6 using tkinter. Here is my attempt,
def Booked(self):
date = Date.get()
month = Month.get()
year = Year.get()
BookedMOT = (date + '/' + month + '/' + year)
Registration = self.RegistrationEnt.get()
with sqlite3.connect('Garage.db') as db:
cursor = db.cursor()
add_date = ('UPDATE MOT SET MOT.BookedMOT = ? FROM Car WHERE Car.Registration = ?')
cursor.execute(add_date,[(BookedMOT), (Registration)])
db.commit()
(this addresses some Python problems I noticed before even realising that the SQL didn't look right, which should probably be fixed first)
Try this:
with sqlite3.connect('Garage.db') as db:
db.execute(add_date, (BookedMOT, Registration))
In general, when you say with ... as x:, you should probably use x inside the with block. After the block finished it did an automatic commit and trying to use db or cursor afterwards is probably incorrect. The with also means that you don't have to db.commit() any more.
sqlite3 connection objects (db in this case) have an execute method:
This is a nonstandard shortcut that creates a cursor object by calling the cursor() method, calls the cursor’s execute() method with the parameters given, and returns the cursor.
Finally, you had some redundant parentheses that could be removed.

MySql read_sql python query with variable #

I am aware that queries in Python can be parameterized using either ? or %s in execute query here or here
However I have some long query that would use some constant variable defined at the beginning of the query
Set #my_const = 'xyz';
select #my_const;
-- Query that use #my_const 40 times
select ... coalesce(field1, #my_const), case(.. then #my_const)...
I would like to do the least modif possible to the query from Mysql. So that instead of modifying the query to
pd.read_sql(select ... coalesce(field1, %s), case(.. then %s)... , [my_const, my_const, my_const, ..]
,I could write something along the line of the initial query. Upon trying the following, however, I am getting a TypeError: 'NoneType' object is not iterable
query_str = "Set #null_val = \'\'; "\
" select #null_val"
erpur_df = pd.read_sql(query_str, con = db)
Any idea how to use the original variable defined in Mysql query ?
The reason
query_str = "Set #null_val = \'\'; "\
" select #null_val"
erpur_df = pd.read_sql(query_str, con = db)
throws that exception is because all you are doing is setting null_value to '' and then selecting that '' - what exactly would you have expected that to give you? EDIT read_sql only seems to execute one query at a time, and as the first query returns no rows it results in that exception.
If you split them in to two calls to read_sql then it will in fact return you the value of your #null value in the second call. Due to this behaviour read_sql is clearly not a good way to do this. I strongly suggest you use one of my suggestions below.
Why are you wanting to set the variable in the SQL using '#' anyway?
You could try using the .format style of string formatting.
Like so:
query_str = "select ... coalesce(field1, {c}), case(.. then {c})...".format(c=my_const)
pd.read_sql(query_str)
Just remember that if you do it this way and your my_const is a user input then you will need to sanitize it manually to prevent SQL injection.
Another possibility is using a dict of params like so:
query_str = "select ... coalesce(field1, %(my_const)s, case(.. then %(my_const)s)..."
pd.read_sql(query_str, params={'my_const': const_value})
However this is dependent on which database driver you use.
From the pandas.read_sql docs:
Check your database driver documentation for which of the five syntax
styles, described in PEP 249’s paramstyle, is supported. Eg. for
psycopg2, uses %(name)s so use params={‘name’ : ‘value’}

Error in Sqlite while Updating the table using Python

I am using a python to update entries in Sqlite table.
The command I am using is:
handle.execute("UPDATE RECORD set NAME=%s DEVICE=%s PROJECT=%s IP=%s COMMENT=%s where ID = %s"%(arg[2],arg[3],arg[4],arg[5],arg[6],arg[1]))
To this I get am getting an error as:
sqlite3.OperationalError: near "DEVICE": syntax error
I cannot understand what is specifically wrong with Device. Also I have checked the variables are as expected. The data base has a column named device and the database can be opened / accessed and edited using this python file.
There are commas missing between set items.
In addition to that, instead of string formatting, pass parameters to prevent SQL injection:
handle.execute(
"""UPDATE RECORD
SET NAME=%s, DEVICE=%s, PROJECT=%s, IP=%s, COMMENT=%s
WHERE ID = %s""",
(arg[2], arg[3], arg[4], arg[5], arg[6], arg[1]))
UPDATE
If you insist to use string formatting, you should quote %s: '%s'

PyMySQL throws 'BrokenPipeError' after making frequent reads

I have written a script to help me work with a database. Specifically, I am trying to work with files on disk and add the result of this work to my database. I have copied the code below, but removed most of the logic which isn't related to my database to try to keep this question broad and helpful.
I used the code to operate on the files and add the result to the database, overwriting any files with the same identifier as the one I was working on. Later, I modified the script to ignore documents which have already been added to the database, and now whenever I run it I get an error:
pymysql.err.OperationalError: (2006, "MySQL server has gone away (BrokenPipeError(32, 'Broken pipe'))")
It seems like the server is rejecting the requests, possibly because I have written my code poorly? I have noticed that the error always occurs at the same place in the list of files, which doesn't change. If I re-run run the code, replacing the file list with a list of only the file on which the program crashes, it works fine. This makes me think that after making a certain number of requests, the database just bottoms out.
I'm using Python 3 and MySQL Community Edition Version 14.14 on OS X.
Code (stripped of stuff that doesn't have to do with the database):
import pymysql
# Stars for user-specific stuff
connection = pymysql.connect(host='localhost',
user='root',
password='*******',
db='*******',
use_unicode=True,
charset="utf8mb4",
)
cursor = connection.cursor()
f_arr = # An array of all of my data objects
def convertF(file_):
# General layout: Try to work with input and add it the result to DB. The work can raise an exception
# If the record already exists in the DB, ignore it
# Elif the work was already done and the result is on disk, put it on the database
# Else do the work and put it on the database - this can raise exceptions
# Except: Try another way to do the work, and put the result in the database. This can raise an error
# Second (nested) except: Add the record to the database with indicator that the work failed
# This worked before I added the initial check on whether or not the record already exists in the database. Now, for some reason, I get the error:
# pymysql.err.OperationalError: (2006, "MySQL server has gone away (BrokenPipeError(32, 'Broken pipe'))")
# I'm pretty sure that I have written code to work poorly with the database. I had hoped to finish this task quickly instead of efficiently.
try:
# Find record in DB, if text exists just ignore the record
rc = cursor.execute("SELECT LENGTH(text) FROM table WHERE name = '{0}'".format(file_["name"]))
length = cursor.fetchall()[0][0] # Gets the length
if length != None and length > 4:
pass
elif ( "work already finished on disk" ):
# get "result_text" from disk
cmd = "UPDATE table SET text = %s, hascontent = 1 WHERE name = %s"
cursor.execute(cmd, ( pymysql.escape_string(result_text), file_["name"] ))
connection.commit()
else:
# do work to get result_text
cmd = "UPDATE table SET text = %s, hascontent = 1 WHERE name = %s"
cursor.execute(cmd, ( pymysql.escape_string(result_text), file_["name"] ))
connection.commit()
except:
try:
# Alternate method of work to get result_text
cmd = "UPDATE table SET text = %s, hascontent = 1 WHERE name = %s"
cursor.execute(cmd, ( pymysql.escape_string(result_text), file_["name"] ))
connection.commit()
except:
# Since the job can't be done, tell the database
cmd = "UPDATE table SET text = %s, hascontent = 0 WHERE name = %s"
cursor.execute(cmd, ( "NO CONTENT", file_["name"]) )
connection.commit()
for file in f_arr:
convertF(file)
Mysql Server Has Gone Away
This problem is described extensively at http://dev.mysql.com/doc/refman/5.7/en/gone-away.html the usual cause is that the server has disconnected for whatever reason and the usual remedy is to retry the query or to reconnect and retry.
But why this breaks your code is because of the way you have written your code. See below
Possibly because I have written my code poorly?
Since you asked.
rc = cursor.execute("SELECT LENGTH(text) FROM table WHERE name = '{0}'".format(file_["name"]))
This is a bad habit. The manually explicitly warns you against doing this to avoid SQL injections. The correct way is
rc = cursor.execute("SELECT LENGTH(text) FROM table WHERE name = %s", (file_["name"],))
The second problem with the above code is that you don't need to check if a value exists before you try to update it. You can delete the above line and it's associated if else and jump straight to the update. Besides, our elif and else seem to do exactly the same thing. So your code can just be
try:
cmd = "UPDATE table SET text = %s, hascontent = 1 WHERE name = %s"
cursor.execute(cmd, ( pymysql.escape_string(result_text), file_["name"] ))
connection.commit()
except: # <-- next problem.
And we come to the next problem. Never ever catch generic exceptions like this. you should always catch specific exceptions like TypeError, AttributeError etc. When catching generic exceptions is unavoidable, you should at least log it.
For example, here you could catch connection errors and attempt to reconnect to the database. Then the code will not stop executing when your server gone away problem happens.
I've solved the same error in the case when I tried to make a bulk inserts by reducing the number of lines I wanted to insert in one command.
Even the maximum number of lines for bulk insert was much higher, I had this kind of error.

Correct Postgresql syntax

I'm a postgres newbie and am having some issues querying a text field in postgresql using Python. What is the correct syntax that will allow me to search the content of column "body" from table "jivemessage" out of database "postgres"?
try:
conn = psycopg2.connect("dbname='postgres' user='postgres' host='localhost' password='<password>'")
except:
print "cannot connect"
i = 'test'
cur = conn.cursor()
cur.execute('SELECT * from jivemessage WHERE body LIKE "%'+i+'%"')
Keep getting the following error:
ProgrammingError: column "%test%" does not exist
Thanks for any help.
You are not quoting the query properly. Don't use string concatenation here, use SQL parameters instead:
cur.execute('SELECT * from jivemessage WHERE body LIKE %s', ("%{}%".format(i),))
Here, the %s placeholder signals to the database driver that the first value of the second argument should be placed there when querying.
This leaves the interpolation up to the database driver, giving the database the opportunity to optimize for the query once, even if you were to reuse the same query.
It also prevents SQL injection attacks better than you could yourself, and most of all, guarantees that the correct quoting rules are followed.

Categories