prepared statements in psycopg3 - python

Are prepared statements written differently in psycopg3 than psycopg2?
I am upgrading from psycopg2 to 3 and in psycopg2 this works but doesnt work for psycopg3
Also the variables name and age will be populated with data
sql is a variable which is connected to the Database we use. So my problem is within the query itself and not the python syntax
def test(name, age=None, alive=False):
sql.execute("""prepare test as UPDATE table AS x
SET var = $1
FROM table1 as y
WHERE y.name = x.name and y.age = $2""")
sql.execute("""execute test (%s, %s)""", (name, age))
This throws an error saying could not determine the datatype of parameter $1. Looking at the docs and github of psycopg, theres no info regarding this matter.

Related

What is wrong with this row insertion?

I am working with mySQL in Python with the PyMySQL module handling the connection. dbCursor is the cursor object.
I have the following statement:
statement = f"INSERT INTO Machine_status (timestamp, num_of_char, STATUS, emergency_status) VALUES ({timestamp},{num_of_char},{status},{emergency_status})"
dbCursor.execute(statement)
However, whenever I try to execute this statement, the program hangs. This is perplexing because this statement similar to it:
statement = f"INSERT INTO Speed (timestamp, meters_per_second) VALUES ({timestamp}, {meters})"
dbCursor.execute(statement)
does not give any issue. What is wrong with the first statement? I'm truly at a loss as to what it could be. I checked and made sure that the variable names match, I checked the structure in the database to make sure the variables matched what the columns could accept, made sure there were no extra parentheses and that syntax was correct but it continues to hang when trying to execute the statement. The frustrating part is that trying the run the statement directly in the database (using phpMyAdmin) gives no issues. What's going on here?
EDIT1: Added more relevant code.
EDIT2: I've already tried reformatting it to using placeholders instead (i.e. VALUES (%d, %d, %s, %d)). It doesn't resolve the issue.
EDIT3: CREATE TABLE statement
"CREATE TABLE Machine_status (id AUTO_INCREMENT PRIMARY KEY, timestamp int, num_of_char int, status varchar(255), emergency_status int)"
Complete insert string
payload = item[0].split("{")[1].split("}")[0].split(" ") #returns a list containing everything between brackets of statement
timestamp = int(payload[0])
num_of_char = int(payload[1])
status = payload[2]
emergency_status = int(payload[3])
statement = "INSERT INTO Machine_STATUS (timestamp, num_of_char, STATUS, emergency_status) VALUES (%d, %d, %s, %d)"
try:
dbCursor.execute(statement, (timestamp, num_of_char, status, emergency_status))
print("Complete.")
except:
print("something went wrong")
EDIT4: I figured out what the issue was. pymysql was giving me issues with trying to extract the error code, so I switched to mysql.connector. It finally gave me an error of "table not found". Apparently the table name I was using in the code didn't match exactly what I was using in the database, and therefore it was causing the table to not be found. I changed it, and now the issue is gone.
As Klaus already wrote don't use stirng insertion use place holders.
Like
statement = "INSERT INTO Machine_status (timestamp, num_of_char, STATUS, emergency_status) VALUES (%s,%s,%s,%s)"
dbCursor.execute(statement,(timestamp,num_of_char,status,emergency_status))

Python and Prepared SQL Statements using Variables

I'm fairly new to Python but have a project I am working on so please excuse any nativity on my part.
I am writing some SQL statements in Python 2.7 (Libraries not upgraded to 3 yet) but I am getting stuck on best practice procedure for them. We are using Sybase. Initially I was using
query = "UPDATE DB..TABLE SET version = '{}' WHERE name = '{}'".format(app.version, app.name)
cursor.execute(query)
But realised this after further reading that it is open to injection. So I then looked at doing the following:
query = "UPDATE DB..TABLE SET version = '%s' WHERE name = '%s'" % (app.version, app.name)
cursor.execute(query)
But got me to thinking is this not the same thing?
The parameters are also variables set by argparse, which means I have to use the '' around %s otherwise it throws up the invalid column name error. Which is frustrating for me as I also want to be able to pass NULL (None in Python) by default if any additional flags aren't set in other queries, otherwise it obviously inserts "NULL" as string.
For this particular example the 2 variables are set from a file being read by ConfigParser but I think it's still the same for argparse variables. e.g.
[SECTION]
application=name
version=1.0
I'm not quite sure how to best tackle this issue and yes yes I know "PYTHON 3 IS BETTER UPGRADE TO IT", as I said at the start, the libraries are in the process of being ported.
If you need any additional info then please advise and I will give you the best I can.
UPDATE***
Using the following Param style string I found in some sybase docs it can work but it does not pass None for NULL and throws up errors, starting to think this is a limitation of the sybase module in python.
cursor.execute("SELECT * FROM DB..table where app_name=#app_name", {"#app_name": app_name})
or
params = {"#appname": app.name. "#appver": app.version}
sql = "INSERT INTO DB..table (app_name, app_version) VALUES (#appname, #appversion)
cursor.execute(sql, params)
There is an issue though if you have a global list of params and feed that to a query that if any are None then it gives you a lot of errors again about being None, EVEN if those specific params aren't used in the query. Think I may be stuck doing IF statements for various options here for multiple inserts to bypass this None NULL issue.
Ok I have resolved this issue now.
I had to update the sybase module in order to get it to work with None > NULL.
As posted in the updated question. the below is how I was running the queries.
cursor.execute("SELECT * FROM DB..table where app_name=#app_name", {"#app_name": app_name})
or
params = {"#appname": app.name. "#appver": app.version}
sql = "INSERT INTO DB..table (app_name, app_version) VALUES (#appname, #appversion)
cursor.execute(sql, params)
But got me to thinking is this not the same thing?
Yes, those are effectively the same. They are both wide open to an injection attack.
Instead, do it this way:
query = "UPDATE DB..TABLE SET version = %s WHERE name = %s"
cursor.execute(query, [app.version, app.name])

dynamic table mysqldb python string/int issue

I am receiving an error when trying to write data to a database table when using a variable for the table name that I do not get when using a static name. For some reason on the line where I insert, if I insert an integer as the column values the code runs and the table is filled, however, if I try to use a string I get a SQL syntax error
cursor = db.cursor()
cursor.execute('DROP TABLE IF EXISTS %s' %data[1])
sql ="""CREATE TABLE %s (IP TEXT, AVAILIBILITY INT)""" %data[1]
cursor.execute(sql)
for key in data[0]:
cur_ip = key.split(".")[3]
cursor.execute("""INSERT INTO %s VALUES (%s,%s)""" %(data[1],key,data[0][key]))
db.commit()
the problem is where I have %(data[1], key, data[0][key]) any ideas?
It's a little hard to analyse your problem when you don't post the actual error, and since we have to guess what your data actually is. But some general points as advise:
Using a dynamic table name is often not way DB-systems want to be used. Try thinking if the problem could be used by using a static table name and adding an additional key column to your table. Into that field you can put what you did now as a dynamic table name. This way the DB might be able to better optimize your queries, and your queries are less likely to get errors (no need to create extra tables on the fly for once, which is not a cheap thing to do. Also you would not have a need for dynamic DROP TABLE queries, which could be a security risk.
So my advice to solve your problem would be to actually work around it by trying to get rid of dynamic table names altogether.
Another problem you have is that you are using python string formatting and not parameters to the query itself. That is a security problem in itself (SQL-Injections), but also is the problem of your syntax error. When you use numbers, your expression evaluates to
INSERT INTO table_name VALUES (100, 200)
Which is valid SQL. But with strings you get
INSERT INTO table_name VALUES (Some Text, some more text)
which is not valid (since you have no quotes ' around the strings.
To get rid of your syntax problem and of the sql-injection-problem, don't add the values to the string, pass them as a list to execute():
cursor.execute("INSERT INTO table_name VALUES (%s,%s)", (key, data[0][key]))
If you must have a dynamic table name, put that in your query string first (e.g. with % formatting), and give the actual values for your query as parameters as above (since I cannot imagine that execute will accept the table name as a parameter).
To put it in some simple sample code. Right now you are trying to do it like this:
# don't do this, this won't even work!
table_name = 'some_table'
user_name = 'Peter Smith'
user_age = 47
query = "INSERT INTO %s VALUES (%s, %s)" % (table_name, user_name, user_age)
cursor.execute(query)
That creates query
INSERT INTO some_table VALUES (Peter Smith, 100)
Which cannot work, because of the unquoted string. So you needed to do:
# DON'T DO THIS, it's bad!
query = "INSERT INTO %s VALUES ('%s', %s)" % (table_name, user_name, user_age)
That's not a good idea, because you need to know where to put quotes and where not (which you will mess up at some point). Even worse, imagine a user named named Connor O'Neal. You would get a syntax error:
INSERT INTO some_table VALUES ('Connor O'Neal', 100)
(This is also the way sql-injections are used to crush your system / steal your data). So you would also need to take care of escaping the values that are strings. Getting more complicated.
Leave those problems to python and mysql, by passing the date (not the table name) as arguments to execute!
table_name = 'some_table'
user_name = 'Peter Smith'
user_age = 47
query = "INSERT INTO " + table_name + " VALUES (%s, %s)"
cursor.execute(query, (user_name, user_age))
This way you can even pass datetime objects directly. There are other ways to put the data than using %s, take a look at this examples http://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html (that is python3 used there, I don't know which you use - but except of the print statements it should work with python2 as well, I think).

question about postgresql bind variables

I was looking at the question and decided to try using the bind variables. I use
sql = 'insert into abc2 (interfield,textfield) values (%s,%s)'
a = time.time()
for i in range(10000):
#just a wrapper around cursor.execute
db.executeUpdateCommand(sql,(i,'test'))
db.commit()
and
sql = 'insert into abc2 (intfield,textfield) values (%(x)s,%(y)s)'
for i in range(10000):
db.executeUpdateCommand(sql,{'x':i,'y':'test'})
db.commit()
Looking at the time taken for the two sets, above it seems like there isn't much time difference. In fact, the second one takes longer. Can someone correct me if I've made a mistake somewhere? using psycopg2 here.
The queries are equivalent in Postgresql.
Bind is oracle lingo. When you use it will save the query plan so the next execution will be a little faster. prepare does the same thing in Postgres.
http://www.postgresql.org/docs/current/static/sql-prepare.html
psycopg2 supports an internal 'bind', not prepare with cursor.executemany() and cursor.execute()
(But don't call it bind to pg people. Call it prepare or they may not know what you mean:)
IMPORTANT UPDATE :
I've seen into source of all python libraries to connect to PostgreSQL in FreeBSD ports and can say, that only py-postgresql does real prepared statements! But it is Python 3+ only.
also py-pg_queue is funny lib implementing official DB protocol (python 2.4+)
You've missed answer for that question about prepared statements to use as many as possible. "Binded variables" are better form of this, let's see:
sql_q = 'insert into abc (intfield, textfield) values (?, ?)' # common form
sql_b = 'insert into abc2 (intfield, textfield) values (:x , :y)' # should have driver and db support
so your test should be this:
sql = 'insert into abc2 (intfield, textfield) values (:x , :y)'
for i in range (10000):
cur.execute(sql, x=i, y='test')
or this:
def _data(n):
for i in range (n):
yield (i, 'test')
sql = 'insert into abc2 (intfield, textfield) values (? , ?)'
cur.executemany(sql, _data(10000))
and so on.
UPDATE:
I've just found interest reciple how to transparently replace SQL queries with prepared and with usage of %(name)s
As far as I know, psycopg2 has never supported server-side parameter binding ("bind variables" in Oracle parlance). Current versions of PostgreSQL do support it at the protocol level using prepared statements, but only a few connector libraries make use of it. The Postgres wiki notes this here. Here are some connectors that you might want to try: (I haven't used these myself.)
pg8000
python-pgsql
py-postgresql
As long as you're using DB-API calls, you probably ought to consider cursor.executemany() instead of repeatedly calling cursor.execute().
Also, binding parameters to their query in the server (instead of in the connector) is not always going to be faster in PostgreSQL. Note this FAQ entry.

Parameterized queries with psycopg2 / Python DB-API and PostgreSQL

What's the best way to make psycopg2 pass parameterized queries to PostgreSQL? I don't want to write my own escpaing mechanisms or adapters and the psycopg2 source code and examples are difficult to read in a web browser.
If I need to switch to something like PyGreSQL or another python pg adapter, that's fine with me. I just want simple parameterization.
psycopg2 follows the rules for DB-API 2.0 (set down in PEP-249). That means you can call execute method from your cursor object and use the pyformat binding style, and it will do the escaping for you. For example, the following should be safe (and work):
cursor.execute("SELECT * FROM student WHERE last_name = %(lname)s",
{"lname": "Robert'); DROP TABLE students;--"})
From the psycopg documentation
(http://initd.org/psycopg/docs/usage.html)
Warning Never, never, NEVER use Python string concatenation (+) or string parameters interpolation (%) to pass variables to a SQL query string. Not even at gunpoint.
The correct way to pass variables in a SQL command is using the second argument of the execute() method:
SQL = "INSERT INTO authors (name) VALUES (%s);" # Note: no quotes
data = ("O'Reilly", )
cur.execute(SQL, data) # Note: no % operator
Here are a few examples you might find helpful
cursor.execute('SELECT * from table where id = %(some_id)d', {'some_id': 1234})
Or you can dynamically build your query based on a dict of field name, value:
query = 'INSERT INTO some_table (%s) VALUES (%s)'
cursor.execute(query, (my_dict.keys(), my_dict.values()))
Note: the fields must be defined in your code, not user input, otherwise you will be susceptible to SQL injection.
I love the official docs about this:
https://www.psycopg.org/psycopg3/docs/basic/params.html

Categories