Nested curly brackets in Psycopg2 SQL Composition query - python

I'm trying to create a query using Pycopg2's SQL String Composition which in I need to use a curly brackets inside my query to update a key value in a jsonb column. Something like this:
update myschema.users set data = jsonb_set(data, '{someId}', '100')
This is how I'm trying to write this query using Sql Composition string in Python:
statement = SQL(
"UPDATE {schema}.{table} set data = jsonb_set(data, '{{key}}', '{value}') {where};"
).format(
schema=Identifier(schema_var),
table=Identifier(table_var),
key=SQL(id_key),
value=SQL(id_value),
where=SQL(where),
)
But by running this, a new key called key will be added in the jsonb value. and if I try to run it with just one pair of curly brackets like this:
statement = SQL(
"UPDATE {schema}.{table} set data = jsonb_set(data, '{key}' ...." # The rest is the same
I get this error:
Array value must start with "{" or dimension information
How can I fix this?

To solve this issue I needed to use three nested curly brackets like this:
statement = SQL(
"UPDATE {schema}.{table} set data = jsonb_set(data, '{{{key}}}' ...." # The rest is the same
This way, the someId key will actually gets updated in the database.

You are over thinking this.
Load data into table:
json_import
Table "public.json_import"
Column | Type | Collation | Nullable | Default
-----------+---------+-----------+----------+-----------------------------------------
id | integer | | not null | nextval('json_import_id_seq'::regclass)
jsonb_fld | jsonb | | |
insert into json_import values (1, '{"test": "dog"}'::jsonb);
select * from json_import;
id | jsonb_fld
----+-----------------
1 | {"test": "dog"}
import psycopg2
from psycopg2 import sql
con = psycopg2.connect("dbname=test user=postgres host=localhost port=5432")
cur = con.cursor()
sql_str = sql.SQL('update {table} set jsonb_fld = jsonb_set(jsonb_fld,
%(key)s, %(val)s) where id = 1').format(table=sql.Identifier('json_import'))
cur.execute(sql_str, {'key': '{test}', 'val': '"cat"'})
con.commit()
select * from json_import;
id | jsonb_fld
----+-----------------
1 | {"test": "cat"}
The values for the jsonb_set() should be passed in as parameters not as part of the composition process.
UPDATE
Using same sql_str and assigning the values to variables.
key_val = '{test}'
fld_val = '"cat"'
cur.execute(sql_str, {'key': key_val, 'val': fld_val})
con.commit()
cur.execute("select * from json_import")
cur.fetchone()
(1, {'test': 'cat'})

Related

mysql.connector do not give the last database state in Python

I use mysql.connector library in Python to send query to database. But, when the database is changed after the initialization, the mysql.connector’s tools answer like if the database had never change.
As example, let’s imagine I have a minimalist table students with just two columns id and name.
+----+------+
| id | name |
+----+------+
| 0 | foo |
+----+------+
In the following code, the query will ask the user with id 0. But, inside the process, some events will happened from outside the Python script and alter the database.
import mysql.connector
maindb = mysql.connector.connect(
host = "<host>",
user = "<user>",
password = "<password>",
db = "<database name>"
)
cursor = maindb.cursor()
# Here, I will send outside the python script a MySQL query to modify the name of the student from “foo” to “bar” like this:
# `UPDATE `students` SET `name` = 'bar' WHERE `students`.`id` = 0;`
cursor.execute("SELECT `id`, `name` FROM `students` WHERE `id` = 0")
result = cursor.fetchall()
print(result)
Then I get this answer [(0, 'foo')]. As you see, Python is not aware the data base has change since maindb.cursor() was called. So I get foo as name field instead of bar as expected.
So how to tell mysql.connector’s tools to take the last updates from the database when I send a query?
You will need to use a socket or if the changes occur frequently have your code re-run every x minutes
I just need to .connect() maindb object and .close() it before each new need.
maindb.connect()
cursor.execute("SELECT `id`, `name` FROM `students` WHERE `id` = 0")
result = cursor.fetchall()
print(result)
maindb.close()
The database maintains data integrity by preventing in-progress transactions from seeing changes made by other transactions (see transaction isolation levels).
You can commit your connection to allow it to see new changes:
cursor = maindb.cursor()
# Here, I will send outside the python script a MySQL query to modify the name of the student from “foo” to “bar” like this:
# `UPDATE `students` SET `name` = 'bar' WHERE `students`.`id` = 0;`
# Doesn't show the update
cursor.execute("SELECT `id`, `name` FROM `students` WHERE `id` = 0")
result = cursor.fetchall()
print(result)
# Shows the update because we have committed.
maindb.commit()
cursor.execute("SELECT `id`, `name` FROM `students` WHERE `id` = 0")
result = cursor.fetchall()
print(result)

Unexpected data returned by PyMySQL when using GROUP_CONCAT

I generated SQL query for my database as following:
SELECT match_id, Group_concat(name SEPARATOR ', ') AS 'winners'
FROM players
WHERE match_id = 4
AND rank = 1
GROUP BY match_id;
(for structure and example data, see sql fiddle)
which results in MySQL:
+----------+------------+
| match_id | winners |
+----------+------------+
| 4 | P106, P107 |
+----------+------------+
So I tried to apply this SQL query into python using pymysql module:
conn = pymysql.connect(**DB_CONFIG, cursorclass=pymysql.cursors.DictCursor)
cur = conn.cursor()
sql = """SELECT match_id, Group_concat(name SEPARATOR ', ') AS 'winners'
FROM players
WHERE match_id = %s
AND rank = 1
GROUP BY match_id """
cur.execute(sql, (4, ))
result = cur.fetchall()
But strangely, the value of result was:
[{'match_id': 4, 'winners': 'P106'}]
So, why is the result of pymysql different from the result of MySQL Client? How can I fix it?

Python: How to include table prefix in mysql result header

I need to execute the following SQL query in python.
SELECT table_a.company, table_b.company
FROM table_a
LEFT JOIN table_b ON table_a.id = table_b.id
but the problem, when I run my code (tried using pymysql (cursor.execute) and pandas (pd.read_sql)), they showed
| company | company |
-----------------------------------
| <some company> | <some company> |
Whereas, my expectation, the output should be like this:
| table_a.company | table_b.company |
---------------------------------------------------
| <some company> | <some company> |
(the header contains table prefix)
How do I achieve this?
My conditions:
I am avoiding using alias, since the real query contain wildcard table_a.*
I want to try mysql.connector, but I heard you need to be in python 64-bit.
In my case, I need to stay on python 32-bit, So I can't use it.
Below are my codes:
Pandas
engine = sqlalchemy.create_engine(connection_string)
df = pd.read_sql(text(query), engine, chunksize=chunksize):
print(df)
Pymysql
engine = pymysql.connect(host=host, user=user, password=passw, port=port, db=database)
cursor = engine.cursor()
cursor.execute(query)
result = cursor.fetchmany(chunksize)
num_fields = len(cursor.description)
field_names = [i[0] for i in cursor.description]
df = pd.DataFrame(columns=field_names, data=result)
print(df)
Any help would be appreciated

MySQL connector Python variables not registering

I'm currently using the mysql-connector-python package to execute database actions on Flask. It's been working so well until suddenly the variables don't seem to working correctly anymore. My code is here:
#bp.route('/addcart', methods=('OPTIONS', 'POST'))
def addcart():
...
userID = session.get("user_id")
reqDict = request.get_json()
itemCode = str(reqDict['itemCode'])
itemAmt = reqDict['itemAmt']
if userID is not None:
db = get_db()
cursor = db.cursor()
query = ('SELECT %s FROM cartdata WHERE id = %s')
cursor.execute(query, (itemCode, userID))
currentNum = cursor.fetchone()[0]
if currentNum is None:
stmt = ('UPDATE cartdata SET %s = 1 WHERE id = %s')
cursor.execute(stmt, (itemCode, userID))
else:
currentNum = int(currentNum) + int(itemAmt)
stmt = ('UPDATE cartdata SET %s = %s WHERE id = %s')
cursor.execute(stmt, (itemCode, currentNum, userID))
....
For some reason, I seem to having trouble with the itemCode variable. When I use it properly, like in the execution of 'query' or 'stmt', it doesn't work. Typically I will get an error saying
" You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use
near ''p1' = 1 WHERE id = 21'".
However, if I do this:
query = ('SELECT ' + itemCode + ' FROM cartdata WHERE id = %s')
...
stmt = ('UPDATE cartdata SET '+ itemCode +' = 1 WHERE id = %s')
...
It works properly as intended.
EDIT: I've checked my backend, and apparently the UPDATE statement does not actually update anything. So now I'm at a complete loss.
I don't understand why the connector suddenly breaks now for variables. I've checked this variables and its types, but they were the expected types. Any insight would be helpful.
My table schema for 'cartdata' looks something like this:
+-------+---------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+---------+------+-----+---------+-------+
| id | int(11) | NO | PRI | NULL | |
| p1 | int(8) | YES | | NULL | |
| p2 | int(8) | YES | | NULL | |
| p3 | int(8) | YES | | NULL | |
| p4 | int(8) | YES | | NULL | |
| p5 | int(8) | YES | | NULL | |
+-------+---------+------+-----+---------+-------+
That's because when MySQL connector injects your variables into the SQL statement, it formats them according to their type.
You can actually see it in the error message that you get:
"p1' = 1 WHERE id = 21'"
^
So probably, your SQL query looks like this:
SELECT 'p1' FROM cartdata WHERE id = someId
Which is syntactically invalid SQL...
Your second option however seems okay. Btw, it seems weird to adapt the column you want to select depending on the user's input... I'd highly recommend to validate this value with something efficient...
Details
You cannot use %s for column names since this injects a string value in your SQL query and this results in a non valid SQL syntax (column names are not string values).
As above:
SET %s = ...
Generates:
SET 'colName' = ...
which is not valid because you are attempting to affect a value to another value...
That would be the same as trying to do the following in python:
'foo' = 'bar'
or
'foo' = 4
You can use %s when setting values (using SET colName = %s) or filtering values (using WHERE colName = %s) because the type of the values in the column colName is actually a string.
As above:
WHERE colName = %s
Generates:
WHERE colName = 'fooBar'
which is valid because you filter on the values that are equal to the string fooBar.
By the way, you might want to check what
SELECT %s FROM cartdata WHERE id = %s
gives you as a result. That could result problems... Actually MySQL won't tell you anything, but you result will probably be exactly the value of itemCode. (it is valid SQL SELECT 'hello', it just returns 'hello').

python: psycopg2: psql: Delete row where condition

I'm trying to delete rows from a psql table on a condition.
I want all rows to be deleted if column "TagNaam" equals a variable var_tagnaam.
I've tried the following code and some variants but I can't get it to work.
There aren't any errors though.
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
Is there something wrong with the syntax?
Edit:
Maybe it is more clear with additional code, the error might be in the other code?
for i in range(len(taginhoud)):
(var_tagnaam, var_tagwaarde, var_tagkwaliteit, var_tagtime) = taginhoud[i]
print (var_tagnaam)
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
conn.commit()
cur.execute('INSERT INTO opc_actuelewaardentags ("TagNaam", "TagWaarde", "TagKwaliteit", create_date, write_date) VALUES (%s,%s,%s,now(),now())',
(var_tagnaam, var_tagwaarde, var_tagkwaliteit))
conn.commit()
So what I try to do here is:
Retrieve "var_tagnaam" from list "taginhoud".
Then in table opc_actuelewaardentags find all rows where column "Tagnaam" equals the value in "var_tagnaam". (Should be a string)
Then delete those rows where "Tagnaam" = "var_tagnaam". This part doesn't work.
Then insert new rows with data. This part works.
Could this code be wrong to do what I want?
I have tried many things already to solve the upper/lower case problem.
Edit 2:Query in pgadmin worked, trying to do the same thing in python:
I ran this query in pgadmin and it deleted the rows:
delete FROM opc_actuelewaardentags where "TagNaam" = 'Bakkerij.Device1.DB100INT8';
My attempt to make it as similar as possible in python:
var_tagnaam2 = "'"+var_tagnaam+"'"
cur.execute("DELETE FROM opc_actuelewaardentags WHERE \"TagNaam\" = %s", (var_tagnaam2,))
conn.commit()
Tried to escape the double quotes in attempt to make it the same as in pgadmin.
'TagNaam' is not a valid column_name identifier in sql language. You must not use single or double quotes in writing database name, table name or colunm name, but you can use apostrophe (`) .
Invalid:
DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = 'test';
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
Valid:
DELETE FROM opc_actuelewaardentags WHERE TagNaam = 'test';
DELETE FROM opc_actuelewaardentags WHERE `TagNaam` = 'test';
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
Update: According to PSQL dosc, double quote is a valid character in table and column names. It is especially used for key words while usinga as a table or column name. So following is valid:
DELETE FROM opc_actuelewaardentags WHERE "TagNaam" = 'test';
More is here...
I don't have a psql server, but a mysql server.
For MySQL:
mysql> select * from user where '1' = '1';
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.05 sec)
mysql> select * from user;
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
mysql> select * from user where '1' = "1";
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
mysql> select * from user where 'id' = "1";
Empty set (0.00 sec)
mysql> select * from user where 'id' = 1;
Empty set, 1 warning (0.02 sec)
mysql> select * from user where id = 1;
+------+
| id |
+------+
| 1 |
+------+
1 row in set (0.02 sec)
mysql> select * from user where 'id' = "id";
+------+
| id |
+------+
| 2 |
| 1 |
+------+
2 rows in set (0.00 sec)
The SQL grammar should be similar. Therefore,
cur.execute("DELETE FROM opc_actuelewaardentags WHERE 'TagNaam' = %s", (var_tagnaam,))
should be
cur.execute("DELETE FROM opc_actuelewaardentags WHERE TagNaam = %s", (var_tagnaam,))
or
cur.execute("DELETE FROM opc_actuelewaardentags WHERE `TagNaam` = %s", (var_tagnaam,))
Above analyusis is error.
Simple Postgresql Statement - column name does not exists gives the answer.
RobbeM wrote: Edit 2:Query in pgadmin worked, trying to do the same thing in python
I've had the same symptoms - I could delete table rows using pgadmin or in SQL console, but Python code wouldn't work. The thing was I was accidentally creating cursor before establishing connection with postgreSQL server:
c = db_conn.cursor()
db_conn = psycopg2.connect(conn_string)
So, the solution for me was to create cursor after establishing connection with database:
db_conn = psycopg2.connect(conn_string)
c = db_conn.cursor()

Categories