Can Postgres resolve character escapes passed down from RESP API to JSONB? - python

I have a Python API that inserts data into Postgres table using stored procedure that takes in a jsonb and does an insert into table. example:
create function insert_new_employee(i jsonb) returns json as:
$$
begin
insert into newhire(name, payload) values (i ->> 'name', i)
end
$$
When I do a test insert using SQL client such as datagrip as such it inserts successfully:
select insert_new_employee('{"name":"alfred"}')
However, when I use Python API the payload gets transformed into:
"{\"name\":\"alfred\"}"
Because of the escaping the name doesn't get parsed and stored into the name column however the jsonb payload column takes it in.
Is there a way for Postgres to clean up and deal with character escape when request are passed through API?

Related

SQL Server adds b'...' to the jwt tokens I store

I am storing JWT tokens used in authentication for a web service in a SQL Server database. I've declared the column that holds the tokens as type nvarchar(max) and am passing them into the database as simple strings. When I insert the token, SQL Server adds b'...' to it. For example, say I add the token ABC123, then it stores b'ABC123'.
This is causing problems down the line as when I compare the token stored in the database to the token being passed through the api, the comparison fails due to the additional characters (the database seems to have literally added a lower case b and two single quotes to the stored string).
I know I can just strip the b and quotes off my string once I pull it from the database, but I'd like to know why SQL Server is doing this and how I can prevent it from doing it.
Anyway, any insight would be much appreciated.
EDIT: is it possible pyodbc is doing this? Here is my code to insert:
cnxn = pyodbc.connect("<connection string>")
crsr = cnxn.cursor()
crsr.execute("insert into jwt values (?,?)", [userId, str(auth_token)])
Perhaps it is confused between Python 3 and Python 2 strings? (I'm using Python 3.6.) In that case, how would I fix it?
Turns out it was python's "str" conversion. jwt.encode returns a byte array which str then turns into b'...'. The solution was to decode the bytes explicitly instead of using str. So my new insert method is:
crsr.execute("insert into jwt values (?,?)", [userId, auth_token.decode("utf-8")])
Really a simple solution but one that is very obscure. Hopefully my efforts with this help someone else!

plpython function to update a row is not working

I am using PostgreSQL 9.4.7 and Python 2.7.6. I am writing a plpython function to update a row in user table and my code is as below -
CREATE FUNCTION update_user(myid int, mymail text, myname text) RETURNS text AS $$
from plpy import spiexceptions
plan=plpy.prepare("UPDATE auth_user SET email=$2, username=$3 WHERE id = $1",
["int"] ["text"]["text"])
rv=plpy.execute(plan, [myid,myemail,myusername])
return rv
$$ LANGUAGE plpythonu;
I am able to create this function successfully in postgres DB but while I am trying to execute it via below command on postgres shell-
select update_user(1,"xyz#xyz.com#sifymail.com","updatedname");
I am getting following error -
ERROR: column "amarshukla#sifymail.com" does not exist
LINE 1: select update_user(1,"amarshukla#sifymail.com","hell");
Can someone point me where am I making a mistake?
First of all, string literals in postgresql must be surrounded by single quotes, not the double ones:
select update_user(1,'xyz#xyz.com#sifymail.com','updatedname');
Double quotes are used to refer columns.
Second. The arguments list in your prepare function isn't correct python list. It should look like:
plpy.prepare("UPDATE auth_user SET email=$2, username=$3 WHERE id = $1",
["int", "text", "text"])

Issues with data return by raw sql query in Django

I am fetching data from db using raw SQl queries , I am following these django docs.
As you can see in django docs we have a function dictfetchall which returns dict with field names and values. I am converting this dict to json as per my requirements but as you can see in output cursor return values with unicode like
> 54360982L
> for date - datetime.date(2015, 8, 3)
> for decimal - Decimal('0.63')
Thus dict cannot be convert into json as it will raise error Decimal('0.63') is not JSON serializable . So How i can prevent raw sql to sent simple values from db or need to edit dictfetchall function , if yes then where ?

Creating Insert Statement for MySQL in Python

I am trying to construct an insert statement that is built from the results of a query. I run a query that retrieves results from one database and then creates an insert statement from the results and inserts that into a different database.
The server that is initially queried only returns those fields in the reply which are populated and this can differ from record to record. The destination database table has all of the possible fields available. This is why I need to construct the insert statement on the fly for each record that is retrieved and why I cannot use a default list of fields as I have no control over which ones will be populated in the response.
Here is a sample of the code, I send off a request for the T&C for an isin and the response is a name and value.
fields = []
data = []
getTCQ = ("MDH:T&C|"+isin+"|NAME|VALUE")
mdh.execute(getTCQ)
TC = mdh.fetchall()
for values in TC:
fields.append(values[0])
data.append(values[1])
insertQ = ("INSERT INTO sp_fields ("+fields+") VALUES ('"+data+"')")
The problem is with the fields part, mysql is expecting the following:
INSERT INTO sp_fields (ACCRUAL_COUNT,AMOUNT_OUTSTANDING_CALC_DATE) VALUES ('030/360','2014-11-10')
But I am getting the following for insertQ:
INSERT INTO sp_fields ('ACCRUAL_COUNT','AMOUNT_OUTSTANDING_CALC_DATE') VALUES ('030/360','2014-11-10')
and mysql does not like the ' ' around the fields names.
How do I get rid of these? so that it looks like the 1st insertQ statement that works.
many thanks in advance.
You could use ','.join(fields) to create the desired string (without quotes around each field).
Then use parametrized sql and pass the values as the second argument to cursor.execute:
insertQ = ("INSERT INTO sp_fields ({}) VALUES ({})".format(
','.join(fields), ','.join(['%s']*len(dates)))
cursor.execute(insertQ, dates)
Note that the correct placemarker to use, e.g. %s, depends on the DB adapter you are using. MySQLdb uses %s, but oursql uses ?, for instance.

python sqlite3 Double quotes resulting query failure

I have a Sqlite3 table
paste{
paste_id int,
paste_content text
}
i have to do an update statement, where text can possibly contain single ' quotes as well as "" double quotes.
In python i wrote
UPDATE_Statement = "Update paste set paste_content = '%s' where paste_id=id" %(content)
But since the content can contain ' or "" , my execute query is not working properly.
How can i escape this properly ?
Do not use string interpolation. Use SQL parameters instead:
UPDATE_Statement = "Update paste set paste_content = %s where paste_id=%s"
cursor.execute(UPDATE_Statement, content)
and leave escaping (and proper quoting) up to the database adapter instead. This:
Simplifies your code
Quotes different data types correctly
Lets the database reuse query plans for varying data
Prevents SQL injection attacks
See the Passing parameters into raw() in the Django SQL documentation.
If you are using a different database connector (not the connection provided by Django) verify the specific style of parameter placeholders in the documentation. The sqlite3 database adapter for example, uses ? as the placeholder syntax.

Categories