python add string return weird character - python

this is a part of my source code that return not true
query = ""
for i in range(len(values)):
if type(values.values()[i]) is str:
query += "'" + str(values.values()[i]) + "', "
else:
query += str(values.values()[i]) + ", "
when I use
values = {'Date': '2014-08-09 07:12:40', 'Ip': '127.0.0.1', 'MembershipID': 1}
query is
"\\\'2014-08-09 07:12:40\\\', \\\'127.0.0.1\\\', 1, "
instead of
"'2014-08-09 07:12:40', '127.0.0.1', 1, "
how can I fix this?

Do not try to quote SQL parameters yourself. Instead, leave this to the database adapter; it can do it more efficiently, always correctly and helps make database query parsing more efficient.
For MySQL you can use named parameters in the form of %(name)s and pass in your dictionary as the second argument to cursor.execute():
query = '''\
SELECT * FROM foo
WHERE
date < %(Date)s AND
ip_address = %(Ip)s AND
membership = %(MembershipId)s
'''
cursor.execute(query, values)
for row in cursor:
# ...

Related

I am using Flask and MySQL database but the update query is giving error [duplicate]

I have multiple tables that are updated after a value is changed in a grid. These tables don't always have the same keys or columns so I cannot explicitly name the columns or formats. The only thing that is ever the same, is the column where the keys reside. I know the way I am currently doing this is not correct and leaves me open to injection attacks.
I also ran into an issue where some of the values contain keys that throw an error in the SQL statement. For example, updating WHERE email = t'est#email.com.
I am not really sure of the proper way to write these statements. I did some research and see multiple methods for different purposes but am not sure which is proper. I am looking to do this as dynamically as possible. Can anyone point me in the right direction?
To connect:
import mysql.connector as sql
import MySQLdb
#Connect
self.db_name = 'database'
self.server = 'server'
self.user_id = 'user'
self.pw = 'password'
try:
self.db_con = MySQLdb.connect(user=self.user_id,password=self.pw,database=self.db_name,
host=self.server,charset='utf8',autocommit=True)
self.cursor = self.db_con.cursor()
except:
print("Error connecting")
SQL Statements:
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#SQL STATEMENTS
sql_update = "UPDATE " + tbl + " SET " + target_col + " = " + nVal + " WHERE " + key_col + " = " + key_id + ""
#INSERT
sql_update = ("INSERT INTO " + str(self.tbl) + "(" + self.key_col + ")" + "VALUES (" + str("'") + str(val) + str("'") + ")")
#DELETE
sql_update = "DELETE FROM " + tbl + " WHERE " + self.key_col + " = " + self.key_id + ""
#SELECT
sql_query = "SELECT * FROM " + self.tbl
#Excecute
try:
self.cursor.execute(sql_update)
except:
print('Error')
self.db_con.rollback()
Databases have different notations for "quoting" identifiers (table and column names etc) and values (data).
MySQL uses backticks to quote identifiers. For values, it's best to use the parameter substitution mechanism provided by the connector package: it's more likely to handle tricky cases like embedded quotes correctly, and will reduce the risk of SQL injection.
Here's an example for inserts; the same techniques can be used for the other types of query.
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#INSERT (using f-strings for brevity)
sql_update = (f"INSERT INTO `{self.tbl}` (`{self.key_col}`) VALUES (%s)")
# Pass the statement and values to cursor.execute.
# The values are assumed to be a sequence, so a single value should be
# placed in a tuple or list.
self.cursor.execute(sql_update, (nVal,))
If you have more than one column / value pair you could do something like this:
cols = ['A', 'B', 'C']
vals = ['a', 'b', 'c']
col_names = ','.join([f'`{c}`' for c in cols])
values_placeholder = ','.join(['%s'] * len(cols))
sql_update = (f"INSERT INTO `{self.tbl}` (col_names) VALUES ({values_placeholder})")
self.cursor.execute(sql_update, vals)
Values are not only data for insertion, but also data that we are using for comparison, for example in WHERE clauses. So an update statement with a filter might be created like this:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = (%s) WHERE (`{key_col}`) = %s")
self.cursor.execute(sql_update, (nVal, key_id))
However sometimes the target of a SET or WHERE clause may be a column, for example we want to do an update based on other values in the row. For example, this statement will set target_col to the value of other_col for all rows where key_col is equal to other_key_col:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = `{other_col}` WHERE (`{key_col}`) = `{other_key_col}`")
self.cursor.execute(sql_update)

Dynamic SQL Queries with Python and mySQL

I have multiple tables that are updated after a value is changed in a grid. These tables don't always have the same keys or columns so I cannot explicitly name the columns or formats. The only thing that is ever the same, is the column where the keys reside. I know the way I am currently doing this is not correct and leaves me open to injection attacks.
I also ran into an issue where some of the values contain keys that throw an error in the SQL statement. For example, updating WHERE email = t'est#email.com.
I am not really sure of the proper way to write these statements. I did some research and see multiple methods for different purposes but am not sure which is proper. I am looking to do this as dynamically as possible. Can anyone point me in the right direction?
To connect:
import mysql.connector as sql
import MySQLdb
#Connect
self.db_name = 'database'
self.server = 'server'
self.user_id = 'user'
self.pw = 'password'
try:
self.db_con = MySQLdb.connect(user=self.user_id,password=self.pw,database=self.db_name,
host=self.server,charset='utf8',autocommit=True)
self.cursor = self.db_con.cursor()
except:
print("Error connecting")
SQL Statements:
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#SQL STATEMENTS
sql_update = "UPDATE " + tbl + " SET " + target_col + " = " + nVal + " WHERE " + key_col + " = " + key_id + ""
#INSERT
sql_update = ("INSERT INTO " + str(self.tbl) + "(" + self.key_col + ")" + "VALUES (" + str("'") + str(val) + str("'") + ")")
#DELETE
sql_update = "DELETE FROM " + tbl + " WHERE " + self.key_col + " = " + self.key_id + ""
#SELECT
sql_query = "SELECT * FROM " + self.tbl
#Excecute
try:
self.cursor.execute(sql_update)
except:
print('Error')
self.db_con.rollback()
Databases have different notations for "quoting" identifiers (table and column names etc) and values (data).
MySQL uses backticks to quote identifiers. For values, it's best to use the parameter substitution mechanism provided by the connector package: it's more likely to handle tricky cases like embedded quotes correctly, and will reduce the risk of SQL injection.
Here's an example for inserts; the same techniques can be used for the other types of query.
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#INSERT (using f-strings for brevity)
sql_update = (f"INSERT INTO `{self.tbl}` (`{self.key_col}`) VALUES (%s)")
# Pass the statement and values to cursor.execute.
# The values are assumed to be a sequence, so a single value should be
# placed in a tuple or list.
self.cursor.execute(sql_update, (nVal,))
If you have more than one column / value pair you could do something like this:
cols = ['A', 'B', 'C']
vals = ['a', 'b', 'c']
col_names = ','.join([f'`{c}`' for c in cols])
values_placeholder = ','.join(['%s'] * len(cols))
sql_update = (f"INSERT INTO `{self.tbl}` (col_names) VALUES ({values_placeholder})")
self.cursor.execute(sql_update, vals)
Values are not only data for insertion, but also data that we are using for comparison, for example in WHERE clauses. So an update statement with a filter might be created like this:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = (%s) WHERE (`{key_col}`) = %s")
self.cursor.execute(sql_update, (nVal, key_id))
However sometimes the target of a SET or WHERE clause may be a column, for example we want to do an update based on other values in the row. For example, this statement will set target_col to the value of other_col for all rows where key_col is equal to other_key_col:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = `{other_col}` WHERE (`{key_col}`) = `{other_key_col}`")
self.cursor.execute(sql_update)

Constructing safe SQL statement with multiple AND/OR conditons

I have the following problem: I have a really large SQL statement inside Python code in string form:
sql = f"""
*many statements here*
"""
Part of that SQL statement is:
where 1 = 1
and selector in ('YES', 'NO')
AND parameter1 = value1
AND parameter2 = value2.1 OR parameter2 = value2.2
AND ...
where those AND/OR statements are given by Python dictionary in the form
{ parameter1: [value1], parameter2: [value2.1, value 2.2], ...}
I've written a function which takes that dictionary and unfolds it to become a string in the form:
AND (parameter1 = value1) AND ((parameter2 = value2.1) OR (parameter2 = value2.2)) AND ...
and inserted that string into large SQL statement via this function:
where 1 = 1
and selector in ('YES', 'NO')
{form_sql_statement_from_dictionary(dictionary)}
but it seems that this approach is vulnerable for SQL-injection attacks. Now, the safe way would be to parametrise the large SQL statement, but since I don't know how many parameters and values there will be in dictionary, I don't know how to make such parametrisation. Also, I can't change the large SQL statement. Somehow I have to form and insert that AND/OR statement into existing string in safe way. Is there any way of doing that, rather than trying to police dictionary values itself?
Full Python script looks like this:
async def query_for_data(
connection: "PgService", dictionary: Dict[str, Any]
) -> pd.DataFrame:
sql = f"""
*multiple SQL statements*
where 1 = 1
and selector in ('YES', 'NO')
{form_sql_statement_from_dictionary(dictionary)}
"""
data = await connection.fetch(sql)
data = pd.DataFrame(res, columns=[k for k in res[0].keys()])
return data
Function looks like this:
def form_sql_statement_from_dictionary(
dictionary: Dict[str, Any]) -> str:
hashvalue = list(dictionary.values())
scope = hashvalue[0]["scope"]
dictionary_element_names = list(scope.keys())
statement_elements = []
for element_name in dictionary_element_names:
dictionary_element_values = scope[element_name]
if len(dictionary_element_values) == 1:
dictionary_element_value = dictionary_element_values[0]
statement_element = (
f"( {prefix}{element_name} = '{dictionary_element_value}' )"
)
statement_elements.append(statement_element)
else:
statement_or_elements = []
for dictionary_element_value in dictionary_element_values:
statement_element = (
f"{prefix}{element_name} = '{dictionary_element_value}'"
)
statement_or_elements.append(statement_element)
final_or_statement = "( " + " OR ".join(statement_or_elements) + ")"
statement_elements.append(final_or_statement)
final_statement = " AND " + " AND ".join(statement_elements)
return final_statement
Details found here:
https://www.psycopg.org/docs/usage.html
First, build the SQL where clause with positional arguments, such as...
(x.col1 = %s) AND (x.col2 = %s OR x.col3 = %s)
At the same time create a list of those parameters
['foo', 'foo', 'bar']
Then use a parameterised query...
cursor = connection.cursor()
cursor.execute(sql, parameters)
data = await cursor.fetchall()
The parameterisation will quote and escape all the parameters for you, so no SQL Injection attacks from those.
BUT the column names are still being substituted in to the query directly. There is no inbuilt way to protect you from that. If a user has direct control over those strings, they can still hack you that way.
As such it is vital that you police, validate, whatever, those column names yourself, through whatever means are appropriate to your use case.
All in all, the revised python would look something like...
def form_sql_statement_from_dictionary(dictionary):
hashvalue = list(dictionary.values())
scope = hashvalue[0]["scope"]
dictionary_element_names = list(scope.keys())
prefix='ummmmmmmmmm.'
statement_elements = []
statement_params = []
for element_name in dictionary_element_names:
statement_elements.append(
' OR '.join(
f"{prefix}{element_name} = %s"
for item in scope[element_name]
)
)
statement_params += scope[element_name]
return '(' + ') AND ('.join(statement_elements) + ')', statement_params
async def query_for_data(
connection: "PgService", dictionary: Dict[str, Any]
) -> pd.DataFrame:
sql_where_clause, sql_params = form_sql_statement_from_dictionary(dictionary)
sql = f"""
*multiple SQL statements*
where 1 = 1
and selector in ('YES', 'NO')
and {sql_where_clause}
"""
cursor = connection.cursor()
cursor.execute(sql, sql_params)
res = await cursor.fetchall()
data = pd.DataFrame(res, columns=[k for k in res[0].keys()])
return data

psycopg2 dictionary dump to json

Sorry if this is a noob question, but I am trying to dump a psycopg2 dictionary directly into a json string. I do get a return value in the browser, but it isn't formatted like most of the other json examples I see. The idea being to dump the result of a select statement into a json string and unbundle it on the other end to add into a database on the client side. The code is below and a sample of the return. Is there a better way to do this operation with json and psycopg2?
# initializing variables
location_connection = location_cursor = 0
sql_string = coordinate_return = data = ""
# opening connection and setting cursor
location_connection = psycopg2.connect("dbname='' user='' password=''")
location_cursor = location_connection.cursor(cursor_factory=psycopg2.extras.RealDictCursor)
# setting sql string and executing query
sql_string = "select * from " + tablename + " where left(alphacoordinate," + str(len(coordinate)) + ") = '" + coordinate + "' order by alphacoordinate;"
location_cursor.execute(sql_string)
data = json.dumps(location_cursor.fetchall())
# closing database connection
location_connection.close()
# returning coordinate string
return data
sample return
"[{\"alphacoordinate\": \"nmaan-001-01\", \"xcoordinate\":
3072951151886, \"planetarydiameter\": 288499, \"planetarymass\":
2.020936938e+27, \"planetarydescription\": \"PCCGQAAA\", \"planetarydescriptionsecondary\": 0, \"moons\": 1"\"}]"
You could create the JSON string directly in Postgres using row_to_json:
# setting sql string and executing query
sql_string = "select row_to_json(" + tablename + ") from " + tablename + " where left(alphacoordinate," + str(len(coordinate)) + ") = '" + coordinate + "' order by alphacoordinate;"
location_cursor.execute(sql_string)
data = location_cursor.fetchall()

Why pymysql not insert record into table?

I am pretty new in python developing. I have a long python script what "clone" a database and add additional stored functions and procedures. Clone means copy only the schema of DB.These steps work fine.
My question is about pymysql insert exection:
I have to copy some table contents into the new DB. I don't get any sql error. If I debug or print the created INSERT INTO command is correct (I've tested it in an sql editor/handler). The insert execution is correct becuse the result contain the exact row number...but all rows are missing from destination table in dest.DB...
(Ofcourse DB_* variables have been definied!)
import pymysql
liveDbConn = pymysql.connect(DB_HOST, DB_USER, DB_PWD, LIVE_DB_NAME)
testDbConn = pymysql.connect(DB_HOST, DB_USER, DB_PWD, TEST_DB_NAME)
tablesForCopy = ['role', 'permission']
for table in tablesForCopy:
with liveDbConn.cursor() as liveCursor:
# Get name of columns
liveCursor.execute("DESCRIBE `%s`;" % (table))
columns = '';
for column in liveCursor.fetchall():
columns += '`' + column[0] + '`,'
columns = columns.strip(',')
# Get and convert values
values = ''
liveCursor.execute("SELECT * FROM `%s`;" % (table))
for result in liveCursor.fetchall():
data = []
for item in result:
if type(item)==type(None):
data.append('NULL')
elif type(item)==type('str'):
data.append("'"+item+"'")
elif type(item)==type(datetime.datetime.now()):
data.append("'"+str(item)+"'")
else: # for numeric values
data.append(str(item))
v = '(' + ', '.join(data) + ')'
values += v + ', '
values = values.strip(', ')
print("### table: %s" % (table))
testDbCursor = testDbConn.cursor()
testDbCursor.execute("INSERT INTO `" + TEST_DB_NAME + "`.`" + table + "` (" + columns + ") VALUES " + values + ";")
print("Result: {}".format(testDbCursor._result.message))
liveDbConn.close()
testDbConn.close()
Result is:
### table: role
Result: b"'Records: 16 Duplicates: 0 Warnings: 0"
### table: permission
Result: b'(Records: 222 Duplicates: 0 Warnings: 0'
What am I doing wrong? Thanks!
You have 2 main issues here:
You don't use conn.commit() (which would be either be liveDbConn.commit() or testDbConn.commit() here). Changes to the database will not be reflected without committing those changes. Note that all changes need committing but SELECT, for example, does not.
Your query is open to SQL Injection. This is a serious problem.
Table names cannot be parameterized, so there's not much we can do about that, but you'll want to parameterize your values. I've made multiple corrections to the code in relation to type checking as well as parameterization.
for table in tablesForCopy:
with liveDbConn.cursor() as liveCursor:
liveCursor.execute("SELECT * FROM `%s`;" % (table))
name_of_columns = [item[0] for item in liveCursor.description]
insert_list = []
for result in liveCursor.fetchall():
data = []
for item in result:
if item is None: # test identity against the None singleton
data.append('NULL')
elif isinstance(item, str): # Use isinstance to check type
data.append(item)
elif isinstance(item, datetime.datetime):
data.append(item.strftime('%Y-%m-%d %H:%M:%S'))
else: # for numeric values
data.append(str(item))
insert_list.append(data)
testDbCursor = testDbConn.cursor()
placeholders = ', '.join(['`%s`' for item in insert_list[0]])
testDbCursor.executemany("INSERT INTO `{}.{}` ({}) VALUES ({})".format(
TEST_DB_NAME,
table,
name_of_columns,
placeholders),
insert_list)
testDbConn.commit()
From this github thread, I notice that executemany does not work as expected in psycopg2; it instead sends each entry as a single query. You'll need to use execute_batch:
from psycopg2.extras import execute_batch
execute_batch(testDbCursor,
"INSERT INTO `{}.{}` ({}) VALUES ({})".format(TEST_DB_NAME,
table,
name_of_columns,
placeholders),
insert_list)
testDbConn.commit()
How to insert data into table using python pymsql
Find my solution below
import pymysql
import datetime
# Create a connection object
dbServerName = "127.0.0.1"
port = 8889
dbUser = "root"
dbPassword = ""
dbName = "blog_flask"
# charSet = "utf8mb4"
conn = pymysql.connect(host=dbServerName, user=dbUser, password=dbPassword,db=dbName, port= port)
try:
# Create a cursor object
cursor = conn.cursor()
# Insert rows into the MySQL Table
now = datetime.datetime.utcnow()
my_datetime = now.strftime('%Y-%m-%d %H:%M:%S')
cursor.execute('INSERT INTO posts (post_id, post_title, post_content, \
filename,post_time) VALUES (%s,%s,%s,%s,%s)',(5,'title2','description2','filename2',my_datetime))
conn.commit()
except Exception as e:
print("Exeception occured:{}".format(e))
finally:
conn.close()

Categories