parameterised postgresql select statement using python - python

sql="select %s,tablename from pg_table_def where tablename like (%s)"
data=("schemaname","abc",)
cur.execute(sql,data)
If I pass a value as described above, then the select takes it as a string.
Which is not the intention.
If I try
data=(schemaname,"abc",)
then it shows the error global name 'schemaname' is not defined.

You cannot parameterize object name (in this case, a column name) that way. You could instead resort to string manipulation:
column = "schemaname"
sql = "select {}, tablename from pg_table_def where tablename like (%s)".format(column)
data= ("abc",)
cur.execute(sql,data)

Related

Pass Argument Through Sql Queries Pandas

I want to convert lot of database table into dataframe. So I tried this step manually.
sql_query = pd.read_sql_query ('''
SELECT
*
FROM attendance
''', test_db_engine)
test_db_attendance_df=pd.DataFrame(sql_query)
Where 'test_db_engine' is database connection.
This method work and I can create dataframe for table attendance.
Now, I want to put this into function so I can do with any table not just one. So I tried this method.
def sql_to_df(table_name):
sql_query = pd.read_sql_query ('''
SELECT
*
FROM table_name
''', test_db_engine)
test_db_df=pd.DataFrame(sql_query)
return test_db_df
sql_to_df(attendance)
It threw me an error:-
name 'attendance' is not defined
Can anyone tell me how to pass function argument through sql query so I can convert any number of database table into pandas dataframe? I need to pass attendance inside sql query replacing table_name.
python thinks that attendance is a variable , but you need to pass a string to the function and then use string replacement
def sql_to_df(table_name):
sql_query = pd.read_sql_query ('''
SELECT
*
FROM %s
''' % (table_name), test_db_engine)
test_db_df=pd.DataFrame(sql_query)
return test_db_df
sql_to_df('attendance')
use f-strings to format your query.
and pass attendance as a string (your error occurred because no variable attendance was set).
read_sql_query returns a dataframe.
def sql_to_df(table_name):
return pd.read_sql_query(f'''
SELECT
*
FROM {table_name}
''', test_db_engine)
sql_to_df("attendance")

use string as columns definition for DataFrame(cursor.fetchall(),columns

I would like to use a string as column names for pandas DataFrame.
The problem arised is that pandas DataFrame interpret the string var as single column instead of multiple ones. An thus the error:
ValueError: 1 columns passed, passed data had 11 columns
The first part of my code is intended to get the column names from the Mysql database I am about to query:
cursor1.execute ("SELECT GROUP_CONCAT(COLUMN_NAME) AS cols FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'or_red' AND TABLE_NAME = 'nomen_prefix'")
for colsTableMysql in cursor1.fetchall() :
colsTable = colsTableMysql[0]
colsTable="'"+colsTable.replace(",", "','")+"'"
The second part uses the created variable "colsTable" :
cursor = connection.cursor()
cursor.execute("SELECT * FROM or_red.nomen_prefix WHERE C_emp IN ("+emplazamientos+")")
tabla = pd.DataFrame(cursor.fetchall(),columns=[colsTable])
#tabla = exec("pd.DataFrame(cursor.fetchall(),columns=["+colsTable+"])")
#tabla = pd.DataFrame(cursor.fetchall())
I have tried ather aproaches like the use of exec(). In that case, there is no error but there is no response with information either, and the result of print(tabla) is None.
¿Is there any direct way of passing the columns dynamically as string to a python pandas DataFrame?
Thanks in advance
I am going to answer my question since I've already found the way.
The first part of my code is intended to get the column names from the Mysql database table I am about to query:
cursor1.execute ("SELECT GROUP_CONCAT(COLUMN_NAME) AS cols FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = 'or_red' AND TABLE_NAME = 'nomen_prefix'")
for colsTableMysql in cursor1.fetchall() :
colsTable = colsTableMysql[0]
colsTable="'"+colsTable.replace(",", "','")+"'"
The second part uses the created variable "colsTable" as input in the statement to define the columns.
cursor = connection.cursor()
cursor.execute("SELECT * FROM or_red.nomen_prefix WHERE C_emp IN ("+emplazamientos+")")
tabla = eval("pd.DataFrame(cursor.fetchall(),columns=["+colsTable+"])")
Using eval the string is parsed and evaluated as a Python expression.

Parameterized Python SQLite3 query is returning the first parameter

I'm trying to make a query to a SQLite database from a python script. However, whenever I use parameterization it just returns the first parameter, which is column2. The desired result is for it to return the value held in column2 on the row where column1 is equal to row1.
conn = sqlite3.connect('path/to/database')
c = conn.cursor()
c.execute('SELECT ? from table WHERE column1 = ? ;', ("column2","row1"))
result = c.fetchone()[0]
print(result)
It prints
>>column2
Whenever I run this using concatenated strings, it works fine.
conn = sqlite3.connect('path/to/database')
c = conn.cursor()
c.execute('SELECT ' + column2 + ' from table WHERE column1 = ' + row1 + ';')
result = c.fetchone()[0]
print(result)
And it prints:
>>desired data
Any idea why this is happening?
This behaves as designed.
The mechanism that parameterized queries provide is meant to pass literal values to the query, not meta information such as column names.
One thing to keep in mind is that the database must be able to parse the parameterized query string without having the parameter at hand: obviously, a column name cannot be used as parameter under such assumption.
For your use case, the only possible solution is to concatenate the column name into the query string, as shown in your second example. If the parameter comes from outside your code, be sure to properly validate it before that (for example, by checking it against a fixed list of values).

passing string arguments to filter database rows in python

i have a written the below function to filter a column in a sql query, the function takes a string argument which will be inputted in the 'where clause'
def summaryTable(machineid):
df=pd.read_sql(""" SELECT fld_ATM FROM [003_tbl_ATM_Tables]
WHERE (LINK <> 1) AND (fld_ATM =('machineid')) ;
""",connection)
connection.close()
return df
the function returns an empty Dataframe. i know the query itself is correct 'cause i get the expected data when i 'hardcode' the machine id
Use params to pass a tuple of parameters including machineid to read_sql. pyodbc replaces the ? character in your query with parameters from the tuple, in order. Their values will be safely substituted at runtime. This avoids dangerous string formatting issues which may result in SQL injection.
df = pd.read_sql(""" SELECT fld_ATM FROM [003_tbl_ATM_Tables]
WHERE (LINK <> 1) AND (fld_ATM = ?) ;
""", connection, params=(machineid,))
You need to add machineid to query using params.
# ? is the placeholder style used by pyodbc. Some use %s, for example.
query = """ SELECT fld_ATM FROM [003_tbl_ATM_Tables]
WHERE (LINK <> 1) AND (fld_ATM = ?) ;
"""
data_df = pd.read_sql_query(query, engine, params=(machineid, ))

use only variables to model an sqlite table with python

I'm practicing in SQLite and Python. I'm trying to build a TABLE using only user prompts as database objects. After some extensive searches (official documentation says nothing about this kind of syntax-please correct me!) I found this method:
new_table = raw_input('Enter a table name: ')
column = raw_input('Enter column name: ')
cur.execute(''' CREATE TABLE IF NOT EXISTS {tn} ({col})'''\
.format(tn = new_table, col = column))
It works very nice and I find it intuitive. My problem is with INSERT INTO syntax. While the following code works ok:
cur.execute("INSERT INTO {tn} ({col}) VALUES (?)", ('goodmorning')\
.format(tn=new_table, col=column))
This code below, won't work:
insdata = raw_input('Insert data for column: ')
cur.execute("INSERT INTO {tn} ({col}) VALUES (?)", (insdata,)\
.format(tn=new_table, col=column))
and fails with error: 'tuple' object has no attribute format.
Question is: what is the proper syntax to assign insdata value to SQLite VALUES?
If you write this is a slightly clearer fashion, you'll see what's going on:
cur.execute(
"INSERT INTO {tn} ({col}) VALUES (?)",
(insdata,).format(tn=new_table, col=column)
)
You're not formatting the string, you're formatting the a tuple of arguments. Instead, you want:
cur.execute(
"INSERT INTO {tn} ({col}) VALUES (?)".format(tn=new_table, col=column),
(insdata,)
)
or perhaps a little more clearly :
sql = "INSERT INTO {tn} ({col}) VALUES (?)".format(tn=new_table, col=column)
cur.execute(sql, (insdata,))
In this case your line continuation character is not needed at all (since you're inside a function call) but if it were needed it would make much more sense to position it between arguments rather than between an object and the method invocation on the object.
I think you are invoking format method of tuple (which appears not to have one) instead of a string with SQL query:
cur.execute("INSERT INTO {tn} ({col}) VALUES ({val})".format(tn=new_table,col=column,val='goodmorning'))

Categories