sqlite3 parameters of unsupported type - python

I am trying to make a simple web budget app with a SQL database but I keep getting ValueError: parameters are of an unsupported type
from flask import Flask, render_template, request
app = Flask(__name__)
#app.route("/")
def index():
return (render_template('index.html'))
#app.route("/start", methods=['GET', 'POST'])
def start():
import sqlite3
import pandas as pd
if request.method == 'POST':
connection = sqlite3.connect('transactions.db')
cursor = connection.cursor()
amount = request.form.get('amount')
description = request.form.get('description')
category = request.form.get('category')
print(amount, description, category)
cursor.execute(f"""INSERT INTO transactions (amount, description, category)
VALUES ({amount}, '{description}', '{category}')""", connection)
cursor.close()
connection.commit()
table = pd.read_sql("""SELECT * FROM transactions """, connection)
return(render_template('start.html'), value1 == table)
else:
return(render_template('start.html'))
here is the SQL database
'-- SQLite
CREATE TABLE transactions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
amount REAL,
description TEXT,
category TEXT
);
The inputs I used were
10.50, 'Dog Food', and 'Misc.'

Quick solution, do not pass connection and use SQL placeholder binding (with ?) rather than string formatting.
cursor.execute(f"""
INSERT INTO transactions (amount, description, category)
VALUES (?, ?, ?)
""", (amount, description, category))
Longer explanation with your original code:
cursor.execute(f"""INSERT INTO transactions (amount, description, category)
VALUES ({amount}, '{description}', '{category}')""", connection)
Your origin code is passing the connection as an argument to cursor.execute, and connection is a sqlite3 instance, which is unsupported as the supported argument for cursor.execute is either a sequence or dict for parameter substitution in your query.
Also see the actual definition of the argument, and why my solution code uses ? instead of doing string substitution:
https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.execute
execute(sql, parameters=(), /)
Execute SQL statement sql. Bind values to the statement
using placeholders that map to the sequence or dict parameters.
https://docs.python.org/3/library/sqlite3.html#sqlite3-placeholders
SQL operations usually need to use values from Python variables. However, beware of using Python’s string operations to assemble queries, as they are vulnerable to SQL injection attacks (see the xkcd webcomic for a humorous example of what can go wrong):

Related

Raw Query with SQL function in SQLAlchemy/encode/databases

I'm a complete beginner at Python and FastAPI.
I'm using FastAPI and I have a table where the requirement is to encrypt the personal information of the user using pgcrypto module of PostgreSQL.
The raw query would be something like this which can be executed into any database client and it executes without any error
insert into customers (email, gender) values (pgm_sym_encrypt('hello#gmail.com', 'some_secret_key'), 'male')
How to execute this query using SQLAlchemy core or encode/databases?
I've tried this
from sqlalchemy import func
query = f"""
insert into customers (email, gender) values
(:email, :gender)
"""
await database.execute(query=query, values={'email': func.pgm_sys_encrypt('hello#gmail.com', 'secret_key'), 'gender': 'male'})
It didn't work.
I also tried
query = f"""
insert into customers (email, gender) values
(pgm_sys_encrypt('hello#gmail.com', 'secret_key'), :gender)
"""
await database.execute(query=query, values={'gender': 'male'})
This didn't work either. I've no idea how to execute a function in the raw query. Please help. I've tried so much but I'm totally clueless on this one still now. Thank you in advance for your help.
As it's a raw query you should be able specify it as you would in raw SQL, so this should work:
from sqlalchemy.sql import text
query = """
insert into customers (email, gender) values
(pgm_sys_encrypt(:email, :secret_key), :gender)
"""
await database.execute(query=text(query), values={'email': 'hello#gmail.com', 'secret_key': 's3cr37', 'gender': 'male'})

Passing arguments to mysql using python

I am trying to write a program in Python that accepts the user inputs and queries from the MySQL database.
My database has the following tables:-
Departments(dept_no (primary key), dept_name)
Employees(emp_no(primary key), f_name, l_name, dob, hire_date)
Salaries(emp_no(primary key), salary, from_date(primary key), to_date)
When I give the following input:- Display the employees with salary greater than 20000.
Then the program should perform the following action:-
from sqlite3 import Error
from MySQLdb import connect
def mysql_code():
try:
with connect(host="localhost", user="root", password="root", database="employees") as connection:
with connection.cursor() as cursor:
cursor.execute("SELECT e.emp_no, e.first_name, e.last_name, s.salary from employees e inner join "
"salaries s on e.emp_no = s.emp_no where s.salary > '20000'")
records = cursor.fetchall()
print(records)
print("Total results found = ", cursor.rowcount)
except Error as e:
print(e)
and display the results.
Is it possible to do so or do I have to write code for each possible query?
I previously used:
cursor.execute("SELECT * FROM {} WHERE {} like %s".format(table, column), (text,))
When I defined the query and gave the user options to choose from where I wrote a query for each of the possible user inputs like
to display all records,
search records with the first name and so on. When the user chose an option the result was displayed.
Now I want the user to give inputs such as
Display employees with salaries greater than 20000 working in dept_no d002. or similar queries.
The program should accept queries in the form of a string from the user.
The code should join the tables and display the result by joining the emp_no, first_name, last_name, salary, dept_no from the tables employees, salaries and departments respectively.
you have an error in your code the like after the on comparison is wrong
from sqlite3 import Error
from MySQLdb import connect
def mysql_code():
try:
with connect(host="localhost", user="root", password="root", database="employees") as connection:
with connection.cursor() as cursor:
cursor.execute("SELECT e.emp_no, e.first_name, e.last_name, s.salary from employees e inner join "
"salaries s on e.emp_no = s.emp_no where s.salary > '20000'")
records = cursor.fetchall()
print(records)
print("Total results found = ", cursor.rowcount)
except Error as e:
print(e)
If I'm understanding your question correctly, you want to perform customizable MySQL queries in Python, creating a constant function and passing it the query.
For this I would use f strings. With this you would create a string query
cursor.execute(f"SELECT {values_to_select} \
from {table_name} \
where {filters} \
{more_stuff/extra_queries}")
This way you can pass the values to the function and it will perform the query with the values specified.
Note: As a test this is fine, but if this code is going to be used in production, giving the end user access to queries can be a huge security risk, checkout SQL injection to see how these kinds of attacks happen. I would recommend to sanitize the given strings first so they can't create custom SQL queries.

Python cx_Oracle module: unable to format query in code

I am using cx_Oracle module to connect to oracle database. In the script i use two variables schema_name and table_name. The below query works fine
cur1.execute("select owner,table_name from dba_tables where owner ='schema_name'")
But i need to query the num of rows of a table, where i need to qualify the table_name with the schema_name and so the query should be
SELECT count(*) FROM "schema_name"."table_name"
This does not work when using in the code, i have tried to put it in triple quotes, single quotes and other options but it does not format the query as expected and hence errors out with table does not exist.
Any guidance is appreciated.
A prepared statement containing placeholders with variables of the form ...{}.{}".format(sc,tb) might be used
sc='myschema'
tb='mytable'
cur1.execute("SELECT COUNT(*) FROM {}.{}".format(sc,tb))
print(cur1.fetchone()[0])
In this particular case, you could also try setting Connection.current_schema, see the cx_Oracle API doc
For example, if you create table in your own schema:
SQL> show user
USER is "CJ"
SQL> create table ffff (mycol number);
Table created.
SQL> insert into ffff values (1);
1 row created.
SQL> commit;
Commit complete.
Then run Python code that connects as a different user:
import cx_Oracle
import os
import sys, os
if sys.platform.startswith("darwin"):
cx_Oracle.init_oracle_client(lib_dir=os.environ.get("HOME")+"/Downloads/instantclient_19_8")
username = "system"
password = "oracle"
connect_string = "localhost/orclpdb1"
connection = cx_Oracle.connect(username, password, connect_string)
connection.current_schema = 'CJ';
with connection.cursor() as cursor:
sql = """select * from ffff"""
for r in cursor.execute(sql):
print(r)
sql = """select sys_context('USERENV','CURRENT_USER') from dual"""
for r in cursor.execute(sql):
print(r)
the output will be:
(1,)
('SYSTEM',)
The last query shows that it is not the user that is being changed, but just the first query is automatically changed from 'ffff' to 'CJ.ffff'.

psycopg2 - Using SQL object with execute_values

I'm inserting data using execute_values, which takes a sql query. The query is constructed using psycopg2.sql.SQL as recommended in the documentation, but execute_values won't take that object.
Here's the code I have:
import psycopg2 as pg
from psycopg2 import extras
from psycopg2 import sql
config = {
'host' : 'localhost',
'user' : 'username',
'password' : 'password',
'dbname' : 'myDatabase'
}
connection = pg.connect(**config)
cursor = connection.cursor()
tableName = 'myTable'
dataset = [[1,2],[3,4],[5,6]]
queryText = "INSERT INTO {table} (uid,value) VALUES %s"
query = sql.SQL(queryText).format(table=sql.Identifier(tableName))
extras.execute_values(cursor,query,dataset)
The last line gives the following error:
AttributeError: 'Composed' object has no attribute 'encode'
If the query is specified directly as a string, as below, then the execution runs.
query = """INSERT INTO "myTable" (uid,value) VALUES %s"""
It's possible to insert the table name into the query using string format, but apparently that shouldn't be done, even at gunpoint. How can I safely insert a variable table name into the query and use execute_values? I can't find a built-in way to convert the SQL object to a string.
The parameter sql in execute_values(cur, sql, argslist, template=None, page_size=100) is supposed to be a string:
sql – the query to execute. It must contain a single %s placeholder, which will be replaced by a VALUES list. Example: "INSERT INTO mytable (id, f1, f2) VALUES %s".
Use the as_string(context) method:
extras.execute_values(cursor, query.as_string(cursor), dataset)
connection.commit()
As execute_values() expect the sql statement to be a string you can simply user:
queryText = "INSERT INTO {table} (uid,value) VALUES %s".format(table=sql.Identifier(tableName)

PostgreSQL Schema "www" does not exist?

From the PostgreSQL documentation if you do a INSERT without specifying a schema it should be a public schema.
conn = psycopg2.connect(dbname = 'orion',
host = 'localhost',
port = 5432,
user = 'earthling',
password = 'mysupersecretpassword')
sql = conn.cursor()
def INSERT(table, info, text):
date = datetime.date.today()
query = "INSERT INTO %s (info, text, date) " \
"VALUES (%s, %s, %s)" %(table, info, text, date)
sql.execute(query)
INSERT("main", "www.capecod.edu", "test")
For some reason I'm seeing the following error?
psycopg2.ProgrammingError: schema "www" does not exist
You are using string interpolation to create the query. This is what psycopg2 executes:
INSERT INTO main (info, text, date)
VALUES (www.capecod.edu, test, 2015-09-12)
If it's not obvious what's wrong here, it's that none of the values are quoted. Here is the properly quoted version:
INSERT INTO main (info, text, date)
VALUES ('www.capecod.edu', 'test', '2015-09-12')
The error is caused by the unquoted www.capecod.edu. Due to the dots, it's being interpreted as schema.table.column.
The "right" way to do this is with a parameterized query.
query = "INSERT INTO main (info, text, date) VALUES (%s, %s, %s)"
params = (info, text, date)
sql.execute(query, params)
psycopg2 will figure out what should be quoted and how. This is a safer option than simply interpolating the string yourself, which often leaves you open to SQL injection attack.
http://initd.org/psycopg/articles/2012/10/01/prepared-statements-psycopg/
Unfortunately, you can't just toss identifiers such as the table name in as a parameter, because then they are quoted as string values, which is bad SQL syntax. I found an answer (python adds "E" to string) that points to psycopg2.extensions.AsIs as a way to pass identifiers such as table names safely as parameters. I wasn't able to make this work in my testing, though.
If you go the AsIs route, you should be cautious about checking the table names are valid, if they somehow come from user input. Something like
valid_tables = ["main", "foo", "bar", "baz"]
if table not in valid_tables:
return False

Categories