I am having trouble getting passed parameters in sqlalchemy to act as columns or tables.
For example, I'd like to select whatever column I pass as a parameter
result = connection.execute(
text(
SELECT :selected_column FROM example_table
),
**{'selected_column': 'col1'}
).fetchall()
But this ends up returning just the string version of the column name. How can I dynamically pass columns to the select statement?
Parameters are treated as if they are literal values. Parameters are not just string-substitution.
If you want to make a dynamic query that selects a column named by a Python variable, you have to expand that variable in the query string before that string is passed to the database connector.
Example:
mycol = 'col1'
result = connection.execute(
f"""
SELECT `{mycol}` FROM example_table
""").fetchall()
It's a good idea to delimit the column name in back-ticks, just in case the name contains special characters or is an SQL reserved keyword. Also it's up to you to make sure the name doesn't contain a back-tick character, or else you'll end up with imbalanced back-ticks.
These same rules apply if you combine Python variables with the query for any part that isn't normally a string literal or a numeric literal. E.g. table identifiers, keywords, expressions, etc.
Related
I am ingesting data into Snowflake tables using Python and SQLAlchemy. These tables that I have created all require quotations to query both the table name and the column names. For example, select * from "database"."schema"."table" where "column" = 2; Will run, while select * from database.schema.table where column = 2; will not run. The difference being the quotes.
I understand that if a table is created in Snowflake with quotes than quotes will be required to query it. However, I only put an Excel file in a Pandas data frame then used SQLAlchemy and pd.to_sql to create the table. An example of my code:
engine = create_engine(URL(
account = 'my_account',
user = 'my_username',
password = 'my_password',
database = 'My_Database',
schema = 'My_Schema',
warehouse = 'My_Wh',
role='My Role',
))
connection = engine.connect()
df.to_sql('My_Table', con=engine, if_exists='replace', index=False, index_label=None, chunksize=16384)
Does SQLAlchemy automatically create the tables with quotes? Is this a problem with the schema? I did not set that up. Is there a way around this?
From the SQLAlchemy Snowflake Github documentation:
Object Name Case Handling
Snowflake stores all case-insensitive object
names in uppercase text. In contrast, SQLAlchemy considers all
lowercase object names to be case-insensitive. Snowflake SQLAlchemy
converts the object name case during schema-level communication, i.e.
during table and index reflection. If you use uppercase object names,
SQLAlchemy assumes they are case-sensitive and encloses the names with
quotes. This behavior will cause mismatches agaisnt data dictionary
data received from Snowflake, so unless identifier names have been
truly created as case sensitive using quotes, e.g., "TestDb", all
lowercase names should be used on the SQLAlchemy side.
What I think this is trying to say is SQLAlchemy treats any names containing capital letters as being case-sensitive and automatically encloses them in quotes, conversely any names in lower case are not quoted. It doesn't look like this behaviour is configurable.
You probably don't have any control over database and possibly schema names, but when creating your table if you want consistent behaviour whether quoted or unquoted then you should stick to using lower case naming. What you should find is that the table name will then work whether you use "my_table" or my_table.
Using both pandas.read_sql as well as pandas.read_sql_table, I keep getting the entire table back with all the column names in lowercase.
Is there anyway around this?
I wanted to do some transformations on the data then replace the table in the DB, but it's a pain if doing so changes all the column names to lowercase.
#both of these produce the same lowercase columns
sql = 'SELECT * from "DB"."SCHEMA"."'+"tablename"+'"; '
df = pd.read_sql(
sql,
con=engine
)
df = pd.read_sql_table(
"tablename",
con=engine
)
Snowflake stores all case-insensitive object names in uppercase text.
In contrast, SQLAlchemy considers all lowercase object names to be
case-insensitive. Snowflake SQLAlchemy converts the object name case
during schema-level communication, i.e. during table and index
reflection. If you use uppercase object names, SQLAlchemy assumes they
are case-sensitive and encloses the names with quotes. This behavior
will cause mismatches agaisnt data dictionary data received from
Snowflake, so unless identifier names have been truly created as case
sensitive using quotes, e.g., "TestDb", all lowercase names should be
used on the SQLAlchemy side.
https://github.com/snowflakedb/snowflake-sqlalchemy
lowercase is default behavior in Sqlalchemy. You should not using uppercase in Sqlalchemy or pandas if it is not necessary to. You can either use "..." or quote_names in Sqlalchemy to specify case-sensitive.
If you insists on getting uppercase columns for all, this post of using events listener could be helpful https://stackoverflow.com/a/34322171/12032355
Are you sure its an issue? https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html says:
Unquoted object identifiers are stored and resolved as uppercase
characters (e.g. id is stored and resolved as ID).
def search(title="",author="",year="",isbn=""):
con = mysql.connector.connect(host="localhost", user="root", passwd="junai2104", database="book")
cur = con.cursor()
sql_statement = "SELECT * FROM book WHERE title={} or author={} or year={} or isbn={} ".format(title,author,year,isbn)
cur.execute(sql_statement)
rows=cur.fetchall()
con.close()
return rows
print(search(title='test2'))
How can I search a value in MySQL using Python argument?
how to get a values from the argument?
You have a couple of issues with your code:
In your SQL SELECT statement you are looking for values in text columns (TEXT, VARCHAR etc.). To do so you must add single quotes to your search qriteria, since you want to indicate a text literal. So WHERE title={} should be WHERE title='{}' (same goes for the other parameters).
When one or more of your arguments are empty, you will search for rows where the respective value is an empty text. So in your example search(title='test2') will trigger a search for an entry where the title column has the value 'test2' or any of the other three columns (author, year and isbn) has an empty text. If you inted to look for a title 'test2', this will only work if none of the other columns will ever contain an empty text. And even then, because of the three OR operators in your query, performance will be poor. What you should do instead is to evaluate each parameter individually and construct the query only with the parameters that are not empty.
By constructing your query with formatting a string, you will create a massive security issue in case the values of your search parameters come from user input. Your code is wide open for SQL injection, which is one of the simplest and most effective attacks on your system. You should always parametrize your queries to prevent this attack. By general principle, never create SQL queries by formating or concatenating strings with their parameters. Note that with parametrized queries you do not need to add single quotes to your query as wriitten in point 1.
I have a dictionary of column name / values, to insert into a table. I have a function that generates the INSERT statement. I'm stuck because the function always puts quotes around the values, and some are integers.
e.g. If column 1 is type integer then the statement should be INSERT INTO myTable (col1) VALUES 5; vs
INSERT INTO myTable (col1) VALUES '5'; second one causes an error saying column 5 does not exist.
EDIT: I found the problem (I think). the value was in double quotes not single, so it was "5".
In Python, given a table and column name, how can I test if the INSERT statement needs to have '' around the VALUES ?
This question was tagged with "psycopg2" -- you can prepare the statement using a format string and have psycopg2 infer types for you in many cases.
cur.execute('INSERT INTO myTable (col1, col2) VALUES (%s, %s);', (5, 'abc'))
psycopg2 will deal with it for you, because Python knows that 5 is an integer and 'abc' is a string.
http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries
You certainly want to use a library function to decide whether or not to quote values you insert. If you are inserting anything input by a user, writing your own quoting function can lead to SQL Injection attacks.
It appears from your tags that you're using psycopg2 - I've found another response that may be able to answer your question, since I'm not familiar with that library. The main gist seems to be that you should use
cursor.execute("query with params %s %s", ("param1", "pa'ram2"))
Which will automatically handle any quoting needed for param1 and param2.
Although I personally don't like the idea, you can use single quotes around integers when you insert in Postgres.
Perhaps your problem is the lack of parentheses:
INSERT INTO myTable(col1)
VALUES('5');
Here is a SQL Fiddle illustrating this code.
As you note in the comments, double quotes do not work in Postgres.
You can put always the single quote (be careful, if the value contents a quote you must double it: insert into example (value_t) values ('O''Hara');
You can decide checking the value that you want to insert regardles of the type of de destination
You can decide checking the type of the target field
As you can see in http://sqlfiddle.com/#!15/8bfbd/3 theres no mater with inserting integers into a text field or string that represents an integer in a numeric field.
To check the field type you can use the information_schema:
select data_type from information_schema.columns
where table_schema='public'
and table_name='example'
and column_name='value_i';
http://sqlfiddle.com/#!15/8bfbd/7
Having a little tricky issue with python and mysql. To keep it simple, the following code returns whatever is in the variable 'field', which is a string. Such as 'username' or 'password'.
options = [field, userID]
entries = cursor.execute('select (?) from users where id=(?)', options).fetchall()
print(entries);
This code works correctly if I remove the first (?) and just use the actually name (like 'username') instead. Can anyone provide some input?
Your query is actually formed as:
select "field" from users where id="value"
which returns you a string "field" instead of the actual table field value.
You cannot parameterize column and table names (docs):
Parameter placeholders can only be used to insert column values. They
can not be used for other parts of SQL, such as table names,
statements, etc.
Use string formatting for that part:
options = [userID]
query = 'select {field} from users where id=(?)'.format(field=field)
cursor.execute(query, options).fetchall()
Related threads with some more explanations:
pysqlite: Placeholder substitution for column or table names?
Python MySQLdb: Query parameters as a named dictionary