I want to use sqlbuilder (https://sqlbuilder.readthedocs.io/en/latest/) library for building native queries to sqlite. There is my code for inserting data:
import sqlite3
from sqlbuilder.smartsql import Q, T
from sqlbuilder.smartsql.dialects.sqlite import compile
if __name__ == '__main__':
connection = sqlite3.connect(':memory:')
with connection:
connection.execute('CREATE TABLE temp (t TEXT, i INTEGER)')
insert = compile(Q(T.temp).insert({T.temp.t: 'text', T.temp.i: 1}))
sql, params = insert
connection.execute(
sql, params
)
connection.close()
This code does not work, because compile produces incorrect sql and params for sqlite:
('(?, (?, ?))', ['INSERT INTO "temp" ("i", "t") VALUES (%s, %s)', 1, 'text']), and I got the error: sqlite3.OperationalError: near "(": syntax error
Interesting, there is not problems with compiling and executing select statements.
UPDATE:
Code for select statements and it's work:
import sqlite3
from sqlbuilder.smartsql import Q, T
from sqlbuilder.smartsql.dialects.sqlite import compile
if __name__ == '__main__':
connection = sqlite3.connect(':memory:')
with connection:
connection.execute('CREATE TABLE temp (t TEXT, i INTEGER)')
select = compile(Q(T.temp).fields('*'))
print(select) # ('SELECT * FROM `temp`', [])
sql, params = select
connection.execute(
sql, params
)
connection.close()
Answer emended
From the python doc for sqlite3 APIs:
Usually your SQL operations will need to use values from Python
variables. You shouldn’t assemble your query using Python’s string
operations because doing so is insecure; it makes your program
vulnerable to an SQL injection attack (see https://xkcd.com/327/ for
humorous example of what can go wrong).
Instead, use the DB-API’s parameter substitution. Put ? as a
placeholder wherever you want to use a value, and then provide a tuple
of values as the second argument to the cursor’s execute() method.
(Other database modules may use a different placeholder, such as %s or
:1.) For example:
# Never do this -- insecure!
symbol = 'RHAT' c.execute("SELECT * FROM stocks WHERE symbol = '%s'" % symbol)
# Do this instead
t = ('RHAT',) c.execute('SELECT * FROM stocks WHERE symbol=?', t)
The returned value of insert` `('(?, (?, ?))', ['INSERT INTO "temp" ("i", "t") VALUES (%s, %s)', 1, 'text']) indicates sqlbuilder is trying to take this advice. What remains is to how to do the string interpolation to put it into valid sqlite syntax. Turns out the result argument to the Q constructor will do just that.
insert = Q(T.temp,result=Result(compile=compile)).insert({T.temp.t: 'text', T.temp.i: 1}) will return a tuple that is "SQL ready", ie: ('INSERT INTO `temp` (`i`, `t`) VALUES (?, ?)', [1, 'text']). Now you see the '%s' have been replaced by '?'. Don't forget to import Result.
Related
in my python code I insert a value into a table.
In the table, there is a sequence which automatically assigns an ID.
After the insert, I want to get this it back in to my python application:
import cx_Oracle, sys
with cx_Oracle.connect(user=ORA_USER,password=ORA_PWD,dsn=ORA_DSN) as conn:
with conn.cursor() as cur:
cur.execute("Insert into my_table columns(data) values ('Hello')")
conn.commit()
with cx_Oracle.connect(user=ORA_USER,password=ORA_PWD,dsn=ORA_DSN) as conn:
with conn.cursor() as cur:
r = cur.execute("select id from my_table where data = 'Hello'")
print(r)
if r is None:
print("Cannot retrieve ID")
sys.exit()
Unfortunately, the result set r is always "None" even though the value has been inserted properly (checked via sqldeveloper).
What am I doing wrong?
I even open a new connection to be sure to grab the value...
After calling execute() for a SELECT statement you need to call fetchone(), fetchmany() or fetchall() as shown in the cx_Oracle documentation SQL Queries.
Or you can use an iterator:
with connection.cursor() as cursor:
try:
sql = """select systimestamp from dual"""
for r in cursor.execute(sql):
print(r)
sql = """select 123 from dual"""
(c_id,) = cursor.execute(sql).fetchone()
print(c_id)
except oracledb.Error as e:
error, = e.args
print(sql)
print('*'.rjust(error.offset+1, ' '))
print(error.message)
However to get an automatically generated ID returned without the overhead of an additional SELECT, you can change the INSERT statement to use a RETURNING INTO clause. There is an example in the cx_Oracle documentation DML RETURNING Bind Variables that shows an UPDATE. You can use similar syntax with INSERT.
With the table:
CREATE TABLE mytable
(myid NUMBER(11) GENERATED BY DEFAULT ON NULL AS IDENTITY (START WITH 1),
mydata VARCHAR2(20));
You can insert and get the generated key like:
myidvar = cursor.var(int)
sql = "INSERT INTO mytable (mydata) VALUES ('abc') RETURNING myid INTO :bv"
cursor.execute(sql, bv=myidvar)
i, = myidvar.getvalue()
print(i)
If you just want a unique identifier you get the ROWID of an inserted row without needing a bind variable. Simple access cursor.lastrowid after executing an INSERT.
I want to insert a new row in my table by using the python-mariadb connector. For that I prefer to use the SET clause.
For some reason it does work if I only want to save ints (i.e y=2), but when I use a string, the following error occurs
Unknown column 'myString' in 'field list'
It seems it thinks the content of the string is a column name? Any idea how to fix that (I can do it with INSERT INTO ... VALUES ..., but I want to use the SET clause here). From my understanding, it should save both an int and a str without throwing an error
Thank you.
See the code example below
def myfunction():
x = 1
y ='myString'
db = connect_db()
cur = db.cursor()
sql = "INSERT INTO Table SET col1={}, col2={}"
cur.execute(sql.format(x, y))
db.commit()
db.close()
return
Here the MariaDB Connector, but this should be fine as it works for other db functions.
import mariadb
def connect_db():
db = mariadb.connect(
user="user",
password="123",
host="localhost",
port=3306,
database="DB"
)
db.autocommit = False
return db
you are not using right syntax for insert
sql = "INSERT INTO Table (col1,col2) values({}, {})"
but if you want to update an existing row:
sql = "UPDATE Table SET col1={}, col2={} WHERE id = {}"
and probably you need a where clause
The code in question produces the SQL statement:
INSERT INTO Table SET col1=1, col2=myString;
This is incorrect syntax, and strings must be in single-quotes:
INSERT INTO Table (col1, col2) VALUES (1, 'myString');
def myfunction():
x = 1
y ='myString'
db = connect_db()
cur = db.cursor()
sql = "INSERT INTO Table (col1, COL2) VALUES ({}, '{}')"
cur.execute(sql.format(x, y))
db.commit()
db.close()
return
But the above is fragile. Don't use string building methods to create SQL statements, it is much better to use parameter binding.
def myfunction():
x = 1
y ='myString'
db = connect_db()
cur = db.cursor()
sql = "INSERT INTO Table (col1, col2) VALUES (?, ?)"
cur.execute(sql, (x, y))
db.commit()
db.close()
return
The MariaDB connector documentation explains these things.
Retrieving Data
Once you have the initial code in place you can start working with the data. The first thing you should do is try to
retrieve information from the database. Here is code for a query
against the employees database:
cur.execute(
"SELECT first_name,last_name FROM employees WHERE first_name=?",
(some_name,))
MariaDB Connector/Python uses prepared statements, sanitizing and inserting the values from the tuple into the position
of the question marks (?). This is safer than inserting through
f-strings or format specifiers when working with user provided
information.
The query results are stored in a list in the cursor object. To view
the results, you can loop over the cursor.
Adding Data
Using the same execute() method with an INSERT statement, you can add rows to the table.
cursor.execute(
"INSERT INTO employees (first_name,last_name) VALUES (?, ?)",
(first_name, last_name))
I am trying to insert info from a pandas DataFrame into a database table by using a function that I wrote:
def insert(table_name="", name="", genere="", year=1, impd_rating=float(1)):
conn = psycopg2.connect("dbname='database1' user='postgres' password='postgres333' host='localhost' port=5433 ")
cur = conn.cursor()
cur.execute("INSERT INTO %s VALUES %s,%s,%s,%s" % (table_name, name, genere, year, impd_rating))
conn.commit()
conn.close()
When I try to use this function like this:
b=0
for row in DF['id']:
insert(impd_rating=float(DF['idbm_rating'][b]),
year=int(DF['year'][b]),
name=str(DF['name'][b]),
genere=str(DF['genere'][b]),
table_name='test_movies')
b = b+1
I get the following syntax error:
SyntaxError: invalid syntax
PS D:\tito\scripts\database training> python .\postgres_script.py
Traceback (most recent call last):
File ".\postgres_script.py", line 56, in <module>insert (impd_rating=float(DF['idbm_rating'][b]),year=int(DF['year'][b]),name=str(DF['name'][b]),genere=str(DF['genere'][b]),table_name='test_movies')
File ".\postgres_script.py", line 15, in insert
cur.execute("INSERT INTO %s VALUES %s,%s,%s,%s" % (table_name ,name ,genere , year,impd_rating))
psycopg2.ProgrammingError: syntax error at or near "Avatar"
LINE 1: INSERT INTO test_movies VALUES Avatar,action,2009,7.9
I also tried to change the str replacement method from %s to .format()
but I had the same error.
The error message is explicit, this SQL command is wrong at Avatar: INSERT INTO test_movies VALUES Avatar,action,2009,7.9. Simply because values must be enclosed in parenthesis, and character strings must be quoted, so the correct SQL is:
INSERT INTO test_movies VALUES ('Avatar','action',2009,7.9)
But building a full SQL command by concatenating parameters is bad practice (*), only the table name should be directly inserted into the command because is is not a SQL parameter. The correct way is to use a parameterized query:
cur.execute("INSERT INTO %s VALUES (?,?,?,?)" % (table_name,) ,(name ,genere , year,impd_rating)))
(*) It was the cause of numerous SQL injection flaws because if one of the parameter contains a semicolumn (;) what comes after could be interpreted as a new command
Pandas has a DataFrame method for this, to_sql:
# Only needs to be executed once.
conn=psycopg2.connect("dbname='database1' user='postgres' password='postgres333' host='localhost' port=5433 ")
df.to_sql('test_movies', con=conn, if_exists='append', index=False)
This should hopefully get you going in the right direction.
In your original query
INSERT INTO %s VALUES %s,%s,%s,%s
there is a sql problem: you need braces around the values, i.e. it should be VALUES (%s, %s, %s, %s). On top of that the table name cannot be merged as a parameter, or it would be escaped as a string, which is not what you want.
You can use the psycopg 2.7 sql module to merge the table name to the query, with placeholders for the values:
from psycopg2 import sql
query = sql.SQL("INSERT INTO {} VALUES (%s, %s, %s, %s)").format(
sql.Identifier('test_movies'))
cur.execute(query, ('Avatar','action',2009,7.9))
This will make secure both merging the table name and the arguments to the query.
Hello mohamed mahrous,
First install psycopg2 package for the access access PostgreSQL database.
Try this below code,
import psycopg2
conn=psycopg2.connect("dbname='database1' user='postgres' password='postgres333' host='localhost' port=5433 ")
cur=conn.cursor()
def insert(table_name,name,genere,year,impd_rating):
query = "INSERT INTO "+table_name+"(name,genere,year,impd_rating) VALUES(%s,%s,%s,%s)"
try:
print query
cur.execute(query,(name,genere,year,impd_rating))
except Exception, e:
print "Not execute..."
conn.commit()
b=0
for row in DF['id']:
insert (impd_rating=float(DF['idbm_rating'][b]),year=int(DF['year'][b]),name=str(DF['name'][b]),genere=str(DF['genere'][b]),table_name='test_movies')
b= b+1
conn.close()
Example,
import psycopg2
conn=psycopg2.connect("dbname='database1' user='postgres' password='postgres333' host='localhost' port=5433 ")
cur=conn.cursor()
def insert(table_name,name,genere,year,impd_rating):
query = "INSERT INTO "+table_name+"(name,genere,year,impd_rating) VALUES(%s,%s,%s,%s)"
try:
print query
cur.execute(query,(name,genere,year,impd_rating))
except Exception, e:
print "Not execute"
conn.commit()
b=0
for row in DF['id']:
insert (impd_rating="7.0",year="2017",name="Er Ceo Vora Mayur",genere="etc",table_name="test_movies")
b= b+1
conn.close()
I hope my answer is helpful.
If any query so comment please.
i found a solution for my issue by using sqlalchemy and pandas to_sql method
thanks for help everyone
from sqlalchemy import *
import pandas as pd
def connect(user, password, db, host='localhost', port=5433):
'''Returns a connection and a metadata object'''
# We connect with the help of the PostgreSQL URL
# postgresql://federer:grandestslam#localhost:5432/tennis
url = 'postgresql://{}:{}#{}:{}/{}'
url = url.format(user, password, host, port, db)
# The return value of create_engine() is our connection object
con = sqlalchemy.create_engine(url, client_encoding='utf8')
# We then bind the connection to MetaData()
meta = sqlalchemy.MetaData(bind=con, reflect=True)
return con, meta
con, meta = connect('postgres','postgres333','database1')
movies= Table('test',meta,
Column('id',Integer,primary_key=True),
Column('name',String),
Column('genere',String),
Column('year',Integer),
Column('idbm_rating',REAL))
meta.create_all(con)
DF=pd.read_csv('new_movies.txt',sep=' ',engine='python')
DF.columns=('id','name' ,'genere' ,'year' ,'idbm_rating' )
DF.to_sql('movies', con=con, if_exists='append', index=False)
I can't insert data into sqlite table.
Here is my code:
import sqlite3
connection = sqlite3.connect('mydata.db')
cursor = connection.cursor()
a=raw_input('name')
a=str(a)
b=raw_input('theme')
b=str(b)
c=raw_input('language')
c=str(c)
sql="INSERT INTO Website (Website, Theme, Language) VALUES (%, %, %)",(a,b,c)
cursor.execute(sql)
connection.commit()
connection.close()
For some reason it doesn't work.
the extended call syntax is f(*args):
cursor.execute(*sql)
sqlite uses '?' placeholder:
conn.execute('insert into sometable values (?,?,?)', (a,b,c))
raw_input() already returns a string. a = str(a) is unnecessary
With Python's DB API spec you can pass an argument of parameters to the execute() method. Part of my statement is a WHERE IN clause and I've been using a tuple to populate the IN. For example:
params = ((3, 2, 1), )
stmt = "SELECT * FROM table WHERE id IN %s"
db.execute(stmt, params)
But when I run into a situation where the parameter tuple is only a tuple of 1 item, the execute fails.
ProgrammingError: ERROR: syntax error at or near ")"
LINE 13: WHERE id IN (3,)
How can I get the tuple to work with clause properly?
Edit: If you think this answer circumvents the built-in protections against SQL-injection attack you're mistaken; look more closely.
Testing with pg8000 (a DB-API 2.0 compatible Pure-Python interface to the PostgreSQL database engine):
This is the recommended way to pass multiple parameters to an "IN" clause.
params = [3,2,1]
stmt = 'SELECT * FROM table WHERE id IN (%s)' % ','.join('%s' for i in params)
cursor.execute(stmt, params)
Full example:
>>> from pg8000 import DBAPI
>>> conn = DBAPI.connect(user="a", database="d", host="localhost", password="p")
>>> c = conn.cursor()
>>> prms = [1,2,3]
>>> stmt = 'SELECT * FROM table WHERE id IN (%s)' % ','.join('%s' for i in prms)
>>> c.execute(stmt,prms)
>>> c.fetchall()
((1, u'myitem1'), (2, u'myitem2'), (3, u'myitem3'))
The error is coming from the comma after the 3. Just leave it off for the single values and you're set.
params = ((3), ... )
stmt = "SELECT * FROM table WHERE id IN %s"
db.execute(stmt, params)
This may not be an answer to exactly the question you asked, but I think it may solve the problem you have.
Python's DB-API doesn't seem to give you a way to pass tuples as safely substituted parameters. The accepted answer from bernie is using the Python % operator for substitution, which is unsafe.
However, you may not have to pass tuples as parameters, particularly when the tuple you want is the result of another SQL query (as you indicated to Daniel). Instead, you can use SQL subqueries.
If the set of IDs you want in your IN clause is the result of SELECT id FROM other_table WHERE use=true, for example:
stmt = "SELECT * FROM table WHERE id IN (SELECT id FROM other_table WHERE use=true)"
db.execute(stmt)
And this can be parameterized (the safe way), too. If the IDs you want to select are the ones with a given parent_id:
stmt = "SELECT * FROM table WHERE id IN (SELECT id FROM other_table WHERE parent_id=%s)"
params = (parent_id,)
db.execute(stmt, params)
A solution with f-string.
params = [...]
stmt = f"SELECT * FROM table WHERE id IN ({','.join(['%s']*len(params ),)})"
db.execute(stmt, params)
If there is another param placeholder it will be like this
age = 18
params = [...]
stmt = f"SELECT * FROM table WHERE age>%s AND id IN ({','.join(['%s']*len(params ),)})"
db.execute(stmt, tuple([age] + params))
As the question said, the following will fail:
params = ((3, 2, 1), )
stmt = "SELECT * FROM table WHERE id IN %s"
db.execute(stmt, params)
Following the pg8000 docs the IN can be replaced with an ANY() to give the same result:
params = ((3, 2, 1), )
stmt = "SELECT * FROM table WHERE id = ANY(%s)"
db.execute(stmt, params)
This sends the query and parameters separately to the server, avoiding SQL injection attacks.
The accepted answer risks SQL injection; you should never ever pass user input directly to the database. Instead, generate a query with the correct number of placeholders, then let pg8000 do the escaping:
params = [3,2,1]
# SELECT * from table where id in (%s,%s,%s)
stmt = 'SELECT * FROM table WHERE id IN ({})'.format(','.join(['%s']*len(params)))
cursor.execute(stmt, tuple(params))