I have dictionaries like this:
{'id': 8, 'name': 'xyzzy', 'done': False}
the table is already created with the correct column names (keys of the dictionary). How can I insert the values in the respective columns? I want to create a new row for each dictionary.
Note that for 'done' the type defined is originally Integer since sqlite does not offer bool type.
cur = connection().cursor()
query = "insert .... tablename"
In Python, database cursors accept two parameters:
an SQL statement as a string: the statement may contain placeholders instead of some values to handle cases where the values are not known until runtime.
a collection of values to be inserted into the SQL statement. These values replace the placeholders in the SQL statement when it is executed.
Placeholders may be positional or named:
# Positional placeholders: the order of values should match the order of
# placeholders in the statement. Values should be contained with
# a tuple or list, even if there is only one.
cur.execute("""SELECT * FROM tbl WHERE name = ? AND age = ?""", ('Alice', 42))
# Named placeholders: values and placeholders are matched by name, order
# is irrelevant. Values must be contained within a mapping (dict) of
# placeholders to values.
cur.execute(
"""SELECT * FROM tbl WHERE name = :name AND age = :age""",
{'age': 42, 'name': 'Alice'}
)
You can dictionary to cursor execute and it will do the right thing as long as the values placeholders in the SQL statement used the :named format (that is, the dict key prefixed by a colon ":").
conn = sqlite3.connect()
cur = conn.cursor()
stmt = """INSERT INTO mytable (id, name, done) VALUES (:id, :name, :done)"""
cur.execute(stmt, {'id': 8, 'name': 'xyzzy', 'done': False})
# Call commit() on the connection to "save" the data.
conn.commit()
This method ensures that values are correctly quoted before being inserted into the database and protects against SQL injection attacks.
See also the docs
You could use .format() method to insert into a query string however this is much more straightforward.
dic = {'id': 8, 'name': 'xyzzy', 'done': False}
cur.execute("INSERT INTO tablename VALUES (:id,:name,:done)",{"id" : dic["id"],"name" : dic["name"],"done" : dic["done"]})
Related
Problem
I'm using PyMYSQL to query a database using the following SQL translater function.
def retrieve_column(lotkey, column="active",print = False):
result = None
try:
connection = sql_connect()
with connection.cursor() as cursor:
# Create a new record
sql = "SELECT %s FROM table_n"
val = (column)
os_print(sql + '\r\n...', end='', style='dim', flush=True)
cursor.execute(sql, val)
result = cursor.fetchall()
connection.close()
except Exception as e:
os_print(e, style='error')
os_print("ERROR: Can't connect to database!", style='error', tts='error')
return result
Which I call and print using the following lines. Note: The 'active' column is boolean.
active_col = retrieve_column(key)
print(active_col)
Which prints the following bizarre result. It seems to be a dictionary with no values present therein.
...[{'active': 'active'}, {'active': 'active'}, {'active': 'active'}, {'active': 'active'}, {'active': 'active'}, {'active': 'active'}, {'active': 'active'}]
Attempted Solutions
My first step was to run the same query in MySQL workbench which produced the following result.
Workbench Query
Which is roughly what I am trying to replicate in Python (Getting a dictionary with each row's boolean value).
Next, I used the python debugger and found that indeed the returned values from cursor.fetchall() are empty dictionaries with nothing but a single key and no values.
Has anyone encountered something similar before?
Actually using these three instructions:
sql = "SELECT %s FROM table_n"
val = (column)
cursor.execute(sql, val)
You will get the following query executed:
SELECT 'column' FROM table_n
The result is a list of 'column' values (name of the column is also 'column'). Because the parameters of the cursor.execute() method are not literals, but parameter values (in this case, a string value 'column')
If you are trying to select the column value, you need to format the SQL query content, not the paramaters:
sql = "SELECT {colname} FROM table_n"
sql.format( colname = column )
cursor.execute(sql)
Column names cannot be passed to cursor the same way argument values can be passed. For that you do actually need to format the query string.
sql = "SELECT {} FROM table_n".format(column)
cursor.execute(sql)
I'm trying to dynamically bind the variable value to be inserted into database table column.
Example variable value in json:
document= {'zipCode': '99999',
'name': 'tester',
'company': 'xxxx'}
And my database table column as:
table name: table1
column: id,zip_code,name,company
My code in python:
with connection.cursor() as cursor:
sql = "INSERT INTO table1(zip_code, name, company) VALUES (%s,%s,%s)"
cursor.execute(sql,(document['zipCode'],
document['name'],
document['company']))
connection.commit()
However, if one of the key-value in document is absent, definitely the INSERT query will encounter error. i.e. ONLY document['name'] exist in document variable
Any thought to handle this for efficient code ?
This is something that, generally, ORMs like SQLAlchemy or Peewee solve pretty easily for you.
But, if I were to implement, I would probably do something "dynamic" based on the available keys:
QUERY = "INSERT INTO table1({columns}) VALUES ({values})"
def get_query(document):
columns = list(document.keys())
return QUERY.format(columns=", ".join(columns),
values=", ".join('%({})s'.format(column) for column in columns))
Sample usage:
In [12]: get_query({'zipCode': '99999', 'name': 'tester', 'company': 'xxxx'})
Out[12]: 'INSERT INTO table1(company, zipCode, name) VALUES (%(company)s, %(zipCode)s, %(name)s)'
In [13]: get_query({'name': 'tester'})
Out[13]: 'INSERT INTO table1(name) VALUES (%(name)s)'
Then, you would just parameterize the query with the document dictionary as we've created named placeholders in the query:
cursor.execute(get_query(document), document)
What would be the most elegant way to save multiple dictionaries - most of them following the same structure, but some having more/less keys - to the same SQL database table?
The steps I can think of are the following:
Determine which dictionary has the most keys and then create a table which follows the dictionary's keys order.
Sort every dictionary to match this column order.
Insert each dictionary's values into the table. Do not insert anything (possible?) if for a particular table column no key exists in the dictionary.
Some draft code I have:
man1dict = {
'name':'bartek',
'surname': 'wroblewski',
'age':32,
}
man2dict = {
'name':'bartek',
'surname': 'wroblewski',
'city':'wroclaw',
'age':32,
}
with sqlite3.connect('man.db') as conn:
cursor = conn.cursor()
#create table - how do I create it automatically from man2dict (the longer one) dicionary, also assigning the data type?
cursor.execute('CREATE TABLE IF NOT EXISTS People(name TEXT, surname TEXT, city TEXT, age INT)')
#show table
cursor.execute('SELECT * FROM People')
print(cursor.fetchall())
#insert into table - this will give 'no such table' error if dict does not follow table column order
cursor.execute('INSERT INTO People VALUES('+str(man1dict.values())+')', conn)
Use NoSQL databases such as MongoDB for this purpose. They will handle these themselves. Using relational data for something that is not relational, this is an anti-pattern. This will break your code, degrade your application's scalability and when you want to change the table structure, it will more cumbersome to do so.
It might be easiest to save the dict as pickle and then unpickle it later. ie
import pickle, sqlite3
# SAVING
my_pickle = pickle.dumps({"name": "Bob", "age": 24})
conn = sqlite3.connect("test.db")
c = conn.cursor()
c.execute("CREATE TABLE test (dict BLOB)")
conn.commit()
c.execute("insert into test values (?)", (my_pickle,))
conn.commit()
# RETRIEVING
b = [n[0] for n in c.execute("select dict from test")]
dicts = []
for d in b:
dicts.append(pickle.loads(d))
print(dicts)
This outputs
[{"name": "Bob", "age": 24}]
I'm new to programming. I have dictionary called record, that receives various inputs like 'Color', 'Type' 'quantity',etc. Now I tried to add a Date column then insert into sqlite table running through the 'if loop' with the code below. But I get an "Operational error near 2017", ie near the date.
Can anyone help please? Thanks in advance
Date = str(datetime.datetime.fromtimestamp(time.time()).strftime('%Y-%m-%d'))
record['Date'] = Date
column = [record['Color'], Date]
values = [record['quantity'], record['Date']]
column = ','.join(column)
if record['Type'] == 'T-Shirts' and record['Style'] == 'Soft':
stment = ("INSERT INTO xtrasmall (%s) values(?)" %column)
c.execute(stment, values)
conn.commit()
Updated
You can simplify the code as follows:
from datetime import datetime
date = datetime.now().date()
sql = "INSERT INTO xtrasmall (%s, Date) values (?, ?)" % record['Color']
c.execute(sql, (record['quantity'], date))
This substitutes the value of the selected color directly into the column names in the query string. Then the query is executed passing the quantity and date string as arguments. The date should automatically be converted to a string, but you could convert with str() if desired.
This does assume that the other colour columns have a default value (presumably 0), or permit null values.
Original answer
Because you are constructing the query with string interpolation (i.e. substituting %s for a string) your statement becomes something like this:
INSERT INTO xtrasmall (Red,2017-10-06) values(?)
which is not valid because 2017-10-06 is not a valid column name. Print out stment before executing it to see.
If you know what the column names are just specify them in the query:
values = ['Red', 2, Date]
c.execute("INSERT INTO xtrasmall (color, quantity, date) values (?, ?, ?)", values)
conn.commit()
You need to use a ? for each column that you are inserting.
It looks like you want to insert the dictionary using its keys and values. This can be done like this:
record = {'date':'2017-10-06', 'color': 'Red', 'quantity': 2}
columns = ','.join(record.keys())
placeholders = ','.join('?' * len(record.values()))
sql = 'INSERT INTO xtrasmall ({}) VALUES ({})'.format(columns, placeholders)
c.execute(sql, record.values())
This code will generate the parameterised SQL statement:
INSERT INTO xtrasmall (date,color,quantity) VALUES (?,?,?)
and then execute it using the dictionary's values as the parameters.
In Python 2.7, let a dictionary with features' IDs as keys.
There are thousands of features.
Each feature has a single value, but this value is a tuple containing 6 parameters for the features (for example; size, color, etc.)
On the other hand I have a postgreSQL table in a database where these features parameters must be saved.
The features' IDs are already set in the table (as well as other informations about these features).
The IDs are unique (they are random (thus not serial) but unique numbers).
There is 6 empty columns with names: "param1", "param2", "param3", ..., "param6".
I already have a tuple containing these names:
columns = ("param1", "param2", "param3", ..., "param6")
The code I have doesn't work for saving these parameters in their respective columns for each feature:
# "view" is the dictionary with features's ID as keys()
# and their 6 params stored in values().
values = [view[i] for i in view.keys()]
columns = ("param1","param2","param3","param4","param5","param6")
conn = psycopg2.connect("dbname=mydb user=username password=password")
curs = conn.cursor()
curs.execute("DROP TABLE IF EXISTS mytable;")
curs.execute("CREATE TABLE IF NOT EXISTS mytable (LIKE originaltable including defaults including constraints including indexes);")
curs.execute("INSERT INTO mytable SELECT * from originaltable;")
insertstatmnt = 'INSERT INTO mytable (%s) values %s'
alterstatement = ('ALTER TABLE mytable '+
'ADD COLUMN param1 text,'+
'ADD COLUMN param2 text,'+
'ADD COLUMN param3 real,'+
'ADD COLUMN param4 text,'+
'ADD COLUMN param5 text,'+
'ADD COLUMN param6 text;'
)
curs.execute(alterstatement) # It's working up to this point.
curs.execute(insertstatmnt, (psycopg2.extensions.AsIs(','.join(columns)), tuple(values))) # The problem seems to be here.
conn.commit() # Making change to DB !
curs.close()
conn.close()
Here's the error I have:
curs.execute(insert_statement, (psycopg2.extensions.AsIs(','.join(columns)), tuple(values)))
ProgrammingError: INSERT has more expressions than target columns
I must miss something.
How to do that properly?
When using '%s' to get the statement as what I think you want, you just need to change a couple things.
Ignoring c.execute(), this statement is by no means wrong, but it does not return what you are looking for. Using my own version, this is what I got with that statement. I also ignored psycopg2.extensions.AsIs() because, it is just a Adapter conform to the ISQLQuote protocol useful for objects whose string representation is already valid as SQL representation.
>>> values = [ i for i in range(0,5)] #being I dont know the keys, I just made up values.
>>> insertstatmnt, (','.join(columns), tuple(vlaues))
>>> ('INSERT INTO mytable (%s) values %s', ('param1,param2,param3,param4,param5,param6', (0, 1, 2, 3, 4)))
As you can see, what you entered returns a tuple with the values.
>>> insertstatmnt % (','.join(columns), tuple(values))
>>> 'INSERT INTO mytable (param1,param2,param3,param4,param5,param6) values (0, 1, 2, 3, 4)'
Where as, this returns a string that is more likely to be read by the SQL. The values obviously do not match the specified ones. I believe the problem you have lies within creating your string.
Reference for pycopg2: http://initd.org/psycopg/docs/extensions.html
As I took the syntax of the psycopg2 command from this thread:
Insert Python Dictionary using Psycopg2
and as my values dictionary doesn't exactly follow the same structure as the mentioned example (I also have 1 key as ID, like in this example, but mine has only 1 corresponding value, as a tuple containing my 6-parameters, thus "nested 1 lever deeper" instead of directly 6 values corresponding to the keys) I need to loop through all features to execute one SQL statement per feature:
[curs.execute(insertstatmnt, (psycopg2.extensions.AsIs(', '.join(columns)), i)) for i in tuple(values)].
This, is working.