psycopg2 complains when inserting multiple words, empty strings, and empty arrays:
name = "Meal Rounds"
description = ""
sizes = []
cur.execute(""" INSERT INTO items (name, description, sizes) VALUES (%s, %s, %s)""" % (name, description, sizes))
Errors:
# Multi word error
psycopg2.ProgrammingError: syntax error at or near "Rounds"
LINE 1: ... (name, description, sizes) VALUES (Meal Rounds, , ...
^
# Empty string error
psycopg2.ProgrammingError: syntax error at or near ","
LINE 1: ...scription, sizes) VALUES ("Meal Rounds", , [], Fals...
^
# Empty array error
psycopg2.ProgrammingError: syntax error at or near "["
LINE 1: ...n, sizes) VALUES ("Meal Rounds", "None", [], False)...
^
I can get around the multi word error by escaping:
""" INSERT INTO items (name, description, sizes) VALUES (\"%s\", \"%s\", %s)"""
But for tables with 15+ columns, escaping each one is a pain. Does psycopg2 not handle this in an easier fashion? It will still throw errors for empty strings though.
How do I insert multiple words more efficiently, and how to insert empty strings and arrays?
Here is what psql prints out on my columns:
name | character varying(255) |
description | character varying(255) |
sizes | integer[] |
Your call to execute is creating a string with Python string substitution, which is turning out to be invalid SQL. You should be using the parameter substitution provided by the Python DB API:
https://www.python.org/dev/peps/pep-0249/#id15
To call execute using parameter substitution, you pass it two arguments. The first is the query with parameter strings which are database dependent. Psycopg2 uses "pyformat" paramstyle so your query will work as written. The second argument should be the variables you want to substitute into the query. The database driver will handle all the quoting/escaping you need. So your call to execute should be
cur.execute("""INSERT INTO items (name, description, sizes) VALUES (%s, %s, %s)""", (name, description, sizes))
Related
Trying to pick up some python. I'm quite new to it at the moment.
I created the code below, but it returns an error.
I am able to get it to work when creating a second column and write multiple values to the db but a single value doesn't seem to work. Probably a list, tuple thing, but can not figure out what exactly.
Error:
Traceback (most recent call last):
File "test.py", line 15, in <module>
cursor.executemany("INSERT INTO combination VALUES (?)", combination)
sqlite3.ProgrammingError: Incorrect number of bindings supplied. The current statement uses 1, and there are 2 supplied.
Code:
import sqlite3
conn = sqlite3.connect("combinations.db")
cursor = conn.cursor()
cursor.execute(r"create table if not exists combination (string text)")
combination = []
chars = "abcd"
for char1 in chars:
for char2 in chars:
combination.append((char1+char2))
cursor.executemany("INSERT INTO combination VALUES (?)", combination)
conn.commit()
You missed making the string into a tuple when adding to the list. The argument to executemany expects a list of iterables, so if you pass it a single string 'ab' in the list, it will treat it as a 2-item iterator of a & b - hence the error.
You need to make the string 'ab' into a 1-item tuple like ('ab',). You do this by adding a trailing comma to the expression you're appending:
combination.append((char1+char2,))
Full code:
import sqlite3
conn = sqlite3.connect("combinations.db")
cursor = conn.cursor()
cursor.execute(r"create table if not exists combination (string text)")
combination = []
chars = "abcd"
for char1 in chars:
for char2 in chars:
combination.append((char1+char2,)) # ('ab',) etc.
cursor.executemany("INSERT INTO combination VALUES (?)", combination)
conn.commit()
def quantity():
i = 0
x = 1
file = open("john.txt", "r")
while i < 5000:
for line in file:
c.execute("INSERT INTO test (playerNAME, playerID) VALUES ("+line+", "+str(x)+")")
conn.commit()
x = random.randint(100,10000000000000000)
i += 1
I try to iterate through the John.txt file and insert each value into a table. The first word in the txt file is "abc123". When I run this code there is an error: sqlite3.OperationalError: no such column: abc123
I can get the code to enter the random numbers into playerID but I can't get the txt file query to work...
You need single quotes around the string.
c.execute("INSERT INTO test (playerNAME, playerID) VALUES ('"+line+"', "+str(x)+")")
Otherwise it tries to interpret it as a sql expression and looks for the named column.
More generally you should use parameters or sanitize the incoming data from the file for safety against sql insertion. Even if you trust this particular file. It's a good habit.
c.execute("INSERT INTO test (playerName, playerID) VALUES (?, ?)", (line, x))
Details are here and here is why it's important.
Formatting sql queries via string concatenation is very bad practice.
Variable bindging should always be used:
c.execute("INSERT INTO test (playerNAME, playerID) VALUES (?, ?)", [line, x])
In your case the line probably contains spaces or any punctuation mark.
The sqlite's error string is misleading, though.
I'm wondering if you can help me. I'm trying to change the value in each column if the text matches a corresponding keyword. This is the loop:
for i in range(0, 20, 1):
cur.execute("UPDATE table SET %s = 1 WHERE text rlike %s") %(column_names[i], search_terms[i])
The MySQL command works fine on its own, but not when I put it in the loop. It's giving an error at the first %s
Does anyone have any insights?
This is the error:
_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '%s = 1 WHERE text rlike %s' at line 1")
Column names looks like
column_names = ["col1","col2","col3"...]
Search terms look like
search_terms = ["'(^| |.|-)word1[;:,. ?-]'","'(^| |.|-)word2[;:,. ?-]'",...]
The right way to do this is to give values to Python, which will quote things correctly.
adapted from voyager's post:
for i in range(0, 20, 1):
cur.execute("UPDATE table SET {} = 1 WHERE text rlike %s".format(column_names[i]),
(search_terms[i],),
)
In this case it's confusing because the column_name isn't a value, it's part of the table structure, so it's inserted using good old string formatting. The search_term is a value, so is passed to cursor.execute() for correct, safe quoting.
(Don't use string manipulation to add the quotes -- you're exposing yourself to SQL injection.)
Missing quotes and wrong parenthesis placement...
for i in range(0, 20, 1):
cur.execute("UPDATE table SET %s = 1 WHERE text rlike '%s'" %(column_names[i], search_terms[i]))
# ^ ^
# (-----------------------------------------------------------------------------------)
Please note, this is not the right way of doing this, if your string may contain quotes by itself...
What about that instead:
for i in range(0, 20, 1):
cur.execute("UPDATE table SET %s = 1 WHERE text rlike ?" % (column_names[i],),
(search_terms[i],))
This uses the % operator to set the column name, but uses an executes parameter to bind the data, letting the DB driver escape all characters that need so.
I am trying to write a python script that is going to load the tables that I created in pyhton using SQL and populate them with data automatically that is coming from a text file. I am stuck on basic coding. I do have a general idea but I am getting errors when I try to run this approach. I have created 2 tables. I have read the file. the file is a comma seperated text file with no headers.
first 3 lines of the file looks like this.
+ ---- + ----- + -------------------- + -------- + - + --- + ----- +
| John | Smith | 111 N. Wabash Avenue | plumber | 5 | 1.0 | 200 |
| John | Smith | 111 N. Wabash Avenue | bouncer | 5 | 1.0 | 200 |
| Jane | Doe | 243 S. Wabash Avenue | waitress | 1 | 5.0 | 10000 |
+ ---- + ----- + -------------------- + -------- + - + --- + ----- +
import sqlite3
conn= sqlite3.connect('csc455.db')
c = conn.cursor()
#Reading the data file
fd = open ('C:/Users/nasia/Documents/data_hw2.txt','r')
data = fd.readlines()
#Creating Tables
>>> L = """create table L
... (first text, last text, address text, job text, LNum integer,
... constraint L_pk
... primary key(first, last, address, job),
... constraint L_fk
... foreign key (LNum) references LN(LNum)
... );"""
>>> c.execute(L)
LN = """create table LN
... (
... LNum integer, Interest float, Amount, Integer,
... constraint LN_pk
... primary key (LNum)
... );"""
c.execute(LN)
#Inserting into database
for elt in data:
... currentRow = elt.split(", ")[:-1]
... insert = """(insert into LN values (%s, %s, %s);, %(currentRow[4], currentRow[5], currentRow[6]))"""
... c.execute(insert)
There is some syntax error here. The code stops working. I cannot figure out what I am doing wrong.
The error is
Traceback (most recent call last):
File "", line 4, in
OperationalError: near "(": syntax error
I can not figure out what am I doing wrong
You haven't explained what format the data are in, or what your table structure is, or how you want to map them, which makes this difficult to answer. But I'll make up my own, and answer that, and hopefully it will help:
infile.txt:
CommonName,Species,Location,Color
Black-headed spider monkey,Ateles fusciceps,Ecuador,black
Central American squirrel monkey,Saimiri oerstedii,Costa Rica,orange
Vervet,Chlorocebus pygerythrus,South Africa,white
script.py
import csv
import sqlite3
db = sqlite3.connect('outfile.db')
cursor = db.cursor()
cursor.execute('CREATE TABLE Monkeys (Common Name, Color, Species)')
cursor.execute('''CREATE TABLE MonkeyLocations (Species, Location,
FOREIGN KEY(Species) REFERENCES Monkeys(Species))''')
with open('infile.txt') as f:
for row in csv.DictReader(f):
cursor.execute('''INSERT INTO Monkeys
VALUES (:CommonName, :Color, :Species)''', row)
cursor.execute('''INSERT INTO MonkeyLocations
VALUES (:Species, :Location)''', row)
db.commit()
db.close()
Of course if your real data are in some other format than CSV, you'll use different code to parse the input file.
I've also made things slightly more complex than your real data might have to deal with—the CSV columns don't have quite the same names as the SQL columns.
In other ways, your data might be more complex—e.g., if your schema has foreign keys that reference an auto-incremented row ID instead of a text field, you'll need to get the rowid after the first insert.
But this should be enough to give you the idea.
Now that you've shown more details… you were on the right track (although it's wasteful to call readlines instead of just iterating over fd directly, and you should close your db and file, ideally with a with statement, …), but you've got a simple mistake right near the end that prevents you from getting any farther:
insert = """(insert into LN values (%s, %s, %s);, %(currentRow[4], currentRow[5], currentRow[6]))"""
c.execute(insert)
You've put the formatting % expression directly into the string, instead of using the operator on the string. I think what you were trying to do is:
insert = """insert into LN values (%s, %s, %s);""" % (currentRow[4], currentRow[5], currentRow[6])
c.execute(insert)
However, you shouldn't do that. Instead, do this:
insert = """insert into LN values (?, ?, ?);"""
c.execute(insert, (currentRow[4], currentRow[5], currentRow[6]))
What's the difference?
Well, the first one just inserts the values into the statement as Python strings. That means you have to take care of converting to the proper format, quoting, escaping, etc. yourself, instead of letting the database engine decide how to deal with each value. Besides being a source of frustrating bugs when you try to save a boolean value or forget to quote a string, this also leaves you open to SQL injection attacks unless you're very careful.
There are other problems besides that one. For example, most databases will try to cache repeated statements, and it's trivial to tell that 3000 instances of insert into LN values (?, ?, ?) are all the same statement, but less so to tell that insert into LN values (5, 1.0, 200) and insert into LN values (1, 5.0, 5000) are the same statement.
If you can use standard sqlite3 utility, you can do it much easier:
sqlite3 -init mydata.sql mydatabase.db ""
simply call this line from your python script, and you're done.
This will read any text file that contains valid SQL statements, and will create mydatabase.db if it did not exist. What's more important, it supports statements spanning more than one line, and also properly ignores SQL comments using both --comment syntax and C/C++ like /*comment*/ syntax.
Typically your mydata.sql content should look like this:
BEGIN TRANSACTION;
CREATE TABLE IF NOT EXISTS table1 (
id INTEGER PRIMARY KEY AUTO_INCREMENT,
name VARCHAR(32)
);
INSERT INTO table1 (name) VALUES
('John'),
('Jack'),
('Jill');
-- more statements ...
COMMIT;
I've been trying to parse a text file (opened with parameter encoding='utf8') and insert extracted values into an mdb database using pyodbc module.
I have tried the code below:
for line in fp:
tokens = line.split('\t')
tokens[4] = tokens[4][:len(tokens[4])-1] #to avoid the \n
tokens[1] = tokens[1][1:] #to remove the 'u' from the beginning of utf8 characters like u'\u0622'
content = conn.execute("INSERT INTO Entries (PForm, WForm, Code, Freq, Pattern) VALUES ("+tokens[0]+","+tokens[1]+","+tokens[2]+","+tokens[3]+","+tokens[4]+")")
conn.commit()
and received the following error:
Error: ('07002', '[07002] [Microsoft][ODBC Microsoft Access Driver] Too few parameters. Expected 4. (-3010) (SQLExecDirectW)')
P.S. the first line of my file is: آ 'A Ab 1 S
And the other lines are of the same format.
Your comments will be appreciated :)
You don't put quotes around the strings which you want to insert. Assuming the "Freq" row is of type INTEGER:
stmt = """
INSERT INTO Entries (PForm, WForm, Code, Freq, Pattern)
VALUES ('%s', '%s', '%s', %s, '%s')
"""
params = tuple(t for t in tokens)
conn.execute(stmt % params)
But anyway, you shouldn't be formatting an INSERT statement like this. Doesn't the library you're using provide a facility to parameterize statements ? Something like this:
conn.execute("INSERT INTO Foo VALUES (?, ?, ?)", (foo, bar, baz))