IndexError out of range on Python - python

I have a python code to insert data into my database. Here is the line:
query = 'insert into 1_recipe values({0})'. I used {0} to pass all data from my CSV file. It works perfectly before I use sys.argv in my code. Here is the new code :
import sys
nomor = sys.argv[1]
.....
query = "insert into {idnumber}_recipe values ({0})".format(idnumber = nomor)
query = query.format(','.join(['%s'] * len(data)))
.....
When I run this code, always back with this error :
'query = "insert into {idnumber}_recipe values ({0})".format(idnumber = nomor)
IndexError: Replacement index 0 out of range for positional args tuple'
How to fix it? Thanks.
Update:
I already found the answer. Thank you

You are only passing one argument to the format() function:
.format(idnumber = nomor)
The format function doesn't have a value to give to the ({0}) part of the formatted string.
Either give another value or change it so it will use idnumber as well

You can look at query development for formatting here.
e.g.:
insert_stmt = (
"INSERT INTO employees (emp_no, first_name, last_name, hire_date) "
"VALUES (%s, %s, %s, %s)"
)
data = (2, 'Jane', 'Doe', datetime.date(2012, 3, 23))
cursor.execute(insert_stmt, data)
Your code can be rewritten something like this using better formatting:
import sys
nomor = sys.argv[1]
data_str_value = ','.join(['%s'] * len(data))
.....
query = "insert into {idnumber}_recipe values ({values})".format(idnumber = nomor, values = data_str_value)
.....
Note: This code is showing only better formatting as per the example given. This query may or may not run as expected due to incorrect syntax.

Related

Error: can only concatenate str (not "list") to str

I am trying to import txt file into sql, but i have an error:
TypeError: can only concatenate str (not "list") to str
My code:
import psycopg2
con = psycopg2.connect(
host = "",
database="",
user = "",
password = "")
cursor = con.cursor()
with open("pom1.txt") as infile:
for line in infile:
data = line.split()
print(data)
query = ("INSERT INTO Pomiary_Obwod_90(Znacznik, Pomiar_x, Pomiar_y, Pomiar_z) VALUES"
"(" + data + ");")
cursor.execute(query, *data)
con.commit()
Does anyone have an idea how can i solve it? :)
You don't put the actual values into the parameterized query; you put whatever placeholders are appropriate for your library.
data = line.split()
place_holders = ', '.join("%s" for _ in data) # Assuming %s is correct
query = ("INSERT INTO Pomiary_Obwod_90(Znacznik, Pomiar_x, Pomiar_y, Pomiar_z) VALUES"
"(" + place_holders + ");")
cursor.execute(query, *data)
cursor.execute takes care of inserting each value where a placeholder occurs, ensure things are properly quoted/escaped/etc.
There are several problems here. First, as the error says, you are trying to concatenate a List (which is data) directly to a string.
Second, you should not use + to concatenate your values and your query.
The doc says:
Warning: Never, never, NEVER use Python string concatenation (+) or string parameters interpolation (%) to pass variables to a SQL query string. Not even at gunpoint.
You should only pass the values to the query via %s.
I'm not sure about the use of * in front of data in cursor.execute(query, *data).
Here is a code that should work, though I have nothing at hand for testing it right now:
import psycopg2
con = psycopg2.connect(
host = "",
database="",
user = "",
password = "")
cursor = con.cursor()
with open("pom1.txt") as infile:
for line in infile:
data = line.split()
print(data)
query = ("INSERT INTO Pomiary_Obwod_90(Znacznik, Pomiar_x, Pomiar_y, Pomiar_z) VALUES (%s, %s, %s, %s);")
cursor.execute(query, data)
con.commit()

Parameterized Table Population

I am trying to populate a table(whose name is parameterized). The program runs fine, up until the point where the command gets executed.
Here is the code:
table_name = input("Enter table name: ")
value_name = input("Enter name: ")
sql = "INSERT INTO %s (name) VALUES (%s)" % db.escape_string(table_name), (value_name)
cursor.execute(sql)
I get the following error:
TypeError: not enough arguments for format string
Thanks to anyone who takes the time to help. Have a great rest of the day :)
as an alternative you good go with the new formatting format
sql = f"INSERT INTO {tab} (name) VALUES ({val})".format(tab=db.escape_string(table_name),
val=value_name)
or
sql = f"INSERT INTO {db.escape_string(table_name)} (name) VALUES ({value_name})"
Just wrap the sql formatting like below and try.
sql = "INSERT INTO %s (name) VALUES (%s)" % (db.escape_string(table_name), value_name)

How to Import Big JSON file to MYSQL [duplicate]

I am having a hard time using the MySQLdb module to insert information into my database. I need to insert 6 variables into the table.
cursor.execute ("""
INSERT INTO Songs (SongName, SongArtist, SongAlbum, SongGenre, SongLength, SongLocation)
VALUES
(var1, var2, var3, var4, var5, var6)
""")
Can someone help me with the syntax here?
Beware of using string interpolation for SQL queries, since it won't escape the input parameters correctly and will leave your application open to SQL injection vulnerabilities. The difference might seem trivial, but in reality it's huge.
Incorrect (with security issues)
c.execute("SELECT * FROM foo WHERE bar = %s AND baz = %s" % (param1, param2))
Correct (with escaping)
c.execute("SELECT * FROM foo WHERE bar = %s AND baz = %s", (param1, param2))
It adds to the confusion that the modifiers used to bind parameters in a SQL statement varies between different DB API implementations and that the mysql client library uses printf style syntax instead of the more commonly accepted '?' marker (used by eg. python-sqlite).
You have a few options available. You'll want to get comfortable with python's string iterpolation. Which is a term you might have more success searching for in the future when you want to know stuff like this.
Better for queries:
some_dictionary_with_the_data = {
'name': 'awesome song',
'artist': 'some band',
etc...
}
cursor.execute ("""
INSERT INTO Songs (SongName, SongArtist, SongAlbum, SongGenre, SongLength, SongLocation)
VALUES
(%(name)s, %(artist)s, %(album)s, %(genre)s, %(length)s, %(location)s)
""", some_dictionary_with_the_data)
Considering you probably have all of your data in an object or dictionary already, the second format will suit you better. Also it sucks to have to count "%s" appearances in a string when you have to come back and update this method in a year :)
The linked docs give the following example:
cursor.execute ("""
UPDATE animal SET name = %s
WHERE name = %s
""", ("snake", "turtle"))
print "Number of rows updated: %d" % cursor.rowcount
So you just need to adapt this to your own code - example:
cursor.execute ("""
INSERT INTO Songs (SongName, SongArtist, SongAlbum, SongGenre, SongLength, SongLocation)
VALUES
(%s, %s, %s, %s, %s, %s)
""", (var1, var2, var3, var4, var5, var6))
(If SongLength is numeric, you may need to use %d instead of %s).
Actually, even if your variable (SongLength) is numeric, you will still have to format it with %s in order to bind the parameter correctly. If you try to use %d, you will get an error. Here's a small excerpt from this link http://mysql-python.sourceforge.net/MySQLdb.html:
To perform a query, you first need a cursor, and then you can execute queries on it:
c=db.cursor()
max_price=5
c.execute("""SELECT spam, eggs, sausage FROM breakfast
WHERE price < %s""", (max_price,))
In this example, max_price=5 Why, then, use %s in the string? Because MySQLdb will convert it to a SQL literal value, which is the string '5'. When it's finished, the query will actually say, "...WHERE price < 5".
As an alternative to the chosen answer, and with the same safe semantics of Marcel's, here is a compact way of using a Python dictionary to specify the values. It has the benefit of being easy to modify as you add or remove columns to insert:
meta_cols = ('SongName','SongArtist','SongAlbum','SongGenre')
insert = 'insert into Songs ({0}) values ({1})'.format(
','.join(meta_cols), ','.join( ['%s']*len(meta_cols)))
args = [ meta[i] for i in meta_cols ]
cursor = db.cursor()
cursor.execute(insert,args)
db.commit()
Where meta is the dictionary holding the values to insert. Update can be done in the same way:
meta_cols = ('SongName','SongArtist','SongAlbum','SongGenre')
update='update Songs set {0} where id=%s'.
.format(','.join([ '{0}=%s'.format(c) for c in meta_cols ]))
args = [ meta[i] for i in meta_cols ]
args.append(songid)
cursor=db.cursor()
cursor.execute(update,args)
db.commit()
The first solution works well. I want to add one small detail here. Make sure the variable you are trying to replace/update it will has to be a type str. My mysql type is decimal but I had to make the parameter variable as str to be able to execute the query.
temp = "100"
myCursor.execute("UPDATE testDB.UPS SET netAmount = %s WHERE auditSysNum = '42452'",(temp,))
myCursor.execute(var)
Here is another way to do it. It's documented on the MySQL official website.
https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-execute.html
In the spirit, it's using the same mechanic of #Trey Stout's answer. However, I find this one prettier and more readable.
insert_stmt = (
"INSERT INTO employees (emp_no, first_name, last_name, hire_date) "
"VALUES (%s, %s, %s, %s)"
)
data = (2, 'Jane', 'Doe', datetime.date(2012, 3, 23))
cursor.execute(insert_stmt, data)
And to better illustrate any need for variables:
NB: note the escape being done.
employee_id = 2
first_name = "Jane"
last_name = "Doe"
insert_stmt = (
"INSERT INTO employees (emp_no, first_name, last_name, hire_date) "
"VALUES (%s, %s, %s, %s)"
)
data = (employee_id, conn.escape_string(first_name), conn.escape_string(last_name), datetime.date(2012, 3, 23))
cursor.execute(insert_stmt, data)

WHERE IN Clause in python list [duplicate]

This question already has answers here:
imploding a list for use in a python MySQLDB IN clause
(8 answers)
Closed 1 year ago.
I need to pass a batch of parameters to mysql in python. Here is my code:
sql = """ SELECT * from my_table WHERE name IN (%s) AND id=%(Id)s AND puid=%(Puid)s"""
params = {'Id':id,'Puid' : pid}
in_p=', '.join(list(map(lambda x: '%s', names)))
sql = sql %in_p
cursor.execute(sql, names) #todo: add params to sql clause
The problem is I want to pass the name list to sql IN clause, meanwhile I also want to pass the id and puid as parameters to the sql query clause. How do I implement these in python?
Think about the arguments to cursor.execute that you want. You want to ultimately execute
cursor.execute("SELECT * FROM my_table WHERE name IN (%s, %s, %s) AND id = %s AND puid = %s;", ["name1", "name2", "name3", id, pid])
How do you get there? The tricky part is getting the variable number of %ss right in the IN clause. The solution, as you probably saw from this answer is to dynamically build it and %-format it into the string.
in_p = ', '.join(list(map(lambda x: '%s', names)))
sql = "SELECT * FROM my_table WHERE name IN (%s) AND id = %s AND puid = %s;" % in_p
But this doesn't work. You get:
TypeError: not enough arguments for format string
It looks like Python is confused about the second two %ss, which you don't want to replace. The solution is to tell Python to treat those %ss differently by escaping the %:
sql = "SELECT * FROM my_table WHERE name IN (%s) AND id = %%s AND puid = %%s;" % in_p
Finally, to build the arguments and execute the query:
args = names + [id, pid]
cursor.execute(sql, args)
sql = """ SELECT * from my_table WHERE name IN (%s) AND id=%(Id)s AND puid=%(Puid)s""".replace("%s", "%(Clause)s")
print sql%{'Id':"x", 'Puid': "x", 'Clause': "x"}
This can help you.

Inserting JSON into MySQL using Python

I have a JSON object in Python. I am Using Python DB-API and SimpleJson. I am trying to insert the json into a MySQL table.
At moment am getting errors and I believe it is due to the single quotes '' in the JSON Objects.
How can I insert my JSON Object into MySQL using Python?
Here is the error message I get:
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7ff68f91d7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the
manual that corresponds to your MySQL server version for
the right syntax to use near ''favorited': '0',
'in_reply_to_user_id': '52063869', 'contributors':
'NULL', 'tr' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-py2.5.egg/
twitstream/twitasync.py|found_terminator|55] [twitter.py|callback|26]
[build/bdist.linux-x86_64/egg/MySQLdb/cursors.py|execute|166]
[build/bdist.linux-x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Another error for reference
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7feb9d52b7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the manual
that corresponds to your MySQL server version for the right
syntax to use near 'RT #tweetmeme The Best BlackBerry Pearl
Cell Phone Covers http://bit.ly/9WtwUO''' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-
py2.5.egg/twitstream/twitasync.py|found_terminator|55]
[twitter.py|callback|28] [build/bdist.linux-
x86_64/egg/MySQLdb/cursors.py|execute|166] [build/bdist.linux-
x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Here is a link to the code that I am using http://pastebin.com/q5QSfYLa
#!/usr/bin/env python
try:
import json as simplejson
except ImportError:
import simplejson
import twitstream
import MySQLdb
USER = ''
PASS = ''
USAGE = """%prog"""
conn = MySQLdb.connect(host = "",
user = "",
passwd = "",
db = "")
# Define a function/callable to be called on every status:
def callback(status):
twitdb = conn.cursor ()
twitdb.execute ("INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)",(status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status))
# print status
#print "%s:\t%s\n" % (status.get('user', {}).get('screen_name'), status.get('text'))
if __name__ == '__main__':
# Call a specific API method from the twitstream module:
# stream = twitstream.spritzer(USER, PASS, callback)
twitstream.parser.usage = USAGE
(options, args) = twitstream.parser.parse_args()
if len(args) < 1:
args = ['Blackberry']
stream = twitstream.track(USER, PASS, callback, args, options.debug, engine=options.engine)
# Loop forever on the streaming call:
stream.run()
use json.dumps(json_value) to convert your json object(python object) in a json string that you can insert in a text field in mysql
http://docs.python.org/library/json.html
To expand on the other answers:
Basically you need make sure of two things:
That you have room for the full amount of data that you want to insert in the field that you are trying to place it. Different database field types can fit different amounts of data.
See: MySQL String Datatypes. You probably want the "TEXT" or "BLOB" types.
That you are safely passing the data to database. Some ways of passing data can cause the database to "look" at the data and it will get confused if the data looks like SQL. It's also a security risk. See: SQL Injection
The solution for #1 is to check that the database is designed with correct field type.
The solution for #2 is use parameterized (bound) queries. For instance, instead of:
# Simple, but naive, method.
# Notice that you are passing in 1 large argument to db.execute()
db.execute("INSERT INTO json_col VALUES (" + json_value + ")")
Better, use:
# Correct method. Uses parameter/bind variables.
# Notice that you are passing in 2 arguments to db.execute()
db.execute("INSERT INTO json_col VALUES %s", json_value)
Hope this helps. If so, let me know. :-)
If you are still having a problem, then we will need to examine your syntax more closely.
The most straightforward way to insert a python map into a MySQL JSON field...
python_map = { "foo": "bar", [ "baz", "biz" ] }
sql = "INSERT INTO your_table (json_column_name) VALUES (%s)"
cursor.execute( sql, (json.dumps(python_map),) )
You should be able to insert intyo a text or blob column easily
db.execute("INSERT INTO json_col VALUES %s", json_value)
You need to get a look at the actual SQL string, try something like this:
sqlstr = "INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)", (status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status)
print "about to execute(%s)" % sqlstr
twitdb.execute(sqlstr)
I imagine you are going to find some stray quotes, brackets or parenthesis in there.
#route('/shoes', method='POST')
def createorder():
cursor = db.cursor()
data = request.json
p_id = request.json['product_id']
p_desc = request.json['product_desc']
color = request.json['color']
price = request.json['price']
p_name = request.json['product_name']
q = request.json['quantity']
createDate = datetime.now().isoformat()
print (createDate)
response.content_type = 'application/json'
print(data)
if not data:
abort(400, 'No data received')
sql = "insert into productshoes (product_id, product_desc, color, price, product_name, quantity, createDate) values ('%s', '%s','%s','%d','%s','%d', '%s')" %(p_id, p_desc, color, price, p_name, q, createDate)
print (sql)
try:
# Execute dml and commit changes
cursor.execute(sql,data)
db.commit()
cursor.close()
except:
# Rollback changes
db.rollback()
return dumps(("OK"),default=json_util.default)
One example, how add a JSON file into MySQL using Python. This means that it is necessary to convert the JSON file to sql insert, if there are several JSON objects then it is better to have only one call INSERT than multiple calls, ie for each object to call the function INSERT INTO.
# import Python's JSON lib
import json
# use JSON loads to create a list of records
test_json = json.loads('''
[
{
"COL_ID": "id1",
"COL_INT_VAULE": 7,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 3.14159,
"COL_STRING_VAULE": "stackoverflow answer"
},
{
"COL_ID": "id2",
"COL_INT_VAULE": 10,
"COL_BOOL_VALUE": false,
"COL_FLOAT_VALUE": 2.71828,
"COL_STRING_VAULE": "http://stackoverflow.com/"
},
{
"COL_ID": "id3",
"COL_INT_VAULE": 2020,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 1.41421,
"COL_STRING_VAULE": "GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer"
}
]
''')
# create a nested list of the records' values
values = [list(x.values()) for x in test_json]
# print(values)
# get the column names
columns = [list(x.keys()) for x in test_json][0]
# value string for the SQL string
values_str = ""
# enumerate over the records' values
for i, record in enumerate(values):
# declare empty list for values
val_list = []
# append each value to a new list of values
for v, val in enumerate(record):
if type(val) == str:
val = "'{}'".format(val.replace("'", "''"))
val_list += [ str(val) ]
# put parenthesis around each record string
values_str += "(" + ', '.join( val_list ) + "),\n"
# remove the last comma and end SQL with a semicolon
values_str = values_str[:-2] + ";"
# concatenate the SQL string
table_name = "json_data"
sql_string = "INSERT INTO %s (%s)\nVALUES\n%s" % (
table_name,
', '.join(columns),
values_str
)
print("\nSQL string:\n\n")
print(sql_string)
output:
SQL string:
INSERT INTO json_data (COL_ID, COL_INT_VAULE, COL_BOOL_VALUE, COL_FLOAT_VALUE, COL_STRING_VAULE)
VALUES
('id1', 7, True, 3.14159, 'stackoverflow answer'),
('id2', 10, False, 2.71828, 'http://stackoverflow.com/'),
('id3', 2020, True, 1.41421, 'GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer.');
The error may be due to an overflow of the size of the field in which you try to insert your json. Without any code, it is hard to help you.
Have you considerate a no-sql database system such as couchdb, which is a document oriented database relying on json format?
Here's a quick tip, if you want to write some inline code, say for a small json value, without import json.
You can escape quotes in SQL by a double quoting, i.e. use '' or "", to enter ' or ".
Sample Python code (not tested):
q = 'INSERT INTO `table`(`db_col`) VALUES ("{k:""some data"";}")'
db_connector.execute(q)

Categories