The code is the following (I am new to Python/Mysql):
import mysql.connector
conn = mysql.connector.connect(host='localhost',user='user1',password='puser1',db='mm')
cursor = conn.cursor()
string1 = 'test1'
insert_query = """INSERT INTO items_basic_info (item_name) VALUES (%s)""", (string1)
cursor.execute(insert_query)
conn.commit()
When I run this code I get this error:
Traceback (most recent call last)
File "test3.py", line 9, in <module>
cursor.execute(insert_query)
File "C:\Users\Emanuele-PC\AppData\Local\Programs\Python\Python36\lib\site-packages\mysql\connector\cursor.py", line 492, in execute
stmt = operation.encode(self._connection.python_charset)
AttributeError: 'tuple' object has no attribute 'encode'
I have seen different answers to this problem but the cases were quite different from mine and I couldn't really understand where I am making mistakes. Can anyone help me?
For avoid SQL-injections Django documentation fully recommend use placeholders like that:
import mysql.connector
conn = mysql.connector.connect(host='localhost',user='user1',password='puser1',db='mm')
cursor = conn.cursor()
string1 = 'test1'
insert_query = """INSERT INTO items_basic_info (item_name) VALUES (%s)"""
cursor.execute(insert_query, (string1,))
conn.commit()
You have to pass tuple/list params in execute method as second argument. And all should be fine.
Not exactly OP's problem but i got stuck for a while writing multiple variables to MySQL.
Following on from Jefferson Houp's answer, if adding in multiple strings, you must specify the argument 'multi=True' in the 'cursor.execute' function.
import mysql.connector
conn = mysql.connector.connect(host='localhost',user='user1',password='puser1',db='mm')
cursor = conn.cursor()
string1 = 'test1'
string2 = 'test2'
insert_query = """INSERT INTO items_basic_info (item_name) VALUES (%s, %s)"""
cursor.execute(insert_query, (string1, string2), multi=True)
conn.commit()
Related
I have trouble with insert of csv data into MySQL tabel with mysql.connector .
The code I use looks like this :
import mysql.connector
import csv
andreport = 'testowa.csv'
cnx = mysql.connector.connect(
user='xxxxx',
password='xxxxx',
host='xxxxxx',
database='xxxxx')
cursor = cnx.cursor()
with open(andreport, 'r') as csv_data:
for row in csv_data:
cursor.execute(
"INSERT INTO flex(date, Store, Vendor, Shelf)"
"VALUES({},{},{},{})", row)
cnx.commit()
cursor.close()
cnx.close()
print("Done")
The error I get :
C:\Users\Iw4n\PycharmProjects\Learning\venv\Scripts\python.exe C:/Users/Iw4n/PycharmProjects/Learning/Orange_android_MySQL_insertion.py
Traceback (most recent call last):
File "C:/Users/Iw4n/PycharmProjects/Learning/Orange_android_MySQL_insertion.py", line 15, in <module>
cursor.execute(
File "C:\Users\Iw4n\PycharmProjects\Learning\venv\lib\site-packages\mysql\connector\cursor.py", line 551, in execute
self._handle_result(self._connection.cmd_query(stmt))
File "C:\Users\Iw4n\PycharmProjects\Learning\venv\lib\site-packages\mysql\connector\connection.py", line 490, in cmd_query
result = self._handle_result(self._send_cmd(ServerCmd.QUERY, query))
File "C:\Users\Iw4n\PycharmProjects\Learning\venv\lib\site-packages\mysql\connector\connection.py", line 395, in _handle_result
raise errors.get_exception(packet)
mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '},{},{},{})' at line 1
When i wrapped {} into '' , as many rows as were in csv been inserted into datbase as {},{}
same story goes for %s if I use it , i got the same error as above, when it's wrapped in '' , %s is insetred into database.
I also found information to add f in fron of "INSERT~ but it did not help.
Can anyone give me some suggestion on how to overcome this and correctly insert data to MySQL ?
Final code that is working as intended :
import mysql.connector
import csv
andreport = 'testowa.csv'
cnx = mysql.connector.connect(
user='xxxxx',
password='xxxxx',
host='xxxxx',
database='xxxxx')
cursor = cnx.cursor()
with open(andreport, mode='r') as csv_data:
reader = csv.reader(csv_data, delimiter=';')
csv_data_list = list(reader)
for row in csv_data_list:
cursor.execute("""
INSERT INTO flex(
date, Agency, MediaSource, Campaign)
VALUES(%s,%s,%s,%s)""",
(row[0], row[1], row[2], row[3]))
cnx.commit()
cursor.close()
cnx.close()
print("Done")
I'm guessing that seems the problem is that you passed one argument (row) instead of four. So try this:
cursor.execute("""
INSERT INTO flex(date, Store, Vendor, Shelf)
VALUES(%s,%s,%s,%s)""",(row[0], row[1], row[2], row[3], ))
Looking at the documentation for MySQLCursor.excute() method, it seems like adding some %s as the parameters in your insert statement might fix this?
import mysql.connector
import csv
andreport = 'testowa.csv'
cnx = mysql.connector.connect(
user='xxxxx',
password='xxxxx',
host='xxxxxx',
database='xxxxx')
cursor = cnx.cursor()
insert_statement = (
"INSERT INTO flex(date, Store, Vendor, Shelf)"
"VALUES (%s, %s, %s, %s)"
)
with open(andreport, mode='r') as csv_data:
reader = csv.reader(csv_data, delimiter=';')
csv_data_list = list(reader)
for row in csv_data_list:
cursor.execute(insert_statement, row)
cnx.commit()
cursor.close()
cnx.close()
print("Done")
Let me know if this gets you anywhere, or if you see a new error!
Edit: updated CSV reading to convert to a list.
I'm attempting to pass a list into a postgres table using psycopg2. I keep running into an exception:
File "c:/Python27/Projects/Newsletter/newsletter.py", line 148, in <module>
insert_pg(listString)
File "c:\Python27\Projects\Newsletter\pg.py", line 23, in insert_pg
print('pggggg', error)
IOError: [Errno 0] Error
The data is pretty messy (forgive me), but here's a snippet of the code. I'm running it from newsletter.py:
if __name__ == '__main__':
dataList = [today, str(int(round(float(str(spxprice.replace(',', '')))))), str(int(round(float(spxchg)))), str(int(round(float(spxpchg)))), str(int(round(float(str(dowprice.replace(',', '')))))), dowpchg, str(int(round(float(dowchg)))), str(int(round(float(str(ndxprice.replace(',', '')))))), ndxpchg, str(int(round(float(ndxchg)))), ''.join(oilPrice[4]), ''.join(getOilChg), ''.join(getOilPct), dayName]
listString = ', '.join(dataList)
insert_pg(listString)
This is pg.py, where i'm importing insert_pg from:
import psycopg2
from config import config
import sys
def insert_pg(thedata):
sql = ("""insert into prices values (%s);""" % thedata)
conn = None
try:
# read database configuration
params = config()
# connect to the PostgreSQL database
conn = psycopg2.connect(**params)
# create a new cursor
cur = conn.cursor()
# execute the INSERT statement
cur.execute(sql)
conn.commit()
cur.close()
print 'Success.'
except (Exception, psycopg2.DatabaseError) as error:
print('pggggg', error)
finally:
if conn is not None:
conn.close()
The output of sql when I print:
insert into prices values (02/14/2018, 2675, 12, 0, 24698, 0.23, 58, 7074, 0.86, 60, 59.09, -0.06, -0.10%, Wednesday);
Not sure where i'm going wrong here. The database is connecting fine. Any ideas?
First off, you're not using bound variables which is bad practice as this can lead to SQL injection. What you should be doing is this:
cur.execute('INSERT INTO PRICES(col1, col2, ...) VALUES(%(val1)s, %(val2)s, ...)', kwargs)
where kwargs is a dictionary of key/value pairs corresponding to the column names and values. this is the correct way to do it.
The problem might be related to your attempt at printing the error by itself.
Try replacing:
print('pggggg', error) with raise.
I wish to import my .csv file to a table 'testcsv' in MySQL using Python, but I'm unable to do so because I keep getting the following error:
Traceback (most recent call last):
File "<pyshell#9>", line 2, in <module>
cursor.execute(query,r)
File "C:\Python27\lib\site-packages\MySQLdb\cursors.py", line 184, in execute
query = query % db.literal(args)
TypeError: not enough arguments for format string
My code looks like this:
import csv
import MySQLdb
mydb = MySQLdb.connect(host='127.0.0.1',
port=3306,
user='root',
passwd='tulip',
db='frompython')
cursor = mydb.cursor()
csv_data = csv.reader(file('C:\Users\Tulip\Desktop\New_Sheet.csv'))
row_count = sum(1 for row in csv_data)
query = """INSERT INTO testcsv (number, name, marks) VALUES (%s, %s, %s)"""
for r in range(1, row_count):
cursor.execute(query,r)
I've tried every possible answer given to the related questions here, but none of them worked for me. Please help!
for r in range(1, row_count):
just iterates over numbers, i.e. in the first iteration r = 1. Remove the line defining row_count and get the actual rows:
for r in csv_data:
I was trying to run this code:
Thats something I am doing that to improve myself...
import sqlite3 as sq
def add_user(username, passwd):
database = sq.connect("database.db")
cursor = database.cursor()
cursor.execute("""insert into maintable values (?, ?, ?)""",
(username, passwd, ""))
database.commit()
def add_lesson(username, lesson):
database = sq.connect("database.db")
cursor = database.cursor()
cursor.execute("""select * from maintable where kullad=?""", (username,))
if cursor.fetchone()[2] != "":
lessons_dict = {lesson: 0}
cursor.execute(
"""update maintable set lessons=? where kullad=?""", (lessons_dict, username,))
database.commit()
else:
cursor.fetchone()[2][lesson] = 0
cursor.execute(
"""update maintable set lessons=? where kullad=?""", (lessons_dict, username,))
database.commit()
add_lesson("user", "lesson1")
When I try to run this I get this error:
Traceback (most recent call last):
File "main.py", line 29, in <module>
add_lesson("user", "lesson1")
File "main.py", line 24, in add_lesson
cursor.fetchone()[2][lesson] = 0
TypeError: 'NoneType' object has no attribute '__getitem__'
I think your SQL may be like this:
cursor.execute("""select * from maintable where kullad="?"""", (username,))
string in SQL should be surrounded by quotes.
am trying to execute the below code using python 2.5.2. The script is establishing the connection and creating the table, but then its failing with the below error.
The script
import pymssql
conn = pymssql.connect(host='10.103.8.75', user='mo', password='the_password', database='SR_WF_MODEL')
cur = conn.cursor()
cur.execute('CREATE TABLE persons(id INT, name VARCHAR(100))')
cur.executemany("INSERT INTO persons VALUES(%d, %s)", \
[ (1, 'John Doe'), (2, 'Jane Doe') ])
conn.commit()
cur.execute("SELECT * FROM persons WHERE salesrep='%s'", 'John Doe')
row = cur.fetchone()
while row:
print "ID=%d, Name=%s" % (row[0], row[1])
row = cur.fetchone()
cur.execute("SELECT * FROM persons WHERE salesrep LIKE 'J%'")
conn.close()
The error
Traceback (most recent call last):
File "connect_to_mssql.py", line 9, in <module>
cur.execute("SELECT * FROM persons WHERE salesrep='%s'", 'John Doe')
File "/var/lib/python-support/python2.5/pymssql.py", line 126, in execute
self.executemany(operation, (params,))
File "/var/lib/python-support/python2.5/pymssql.py", line 152, in executemany
raise DatabaseError, "internal error: %s" % self.__source.errmsg()
pymssql.DatabaseError: internal error: None
any suggestions? plus, how do you read the traceback error, anyone can help me understand the error message? how do you read it? bottom up?
I think you are assuming the regular python string interpolation behavior, ie:
>>> a = "we should never do '%s' when working with dbs"
>>> a % 'this'
"we should never do 'this' when working with dbs"
The % operator within the execute method looks like the normal string formatting operator but that is more of a convenience or mnemonic; your code should read:
cur.execute("SELECT * FROM persons WHERE salesrep=%s", 'John Doe')
without the quotes, and this will work with names like O'Reilly, and help prevent SQL injection per the database adapter design. This is really what the database adapter is there for -- converting the python objects into sql; it will know how to quote a string and properly escape punctuation, etc. It would work if you did:
>>> THING_ONE_SHOULD_NEVER_DO = "select * from table where cond = '%s'"
>>> query = THING_ONE_SHOULD_NEVER_DO % 'john doe'
>>> query
"select * from table where cond = 'john doe'"
>>> cur.execute(query)
but this is bad practice.