I'm new to python and I want to update every record that has count 0 in the database. I have tried a lot can't find anything like help.
for row in cur.fetchall():
if row[3] == 0:
cur.execute("UPDATE tble SET count = 1 WHERE name = %s" %row[1])
Assuming your table has this structure:
CREATE TABLE `test` (
`sno` int(11) NOT NULL,
`name` varchar(50) NOT NULL,
`count` int(11) NOT NULL,
`dtCreated` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP
);
Here is the simple code code-
import pymysql
conn = pymysql.connect(host='localhost', unix_socket='', user='USER', passwd='PASSWORD', db='DATABASENAME')
cur = conn.cursor()
cur.execute("SELECT * FROM test")
for r in cur:
curr = conn.cursor()
sql = """UPDATE test SET count = 1 WHERE name = '%s'""" % r[1]
# print(sql)
try:
# Execute the SQL command
curr.execute(sql)
# Commit your changes in the database
conn.commit()
except:
# Rollback in case there is any error
conn.rollback()
curr.close()
cur.close()
conn.close()
Also, since you mentioned that you are new to python remember to commit, every time, whenever you run INSERT, UPDATE or DELETE like queries.
Hope it helps.
Related
Error Message
You have an error in your SQL syntax; check the manual that
corresponds to your MariaDB server version for the right syntax to use
near '%s' at line 1
MySQL Database Table
CREATE TABLE `tblorders` (
`order_id` int(11) NOT NULL,
`order_date` date NOT NULL,
`order_number` varchar(50) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
ALTER TABLE `tblorders`
ADD PRIMARY KEY (`order_id`),
ADD UNIQUE KEY `order_number` (`order_number`);
ALTER TABLE `tblorders`
MODIFY `order_id` int(11) NOT NULL AUTO_INCREMENT, AUTO_INCREMENT=4;
Code
mydb = mysql.connector.connect(host = "localhost", user = "root", password = "", database = "mydb")
mycursor = mydb.cursor()
sql = "Select order_id from tblorders where order_number=%s"
val = ("1221212")
mycursor.execute(sql, val)
Am I missing anything?
You must pass a list or a tuple as the arguments, but a tuple of a single value is just a scalar in parentheses.
Here are some workarounds to ensure that val is interpreted as a tuple or a list:
sql = "Select order_id from tblorders where order_number=%s"
val = ("1221212",)
mycursor.execute(sql, val)
sql = "Select order_id from tblorders where order_number=%s"
val = ["1221212"]
mycursor.execute(sql, val)
This is a thing about Python that I always find weird, but it makes a kind of sense.
In case you want to insert data you have to modify your SQL. Use INSERT instead of SELECT like this:
INSERT INTO tblorders (order_number) VALUES ("122121");
That statement will add new record to the table. Besides, in MariaDB you need to use ? instead of %s that works on Mysql database.
sql = "INSERT INTO tblorders (order_number) VALUES (?);"
val = "1231231"
mycursor.execute(sql, [val])
I tried to update multiple rows (approx. 350000) with a single query by implementing the following function:
def update_items(rows_to_update):
sql_query = """UPDATE contact as t SET
name = e.name
FROM (VALUES %s) AS e(id, name)
WHERE e.id = t.id;"""
conn = get_db_connection()
cur = conn.cursor()
psycopg2.extras.execute_values (
cur, sql_query, rows_to_update, template=None, page_size=100
)
While trying to run the function above, only 31 records were updated. Then, I tried to update row by row with the following function:
def update_items_row_by_row(rows_to_update):
sql_query = """UPDATE contact SET name = %s WHERE id = %s"""
conn = get_db_connection()
with tqdm(total=len(rows_to_update)) as pbar:
for id, name in rows_to_update:
cur = conn.cursor()
# execute the UPDATE statement
cur.execute(sql_query, (name, id))
# get the number of updated rows
# Commit the changes to the database
conn.commit()
cur.close()
pbar.update(1)
The latter has updated all the records so far but is very slow (estimated to end in 9 hours).
Does anyone know what is the efficient way to update multiple records?
By splitting the list into chunks of size equal to page_size, it worked well:
def update_items(rows_to_update):
sql_query = """UPDATE contact as t SET
name = data.name
FROM (VALUES %s) AS data (id, name)
WHERE t.id = data.id"""
conn = get_db_connection()
cur = conn.cursor()
n = 100
with tqdm(total=len(rows_to_update)) as pbar:
for i in range(0, len(rows_to_update), n):
psycopg2.extras.execute_values (
cur, sql_query, rows_to_update[i:i + n], template=None, page_size=n
)
conn.commit()
pbar.update(cur.rowcount)
cur.close()
conn.close()
The problem with your original function appears to be that you forgot to apply commit. When you execute an insert/update query with psycopg2 a transaction is opened but not finalized until commit is called. See my edits in your function (towards the bottom).
def update_items(rows_to_update):
sql_query = """UPDATE contact as t SET
name = e.name
FROM (VALUES %s) AS e(id, name)
WHERE e.id = t.id;"""
conn = get_db_connection()
cur = conn.cursor()
psycopg2.extras.execute_values(cur, sql_query, rows_to_update)
## solution below ##
conn.commit() # <- We MUST commit to reflect the inserted data
cur.close()
conn.close()
return "success :)"
If you don't want to call conn.commit() each time you create a new cursor, you can use autocommit such as
conn = get_db_connection()
conn.set_session(autocommit=True)
I am trying to insert a new row into postgresql database using psycopg2 and flask.
Here is the code:
con = psycopg2.connect("host=localhost dbname=crm_whatsapp user=user_name password=user_password")
cur = con.cursor()
create = cur.execute("INSERT INTO crm_user_chat_data (number) VALUES (%s) returning id",(user_number,)) //Here it returns none
con.commit()
But I am getting None instead of id.
How can i solve this?
You need to fetch() from the cursor to get the id:
cur = con.cursor()
create = cur.execute("INSERT INTO crm_user_chat_data (number) VALUES (%s) returning id",(user_number,))
con.commit()
insertId = cur.fetchone()
When trying to insert rows into a table with a unique index, it appears to simply silently not insert.
I've captured the behaviour in the following program: on the second call to test_insert I should get an integrity violation on the unique key. But nothing. Also, if I take the c.execute(query, [id_to_test]) line and duplicate itself below it, I do receive the proper integrity constraint as expected. What's happening here?
import sqlite3
def test_insert(id_to_test):
conn = sqlite3.connect('test.db')
c = conn.cursor()
query = '''INSERT INTO test(unique_id)
VALUES(?)'''
c.execute(query, [id_to_test])
def setup_table():
conn = sqlite3.connect('test.db')
c = conn.cursor()
c.execute('''DROP TABLE IF EXISTS test''')
c.execute('''CREATE TABLE test (unique_id text)''')
c.execute('''CREATE UNIQUE INDEX test_unique_id ON test (unique_id)''')
if __name__ == '__main__':
setup_table()
test_insert('test_id')
test_insert('test_id')
test_insert('test_id')
At the end of database operations, commit the changes to the database:
conn.commit()
I am having trouble inserting a record into a MySQL database from python. This is what I am doing.
def testMain2():
conn = MySQLdb.connect(charset='utf8', host="localhost", user="root", passwd="root", db="epf")
cursor = conn.cursor()
tableName = "test_table"
columnsDef = "(export_date BIGINT, storefront_id INT, genre_id INT, album_id INT, album_rank INT)"
exStr = """CREATE TABLE %s %s""" % (tableName, columnsDef)
cursor.execute(exStr)
#Escape the record
values = ["1305104402172", "12", "34", "56", "78"]
values = [conn.literal(aField) for aField in values]
stringList = "(%s)" % (", ".join(values))
columns = "(export_date, storefront_id, genre_id, album_id, album_rank)"
insertStmt = """INSERT INTO %s %s VALUES %s""" % (tableName, columns, stringList)
cursor.execute(insertStmt)
cursor.close()
conn.close()
The table is created however nothing is in the table. I can run the INSERT statement successfully in Terminal with the same credentials.
Any suggestions on what I may be doing wrong?
You haven't committed the transaction.
conn.commit()
(The MySQLdb library sets autocommit to False when connecting to MySQL. This means that you need to manually call commit or your changes will never make it into the database.)