EDIT: Yup I'm dumb. Missed the typo.
I'm following along with a video in a Udacity course, and getting an error trying to run a SQL command via psycopg2. The code is identical to the instructor's, but mine returns an error and her's doesnt.
import psycopg2
# establish connection to db
connection = psycopg2.connect('dbname=example')
# cursor is essentially an interface that allows you to start
# cuing up work and transactions
cursor = connection.cursor()
# defines SQL transaction
cursor.execute('''
CREATE TABLE table2 (
id INTEGER PRIMARY KEY,
completed BOOLEAN NOT NULL DEFUALT False
);
''')
cursor.execute('INSERT INTO table2 (id, completed) VALUES (1, true);')
# commits the transaction
connection.commit()
# must manually close your session each time one is opened
connection.close()
cursor.close()
Error:
$ python3 demo.py
Traceback (most recent call last):
File "demo.py", line 11, in <module>
cursor.execute("""
psycopg2.errors.SyntaxError: syntax error at or near "DEFUALT"
LINE 4: completed BOOLEAN NOT NULL DEFUALT False
You seem to have made a typo instead of DEFAULT you have written DEFUALT
cursor.execute('''
CREATE TABLE table2 (
id INTEGER PRIMARY KEY,
completed BOOLEAN NOT NULL DEFAULT False
);
''')
Related
Trying to automate working process with the tables in MySQL using for-loop
from mysql.connector import connect, Error
def main():
try:
with connect(host="host", user="user",password="password") as connection:
connection.autocommit = True
no_pk_tables_query = """
select tab.table_schema as database_name,
tab.table_name
from information_schema.tables tab
left join information_schema.table_constraints tco
on tab.table_schema = tco.table_schema
and tab.table_name = tco.table_name
and tco.constraint_type = 'PRIMARY KEY'
where tco.constraint_type is null
and tab.table_schema not in('mysql', 'information_schema',
'performance_schema', 'sys')
and tab.table_type = 'BASE TABLE'
order by tab.table_schema,
tab.table_name;
"""
tables_to_cure = []
with connection.cursor() as cursor:
cursor.execute(no_pk_tables_query)
for table in cursor:
tables_to_cure.append(table[1])
print(table[1])
for s_table in tables_to_cure:
cure = """
USE mission_impossible;
ALTER TABLE `{}` MODIFY `ID` int(18) NOT NULL auto_increment PRIMARY KEY;
""".format(s_table)
cursor.execute(cure)
print("Cured {}".format(s_table))
except Error as e:
print(e)
finally:
print("End")
main()
And I get:
quote 2014 (HY000): Commands out of sync; you can't run this command now
If I add connection.commit() inside the for-loop after cursor.execute() I'll get:
_mysql_connector.MySQLInterfaceError: Commands out of sync; you can't run this command now
Does this mean that I'll have to use new connections inside loop instead of cursor?
I've looked it up and found some methods like fetchall() and nextset() but they seem to do other things than simply refreshing current cursor data.
Using connection.autocommit = True seem not to work either as the same error occurs.
Using something like sleep() also doesn't help.
What am I doing wrong here?
Edit
Getting rid of try/except didn't help:
File "/usr/local/lib/python3.8/dist-packages/mysql/connector/connection_cext.py", line 523, in cmd_query
self._cmysql.query(query,
_mysql_connector.MySQLInterfaceError: Commands out of sync; you can't run this command now
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "db.py", line 40, in <module>
main()
File "db.py", line 36, in main
cursor.execute(cure)
File "/usr/local/lib/python3.8/dist-packages/mysql/connector/cursor_cext.py", line 269, in execute
result = self._cnx.cmd_query(stmt, raw=self._raw,
File "/usr/local/lib/python3.8/dist-packages/mysql/connector/connection_cext.py", line 528, in cmd_query
raise errors.get_mysql_exception(exc.errno, msg=exc.msg,
mysql.connector.errors.DatabaseError: 2014 (HY000): Commands out of sync; you can't run this command now
Fixed:
Seems like I've finally figured it out, it's needed to get results from the cursor using fetchall() instead of directly addressing the cursor as an iterator.
with connection.cursor() as cursor:
cursor.execute(no_pk_tables_query)
rows = cursor.fetchall()
with connection.cursor() as cursor:
for table in rows:
try:
print(table[1])
cure = """
ALTER TABLE `{}` MODIFY `ID` int(18) NOT NULL auto_increment PRIMARY KEY;
""".format(table[1])
cursor.execute(cure)
res = cursor.fetchall()
print(res)
except Error as e:
print(e)
Thx everybody
Here's some sample code that shows how the "Commands out of sync" error can occur:
from mysql.connector import connect, Error
# replace asterisks in the CONFIG dictionary with your data
CONFIG = {
'user': '*',
'password': '*',
'host': '*',
'database': '*',
'autocommit': False
}
try:
with connect(**CONFIG) as conn:
try:
with conn.cursor() as cursor:
cursor.execute('select * from ips')
# cursor.fetchall()
finally:
conn.commit()
except Error as e:
print(e)
Explanation:
The code selects all rows from a table called "ips" the contents of which are irrelevant here.
Now, note that we do not attempt to get a rowset (fetchall is commented out). We then try to commit the transaction (even though no changes were made to the table).
This will induce the "Commands out of sync" error.
However, if we take out the comment line and fetch the rowset (fetchall) this problem does not arise.
Explicitly fetching the rowset is equivalent to iterating over the cursor.
If we change the autocommit parameter to True and remove the explicit commit(), we get another error:- "Unread result found".
In other words, it seems that MySQL requires you to get the rowset (or iterate over the cursor) whenever you select anything!
Note that even if autocommit is enabled (True) explicit calls to commit() are permitted
Solutions:
Either ensure that the client application iterates over the entire cursor after SELECT or in the CONFIG dictionary add: 'consume_results': True
I am trying to insert value into SQL SERVER using python.
I wrote my python program as below.
import pyodbc
import subprocess
cnx = pyodbc.connect("DSN=myDSN;UID=myUID;PWD=myPassword;port=1433")
runcmd1 = subprocess.check_output(["usbrh", "-t"])[0:5]
runcmd2 = subprocess.check_output(["usbrh", "-h"])[0:5]
cursor = cnx.cursor()
cursor.execute("SELECT * FROM T_TABLE-A;")
cursor.execute('''
INSERT INTO T_TABLE-A (TEMP,RH,DATE,COMPNAME)
VALUES
(runcmd1,runcmd2,GETDATE(),'TEST_Py')
''')
cnx.commit()
Then get error like below.
# python inserttest.py
Traceback (most recent call last):
File "inserttest.py", line 13, in <module>
''')
pyodbc.ProgrammingError: ('42S22', "[42S22] [FreeTDS][SQL Server]Invalid column name 'runcmd1'. (207) (SQLExecDirectW)")
If I wrote like below, it's OK to insert.
import pyodbc
cnx = pyodbc.connect("DSN=myDSN;UID=myUID;PWD=myPassword;port=1433")
cursor = cnx.cursor()
cursor.execute("SELECT * FROM T_TABLE-A;")
cursor.execute('''
INSERT INTO T_TABLE-A (TEMP,RH,DATE,COMPNAME)
VALUES
(20.54,56.20,GETDATE(),'TEST_P')
''')
cnx.commit()
The command USBRH -t gets temperature and USBRH -h gets humidity. They work well in individual python program.
Does anyone have idea to solve this error?
Thanks a lot in advance.
check the data types returning from these two lines
runcmd1 = subprocess.check_output(["usbrh", "-t"])[0:5]
runcmd2 = subprocess.check_output(["usbrh", "-h"])[0:5]
runcmd1 and runcmd2 should be in 'double' data type since it accepts 20.54.
cursor.execute('''
INSERT INTO T_TABLE-A (TEMP,RH,DATE,COMPNAME)
VALUES
(runcmd1,runcmd2,GETDATE(),'TEST_Py')
''')
won't work because you are embedding the names of the Python variables, not their values. You need to do
sql = """\
INSERT INTO T_TABLE-A (TEMP,RH,DATE,COMPNAME)
VALUES
(?, ?, GETDATE(),'TEST_Py')
"""
cursor.execute(sql, runcmd1, runcmd2)
I’m trying to INSERT INTO / ON DUPLICATE KEY UPDATE taking the values from one table and inserting into another. I have the following Python code.
try:
cursor.execute("SELECT LocationId, ProviderId FROM CQCLocationDetailsUpdates")
rows = cursor.fetchall()
for row in rows:
maria_cnxn.execute('INSERT INTO CQCLocationDetailsUpdates2 (LocationId, ProviderId) VALUES (%s,%s) ON DUPLICATE KEY UPDATE ProviderId = VALUES(%s)', row)
mariadb_connection.commit()
except TypeError as error:
print(error)
mariadb_connection.rollback()
If I change this script just to INSERT INTO it work fine, the problem seems to be when I add the ON DUPLICATE KEY UPDATE. What do I have wrong? LocationId is the PRIMARY KEY
I get this error.
Traceback (most recent call last):
File "C:/Users/waynes/PycharmProjects/DRS_Dev/CQC_Locations_Update_MariaDB.py", line 228, in <module>
maria_cnxn.execute('INSERT INTO CQCLocationDetailsUpdates2 (LocationId, ProviderId) VALUES (%s,%s) ON DUPLICATE KEY UPDATE ProviderId = VALUES(%s)', row)
File "C:\Users\waynes\PycharmProjects\DRS_Dev\venv\lib\site-packages\mysql\connector\cursor.py", line 548, in execute
stmt = RE_PY_PARAM.sub(psub, stmt)
File "C:\Users\waynes\PycharmProjects\DRS_Dev\venv\lib\site-packages\mysql\connector\cursor.py", line 79, in __call__
"Not enough parameters for the SQL statement")
mysql.connector.errors.ProgrammingError: Not enough parameters for the SQL statement
Your error is because row is a 2 element tuple and your SQL statement requires three %s vars.
It is however possible to use an INSERT .. SELECT .. ON DUPLICATE KEY like:
maria_cnxn.execute('INSERT INTO CQCLocationDetailsUpdates2 (LocationId,
ProviderId)
SELECT LocationId, ProviderId
FROM CQCLocationDetailsUpdates orig
ON DUPLICATE KEY UPDATE CQCLocationDetailsUpdates2.ProviderID = orig.ProviderID')
Whenever you end up doing a loop around a SQL statement you should look to see if there is a SQL way of doing this.
I am streaming tweets to a postgres database with a python script (using psycopg2). I would like to be able to schedule this script in a windows task manager. The only issue I have to overcome is to be able to rename the table in postgres. Is it possible?
x = datetime.date.today() - datetime.timedelta(days=1)
con = psycopg2.connect("dbname='test' user='postgres'")
cur = con.cursor()
cur.execute("DROP TABLE IF EXISTS schemaname.%s", (x))
** UPDATE
That answer does get my further, now it just complains about the numbers.
Traceback (most recent call last):
File "Z:/deso-gis/scripts/test123.py", line 26, in <module>
cur.execute("DROP TABLE IF EXISTS tweets_days.%s" % x)
psycopg2.ProgrammingError: syntax error at or near ".2016"
LINE 1: DROP TABLE IF EXISTS tweets_days.2016-02-29
I believe you are getting arror at line
cur.execute("DROP TABLE IF EXISTS schemaname.%s", (x))
because psycopg generates not what you want:
DROP TABLE IF EXISTS schemaname."table_name"
try using
cur.execute("DROP TABLE IF EXISTS schemaname.%s" % x)
This is not as secure as could be but now table name is name not SQL string.
after searching untill madness, i decided to post a question here.
I try to create a sqlite3 database where i'd like to make use of the secure variable substituation function of the cursor.execute(SQL, param) function. My function goes like this:
#!/usr/bin/env python3
#-*- coding: utf-8 -*-
import sqlite3
def create():
values = ("data")
sql = "CREATE TABLE IF NOT EXISTS ? ( name TEXT, street TEXT, time REAL, age INTEGER )"
con = sqlite3.connect("database.db")
c = con.cursor()
c.execute(sql, values)
con.commit()
c.close()
con.close()
if __name__ = "__main__":
create()
I know that the first argument should be the sql command in form of a string and the second argument must be a tuple of the values which are supposed to be substituted where the ? is in the sql string.
However, when i run the file it returns the following error:
$ ./test.py
Traceback (most recent call last):
File "./test.py", line 21, in <module>
create()
File "./test.py", line 14, in create
c.execute(sql, values)
sqlite3.OperationalError: near "?": syntax error
This also happens when paramstyle is set to named (e.g. the :table form).
I can't spot a syntax error here, so i think that the problem must be caused somewhere in the system. I tested it on an Archlinux and Debian install, both post me the same error.
Now it is up yo you, as I have no idea anymore where to look for the cause.
SQL parameters can only apply to insert data, not table names. That means parameters are not even parsed for DDL statements.
For that you'll have to use string formatting:
sql = "CREATE TABLE IF NOT EXISTS {} ( name TEXT, street TEXT, time REAL, age INTEGER )".format(*values)
As I understand, your parameter is the table name?
so your command would be
tbl = 'my_table'
sql = "CREATE TABLE IF NOT EXISTS '%s' ( name TEXT, street TEXT, time REAL, age INTEGER )" % tbl