Python/SQLite Command skipped with trigger - python

I've got some problems with my SQLite database. I've got some sql script like this:
CREATE TABLE Workers(
Id_worker INT NOT NULL PRIMARY KEY,
Name VARCHAR(20) NOT NULL,
Surname VARCHAR(30) NOT NULL,
Id_job INT NOT NULL, -- REFERENCES Job(Id_job),
Adress VARCHAR(30) NOT NULL,
Start_date SMALLDATETIME NOT NULL
);
CREATE TABLE OldWorkers(
Id_arch INT NOT NULL PRIMARY KEY,
Name VARCHAR(20) NOT NULL,
Surname VARCHAR(30) NOT NULL,
Id_job INT NOT NULL, -- REFERENCES Job(Id_job),
Adress VARCHAR(30) NOT NULL,
Start_date SMALLDATETIME NOT NULL,
Delete_date SMALLDATETIME NOT NULL
);
CREATE TRIGGER OldWorkersTrigger
AFTER DELETE ON Workers
FOR EACH ROW
BEGIN
INSERT INTO OldWorkers (Id_arch, Name, Surname, Id_job, Adress, Start_date) VALUES (old.Id_arch, old.Name, old.Surname, old.Id_job, old.Adress, old.Start_date,datatime('now'));
END;
I try to do it in Python 2.7.4 by sqlite3 like this:
conn = sqlite3.connect('Company.db')
c = conn.cursor()
fd = open('MyScript.sql', 'r')
sqlFile = fd.read()
fd.close()
# all SQL commands (split on ';')
sqlCommands = sqlFile.split(';')
i = 1
# Execute every command from the input file
for command in sqlCommands:
# This will skip and report errors
# For example, if the tables do not yet exist, this will skip over
# the DROP TABLE commands
print i," : ",command
i = i + 1
try:
c.execute(command)
except OperationalError, msg:
print "Command skipped: ", msg
But in command wiht my trigget it returns: Command skipped: near ")": syntax error.
And after END; it returns: Command skipped: cannot commit - no transaction is active

You are splitting the file at every semicolon, but the CREATE TRIGGER statement has an embedded semicolon.
To check whether a statement is complete, try the sqlite3.complete_statement function.

SQLite can ingest and execute arbitrary SQL statements.
Why not consider the following (untested):
conn = sqlite3.connect('Company.db')
c = conn.cursor()
with open('MyScript.sql', 'r') as fd:
try:
c.execute(fd.read())
except OperationalError, msg:
print "Command skipped: ", msg

Related

How to add an action to a mysql, python written database

So I'm trying to make a project for school where a database stores the check_in and check_out time form an RFID card using a RFID reader.
create table attendance_check(
id INT UNSIGNED NOT NULL AUTO_INCREMENT UNIQUE,
date_id DATE NOT NULL DEFAULT CURRENT_DATE,
user_id INT UNSIGNED NOT NULL,
name VARCHAR(255) NOT NULL,
clock_id TIME NOT NULL DEFAULT CURRENT_TIME,
Action VARCHAR(255) NOT NULL,
PRIMARY KEY ( id )
);
The database looks like this and for "Action" I want to add "in" and "out". I manage to add the "in" but can't figure out how to make a look-up and add an "out".
This is the code so far. I've tried 10 different variations already, I also have a database that stores the Users.
while True:
lcd.clear()
lcd.message('Place Card to\nrecord attendance')
id, text = reader.read()
cursor.execute("Select id, name FROM users WHERE rfid_uid="+str(id))
result = cursor.fetchone()
lcd.clear()
name = result[1]
number = result [0]
action1 = "in"
action2 = "out"
if cursor.rowcount >= 1:
lcd.message("Welcome " + name)
add = ("INSERT INTO attendance_check (user_id, name, Action) VALUES (%s, %s, %s)")
date = (numar, nume, action1)
cursor.execute(add, date)
db.commit()
else:
lcd.message("User does not exist.")
time.sleep(2)
I've tried to use if statements inside that checks if an action is there and if it's "in" it should add "out" but it never worked
It looks like this so far

Python, Oracle_cx, using query result as parameter list into looping insert statement

In the past I wrote a pl/sql script that takes a table name and a column name (that indicates source) as arguments and then profiles all the columns in the table giving useful counts.
I'm currently teaching myself python and am re-writing that pl/sql script in a way that can be executed against other sql databases, not just oracle. So I am new to Python. I'm going through Automate the Boring Stuff on Udemy. At the moment I'm not concerned with sql injection because I'm just learning the Python language. I have left out the create table statements to reduce the amount of code I'm pasting.
The script is inserting the correct records on the first pass of the loop however it does not start the 2nd loop. Here is the IDLE output, and then the code.
================================================ RESTART: C:\Users\nathan\Documents\_work\_data_profiling_script\profiling_python_tester.py ================================================
('ETL_INS_DTM',)
insert into PROFILING_NWS6_PRT
select 'PROFILING_NWS6', 'ETL_INS_DTM', SRCRECNO, count(*), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null
from PROFILING_NWS6
group by SRCRECNO
order by 1,2,3
executed
committed
**Traceback (most recent call last):
File "C:\Users\nathan\Documents\_work\_data_profiling_script\profiling_python_tester.py", line 39, in <module>
for row in cursor:
cx_Oracle.InterfaceError: not a query**
import cx_Oracle
conn = cx_Oracle.connect("system", "XXXX", "localhost/xe")
cursor = conn.cursor()
## parameter declaration
##########################################################################
# These 2 parameters populated by user
v_st = 'PROFILING_NWS6' # Source Table - table in which we are profiling the data
v_srcno = 'SRCRECNO' # Source Number - numeric column in v_st that identifies the source system
# These 3 parameters automatically populated
v_prt = v_st + '_PRT' # Profile Report Table - table name we want our report created as
v_log = v_st + '_LOG' # Log Table - script logging goes here, used for monitoring and debugging
v_top = v_st + '_TOP' # Top Table - temporary table to hold top 5 counts
# write script that populates Profile Report Table with rows for each source/column combination from source table
# these are required to join to when updating analysis fields
##########################################################################
sql = "Select column_name from user_tab_columns where table_name = '"+ v_st + "' and column_name <> '" + v_srcno + "'"
cursor.execute(sql)
for row in cursor:
print(row)
sql = """insert into {x_prt}
select '{x_st}', '{x_row}', {x_srcno}, count(*), null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null
from {x_st}
group by {x_srcno}
order by 1,2,3""".format(x_prt = v_prt, x_srcno = v_srcno, x_st = v_st, x_row = row[0])
print(sql)
cursor.execute(sql)
print('executed')
cursor.execute('commit')
print('committed')
#close connections
##########################################################################
cursor.close()
conn.close()
The cursor in for row in cursor: is still in use until the loop completes. When you do cursor.execute(sql) inside the loop, it changes the cursor object. So on the 2nd loop, the cursor item being iterated is the cursor from the commit inside the loop. Solution is to create or use a different cursor object inside the loop.
cursor = conn.cursor() # original cursor, as above
insert_cursor = conn.cursor() # new one for insert
sql = "Select column_name from user_tab_columns where table_name " # etc
for row in cursor.execute(sql):
print(row)
sql = """second sql""".format(...)
print(sql)
insert_cursor.execute(sql)
print('executed')
insert_cursor.execute('commit')
print('committed')
cursor.close()
insert_cursor.close()
conn.close()
Also, for row in cursor: should be for row in cursor.fetchall():. Or just
for row in cur.execute(sql):.
This coding architecture will do a lot of 'round-trips' between Python & the DB, so it will be far from optimal. Small improvements include using connection.autocommit instead of a full SQL commit (or connection.commit() call). Then you could look at using executemany() instead of multiple execute() calls. Overall, for Oracle, just use a PL/SQL call since this will take just one round-trip.

Cannot to insert record into table Incorrect number of bindings supplied

Cannot to insert into ShopifyMonitor table record (there 2 field: id, name)
full traceback of error:
File "D:\Related To Python (Tutorials)\Python-Test\Working With Database\goo.py", line 174, in <module>
c.execute(make_shopify_name, (shopify_name))
sqlite3.ProgrammingError: Incorrect number of bindings supplied. The current statement uses 1, and there are 10 supplied.
.
get_site = str(input('Enter site here: '))
url = fix_url(get_site)
shopify_name = fix_url(get_site, True)
basepath = os.path.dirname(__file__)
db_name = '{}/shopify.sqlite3'.format(basepath)
sql_create_projects_table = """ CREATE TABLE IF NOT EXISTS ShopifyMonitor (
id integer PRIMARY KEY AUTOINCREMENT,
name text UNIQUE NOT NULL
);"""
sql_create_tasks_table = """ CREATE TABLE IF NOT EXISTS Product (
id integer PRIMARY KEY AUTOINCREMENT,
product_id text NOT NULL,
updated_at text NOT NULL,
title text NOT NULL,
link_to_product text UNIQUE NOT NULL,
vendor text NOT NULL,
sku text NOT NULL,
quantity text NOT NULL,
options text,
price text NOT NULL,
collection_id text,
collection_updated text,
shopify_name text NOT NULL,
FOREIGN KEY(shopify_name) REFERENCES ShopifyMonitor(name)
);"""
make_shopify_name = '''INSERT INTO ShopifyMonitor(name) VALUES (?) '''
conn = create_connection(db_name)
if conn is not None:
# create projects table
create_table(conn, sql_create_projects_table)
# create tasks table
create_table(conn, sql_create_tasks_table)
else:
print("Error! cannot create the database connection.")
c = conn.cursor()
c.execute(make_shopify_name, (shopify_name))
conn.commit()
It looks like your post is mostly code; please add some more details.add some more details.
The issue is subtle:
c.execute(make_shopify_name, (shopify_name))
Should be:
c.execute(make_shopify_name, (shopify_name,)) # note comma after shopify_name
The second parameter passed into execute should be a tuple of parameters for the query - even if there's only one parameter, it still has to be a tuple.
At the moment all you have is parentheses around a variable name - the parentheses will basically be ignored by Python as they don't mean anything.
It's a common misconception that it's the parentheses that make a tuple - it's not, it's the comma:
x = (1) # x is 1
x = 1, # x is a tuple containing a single value, the integer 1
x = (1,) # as above - but the parentheses aren't actually required syntactically here

How to leave the unnecessary quotation marks when i convert a csv into a mysqldb with python?

I've got a CSV excel file, which one I want to convert into mysqldb.
It is working very fine, but in every cell in the MYSQLdb there are unnecessary quotation marks, like this: "".
When the cell is empty I see this: ""
When the cell is not empty is see this: "somethingdata"
I can't understand, why put these quotation marks when in the csv file there are none.
Here my code, I think it is correct too.
connection = MySQLdb.connect(host='localhost',
user='root',
passwd='1234',
db='database')
cursor = connection.cursor()
query = """
CREATE TABLE `test` (
`Megnevezes` varchar(100) DEFAULT NULL,
`2015` varchar(100) DEFAULT NULL,
`2014` varchar(100) DEFAULT NULL,
`2013` varchar(100) DEFAULT NULL,
`2012` varchar(100) DEFAULT NULL,
`2011` varchar(100) DEFAULT NULL,
`ID` int(10) NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB AUTO_INCREMENT=32 DEFAULT CHARSET=utf8"""
cursor.execute(query)
connection.commit()
cursor.close()
connection = MySQLdb.connect(host='localhost',
user='root',
passwd='1234',
db='database')
cursor = connection.cursor()
query = """ load data local infile 'C:/Python27/output.csv'
into table test
character set latin1
fields terminated by ';'
lines terminated by '\n'
ignore 1 lines;
"""
cursor.execute(query)
connection.commit()
cursor.close()
Any ideas how can I fix this issue?
https://dev.mysql.com/doc/refman/5.7/en/load-data.html
Use the: ENCLOSED BY 'char' where char will be "

How to get columns from a query in python?

I have that query in a python program:
And i should create a multidimensional array (if it possible) or four arrays from this query for every column from the query.
Can you suggest an elegant way to solve it?
conn = #connection to the server
cursor=conn.cursor()
query = (" select id, name, phone, city from guest")
cursor.execute(query)
results = cursor.fetchall
for i in results:
print i
cursor.close()
conn.close()
Not elegant but it may assist to unravel the mysterious Python Connector Cursor Class and transfers the list of tuples (see Copperfield comment) with the data from the query, into a list (phoneList) of dictionaries (entries) with details of each entry in the database, that might be easier to work with in your python script:
# ref: https://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor.html
import mysql.connector
db = 'test'
table = 'phonebook'
phoneList = []
drop_table = ("DROP TABLE IF EXISTS {};").format(table)
# By default, the starting value for AUTO_INCREMENT is 1, and it will increment by 1 for each new record.
# To let the AUTO_INCREMENT sequence start with another value, use the following SQL statement:
# ALTER TABLE phonebook AUTO_INCREMENT=100;
create_table = ("CREATE TABLE {} ("
"id int NOT NULL AUTO_INCREMENT,"
"name varchar(30) NOT NULL,"
"phone varchar(30) NOT NULL,"
"city varchar(30) NOT NULL,"
"PRIMARY KEY (id))"
" ENGINE=InnoDB DEFAULT CHARSET=latin1;").format(table)
Names = {'Bill':{'phone':'55123123','city':'Melbourne'},
'Mary':{'phone':'77111123','city':'Sydney'},
'Sue':{'phone':'55888123','city':'Melbourne'},
'Harry':{'phone':'77777123','city':'Sydney'},
'Fred':{'phone':'88123444','city':'Yongala'},
'Peter':{'phone':'55999123','city':'Melbourne'}}
cnx = mysql.connector.connect(user='mysqluser', password='xxxx',host='127.0.0.1',database=db)
cursor = cnx.cursor(dictionary=True) # key to using **row format
cursor.execute(drop_table)
cursor.execute(create_table)
# populate db
for name,detail in dict.items(Names):
sql = ("INSERT INTO {} (name,phone,city) VALUES ('{}','{}','{}')".format(table,name,detail['phone'],detail['city']))
cursor.execute(sql)
sql = ("SELECT id,name,phone,city FROM {}".format(table))
cursor.execute(sql)
for row in cursor:
print("{id} {name} {phone} {city}".format(**row))
phoneList.append(row)
print phoneList[0]['name'],phoneList[0]['city']
print phoneList[3]['name'],phoneList[3]['phone']
for entries in phoneList: # list of dictionaries
print entries['name'],entries
for entries in phoneList:
for k,v in dict.items(entries):
print k,v
print "\n"
cnx.close()

Categories