I am trying to insert Arduino data into a database through Python, however, it will not do it. Basically I am assigning data that I read in from the serial port assigned to my Arduino and storing the first value of it in the variable arduinoData. in my insert statement I am trying to use a string literal to put the arduinoData into the table. Here is the code:
import mysql.connector
from mysql.connector import errorcode
from time import sleep
import serial
# Obtain connection string information from the portal
config = {
'host':'oursystem.mysql.database.azure.com',
'user':'project',
'password':'',
'database':'projectdb'
}
# Construct connection string
try:
conn = mysql.connector.connect(**config)
print("Connection established")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("Something is wrong with the user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("Database does not exist")
else:
print(err)
else:
cursor = conn.cursor()
ser = serial.Serial('/dev/ttyACM0', 9600) # Establish the connection on a specific port
arduinoData=ser.read().strip()
print arduinoData
# Drop previous table of same name if one exists
cursor.execute("DROP TABLE IF EXISTS ArduinoData;")
print("Finished dropping table (if existed).")
# Create table
cursor.execute("CREATE TABLE ArduinoData (value VARCHAR(20));")
print("Finished creating table.")
# Insert some data into table
cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData))
print("Inserted",cursor.rowcount,"row(s) of data.")
# Cleanup
conn.commit()
cursor.close()
conn.close()
print("Done.")
If I put the %s in single quotes like '%s' it just prints that instead of my arduinoData. Can anyone see what is wrong here, thanks.
I just lost two hours on this :
If you're trying to observe what's happening to your database with phpmyadmin, please note that all your insert commands won't be visible until you commit them
connection.commit()
Simply pass a tuple (arduinoData,) which means a comma within parentheses for a single value or list [arduinoData] and not a single value (arduinoData) in your parameterization:
cursor.execute("INSERT INTO ArduinoData (`value`) VALUES (%s);",(arduinoData,))
However if arduinoData is a list of multiple values, then use executemany, still passing a list. Also, escape value which is a MySQL reserved word:
cursor.executemany("INSERT INTO ArduinoData (`value`) VALUES (%s);",[arduinoData])
I may have interpreted this wrong but, shouldn't there be something like certain_value = '%s' otherwise it doesn't know what it is looking for.
I have just figured out what was wrong, using Parfait's suggestion of parsing a tuple, i just changed my insert statement from cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData))to cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData,))thanks to everyone who answered you were all a great help! :D
Related
I am inserting JSON data into a MySQL database
I am parsing the JSON and then inserting it into a MySQL db using the python connector
Through trial, I can see the error is associated with this piece of code
for steps in result['routes'][0]['legs'][0]['steps']:
query = ('SELECT leg_no FROM leg_data WHERE travel_mode = %s AND Orig_lat = %s AND Orig_lng = %s AND Dest_lat = %s AND Dest_lng = %s AND time_stamp = %s')
if steps['travel_mode'] == "pub_tran":
travel_mode = steps['travel_mode']
Orig_lat = steps['var_1']['dep']['lat']
Orig_lng = steps['var_1']['dep']['lng']
Dest_lat = steps['var_1']['arr']['lat']
Dest_lng = steps['var_1']['arr']['lng']
time_stamp = leg['_sent_time_stamp']
if steps['travel_mode'] =="a_pied":
query = ('SELECT leg_no FROM leg_data WHERE travel_mode = %s AND Orig_lat = %s AND Orig_lng = %s AND Dest_lat = %s AND Dest_lng = %s AND time_stamp = %s')
travel_mode = steps['travel_mode']
Orig_lat = steps['var_2']['lat']
Orig_lng = steps['var_2']['lng']
Dest_lat = steps['var_2']['lat']
Dest_lng = steps['var_2']['lng']
time_stamp = leg['_sent_time_stamp']
cursor.execute(query,(travel_mode, Orig_lat, Orig_lng, Dest_lat, Dest_lng, time_stamp))
leg_no = cursor.fetchone()[0]
print(leg_no)
I have inserted higher level details and am now searching the database to associate this lower level information with its parent. The only way to find this unique value is to search via the origin and destination coordinates with the time_stamp. I believe the logic is sound and by printing the leg_no immediately after this section, I can see values which appear at first inspection to be correct
However, when added to the rest of the code, it causes subsequent sections where more data is inserted using the cursor to fail with this error -
raise errors.InternalError("Unread result found.")
mysql.connector.errors.InternalError: Unread result found.
The issue seems similar to MySQL Unread Result with Python
Is the query too complex and needs splitting or is there another issue?
If the query is indeed too complex, can anyone advise how best to split this?
EDIT As per #Gord's help, Ive tried to dump any unread results
cursor.execute(query,(leg_travel_mode, leg_Orig_lat, leg_Orig_lng, leg_Dest_lat, leg_Dest_lng))
leg_no = cursor.fetchone()[0]
try:
cursor.fetchall()
except mysql.connector.errors.InterfaceError as ie:
if ie.msg == 'No result set to fetch from.':
pass
else:
raise
cursor.execute(query,(leg_travel_mode, leg_Orig_lat, leg_Orig_lng, leg_Dest_lat, leg_Dest_lng, time_stamp))
But, I still get
raise errors.InternalError("Unread result found.")
mysql.connector.errors.InternalError: Unread result found.
[Finished in 3.3s with exit code 1]
scratches head
EDIT 2 - when I print the ie.msg, I get -
No result set to fetch from
All that was required was for buffered to be set to true!
cursor = cnx.cursor(buffered=True)
The reason is that without a buffered cursor, the results are "lazily" loaded, meaning that "fetchone" actually only fetches one row from the full result set of the query. When you will use the same cursor again, it will complain that you still have n-1 results (where n is the result set amount) waiting to be fetched. However, when you use a buffered cursor the connector fetches ALL rows behind the scenes and you just take one from the connector so the mysql db won't complain.
I was able to recreate your issue. MySQL Connector/Python apparently doesn't like it if you retrieve multiple rows and don't fetch them all before closing the cursor or using it to retrieve some other stuff. For example
import mysql.connector
cnxn = mysql.connector.connect(
host='127.0.0.1',
user='root',
password='whatever',
database='mydb')
crsr = cnxn.cursor()
crsr.execute("DROP TABLE IF EXISTS pytest")
crsr.execute("""
CREATE TABLE pytest (
id INT(11) NOT NULL AUTO_INCREMENT,
firstname VARCHAR(20),
PRIMARY KEY (id)
)
""")
crsr.execute("INSERT INTO pytest (firstname) VALUES ('Gord')")
crsr.execute("INSERT INTO pytest (firstname) VALUES ('Anne')")
cnxn.commit()
crsr.execute("SELECT firstname FROM pytest")
fname = crsr.fetchone()[0]
print(fname)
crsr.execute("SELECT firstname FROM pytest") # InternalError: Unread result found.
If you only expect (or care about) one row then you can put a LIMIT on your query
crsr.execute("SELECT firstname FROM pytest LIMIT 0, 1")
fname = crsr.fetchone()[0]
print(fname)
crsr.execute("SELECT firstname FROM pytest") # OK now
or you can use fetchall() to get rid of any unread results after you have finished working with the rows you retrieved.
crsr.execute("SELECT firstname FROM pytest")
fname = crsr.fetchone()[0]
print(fname)
try:
crsr.fetchall() # fetch (and discard) remaining rows
except mysql.connector.errors.InterfaceError as ie:
if ie.msg == 'No result set to fetch from.':
# no problem, we were just at the end of the result set
pass
else:
raise
crsr.execute("SELECT firstname FROM pytest") # OK now
cursor.reset() is really what you want.
fetchall() is not good because you may end up moving unnecessary data from the database to your client.
The problem is about the buffer, maybe you disconnected from the previous MySQL connection and now it cannot perform the next statement. There are two ways to give the buffer to the cursor. First, only to the particular cursor using the following command:
import mysql.connector
cnx = mysql.connector.connect()
# Only this particular cursor will buffer results
cursor = cnx.cursor(buffered=True)
Alternatively, you could enable buffer for any cursor you use:
import mysql.connector
# All cursors created from cnx2 will be buffered by default
cnx2 = mysql.connector.connect(buffered=True)
cursor = cnx.cursor()
In case you disconnected from MySQL, the latter works for you.
Enjoy coding
If you want to get only one result from a request, and want after to reuse the same connexion for other requests, limit your sql select request to 1 using "limit 1" at the end of your request.
ex "Select field from table where x=1 limit 1;"
This method is faster using "buffered=True"
Set the consume_results argument on the connect() method to True.
cnx = mysql.connector.connect(
host="localhost",
user="user",
password="password",
database="database",
consume_results=True
)
Now instead of throwing an exception, it basically does fetchall().
Unfortunately this still makes it slow, if you have a lot of unread rows.
There is also a possibility that your connection to MySQL Workbench is disconnected. Establish the connection again. This solved the problem for me.
cursor.reset()
and then create tables and load entries
Would setting the cursor within the for loop, executing it, and then closing it again in the loop help?
Like:
for steps in result['routes'][0]['legs'][0]['steps']:
cursor = cnx.cursor()
....
leg_no = cursor.fetchone()[0]
cursor.close()
print(leg_no)
For our semester project we need to insert data from a stream that's updated every 3,5s into 2 separate SQL tables and this has to be done with a Python script.
The Idea is to have 2 rooms (Office, Server) that each have a combined temperature and humidity sensor while the server room has a smoke detector too.
The data is send from a Arduino over USB and the data stream looks like this:
Server:61.20,22.70,221.00Office:64.00,23.00
The Python script I've managed to cobble together looks like this:
import serial
import time
import mysql.connector
mydb = mysql.connector.connect(
host ="127.0.0.1",
user ="root",
password ="",
database ="messwerte"
)
mycursor = mydb.cursor()
device = 'COM3'
try:
arduino = serial.Serial(device, 9600)
except:
print("Error: {}".format()),device;
while True:
try:
time.sleep(2)
data = arduino.readline()
print.data
pieces = data.split(" ")
try:
mycursor.execute("INSERT INTO dht11serial (humidity,temperature,CO2) VALUES (%s,%s,%s)", (pieces[0],pieces[1],pieces[2]))
mydb.commit()
mycursor.close()
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
except:
print("Error: {}".format(err))
Now I would need to insert the values for each room into a SQL table that corresponds to that room but how could I manage that?
Please keep in mind that I have actually no idea what I'm doing. It's my very first time doing anything with Python or SQL.
I created this script based on the data stream you posted. I used the same configuration for the arduino connection. The script will generate the database file and the two tables if they do not exists, not need to worry about losing data each time you run the script.
It should work right away (hope so), if not then you only need to adjust the data stream, i put some comments in the code to help you out. Basically, you just need to create two tables inside the same database and same the data twice.
I changed SQL motor to sqlite3, since data will be stored local.
Put the code inside a .py file and run it! Tell me if it works.
import serial
import time
import sqlite3
#Database file is stored in the same folder where script is located.
#File will created itself if not exists
conn = sqlite3.connect("./data.sqlite3")
cursor = conn.cursor()
#Server table inside data.sqlite3
server = """CREATE TABLE IF NOT EXISTS server (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Office table inside data.sqlite3
office = """CREATE TABLE IF NOT EXISTS office (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Creates tables inside database, nothing will happen if tables already exists
cursor.execute(server)
#You can comment out these lines, use them only if you are creating a new file
cursor.execute(office)
conn.commit()
#Arduino conection
#App makes no sense without this, allow it to crash if there is no connection.
arduino = serial.Serial('COM3', 9600)
#Forever
while True:
#Catching error inside a try-except loop will keep these going forever
#Even if arduino connection gets lost
#Rest 3.5 each loop
time.sleep(3.5)
try:
#Server:61.20,22.70,221.00Office:64.00,23.00,220.00
data = arduino.readline()
#Remove Server: and Office: from data string
#You need to leave a comma where Office: is located to separate both values
cleaned_data = data.replace("Server:","").replace("Office:",",")
#Cleaned data looks like this: 61.20,22.70,221.00,64.00,23.00,220.00
#We know first 3 values are from server, last 3 are from office
#Split the string in elements using commas
pieces = cleaned_data.split(",")
#[ 61.20, 22.70, 221.00, 64.00, 23.00, 220.00 ]
print(pieces)
#Send data to server table
cursor.execute("INSERT INTO server (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[0],pieces[1],pieces[2]))
#Send data to office table
cursor.execute("INSERT INTO office (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[3],pieces[4],pieces[5]))
#Write to database confirmation
conn.commit()
#There is no need to close connection to database
#Prints out what caused the error
except Exception as e:
print(e)
I'm having problems returning auto-incremented ID columns from a MySQL database using MySQLdb python library.
I have something like:
sql = """INSERT INTO %s (%s) VALUES (\"%s\")""" %(tbl, colsf, valsf)
try:
cursor.execute(sql)
id = cursor.lastrowid
db.close()
except:
print "Failed to add to MySQL database: \n%s" %sql
print sys.exc_info()
db.close()
exit()
However the lastrowid command seems to be returning incorrect values. For instance, I've tried printing out various id columns from the MySQL command line which shows them to be empty, but the lastrowid value keeps increasing by 1 every time the python script is run. Any ideas?
Turned out that the values weren't being committed to the MySQL database properly, adding "db.commit()" command seems to solve the problem.
sql = """INSERT INTO %s (%s) VALUES (\"%s\")""" %(tbl, colsf, valsf)
try:
cursor.execute(sql)
id = cursor.lastrowid
cursor.close()
db.commit()
db.close()
except:
print "Failed to add to MySQL database: \n%s" %sql
print sys.exc_info()
db.close()
exit()
Is there an exception in my pymysql insert statement below that I can use here to pass on looping if the value already exists?
My code looks like:
try:
with db.cursor() as cursor:
sqltld = "INSERT INTO `table`(`colname`) VALUES (%s)"
cursor.execute(sqltld, (self.url))
db.commit()
except:
print("error INSERTing url")
db.rollback()
Mysql is already enforcing unique on the column colname
Thanks
When in the mysql client the following query works perfectly:
INSERT INTO VOLUMES (VOLUMEID, NAME, TYPE, SERVERID,SIZE, DEVICENAME, CREATIONDATE) VALUES ('vol-b67d73b7', 'TBC','gp2', 'i-7d445a89', '8', '/dev/sda1', '2014-11-24T07:40:37.921Z');
However when I use the same query in my python script I get the following error:
1265: Data truncated for column 'CREATIONDATE' at row 1
Here is the python code:
try:
db = mysql.connector.connect(**config)
cursor = db.cursor()
for volume in volumes:
ins_stmt = ("INSERT INTO VOLUMES (VOLUMEID, NAME, TYPE, SERVERID,"
"SIZE, DEVICENAME, CREATIONDATE) VALUES ('{0}', 'TBC',"
"'{1}', '{2}', '{3}', '{4}', '{5}');")
to_execute = ins_stmt.format( volume.id, volume.type,
volume.attach_data.instance_id, volume.size,
volume.attach_data.device, volume.create_time)
cursor.execute(to_execute)
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("Something is wrong with your user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("Database does not exists")
else:
print(err)
else:
db.close()
I don't understand what could be wrong. Can anyone help?
You need to use SQL parameters and have the database adapter handle quoting for you:
ins_stmt = ("INSERT INTO VOLUMES (VOLUMEID, NAME, TYPE, SERVERID,"
"SIZE, DEVICENAME, CREATIONDATE) VALUES (?, 'TBC',"
"%s, %s, %s, %s, %s);")
params = (volume.id, volume.type,
volume.attach_data.instance_id, volume.size,
volume.attach_data.device, volume.create_time)
cursor.execute(to_execute, params)
Not only does this ensure that you don't run into problems with pre-existing quoting characters in your strings, it also avoids SQL injection attacks (where a user crafts data that breaks out of the quoting and issues additional SQL statements you did not intent to allow). Don't let Little Bobby Tables get you!
Next, make sure your timestamps are in a format that MySQL supports; your timestamp appears to have a 3-digit fraction and a timezone specifier; MySQL 5.6.4 and up appears to expect a 6 digit fraction, a space instead of the T, and no timezones, so perhaps you should remove that portion from your volume.create_time string:
params = (volume.id, volume.type,
volume.attach_data.instance_id, volume.size,
volume.attach_data.device,
volume.create_time.replace('T', ' ').partition('.')[0])
The message you see is telling you that MySQL won't accept the date as formatted. Your MySQL client is either only issuing a warning (instead of an error), or it is configured to allow invalid dates (ALLOW_INVALID_DATES is set). If a warning was set you need to ask MySQL for the warning with show warnings in the client.