Python/MySQL, No errors but data not inserting - python

I have a code, shown below that reads 4 lines of data over a serial connection. The data is assigned a variable and then an attempt is made to insert this into a local database. However, once the code has run, there is no new data in the database.
I have inserted print commands in to check that the data is definitely being received over terminal and it is, I have also successfully inserted data into the database via terminal, but that was static values such as 10.0, 10.0, 0, 10.
import MySQLdb
import serial
import time
ser = serial.Serial('/dev/ttyACM0', 115200)
conn = MySQLdb.connect(host= "localhost", user= "JP", passwd= "password", db= "serialdb")
cursor = conn.cursor()
while 1:
print "waiting for data"
print ""
xs = ser.readline()
print xs
time.sleep(1)
ys = ser.readline()
print ys
time.sleep(1)
zs = ser.readline()
print zs
time.sleep(1)
vs = ser.readline()
print vs
time.sleep(1)
try:
x= float(xs)
except ValueError:
pass
try:
y= float(xs)
except ValueError:
pass
try:
z= float(xs)
except ValueError:
pass
v = int(vs)
print "inserting into database"
print ""
time.sleep(1)
sql = "INSERT INTO Arduino_Data(Temperature, Humidity, RPM, Distance) VALUES (%f, %f, %f, %d)" %(x, y, z, v)
cursor.execute(sql)
conn.commit
break

Commit is a function, you are not calling it :)
conn.commit()
That should do it

Related

pyodbc: print datatype in python

I just started learning Python and I got into pyodbc. I would like to print the datatypes of the columns just once above the actual data. But right now, it prints the datatypes through the whole loop or the program shuts down. I'm not sure about what exactly my mistake is.
def selectPlace(dbc):
cursor = dbc.cursor()
try:
cursor.execute('SELECT Nr, address from Place')
except:
print ('Error')
cursor.close()
return
print ('\nPlaces:')
i = 0
for row in cursor:
if i == 0:
print('Datatypes: ', type(row[0]), type(row[1]))
i = 1
print (row[0], row[1])
# print type(row[0])
cursor.close()
x = input('Input : ')
return x

sqlite3.OperationalError: database is locked - How to avoid this?

I'm using an open source piece of python code that basically pulls in a location of an entity and saves these details to a DB in real time. lets call it scanner the scanner program. DB file it saves it to is a sqlite file: db.sqlite.
As this is happening my piece of code in question is searching the db file every 45 seconds performing a select statement to find a certain value. This will work a couple of times but after running for a couple of minutes concurrently with the scanner program they run into a DB lock error:
sqlite3.OperationalError: database is locked
So what can I do to my code to ensure this lock does not happen. I cannot change how the scanner program accesses the DB. Only my program.
Any help here would be great. I've seen timeouts mentioned along with threading but I am not sure on either.
from datetime import datetime
import sqlite3
import time
import json
import tweepy
def get_api(cfg):
auth = tweepy.OAuthHandler(cfg['consumer_key'], cfg['consumer_secret'])
auth.set_access_token(cfg['access_token'], cfg['access_token_secret'])
return tweepy.API(auth)
# Fill in the values noted in previous step here
cfg = {
"consumer_key" : "X",
"consumer_secret" : "X",
"access_token" : "X",
"access_token_secret" : "X"
}
with open('locales/pokemon.en.json') as f:
pokemon_names = json.load(f)
currentid = 1
pokemonid = 96 #test
while 1==1:
conn = sqlite3.connect('db.sqlite')
print "Opened database successfully";
print "Scanning DB....";
time.sleep(1)
cur = conn.execute("SELECT * FROM sightings WHERE pokemon_id = ? and id > ?", (pokemonid, currentid))
row = cur.fetchone()
if row is None:
print "No Pokemon Found \n "
time.sleep(1)
while row is not None:
#get pokemon name
name = pokemon_names[str(pokemonid)]
#create expiry time
datestr = datetime.fromtimestamp(row[3])
dateoutput = datestr.strftime("%H:%M:%S")
#create location
location = "https://www.google.com/maps/place/%s,%s" % (row[5], row[6])
#inform user
print "%s found! - Building tweet! \n" % (name)
time.sleep(1)
#create tweet
buildtweet = "a wild %s spawned in #Dublin - It will expire at %s. %s #PokemonGo \n "%(name, dateoutput, location)
#print tweet
#log
print buildtweet
currentid = row[0]
time.sleep(1)
#send tweet
api = get_api(cfg)
tweet = buildtweet
try:
status = api.update_status(status=tweet)
print "sent!"
except:
pass
print "this tweet failed \n"
time.sleep(30)
row = cur.fetchone()
cur.close()
conn.close()
print "Waiting..... \n "
time.sleep(45)
conn.close()

Python Multi-threaded socket listener error with threads not releasing

I have 500+ units in the world that connects to my server and dump their data. Up to now i have been using a PHP script to act as a socket listener, but I need to go multi-threaded as the load is increasing and PHP can not keep up. I am quite new to Python and decided to use it for my new platform, over the past few days i have struggled and tried many examples to no avail. Through my search i came across some questions trying to answer this problem, but none did. I will attach my code.
The problem : as the units connect to the server and the server accepts it, the server creates a new thread to handle the connection, the problem comes when the unit drops the connection the thread stays open and active and the total thread count grows, and this is linked to the system limit : "number of open files", i can increase this limit but this only make it a time bomb , it does not solve this.
Please help.
#! /usr/bin/python
import multiprocessing
import socket
import sys
import pprint
import datetime
import MySQLdb
import time
import datetime
import re
import select
import resource
import threading
max_connections = 1024
max_connections_set = max_connections
resource.setrlimit(resource.RLIMIT_NOFILE, (max_connections_set, max_connections_set))
#the incomming port
the_global_port = xxxx #(any port)
#display id
the_global_id = "UNIT TYPE"
class ConnectionObject(object):
the_id = None
the_socket = None
the_socket_address = None
the_socket_address_ip = None
the_socket_address_port = None
the_server = None
the_process = None
the_process_id = None
the_process_name = None
the_imei = None
identifier = ""
# The class "constructor" - It's actually an initializer
def __init__(self, in_process_nr, in_process , in_socket , in_socket_address, in_server):
self.the_id = in_process_nr
self.the_process = in_process
self.the_process_id = self.the_process.exitcode
self.the_process_name = self.the_process.name
self.the_socket = in_socket
self.the_socket_address = in_socket_address
self.the_socket_address_ip = self.the_socket_address[0]
self.the_socket_address_port = self.the_socket_address[1]
self.the_server = in_server
self.identifier = str(self.the_id) + " " + str(self.the_process_name) + " " + str(self.the_socket_address_ip) + " " + str(self.the_socket_address_port) + " "
#end def init
#end def class
def processData(the_connection_object , the_data):
def mysql_open_connection_inside():
try:
the_conn = MySQLdb.connect(host= "127.0.0.1",
user="user",
passwd="password",
db="mydb")
except MySQLdb.Error, e:
print "Error %d: %s" % (e.args[0], e.args[1])
time.sleep(30)
try:
the_conn = MySQLdb.connect(host= "127.0.0.1",
user="user",
passwd="password",
db="mydb")
except MySQLdb.Error, e:
print "Error %d: %s" % (e.args[0], e.args[1])
print "Unexpected error:", sys.exc_info()[0]
raise
sys.exit(0)
#end time 2
#end try except
return the_conn
#end def mysql_open_connection
conn = mysql_open_connection_inside()
x = conn.cursor()
add_rawdata = ("INSERT INTO RawData"
"(ID,RawData,Type) "
"VALUES (%s, %s, %s)")
data_raw = ('123456', 'ABCD','')
records_inserted = 0
the_connection_object.the_imei = ""
#global clients
imei = ""
try:
thedata = ""
thedata = " ".join("{:02x}".format(ord(c)) for c in the_data)
record_to_save = ' '.join(thedata)
seqtoreply = ""
seqtoreply = "OK"
#reply part
if (seqtoreply != ""): #reply to unit
try:
the_connection_object.the_socket.sendall(seqtoreply)
#echoout(the_connection_object.identifier+"We Replyed With : " + seqtoreply)
except:
echoout(the_connection_object.identifier+"Send Reply Error : " + str(sys.exc_info()[1]))
#end except
#end of if
the_id = "some generated id"
data_raw = (the_id, werk_data, 'UNIT')
try:
x.execute(add_rawdata, data_raw)
conn.commit()
echoout(the_connection_object.identifier+"Raw Data Saved.")
except:
conn.rollback()
echoout(the_connection_object.identifier+" Raw Data NOT Saved : " + str(sys.exc_info()[1]))
#end of data save insert
#echoout("=============================")
endme = 1
echoout("")
conn.close()
#end try
except:
conn.close()
echoout(the_connection_object.identifier+"Error : " + str(sys.exc_info()[1]))
#end try except
#end def handel function
def handle_p(processnr, server, connection, address):
this_connection = ConnectionObject(processnr,multiprocessing.current_process(), connection, address, server)
thisprocess = multiprocessing.current_process()
this_connection.the_id = ""
the_address = this_connection.the_socket_address_ip
the_port = this_connection.the_socket_address_port
try:
echoout("New connection from : "+str(the_address)+" on port "+str(the_port))
close_the_socket = False
while True:
#--------------------- recive part -------------------------------------------------
data = connection.recv(512)
thedata = ""
thedata = " ".join("{:02x}".format(ord(c)) for c in data)
if ((thedata == "") or (thedata == " ") or (data == False)):
echoout("Socket Closed Remotely : No Data")
close_the_socket = True
break
#end - if data blank
else :
processData(this_connection, data)
#end there is data
echoout("=============================")
#end if while true
#end try
except:
print "handling request, Error : " + str(sys.exc_info()[1])
close_the_socket = True
connection.close()
finally:
close_the_socket = True
echoout("Closing socket")
connection.close()
#end try finally
#end def handel function
def mysql_update(update_statement, update_values):
conn_update = MySQLdb.connect(host= "127.0.0.1",
user="user",
passwd="password",
db="mydb")
x_update = conn_update.cursor(MySQLdb.cursors.DictCursor)
rawdata_data = (update_statement)
data_rawdata = (update_values)
allupdateok = False
#end if there is more
try:
x_update.execute(rawdata_data, data_rawdata)
conn_update.commit()
allupdateok = True
conn_update.close()
except:
conn_update.rollback()
allupdateok = False
conn_update.close()
print "Unexpected error:", sys.exc_info()[0]
raise
#end of data save insert
if (allupdateok == False):
echoout("Update Raw Data Table Error")
#end if update
return allupdateok
#end def mysqlupdate
def echoout(thestring):
datestring = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
if (thestring != ""):
outstring = datestring + " " + thestring
print outstring
else :
outstring = thestring
print outstring
#end - def echoout
class Server(object):
threads = []
all_threads = []
high_proc = ""
def __init__(self, hostname, port):
self.hostname = hostname
self.port = port
def start(self):
echoout("Listening for conncetions")
self.socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
self.socket.bind((self.hostname, self.port))
self.socket.listen(10)
process_number = 1
inputs = [self.socket]
while True:
inready, outready, excready = select.select(inputs, [], [], 30);
for s in inready:
if s == self.socket:
conn, address = self.socket.accept()
high_processname = ""
echoout("Got a connection...")
process = threading.Thread(target=handle_p, args=(process_number,self, conn, address))
high_processname = process.name
self.high_proc = high_processname
process.daemon = True
process.start()
self.all_threads.append((process,conn))
ts = time.time()
st = datetime.datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
self.threads.append((process_number,conn,st,0,process))
process_number = process_number + 1
print "ACTIVE Threads = " + str(threading.activeCount())
the_total_count = 0
dead_count = 0
alive_count = 0
for the_thread in self.all_threads :
if (the_thread[0].is_alive() == True):
alive_count = alive_count + 1
else :
dead_count = dead_count + 1
the_thread[1].close()
the_thread[0].join(0.3)
self.all_threads.pop(the_total_count)
#end if alive else
the_total_count = the_total_count + 1
#end for threads
print "LIVE Threads = " + str(alive_count)
print "DEAD Threads = " + str(dead_count)
print "TOTAL Threads = " + str(the_total_count)
print ""
#end if s = socke, new connection
#end for loop
#end while truw
self.socket.close()
#end def start
#main process part
if __name__ == "__main__":
start_ip = "0.0.0.0"
start_port = the_global_port
#start server
server = Server(start_ip, start_port)
try:
print "Listening on " , start_port
server.start()
except:
print "unexpected, Error : " + str(sys.exc_info()[1])
finally:
print "shutting down"
active_clients = 0
for process in multiprocessing.active_children():
try:
active_clients = active_clients + 1
process.terminate()
#process.join()
except:
print "Process not killed = " + str(sys.exc_info()[1])
#end try except
#close mysql connection
print "Active clients = " + str(active_clients)
#end try finally
server.socket.close()
server.threads = []
server = None
print "All done."
#end def main
First of all, it is silly to use threads when you can have 500+ connected clients, you should go asynchronous - look at gevent for example for
a very good library, or at least use select (see Python documentation).
Then, your code to close the socket in handle_p looks good, indeed when
the recv() call comes back with an empty string it means the remote
end is disconnected so you break the while, fine.
However, it looks like the remote closed the connection but it is not
detected on your side (recv() doesn't return). The best would be
then to have a kind of heartbeat to know when you can close the
connection.

Rasperry Pi loses Mysql Connection?

I am trying to insert data into a mysql database with a keypad connected to a Rasperry PI using python.
The code:
#!/usr/bin/python
import os
import MySQLdb
import RPi.GPIO as GPIO
import time
#Open db conn
db = MySQLdb.connect("remote_server.com","user","password","database")
# prep cursor
cursor = db.cursor()
GPIO.setmode(GPIO.BOARD)
MATRIX = [ [1,2,3],
[4,5,6],
[7,8,9],
['*',0,'#'] ]
ROW = [7,11,13,15]
COL = [23,21,19]
for j in range(3):
GPIO.setup(COL[j], GPIO.OUT)
GPIO.output(COL[j], 1)
for i in range(4):
GPIO.setup(ROW[i], GPIO.IN, pull_up_down = GPIO.PUD_UP)
try:
while(True):
for j in range(3):
GPIO.output(COL[j],0)
for i in range(4):
if GPIO.input(ROW[i]) == 0:
mysql_code = MATRIX[i][j]
print mysql_code
try:
cursor.execute('''Insert into Rasperry_Codes (Code, insertTS) VALUES (%s, NOW())''', (mysql_code))
except MySQLdb.Error, e:
try:
print "MySQL Error [%d]: %s" % (e.args[0], e.args[1])
except IndexError:
print "MySQL Error: %s" % str(e)
db.commit()
time.sleep(0.2)
while(GPIO.input(ROW[i]) == 0):
pass
GPIO.output(COL[j],1)
except KeyboardInterrupt:
GPIO.cleanup()
Sometimes the data is inserted into database, sometimes not.
No error is given from mysql MySQLdb.Error.
print mysql_code always prints the correct pressed number.
Does anybody see a problem that could cause that random malfunction?

Return SQL result from function as array

Since the query returns more than 1 result, at the Get_results class how could i return the data_out as an array in order to iterate on the results of the query?
import psycopg2
import sys
class Get_results():
def db_call(self,query,dbHost,dbName,dbUser,dbPass):
try:
con = None
con = psycopg2.connect(host=dbHost, database=dbName,
user=dbUser, password=dbPass)
cur = con.cursor()
cur.execute(query)
data = cur.fetchall()
for data_out in data:
return data_out
except psycopg2.DatabaseError, e:
print 'Error %s' % e
sys.exit(1)
finally:
if con:
con.close()
sql = " some sql "
w = Get_results()
for i in w.db_call(sql, dbHost, dbName, dbUser, dbPass):
print "The result is : " + i
For aditional info, when if i add print data right after data = cur.fetchall() i have the result:
[('The_Galaxy', 'The_Galaxy:star'),
('The_Galaxy', 'The_Galaxy:planet')]
The immediate answer is to change:
for data_out in data:
data_out result
to:
for data_out in data:
yield data_out
But you should look at using a with statement (if the DB API supports it), and simplifying the code - this could just be done by making a generator function (a class is OTT for this)
import psycopg2
import sys
class Get_results():
def db_call(self,query,dbHost,dbName,dbUser,dbPass):
try:
con = None
con = psycopg2.connect(host=dbHost, database=dbName,
user=dbUser, password=dbPass)
cur = con.cursor()
cur.execute(query)
data = cur.fetchall()
resultList = []
for data_out in data:
resultList.append(data_out[1])
return resultList
except psycopg2.DatabaseError, e:
print 'Error %s' % e
sys.exit(1)
finally:
if con:
con.close()
sql = " some sql "
w = Get_results()
for i in w.db_call(sql, dbHost, dbName, dbUser, dbPass):
print "The result is : " + i

Categories