Python script to insert data into multiple SQL tables - python

For our semester project we need to insert data from a stream that's updated every 3,5s into 2 separate SQL tables and this has to be done with a Python script.
The Idea is to have 2 rooms (Office, Server) that each have a combined temperature and humidity sensor while the server room has a smoke detector too.
The data is send from a Arduino over USB and the data stream looks like this:
Server:61.20,22.70,221.00Office:64.00,23.00
The Python script I've managed to cobble together looks like this:
import serial
import time
import mysql.connector
mydb = mysql.connector.connect(
host ="127.0.0.1",
user ="root",
password ="",
database ="messwerte"
)
mycursor = mydb.cursor()
device = 'COM3'
try:
arduino = serial.Serial(device, 9600)
except:
print("Error: {}".format()),device;
while True:
try:
time.sleep(2)
data = arduino.readline()
print.data
pieces = data.split(" ")
try:
mycursor.execute("INSERT INTO dht11serial (humidity,temperature,CO2) VALUES (%s,%s,%s)", (pieces[0],pieces[1],pieces[2]))
mydb.commit()
mycursor.close()
except mysql.connector.IntegrityError as err:
print("Error: {}".format(err))
except:
print("Error: {}".format(err))
Now I would need to insert the values for each room into a SQL table that corresponds to that room but how could I manage that?
Please keep in mind that I have actually no idea what I'm doing. It's my very first time doing anything with Python or SQL.

I created this script based on the data stream you posted. I used the same configuration for the arduino connection. The script will generate the database file and the two tables if they do not exists, not need to worry about losing data each time you run the script.
It should work right away (hope so), if not then you only need to adjust the data stream, i put some comments in the code to help you out. Basically, you just need to create two tables inside the same database and same the data twice.
I changed SQL motor to sqlite3, since data will be stored local.
Put the code inside a .py file and run it! Tell me if it works.
import serial
import time
import sqlite3
#Database file is stored in the same folder where script is located.
#File will created itself if not exists
conn = sqlite3.connect("./data.sqlite3")
cursor = conn.cursor()
#Server table inside data.sqlite3
server = """CREATE TABLE IF NOT EXISTS server (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Office table inside data.sqlite3
office = """CREATE TABLE IF NOT EXISTS office (
ID INTEGER PRIMARY KEY AUTOINCREMENT,
humidity VARCHAR(255),
temperature VARCHAR(255),
CO2 VARCHAR(255))"""
#Creates tables inside database, nothing will happen if tables already exists
cursor.execute(server)
#You can comment out these lines, use them only if you are creating a new file
cursor.execute(office)
conn.commit()
#Arduino conection
#App makes no sense without this, allow it to crash if there is no connection.
arduino = serial.Serial('COM3', 9600)
#Forever
while True:
#Catching error inside a try-except loop will keep these going forever
#Even if arduino connection gets lost
#Rest 3.5 each loop
time.sleep(3.5)
try:
#Server:61.20,22.70,221.00Office:64.00,23.00,220.00
data = arduino.readline()
#Remove Server: and Office: from data string
#You need to leave a comma where Office: is located to separate both values
cleaned_data = data.replace("Server:","").replace("Office:",",")
#Cleaned data looks like this: 61.20,22.70,221.00,64.00,23.00,220.00
#We know first 3 values are from server, last 3 are from office
#Split the string in elements using commas
pieces = cleaned_data.split(",")
#[ 61.20, 22.70, 221.00, 64.00, 23.00, 220.00 ]
print(pieces)
#Send data to server table
cursor.execute("INSERT INTO server (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[0],pieces[1],pieces[2]))
#Send data to office table
cursor.execute("INSERT INTO office (humidity,temperature,CO2) VALUES (?,?,?)",(pieces[3],pieces[4],pieces[5]))
#Write to database confirmation
conn.commit()
#There is no need to close connection to database
#Prints out what caused the error
except Exception as e:
print(e)

Related

Where does the connection to database need to be made?

I am creating a twitter bot that searches twitter for ticker symbols. I am then sending them to a database. The problem that I have here is that the list that is getting sent is not changing.
Maybe I have to keep connecting to the database, but I can not figure out where my problem is. Can anyone figure out how to make make the list of tickers be different everytime?
def searchTwit():
tweets = api.search("#stocks",count=100)
return tweets
print("connecting to database")
#connecting to the database
conn = pyodbc.connect(
"Driver={SQL Server};"
"Server=..............;"
"Database=master;"
"Trusted_Connection=yes;")
cursor = conn.cursor()
tickList=[]
def getTicker(tweets):
for tweet in tweets:
if "$" in tweet.text:
x = tweet.text.split()
for i in x:
if i.startswith("$") and i[1].isalpha():
i.strip(".")
i.upper()
tickList.append(i)
# print(var_string)
def retrieveTickers():
for i in tickList:
cursor.execute('INSERT INTO master.dbo.TickerTable (TickerName) VALUES (?);', (i))
conn.commit()
# thing to run
print("about to do while ")
while True:
sleep(60 - time() %60)
print("searchtwit")
searchTwit()
theTweets = searchTwit()
getTicker(theTweets)
print("getting Tickers")
retrieveTickers()
print("sending tickers")
print(tickList)
tickList=[]
print(tickList)
You can connect to a remote database or on your local machine. Define which database you want to use, so in your database server be 127.0.0.1:PORT (that means that the database is your machine) (THE PORT will change depending on which SGDB you want

Why does any row I insert into my AWS Postgres database (with psycopg2) immediately become a dead tuple? [duplicate]

This question already has an answer here:
Why is my postgresql function insert rolling back?
(1 answer)
Closed 3 years ago.
Having a problem with connecting psycopg2 to an AWS Postgres server and inserting a row.
Below is a test script that attempts to connect to the server and insert one row. The test query works when I use it in pgAdmin. That is, it runs successfully and the row can be selected.
When I run the python script, the server shows that a connection is made. No exceptions are thrown. I can even try to insert like a hundred rows and there's a big spike in traffic. And yet nothing can be found in the table.
import psycopg2
from getpass import getpass
​
# connect to database
try:
connection = psycopg2.connect(
dbname = "postgres",
user = "username",
password = getpass(),
host = "blahblah.us-east-1.rds.amazonaws.com",
port = '5432'
)
print("connected!")
except:
print("oops")
​
#cursor object
cursor_boi = connection.cursor()
​
# simple test query
test_query = """INSERT INTO reviews (review_id, username, movie_id, review_date, review_text, review_title, user_rating, helpful_num, helpful_denom)
VALUES (1, 'myname', 12345678, '2016-06-23', 'I love this movie!', 'Me happy', 5, 6, 12 )"""
​
# execute query
try:
cursor_boi.execute(test_query)
print(test_query)
except:
print("oopsie!")
​
# close connection
if(connection):
cursor_boi.close()
connection.close()
The database statistics report the following for my "reviews" table:
Tuples inserted: 257
Tuples deleted: 1
Dead Tuples: 8
Last autovacuum: 2019-12-13 15:49:20.369715+00
And the Dead Tuples field increments every time I run the Python script. So it seems that every record I insert immediately becomes a dead tuple. Why is this, and how can I stop it? I imagine the records are being overwritten, but if so, they're not being replaced with anything.
Solved. I forgot to commit the connection with connection.commit(). Thanks #roganjosh.

How do I get Data to 'commit' from Python to SQL Server?

I have a localhost SQL Server running and am able to connect to it successfully. However, I am running into the problem of data not transfering over from temp csv files. Using Python import pyodbc for Server connection.
I've tried with Python Import pymssql but have had worse results so I've stuck with pyodbc. I've also tried closing the cursor each time or just at the end but not to any luck.
Here is a piece of code that I am using. Towards the bottom are two different csv styles. One is a temp in which is used to fill the SQL Server table. The other is for my personal use to make sure I am actually gathering information at the moment but, in the long term will be removed so only the temp csv is used.
#_retry(max_retry=1, timeout=1)
def blocked_outbound_utm_scada():
# OTHER CODE EXISTS HERE!!!
# GET Search Results and add to a temp CSV File then send to MS SQL Server
service_search_results_str = '/services/search/jobs/%s/results?output_mode=csv&count=0' % sid
search_results = (_service.request(_host + service_search_results_str, 'GET',
headers={'Authorization': 'Splunk %s' % session_key},
body={})[1]).decode('utf-8')
with tempfile.NamedTemporaryFile(mode='w+t', suffix='.csv', delete=False) as temp_csv:
temp_csv.writelines(search_results)
temp_csv.close()
try:
cursor.execute("BULK INSERT Blocked_Outbound_UTM_Scada FROM '%s' WITH ("
"FIELDTERMINATOR='\t', ROWTERMINATOR='\n', FirstRow = 2);" % temp_csv.name)
conn.commit()
except pyodbc.ProgrammingError:
cursor.execute("CREATE TABLE Blocked_Outbound_UTM_Scada ("
"Date_Time varchar(25),"
"Src_IP varchar(225),"
"Desktop_IP varchar(225));")
conn.commit()
finally:
cursor.execute("BULK INSERT Blocked_Outbound_UTM_Scada FROM '%s' WITH ("
"FIELDTERMINATOR='\t', ROWTERMINATOR='\n', FirstRow = 2);" % temp_csv.name)
conn.commit()
os.remove(temp_csv.name)
with open(_global_path + '/blocked_outbound_utm_scada.csv', 'a', newline='') as w:
w.write(search_results)
w.close()
I'm just trying to get the information into SQL Server but the code seems to be ignoring cursor.commit(). Any help is appreciated in figuring out what is wrong.
Thanks in Advance!
Try it without the conn.commit .
I do not understand why or how does it work but it seems, as well to me, that pyodbc ignores the commit clause.
Try change autocommit parameter in pymssql.connect()
conn = pymssql.connect(host=my_host, user=my_user, password=my_password, database=my_database, autocommit=True)
conn = pymssql.connect(host=my_host, user=my_user, password=my_password, database=my_database, autocommit=False)

Print Data from MySQL Database to Console from Python

I'm using Visual Studio 2017 with a Python Console environment. I have a MySQL database set up which I can connect to successfully. I can also Insert data into the DB. Now I'm trying to display/fetch data from it.
I connect fine, and it seems I'm fetching data from my database, but nothing is actually printing to the console. I want to be able to fetch and display data, but nothing is displaying at all.
How do I actually display the data I select?
#importing module Like Namespace in .Net
import pypyodbc
#creating connection Object which will contain SQL Server Connection
connection = pypyodbc.connect('Driver={SQL Server};Server=DESKTOP-NJR6F8V\SQLEXPRESS;Data Source=DESKTOP-NJR6F8V\SQLEXPRESS;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=True;ApplicationIntent=ReadWrite;MultiSubnetFailover=False')
cursor = connection.cursor()
SQLCommand = ("SELECT ID FROM MyAI_DB.dbo.WordDefinitions WHERE ID > 117000")
#Processing Query
cursor.execute(SQLCommand)
#Commiting any pending transaction to the database.
connection.commit()
#closing connection
#connection.close()
I figured it out. I failed to include the right Print statement. Which was:
print(cursor.fetchone())
I also had the connection.commit statement in the wrong place (it was inserted even executing the Print statement). The final code that worked was this:
#importing module Like Namespace in .Net
import pypyodbc
#creating connection Object which will contain SQL Server Connection
connection = pypyodbc.connect('Driver={SQL Server};Server=DESKTOP-NJR6F8V\SQLEXPRESS;Data Source=DESKTOP-NJR6F8V\SQLEXPRESS;Integrated Security=True;Connect Timeout=30;Encrypt=False;TrustServerCertificate=True;ApplicationIntent=ReadWrite;MultiSubnetFailover=False')
cursor = connection.cursor()
SQLCommand = ("SELECT * FROM MyAI_DB.dbo.WordDefinitions")
#Processing Query
cursor.execute(SQLCommand)
#Commiting any pending transaction to the database.
print(cursor.fetchone())
connection.commit()
#closing connection
#connection.close()

Python SQL insert statement not working

I am trying to insert Arduino data into a database through Python, however, it will not do it. Basically I am assigning data that I read in from the serial port assigned to my Arduino and storing the first value of it in the variable arduinoData. in my insert statement I am trying to use a string literal to put the arduinoData into the table. Here is the code:
import mysql.connector
from mysql.connector import errorcode
from time import sleep
import serial
# Obtain connection string information from the portal
config = {
'host':'oursystem.mysql.database.azure.com',
'user':'project',
'password':'',
'database':'projectdb'
}
# Construct connection string
try:
conn = mysql.connector.connect(**config)
print("Connection established")
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("Something is wrong with the user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("Database does not exist")
else:
print(err)
else:
cursor = conn.cursor()
ser = serial.Serial('/dev/ttyACM0', 9600) # Establish the connection on a specific port
arduinoData=ser.read().strip()
print arduinoData
# Drop previous table of same name if one exists
cursor.execute("DROP TABLE IF EXISTS ArduinoData;")
print("Finished dropping table (if existed).")
# Create table
cursor.execute("CREATE TABLE ArduinoData (value VARCHAR(20));")
print("Finished creating table.")
# Insert some data into table
cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData))
print("Inserted",cursor.rowcount,"row(s) of data.")
# Cleanup
conn.commit()
cursor.close()
conn.close()
print("Done.")
If I put the %s in single quotes like '%s' it just prints that instead of my arduinoData. Can anyone see what is wrong here, thanks.
I just lost two hours on this :
If you're trying to observe what's happening to your database with phpmyadmin, please note that all your insert commands won't be visible until you commit them
connection.commit()
Simply pass a tuple (arduinoData,) which means a comma within parentheses for a single value or list [arduinoData] and not a single value (arduinoData) in your parameterization:
cursor.execute("INSERT INTO ArduinoData (`value`) VALUES (%s);",(arduinoData,))
However if arduinoData is a list of multiple values, then use executemany, still passing a list. Also, escape value which is a MySQL reserved word:
cursor.executemany("INSERT INTO ArduinoData (`value`) VALUES (%s);",[arduinoData])
I may have interpreted this wrong but, shouldn't there be something like certain_value = '%s' otherwise it doesn't know what it is looking for.
I have just figured out what was wrong, using Parfait's suggestion of parsing a tuple, i just changed my insert statement from cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData))to cursor.execute("INSERT INTO ArduinoData (value) VALUES (%s);",(arduinoData,))thanks to everyone who answered you were all a great help! :D

Categories