Real time python app that works on a database - python

I am using postgreSQL with python and the SQL database is such that rows are added regularly. At present, the python program does not know if new data has been added (I used psycopg2 to read rows. But it reads till the end of rows and stops). How can I let my python program constantly search if new data has been added? Or can I let postgreSQL trigger python when a new row is added?
This is what I have currently:
def get_data():
try:
connect = psycopg2.connect(database="yardqueue", user="postgres", password="abcd", host="localhost", port="5432")
except:
print "Could not open database"
cur = connect.cursor()
cur.execute("SELECT id,position FROM container")
rows = cur.fetchall()
for row in rows:
print "ID = ", row[0]
print "Position = ", row[1]
As you see, when I run this, it stops once variable 'row' reaches the last row.
EDIT: Is there a way I can keep my python code running for a specified amount of time? If so, I can make it go through the database until I kill it.

if you want to check out new records we can write (assuming there are no deletions in container table):
from time import sleep
import psycopg2
IDLE_INTERVAL_IN_SECONDS = 2
def get_data():
try:
connect = psycopg2.connect(database="yardqueue", user="postgres",
password="abcd", host="localhost",
port="5432")
except:
print "Could not open database"
# TODO: maybe we should raise new exception?
# or leave default exception?
return
cur = connect.cursor()
previous_rows_count = 0
while True:
cur.execute("SELECT id, position FROM container")
rows_count = cur.rowcount
if rows_count > previous_rows_count:
rows = cur.fetchall()
for row in rows:
print "ID = ", row[0]
print "Position = ", row[1]
previous_rows_count = rows_count
sleep(IDLE_INTERVAL_IN_SECONDS)
if we want to process only new records we can add ordering by id and offset like
from time import sleep
import psycopg2
IDLE_INTERVAL_IN_SECONDS = 2
def get_data():
try:
connect = psycopg2.connect(database="yardqueue", user="postgres",
password="abcd", host="localhost",
port="5432")
except:
# TODO: maybe we should raise new exception?
# or leave default exception?
print "Could not open database"
return
cur = connect.cursor()
rows_count = 0
while True:
cur.execute("SELECT id, position FROM container "
# sorting records by id to get new records data
# assuming that "id" column values are increasing for new records
"ORDER BY id "
# skipping records that we have already processed
"OFFSET {offset}"
.format(offset=rows_count))
rows_count = cur.rowcount
if rows_count > 0:
rows = cur.fetchall()
for row in rows:
print "ID = ", row[0]
print "Position = ", row[1]
sleep(IDLE_INTERVAL_IN_SECONDS)

Unfortunately, a database has no notion of insertion order, so you as the designer must provide an explicit order. If you do not, the order of the rows you fetch (using a new cursor) may change at any time.
Here a possible way is to have a serial field in your table. PostgreSQL implements a serial field through a sequence, which guarantees that each new inserted row gets a serial number greater than all currently existing ones. But:
there can be holes if a transaction requires a serial number and is aborted
if multiple concurrent transactions insert a serial field, the order of the serial field will be the order of the insert commands, not the order of the commit commands. That means that race conditions can result in a wrong order. But it is fine if you have only one writer in the database
An alternative way is to use an insertion date field - the inserting application has to manage it explicitely or you can use a trigger to set it tranparently. PostgreSQL timestamp have a microsecond precision. That means that many rows can have same insertion date value if they are inserted at the same time. Your Python script should read the time before opening a cursor and fetch all rows with an insertion time greater than its last run time. But here again you should care of race conditions...

Related

python-mysql-connector: I need to speed up the time it takes to update multiple items in mySQL table

I currently have a list of id's approx. of size 10,000. I need to update all rows in the mySQL table which have an id in the inactive_ids list that you see below. I need to change their active status to 'No' which is a column in the mySQL table.
I am using mysql.connector python library.
When I run the code below, it is taking about 0.7 seconds to execute each iteration in the for loop. Thats about a 2 hour run time for all 10,000 id's to be changed. Is there a more optimal/quicker way to do this?
# inactive_ids are unique strings something like shown below
# inactive_ids = ['a9okeoko', 'sdfhreaa', 'xsdfasy', ..., 'asdfad']
# initialize connection
mydb = mysql.connector.connect(
user="REMOVED",
password="REMOVED",
host="REMOVED",
database="REMOVED"
)
# initialize cursor
mycursor = mydb.cursor(buffered=True)
# Function to execute multiple lines
def alter(state, msg, count):
result = mycursor.execute(state, multi=True)
result.send(None)
print(str(count), ': ', msg, result)
count += 1
return count
# Try to execute, throw exception if fails
try:
count = 0
for Id in inactive_ids:
# SAVE THE QUERY AS STRING
sql_update = "UPDATE test_table SET Active = 'No' WHERE NoticeId = '" + Id + "'"
# ALTER
count = alter(sql_update, "done", count)
# commits all changes to the database
mydb.commit()
except Exception as e:
mydb.rollback()
raise e
Do it with a single query that uses IN (...) instead of multiple queries.
placeholders = ','.join(['%s'] * len(inactive_ids))
sql_update = f"""
UPDATE test_table
SET Active = 'No'
WHERE NoticeId IN ({placeholders})
"""
mycursor.execute(sql_update, inactive_ids)

sqlite3 error: "Unable to resolve table" even though I already remade the table

I have searched extensively before asking this seemingly simple question. I have a python project, made a sqlite DB and some code to insert and work with it, and all was good until I decided to pull the db functions out of the main file, and pull out the db file, and put both into a folder called db. So now both the function file and the db are in the same folder, one level deep. So the error starts immediately, but the code still runs, albeit not actually doing anything, I search the internet and all I see are people saying to delete the DB file, make it again in place and that usually solves the issue, I did that twice but no luck. What am I missing here? The code runs without an error, but does not actually work, and the error I am reporting here is from the pycharm hover box.
def add_symbols_to_list(symbols_to_add) -> None:
"""This will add symbols to the current symbols list, but leave the previous entries.
:param: a list of user provided symbols as comma separated strings."""
conn = sqlite3.connect('database.db')
c = conn.cursor()
time_now = datetime.datetime.now() # get current time for the int conversion below
this_month_int = time_now.month # get the current month and set it to an int
# gets the current number of rows so new additions have the correct rowid
# c.execute("SELECT * FROM currentMonthStocks")
# current_row_number = c.execute("SELECT COUNT(*) FROM currentMonthStocks")
# rows = int(current_row_number)
# # https://www.sqlitetutorial.net/sqlite-count-function/
# db_row_id = rows + 1 # set the first row number
extra_symbols = symbols_to_add
for i in range(len(extra_symbols)):
c.execute("""INSERT INTO currentMonthStocks
(symbol, month)
VALUES (?, ?)""", (extra_symbols[i], this_month_int))
# db_row_id += 1
print("Added a symbol")
print("Symbols successfully populated into currentMonthStocks table in database.db")
new_symbols = ['tsla', 'dis', 'pltr']
add_symbols_to_list(new_symbols)
def get_symbols_at_month_start() -> None:
"""Function inserts a list of symbols to trade every month into the currentMonthStocks table in database.db.
This is called once at the start of the month, deletes the current symbols and adds the new ones.
:return: None."""
# edited out the url info for brevity
response = requests.request("POST", url, headers=headers, data=payload)
symbols = response.json()['content']['allInstrumentRows']
this_months_symbols = []
for symbol in symbols:
this_months_symbols.append(symbol['Symbol'])
# print(this_months_symbols)
# file = "database.db"
try:
conn = sqlite3.connect('database.db') # setup database connection
c = conn.cursor()
print("Database Connected")
# c.execute("""CREATE TABLE currentMonthStocks (
# id INT PRIMARY KEY,
# symbol TEXT,
# month INT)""")
# print("table created successfully")
# # checks to see if there is at least 1 row in the db, if yes it deletes all rows.
if c.execute("SELECT EXISTS(SELECT 1 FROM currentMonthStocks WHERE id=1 LIMIT 2);"):
# for i in range(len(this_months_symbols)):
c.execute("DELETE FROM currentMonthStocks")
print("Delete all rows successful")
time_now = datetime.datetime.now() # get current time for the int conversion below
this_month_int = time_now.month # get the current month and set it to an int
db_row_id = 1 # set the first row number
for i in range(len(this_months_symbols)):
c.execute("""INSERT INTO currentMonthStocks
(id, symbol, month)
VALUES (?, ?, ?)""", (db_row_id, this_months_symbols[i], this_month_int))
db_row_id += 1
# print("one more entry")
print("Symbols successfully populated into currentMonthStocks table in database.db")
conn.commit() # commits the current transaction.
print("Entries committed to database.db")
# c.close() # closes the connection to the db.
conn.close()
except sqlite3.Error as e:
print("sqlite3 error", e)
finally:
if conn:
conn.close()
print("Database.db Closed")
There was no problem, or at least no solution, pycharm is still not recognizing the table but I wrote 5 CRUD functions and they all work. So the answer is don't worry about it, just see if the DB is updating correctly.

Store Mysql coulmn names in array using Python mysql connector

I'm quite new to mysql as in manipulating the database itself. I succeeded to store new lines in a table but my next endeavor will be a little more complex.
I'd like to fetch the column names from an existing mysql database and save them to an array in python. I'm using the official mysql connector.
I'm thinking I can achieve this through the information_schema.columns command but I have no idea how to build the query and store the information in an array. It will be around 100-200 columns so performance might become an issue so I don't think its wise just to iterate my way through it for each column.
The base code to inject code into mysql using the connector is:
def insert(data):
query = "INSERT INTO templog(data) " \
"VALUES(%s,%s,%s,%s,%s)"
args = (data)
try:
db_config = read_db_config()
conn = MySQLConnection(db_config)
cursor = conn.cursor()
cursor.execute(query, args)
#if cursor.lastrowid:
# print('last insert id', cursor.lastrowid)
#else:
# print('last insert id not found')
conn.commit()
cursor.close()
conn.close()
except Error as error:
print(error)
As said this above code needs to be modified in order to get data from the sql server. Thanks in advance!
Thanks for the help!
Got this as working code:
def GetNames(web_data, counter):
#get all names from the database
connection = create_engine('mysql+pymysql://user:pwd#server:3306/db').connect()
result = connection.execute('select * from price_usd')
a = 0
sql_matrix = [0 for x in range(counter + 1)]
for v in result:
while a == 0:
for column, value in v.items():
a = a + 1
if a > 1:
sql_matrix[a] = str(('{0}'.format(column)))
This will get all column names from the existing sql database

Why am I not able to update data in SQLite database from Python?

I am trying to update or rather modify the existing data in the particular cell of a SQLite database from Flask Python. But I can't update. Also I didn't receive any error while doing so. At the same time, I am able to insert a new data in the new row of the table.
Server side code:
#app.route("/")
#app.route("/<state>")
def get_post_javascript_data(state=None):
connection = sqlite3.connect('/home/pi/toggle7.db')
cursor = connection.cursor()
load = 'light5'
status = 'ON'
if state == 'Load1on':
sock.send('Load1ON')
print ("sent: Load1ON")
try:
cursor.execute("UPDATE user1 SET load = 'light5' WHERE id='1'")
connection.commit()
message = "success"
except:
connection.rollback()
message = "failure"
finally:
connection.close()
print (message)
UPDATE user1 SET load = 'light5' WHERE id='1'
This command updates all rows that have the value '1' in the id column.
If nothing happens, then this implies that there is no such row.
If the id column contains numbers, then you must search for a number:
... WHERE id=1
And you should always use parameters to avoid formatting problems like this (and SQL injection attacks):
cursor.execute('UPDATE user1 SET load = ? WHERE id = ?', ['light5', 1])

Last value from mySQL in python (Query)

Trying to get the last value from MySQL on Raspberry Pi. No idea why my simple code wont work, gives error at "execute() first" at row = cursor.fetchone().
Here is my code:
# External module imports
import time
import os
import datetime
import MySQLdb
# Connect to mysql
db=MySQLdb.connect("localhost","zikmir","gforce","temp_database")
# Prepair a cursor
cursor=db.cursor()
# Select three columns, id, time and temp from table time_temp
cursor.execute = ("SELECT id, time, temp FROM time_temp")
# ID is autoincremented value, time is in TIME and temp is float
row = cursor.fetchone()
# Trying to store the last result in variable row
# Close cursor and database
cursor.close()
db.close()
watch the = incursor.execute = ("SELECT id, time, temp FROM time_temp"). It should read cursor.execute("SELECT...")

Categories