Iterating over table names and updating queries - python

I'm using PyMySQL to Update Data by iterating through table names , But the problem is that I was able to update the data from the first table only
the loop is not working after the first table
Ive tried using the fetchall() to get the table names and loop by that but it didnt work
def update():
global table_out
global i
cursor.execute("USE company;")
cursor.execute("SHOW TABLES;")
lst=[]
for table_name in cursor:
lst.append(table_name)
emp_list=lst[0][0]
print(emp_list)
i=0
while i<=len(lst)-1:
state="""SELECT `employee_name` from `%s` WHERE attended=0 """%(employees)
out=cursor.execute(state)
result=cursor.fetchall()
i+=1
for records in result:
table_out=''.join(records)
print(table_out)
db.commit()
try:
sql="""UPDATE `%s` SET `attended` = True WHERE `employee_name` = '%s' ;"""%(emp_list,table_out)
cursor.execute(sql)
I expect to iterate over all the tables in that database when this function is called

I'm not sure that your approach is quite optimal.
[In your middle block, employees is undefined - should that be emp_lst?]
Your select statement appears to reduce down to
SELECT employee_name FROM $TABLE WHERE attended=0;
which you then want to go through each table and change the value. Have you considered using the UPDATE verb instead? https://dev.mysql.com/doc/refman/8.0/en/update.html
UPDATE $table SET attended=True WHERE attended=0;
If that works for your desired outcome, then that will save you quite a few table scans and double handling.
So perhaps you could refactor your code along these lines:
def update_to_True():
# assume that cursor is a global
cursor.execute("USE company;")
tables = cursor.execute("SHOW TABLES;").fetchall()
for el in tables:
STMT="""UPDATE {0} SET attended=True WHERE attended=0;".format(el)
res = cursor.execute(el)
# for debugging....
print(res)
that's it!

Related

update the last entered value from a selection of values in a database with python , mysql

Okay so i have a table which has student id and the student id is used as identifier to edit the column but what if the same student lends a book twice then all the student value will b edited which i don't want....i want the last entered data of student id to b edited and using a Sl.No is not a solution here because its practically complicated.I am using python connector. Please help :) Thanks in advance
code i use right now :
con = mysql.connect(host='localhost', user='root',
password='monkey123', database='BOOK')
c = con.cursor()
c.execute(
f"UPDATE library set `status`='Returned',`date returned`='{str(cal.selection_get())}' WHERE `STUDENT ID`='{e_sch.get()}';")
c.execute('commit')
con.close()
messagebox.showinfo(
'Success', 'Book has been returned successfully')
If I followed you correctly, you want to update just one record that matches the where condition. For this to be done in a reliable manner, you need a column to define the ordering of the records. It could be a date, an incrementing id, or else. I assume that such column exists in your table and is called ordering_column.
A simple option is to use ORDER BY and LIMIT in the UPDATE statement, like so:
sql = """
UPDATE library
SET status = 'Returned', date returned = %s
WHERE student_id = %s
ORDER BY ordering_column DESC
LIMIT 1
"""
c = con.cursor()
c.execute(sql, (str(cal.selection_get()), e_sch.get(), )
Note that I modified your code so input values are given as parameters rather than concatenated into the query string. This is an important change, that makes your code safer and more efficient.

How to escape a #/# (for example 6/8) in the name of a table from a database

I am currently trying to get a list of values from a table inside an SQL database. The problem is appending the values due to the table's name in which I can't change. The table's name is something like Value123/123.
I tried making a variable with the name like
x = 'Value123/123'
then doing
row.append(x)
but that just prints Value123/123 and not the values from the database
cursor = conn.cursor()
cursor.execute("select Test, Value123/123 from db")
Test = []
Value = []
Compiled_Dict = {}
for row in cursor:
Test.append(row.Test)
Value.append(row.Value123/123)
Compiled_Dict = {'Date&Time': Test}
Compiled_Dict['Value'] = Value
conn.close()
df = pd.DataFrame(Compiled_Dict)
The problem occurs in this line
Value.append(row.Value123/123)
When I run it I get that the database doens't have a table named 'Value123'. Since I think it's trying to divide 123 by 123? Unfortunately the table in the database is named like this and I cannot change it, so how do I pull the values from this table?
Edit:
cursor.execute("select Test, Value123/123 as newValue from db")
I tried this and it worked thanks for the solutions. Suggested by Yu Jiaao

Python SQLite Data inserted into table disappearing? [duplicate]

This question already has answers here:
Sqlite insert query not working with python?
(2 answers)
Closed 4 years ago.
So this is confusing the hell out of me, and the title doesn't really explain it properly. I have a webpage which, when a user selects from a dropdown list makes an AJAX call to a url which is handled by flask. It first issues a call to /teamSelected, where my python scrapes some information about the football match between those two teams, puts it in an accordingly named table and returns.
After this, the second AJAX call is made to /requestCommentary where in python i then retrieve this data from the table and return it.
Problem:
When I first make the call to /teamSelected, I have code which drops the table if it exists. After which I check if the table exists (seems strange but dropping the table everytime its called is just me making sure the if part of the program is entered so I can test everything is working in there). After which if the table doesn't exist, it enters the if portion, where it creates the table, scrapes the data and stores it in the table. If I then try to print the contents of the table, it spits them out perfectly, despite telling me there exists only 1 row?
However if I remove the code at the start which drops the table, then try to print the contents, it prints nothing.
To me this makes very little sense. If the table doesnt exist, I make it, fill it with data and I can access it. However if I comment out the drop table statement, next call the table exists and isn't dropped, so I should be able to access it, but there's no longer data in there. If I were to guess, it's almost as if the table is made only for that that call, where it is then destroyed afterwards, so subsequent calls can't access it?
/teamSelected - Scrapes data, adds to database
#app.route("/teamSelected", methods=['POST', 'GET'])
def new():
try:
connection = sqlite3.connect("commentary.db")
cursor = connection.cursor()
except:
print("COULD NOT CONNECT TO DATABASE")
data = request.get_data() #gets data passed via AJAX call
splitData = data.decode().replace("\"", "").split("__") #data contains different elements split up by "__"
homeTeam = splitData[0]
awayTeam = splitData[1]
tableName = homeTeam + awayTeam + splitData[2] #unique table name
cursor.execute("DROP TABLE if exists "+tableName) #drops table to ensure enters if statement
cursor.execute("SELECT count(*) FROM sqlite_master WHERE type='table' AND name='"+tableName+"';")
result = cursor.fetchone()
number_of_rows = result[0]
print("R O W S " + str(number_of_rows)) #Always prints 0
if(number_of_rows == 0):
create_table_string = "create table if not exists '"+ tableName + "' (id INTEGER PRIMARY KEY, commentary TEXT, time TEXT)"
cursor.execute(create_table_string)
def scrapeInfo():
...
#scraping stuff
...
maxUpdate = 5
updateNumber = 0
while updateNumber < maxUpdate:
cursor.execute("INSERT INTO "+tableName+"(commentary, time) VALUES(?,?)", (commentaryUpdates[updateNumber], times[updateNumber]))
#inserts scraped data into table
updateNumber += 1
cursor.execute("select * from " + tableName)
rows = cursor.fetchall()
for row in rows:
print(row)
#THIS ^ works
cursor.execute("SELECT count(*) FROM sqlite_master WHERE type='table' AND name='" + tableName + "';")
result = cursor.fetchone()
number_of_rows = result[0]
print(number_of_rows)
#This ^ prints 1 despite the above for loop printing 5 rows
return jsonify("CLEAN")
return scrapeInfo()
#This is only hit if the table exists, meaning it doesn't enter
#the if statement, so this section is only hit when the drop table
#statement above is commented out. Here, this prints nothing, no idea why.
cursor.execute("select * from " + tableName)
rows = cursor.fetchall()
for row in rows:
print(row)
/RequestCommentary - retrieves data from table
#app.route("/requestCommentary", methods=['POST', 'GET'])
def getCommentary():
data = request.get_data()
splitData = data.decode().replace("\"", "").split("__")
homeTeam = splitData[0]
awayTeam = splitData[1]
tableName = homeTeam + awayTeam + splitData[2]
#Here I'm trying to retrieve the data from the table, but nothing is printed
cursor.execute("select * from " + tableName)
rows = cursor.fetchall()
for row in rows:
print(row)
return jsonify("CLEAN")
To recap, unexpected behaviour:
Data is only retrieved from table if it is first dropped, then has data added
If table already exists from previous call where data was added, new call retrieves no data
Number_of_rows after insertion of the data prints 1 (could be relevant as should be printing 5)
Separate route /RequestCommentary cannot access table regardless
No exceptions being thrown
I could really use a hand on this, as I am completely stumped on what the issue is here and have been at this for hours.
After some more testing, I'm certain it's something to do with the scope of the tables being created. I'm not sure how or why, but I can only seem to access data I add to the table in that call, any data added from previous calls is non-existent, which makes me think somehow the tables data is local only to the call accessing it, and isn't global?
Just managed to figure it out. I knew it was something to do with making local changes not being seen globally. After looking around I realised this is the exact problem I'd be having if I wasnt using connection.commit() to save the changes being made. I've added it now, and the changes being made can now be seen by all calls and is working properly.

cleaning a Postgres table of bad rows

I have inherited a Postgres database, and am currently in the process of cleaning it. I have created an algorithm to find the rows where the data is bad. The algorithm is encoded into the function called checkProblems(). Using this, I am able to select the rows that contains the bad rows, as shown below ...
schema = findTables(dbName)
conn = psycopg2.connect("dbname='%s' user='postgres' host='localhost'"%dbName)
cur = conn.cursor()
results = []
for t in tqdm(sorted(schema.keys())):
n = 0
cur.execute('select * from %s'%t)
for i, cs in enumerate(tqdm(cur)):
if checkProblem(cs):
n += 1
results.append({
'tableName': t,
'totalRows': i+1,
'badRows' : n,
})
cur.close()
conn.close()
print pd.DataFrame(results)[['tableName', 'badRows', 'totalRows']]
Now, I need to delete the rows that are bad. I have two different ways of doing it. First, I can write the clean rows in a temporary table, and rename the table. I think that this option is too memory-intensive. It would be much better if I would be able to just delete the specific record at the cursor. Is this even an option?
Otherwise, what is the best way of deleting a record under such circumstances? I am guessing that this should be a relatively common thing that database administrators do ...
Of course that delete the specific record at the cursor is better. You can do something like:
for i, cs in enumerate(tqdm(cur)):
if checkProblem(cs):
# if cs is a tuple with cs[0] being the record id.
cur.execute('delete from %s where id=%d'%(t, cs[0]))
Or you can store the ids of the bad records and then do something like
DELETE FROM table WHERE id IN (id1,id2,id3,id4)

MySQL - Match two tables contains HUGE DATA and find the similar data

I have two tables in my SQL.
Table 1 contains many data, but Table 2 contains huge data.
Here's the code I implement using Python
import MySQLdb
db = MySQLdb.connect(host = "localhost", user = "root", passwd="", db="fak")
cursor = db.cursor()
#Execute SQL Statement:
cursor.execute("SELECT invention_title FROM auip_wipo_sample WHERE invention_title IN (SELECT invention_title FROM us_pat_2005_to_2012)")
#Get the result set as a tuple:
result = cursor.fetchall()
#Iterate through results and print:
for record in result:
print record
print "Finish."
#Finish dealing with the database and close it
db.commit()
db.close()
However, it takes so long. I have run the Python script for 1 hour, and it still doesn't give me any results yet.
Please help me.
Do you have index on invention_title in both tables? If not, then create it:
ALTER TABLE auip_wipo_sample ADD KEY (`invention_title`);
ALTER TABLE us_pat_2005_to_2012 ADD KEY (`invention_title`);
Then combine your query into one which don't use subqueries:
SELECT invention_title FROM auip_wipo_sample
INNER JOIN us_pat_2005_to_2012 ON auip_wipo_sample.invention_title = us_pat_2005_to_2012.invention_title
And let me know about your results.

Categories