Converting data from sqlite3 database into an array format - python

I have a problem where I am trying to call a specific field from the data recovered by the self.results variable from the sqlite3 login database, although I am unable to do this as I believe that the fetched data is not in an array format and therefore the system is unable to use that field, I got rid of all the " ' ", "(", ")" but I do not know what to do now to convert this text file into an array so that a field can be fetched and printed.
Could you help me?
while True:
username = self.usernameEntry.get()
password = self.passwordEntry.get()
conn = sqlite3.connect("database.db")
cursor = conn.cursor()
findUser = ("SELECT * FROM students WHERE CardNumberID = ? AND Password = ?")
cursor.execute(findUser, [(username), (password)])
self.results = cursor.fetchone()
fetchedResults = str(self.results)
fetchedResults = fetchedResults.replace('(', '')
fetchedResults = fetchedResults.replace(')', '')
fetchedResults = fetchedResults.replace("'", '')
fetchedResults.split(',')
print(fetchedResults[2])
print(self.results)
Here are the results that I get:

The results are in an "array" format, but you then explicitly convert the whole thing to a string. Don't do that.

Related

Python: TypeError: Unsupported format string passed to NoneType.__format__

I have a program when it currently reads from a database, which can be found here. I have users choose a specific record they want to display so the SQL command will execute that record. Now I have a table that currently displays some records that do not include any NULL or empty strings. If it does have NULL or empty strings, it gives me an error and the program does not display the records. I figured that is where the NoneType error is mostly coming from. I'm not sure how to fix that. How can I make sure it also counts the Null or empty strings? Hopefully, that will fix the error.
If you were to try and test the DB, tables like Customers don't display because it has null values.
Here is the Traceback error:
line '..', in read_display(record)
line = format_.format(*rec)
TypeError: unsupported format string passed to NoneType.__format__
This is what my code looks like:
import sqlite3
def read_display(record):
database = 'data.db'
connection = sqlite3.connect(database)
c = connection.cursor()
sql = "SELECT * FROM {0}".format(record)
cursor.execute(sql)
conn.commit()
results = cursor.fetchall()
header = tuple(i[0] for i in c.description)
width = max((len(str(x)) for d in data for x in d))
data = [header] + results
config = [{'width': 0} for _ in range(len(data[0]))]
for rec in data:
for c, value in enumerate(rec):
config[c]['width'] = max(config[c]['width'], len(str(value)))
format_ = []
for f in config:
format_.append('{:<' + str(f['width']) + '}')
format_ = ' | '.join(format_)
for rec in data:
line = format_.format(*rec)
print(line)
You have error in
line = format_.format(*rec)
and you can get the same error with
'{:s}'.format(None)
so it seems rec has None on the list.
You would have to convert it to string with str(None)
You can use str() with all elements in rec to make sure:
rec = [str(x) for x in rec]
So code should be
for rec in data:
rec = [str(x) for x in rec]
line = format_.format(*rec)

Data type conversion via sql query

I have problem with data type conversion.
Using django and pypyodbc lib I'm trying to recieive data from oracle DB (external) and save it into local app DB.
import pypyodbc
def get_data(request):
conn = pypyodbc.connect("DSN=...")
cursor = conn.cursor()
cursor.execute("SELECT value FROM table")
data = cursor.fetchall()
for row in data:
d = External_Data(first_val = row[0])
d.save()
The output from value is "0,2" and I've received error message:
could not convert string to float: b',02'
When I changed sql statement to:
SELECT cast(value as numeric(10,2) from table)
I received error message:
[<class 'decimal.ConversionSyntax'>]
How to change that data to get float data and save it. I use DecimalField(max_digits=10, decimal_places=2) as model field.
I think this problem comes with implicit type change.
when you get data from your get_data function, row[0] var in for loop is seemed like Bytes variable.
So First of all, I recommend to check row[0]'s data type with print(type(row[0])).
If result is Bytes, you can do like this:
import pypyodbc
def get_data(request):
conn = pypyodbc.connect("DSN=...")
cursor = conn.cursor()
cursor.execute("SELECT value FROM table")
data = cursor.fetchall()
for row in data:
data = float(str(row[0]).replace(',','.'))
d = External_Data(first_val=data)
d.save()

Error with string formatting?

Having trouble here, It worked when I had multiple (%s,%s) and data was (user,pass) or something like that.
However with the following code I keep getting this error.
query = query % tuple([db.literal(item) for item in args])
TypeError: not all arguments converted during string formatting
Why does this keep happening? It only occurs when there is only a single argument
This code is from my flask application
username = request.values['username']
update_stmt = (
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = '(%s)'"
)
data = (username)
cursor.execute(update_stmt,data)
For a single valued tuple to be recognized as a tuple you need a trailing ,
data = (username,)
And unrelated, you don't really need to quote in your query
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = (%s)"
Your full code should be
username = request.values['username']
update_stmt = (
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = (%s)"
)
data = (username,)
cursor.execute(update_stmt,data)

Converting from Tuple to List to modify in Python

I am querying a Oracle database and need some special handling around one column of data that is a clob. I can read in the clobe with .read(). I'd like to write the actual value back to my array. It's a tuple so I must convert to a list, write the value, then convert back to tuple. I am getting the error message: TypeError: 'tuple' object does not support item assignment
My Code:
import cx_Oracle
# USE THIS CONNECTION STRING FOR PRODUCTION
production_username = 'username'
production_password = 'password'
con_string = '%s/%s#hostname:port/orcl' % (production_username, production_password)
con = cx_Oracle.connect(con_string)
cursor = con.cursor()
querystring = ("Select ID, Description from Table")
cursor.execute(querystring)
data = cursor.fetchall()
for currentrow in range(1, len(data)):
description= data[currentrow][1].read()
data = list(data)
data[currentrow][1] = description
data = tuple(data)
con.close()
print data
Try this way
for currentrow in data :
description= currentrow[1].read()
tupled_data= tuple([currentrow[0],description])
print tupled_data

How do I read cx_Oracle.LOB data in Python?

I have this code:
dsn = cx_Oracle.makedsn(hostname, port, sid)
orcl = cx_Oracle.connect(username + '/' + password + '#' + dsn)
curs = orcl.cursor()
sql = "select TEMPLATE from my_table where id ='6'"
curs.execute(sql)
rows = curs.fetchall()
print rows
template = rows[0][0]
orcl.close()
print template.read()
When I do print rows, I get this:
[(<cx_Oracle.LOB object at 0x0000000001D49990>,)]
However, when I do print template.read(), I get this error:
cx_Oracle.DatabaseError: Invalid handle!
Do how do I get and read this data? Thanks.
I've found out that this happens in case when connection to Oracle is closed before the cx_Oracle.LOB.read() method is used.
orcl = cx_Oracle.connect(usrpass+'#'+dbase)
c = orcl.cursor()
c.execute(sq)
dane = c.fetchall()
orcl.close() # before reading LOB to str
wkt = dane[0][0].read()
And I get: DatabaseError: Invalid handle!
But the following code works:
orcl = cx_Oracle.connect(usrpass+'#'+dbase)
c = orcl.cursor()
c.execute(sq)
dane = c.fetchall()
wkt = dane[0][0].read()
orcl.close() # after reading LOB to str
Figured it out. I have to do something like this:
curs.execute(sql)
for row in curs:
print row[0].read()
You basically have to loop through the fetchall object
dsn = cx_Oracle.makedsn(hostname, port, sid)
orcl = cx_Oracle.connect(username + '/' + password + '#' + dsn)
curs = orcl.cursor()
sql = "select TEMPLATE from my_table where id ='6'"
curs.execute(sql)
rows = curs.fetchall()
for x in rows:
list_ = list(x)
print(list_)
There should be an extra comma in the for loop, see in below code, i have supplied an extra comma after x in for loop.
dsn = cx_Oracle.makedsn(hostname, port, sid)
orcl = cx_Oracle.connect(username + '/' + password + '#' + dsn)
curs = orcl.cursor()
sql = "select TEMPLATE from my_table where id ='6'"
curs.execute(sql)
rows = curs.fetchall()
for x, in rows:
print(x)
I had the same problem with in a slightly different context. I needed to query a +27000 rows table and it turns out that cx_Oracle cuts the connection to the DB after a while.
While a connection to the db is open, you can use the read() method of the cx_Oracle.Lob object to transform it into a string. But if the query brings a table that is too big, it won´t work because the connection will stop at some point and when you want to read the results from the query you´ll gt an error on the cx_Oracle objects.
I tried many things, like setting
connection.callTimeout = 0 (according to documentation, this means it would wait indefinetly), using fetchall() and then putting the results on a dataframe or numpy array but I could never read the cx_Oracle.Lob objects.
If I try to run the query using pandas.DataFrame.read_sql(query, connection) The dataframe would contain cx_Oracle.Lob objects with the connection closed, making them useless. (Again this only happens if the table is very big)
In the end I found a way of getting around this by querying and creating a csv file inmediatlely after, even though I know it´s not ideal.
def csv_from_sql(sql: str, path: str="dataframe.csv") -> bool:
try:
with cx_Oracle.connect(config.username, config.password, config.database, encoding=config.encoding) as connection:
connection.callTimeout = 0
data = pd.read_sql(sql, con=connection)
data.to_csv(path)
print("FILE CREATED")
except cx_Oracle.Error as error:
print(error)
return False
finally:
print("PROCESS ENDED\n")
return True
def make_query(sql: str, path: str="dataframe.csv") -> pd.DataFrame:
if csv_from_sql(sql, path):
dataframe = pd.read_csv("dataframe.csv")
return dataframe
return pd.DataFrame()
This took a long time (about 4 to 5 minutes) to bring my +27000-rows table, but it worked when everything else didn´t.
If anyone knows a better way, it would be helpful for me too.

Categories