Using MySQLdb to return values - Python - python

I have a table called coords and it is defined as:
mysql> describe coords;
+-----------+--------------+------+-----+------------------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+--------------+------+-----+------------------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| location | varchar(150) | NO | UNI | NULL | |
| latitude | float(20,14) | YES | | 0.00000000000000 | |
| longitude | float(20,14) | YES | | 0.00000000000000 | |
-----------------------------------------------------------------------------
I am using the MySQLdb import in my Python script. The purpose of this table is to store (as you can guess, but for clarity) location coordinates (but only when I do not have the coordinates already for a particular location).
I will be querying this table in my Python program to see if I already have coordinates for a pre-requested location. I'm doing this to speed up the use of the geopy package that interrogates Google's Geolocation Service.
How do I store the returned floats that correspond to a location? So far I have the following:
myVar = cur.execute("SELECT latitude, longitude FROM coords WHERE location ='" + jobLocation + "';")
if myVar == 1:
print(cur.fetchone())
else:
try:
_place, (_lat, _lon) = geos.geocode(jobLocation, region='GB', exactly_one=False)
print("%s: %.5f, %.5f" % _place, (_lat, _lon))
except ValueError as err:
print(err)
The code works (well, not really...) but I have no idea of how to get the returned coordinates into separate float variables.
Can you help?

When you do cur.fetchone(), you need to store the result somewhere:
row = cur.fetchone()
print row[0], row[1]
Now row[0] will contain the latitude, and row[1] the longitude.
If you do this when connecting:
cur = con.cursor(mdb.cursors.DictCursor)
you can then use a dictionary to refer to the columns by name:
row = cur.fetchone()
print row["latitude"], row["longitude"]

Related

Execute a multi-select SQL query using pyodbc

I'm trying to execute a multi-select SQL query using Pyodbc, but getting errors either about no results or no scalar variables. Since I need to create SQL variables that are used at different locations in the query, how could I get this to run in pyodbc?
Would this be feasible if I converted my SQL into a stored procedure?
It is not likely that I will be able to create the logic as a stored procedure as I do not have write access to the database.
Is there any possible way to get this type of query to run in python, or does it need to be modified in some way?
| ID | LNAME | FNAME | EMAIL |
| ----- | -------- | -------- | ------- |
| 1 | Smith | Bob | s#a.com |
| 2 | Davidson | Mike | d#a.com |
| 1 | Campbell | Brian | c#a.com |
This is what I tried so far but keep running into errors.
q = """
set ANSI_WARNINGS OFF;
declare #html varchar(MAX)
decalre #dedupedemails varchar(MAX)
decalre #esc_seq int
set #esc_seq = 5;
if object_id('tempdb.dbo.##dedupemail', 'U') is not null
drop table ##dedupemail;
with sub1 as (
select p.ID, p.LNAME, p.FNAME, p.EMAIL
from dbo.person p
),
sub2 as (
select
s1.*,
case when s1.ID = 1
then 'Yes'
else 'No'
end as IS_ADMIN
from sub1 s1
)
select distinct s2.* into ##dedupemail
from sub2 s2
where s2.IS_ADMIN = 'Yes'
set #html = 'abc';
select #dedupedemails = ltrim(stuff((
select '; ' + d.email
from ##dedupemail d
for xml path('')), 1,1,''));
select #dedupedemails as EMAIL_LIST, #html as EMAIL_BODY"""
try:
cnxn = pyodbc.connect(cnxn_str)
cursor = cnxn()
cursor.execute(q)
result = cursor.fetchall()
del cnxn
except pyodbc.Error as e:
print("Error caught: ", e)
The error:
Error caught: No results. Previous SQL was not a query.
Another error that I get refers to unknown scalars, but it always gives errors.

Python 3 - How do I extract data from SQL database and process the data and append to pandas dataframe row by row?

I have a MySQL database, its columns are:
+--------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+--------------+------+-----+---------+----------------+
| id | int unsigned | NO | PRI | NULL | auto_increment |
| artist | text | YES | | NULL | |
| title | text | YES | | NULL | |
| album | text | YES | | NULL | |
| duration | text | YES | | NULL | |
| artistlink | text | YES | | NULL | |
| songlink | text | YES | | NULL | |
| albumlink | text | YES | | NULL | |
| instrumental | tinyint(1) | NO | | 0 | |
| downloaded | tinyint(1) | NO | | 0 | |
| filepath | text | YES | | NULL | |
| language | json | YES | | NULL | |
| genre | json | YES | | NULL | |
| style | json | YES | | NULL | |
| artistgender | text | YES | | NULL | |
+--------------+--------------+------+-----+---------+----------------+
I need to extract data from it and process the data and add the data to a pandas DataFrame.
I know how to extract data from SQL database, and I have already implemented a way to pass the data to DataFrame, but it is extremely slow (about 30 seconds), whereas when I used a flat list of namedtuples the operation is tremendously faster (under 3 seconds).
Specifically, filepath is default NULL unless the file is downloaded (currently none of the songs are downloaded), and when Python gets filepath the value will be None, and I need that value become ''.
And because MySQL doesn't have BOOLEAN type, I need to cast the received ints to bool.
And the language, genre, style fields are tags stored as JSON lists, and they are all currently NULL, when Python gets them they are strings and I need to make them lists using json.loads unless they are None, and if they are None I need to append empty lists instead.
This is my inefficient solution to the problem:
import json
import mysql.connector
from pandas import *
fields = {
"artist": str(),
"album": str(),
"title": str(),
"id": int(),
"duration": str(),
"instrumental": bool(),
"downloaded": bool(),
"filepath": str(),
"language": list(),
"genre": list(),
"style": list(),
"artistgender": str(),
"artistlink": str(),
"albumlink": str(),
"songlink": str(),
}
conn = mysql.connector.connect(
user="Estranger", password=PWD, host="127.0.0.1", port=3306, database="Music"
)
cursor = conn.cursor()
def proper(x):
return x[0].upper() + x[1:]
def fetchdata():
cursor.execute("select {} from songs".format(', '.join(list(fields))))
data = cursor.fetchall()
dataframes = list()
for item in data:
entry = list(map(proper, item[0:3]))
entry += [item[3]]
for j in range(4, 7):
cell = item[j]
if isinstance(cell, int):
entry.append(bool(cell))
elif isinstance(cell, str):
entry.append(cell)
if item[7] is not None:
entry.append(item[7])
else:
entry.append('')
for j in range(8, 11):
entry.append(json.loads(item[j])) if item[j] is not None else entry.append([])
entry.append(item[11])
entry += item[12:15]
df = DataFrame(fields, index=[])
row = Series(entry, index = df.columns)
df = df.append(row, ignore_index=True)
dataframes.append(df)
songs = concat(dataframes, axis=0, ignore_index=True)
songs.sort_values(['artist', 'album', 'title'], inplace=True)
return songs
Currently there are 4464 songs in the database and the code takes about 30 seconds to finish.
I sorted my SQL database by artist and title and I need to resort the entries by artist, album and title for QTreeWidget, and MySQL sorts data differently from Python and I prefer Python sorting.
In my testing, df.loc and df = df.append() methods are slow, pd.concat is fast, but I really don't know how to create dataframes with only one row and pass flat lists to dataframe instead of a dictionary, and if there is a faster way than pd.concat, or if operations in the for loop can be vectorized.
How can my code be improved?
I figured out how to create a DataFrame with a list of lists and specify column names, and it is tremendously faster, but I still don't know how to also specify the data types elegantly without the code throwing errors...
def fetchdata():
cursor.execute("select {} from songs".format(', '.join(list(fields))))
data = cursor.fetchall()
for i, item in enumerate(data):
entry = list(map(proper, item[0:3]))
entry += [item[3]]
for j in range(4, 7):
cell = item[j]
if isinstance(cell, int):
entry.append(bool(cell))
elif isinstance(cell, str):
entry.append(cell)
if item[7] is not None:
entry.append(item[7])
else:
entry.append('')
for j in range(8, 11):
entry.append(json.loads(item[j])) if item[j] is not None else entry.append([])
entry.append(item[11])
entry += item[12:15]
data[i] = entry
songs = DataFrame(data, columns=list(fields), index=range(len(data)))
songs.sort_values(['artist', 'album', 'title'], inplace=True)
return songs
And I still need the type conversions, they are already pretty fast, but they don't look elegant.
You could make a list of conversion functions for each column:
funcs = [
str.capitalize,
str.capitalize,
str.capitalize,
int,
str,
bool,
bool,
lambda v: v if v is not None else '',
lambda v: json.loads(v) if v is not None else [],
lambda v: json.loads(v) if v is not None else [],
lambda v: json.loads(v) if v is not None else [],
str,
str,
str,
str,
]
Now you can apply the function that converts the value for each field
for i, item in enumerate(data):
row = [func(field) for field, func in zip(item, funcs)]
data[i] = row
For the first part of the question, for generic database 'history':
import pymysql
# open database
connection = pymysql.connect("localhost","root","123456","blue" )
# prepare a cursor object using cursor() method
cursor = connection.cursor()
# prepare SQL command
sql = "SELECT * FROM history"
try:
cursor.execute(sql)
data = cursor.fetchall()
print ("Last row uploaded",list(data[-1]))
except:
print ("Error: unable to fetch data")
# disconnect from server
connection.close()
You can simply fetch data from the table and create a Data-frame using Pandas.
import pymysql
import pandas as pd
from pymysql import Error
conn = pymysql.connect(host="",user="",connect_timeout=10,password="",database="",port=)
if conn:
cursor = conn.cursor()
sql = f"""SELECT * FROM schema.table_name;"""
cursor.execute(sql)
data =pd.DataFrame(cursor.fetchall())
conn.close()
# You can go ahead and create a csv from this Data-Frame
csv_gen = pd.to_csv(data,index=False)
enter code here

MySQL connector Python variables not registering

I'm currently using the mysql-connector-python package to execute database actions on Flask. It's been working so well until suddenly the variables don't seem to working correctly anymore. My code is here:
#bp.route('/addcart', methods=('OPTIONS', 'POST'))
def addcart():
...
userID = session.get("user_id")
reqDict = request.get_json()
itemCode = str(reqDict['itemCode'])
itemAmt = reqDict['itemAmt']
if userID is not None:
db = get_db()
cursor = db.cursor()
query = ('SELECT %s FROM cartdata WHERE id = %s')
cursor.execute(query, (itemCode, userID))
currentNum = cursor.fetchone()[0]
if currentNum is None:
stmt = ('UPDATE cartdata SET %s = 1 WHERE id = %s')
cursor.execute(stmt, (itemCode, userID))
else:
currentNum = int(currentNum) + int(itemAmt)
stmt = ('UPDATE cartdata SET %s = %s WHERE id = %s')
cursor.execute(stmt, (itemCode, currentNum, userID))
....
For some reason, I seem to having trouble with the itemCode variable. When I use it properly, like in the execution of 'query' or 'stmt', it doesn't work. Typically I will get an error saying
" You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use
near ''p1' = 1 WHERE id = 21'".
However, if I do this:
query = ('SELECT ' + itemCode + ' FROM cartdata WHERE id = %s')
...
stmt = ('UPDATE cartdata SET '+ itemCode +' = 1 WHERE id = %s')
...
It works properly as intended.
EDIT: I've checked my backend, and apparently the UPDATE statement does not actually update anything. So now I'm at a complete loss.
I don't understand why the connector suddenly breaks now for variables. I've checked this variables and its types, but they were the expected types. Any insight would be helpful.
My table schema for 'cartdata' looks something like this:
+-------+---------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+---------+------+-----+---------+-------+
| id | int(11) | NO | PRI | NULL | |
| p1 | int(8) | YES | | NULL | |
| p2 | int(8) | YES | | NULL | |
| p3 | int(8) | YES | | NULL | |
| p4 | int(8) | YES | | NULL | |
| p5 | int(8) | YES | | NULL | |
+-------+---------+------+-----+---------+-------+
That's because when MySQL connector injects your variables into the SQL statement, it formats them according to their type.
You can actually see it in the error message that you get:
"p1' = 1 WHERE id = 21'"
^
So probably, your SQL query looks like this:
SELECT 'p1' FROM cartdata WHERE id = someId
Which is syntactically invalid SQL...
Your second option however seems okay. Btw, it seems weird to adapt the column you want to select depending on the user's input... I'd highly recommend to validate this value with something efficient...
Details
You cannot use %s for column names since this injects a string value in your SQL query and this results in a non valid SQL syntax (column names are not string values).
As above:
SET %s = ...
Generates:
SET 'colName' = ...
which is not valid because you are attempting to affect a value to another value...
That would be the same as trying to do the following in python:
'foo' = 'bar'
or
'foo' = 4
You can use %s when setting values (using SET colName = %s) or filtering values (using WHERE colName = %s) because the type of the values in the column colName is actually a string.
As above:
WHERE colName = %s
Generates:
WHERE colName = 'fooBar'
which is valid because you filter on the values that are equal to the string fooBar.
By the way, you might want to check what
SELECT %s FROM cartdata WHERE id = %s
gives you as a result. That could result problems... Actually MySQL won't tell you anything, but you result will probably be exactly the value of itemCode. (it is valid SQL SELECT 'hello', it just returns 'hello').

How to pass id from flask app to mysql database

I have a flask application that is connected to the MySQL database.
NOTE
database name = evaluation
table name = evaluation
columns = eval_id, eval_name, date
I have an 'evaluation table' with field eval_id, eval_name and date in it.
mysql> use evaluation;
mysql> describe evaluation;
+-----------+-------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-----------+-------------+------+-----+---------+-------+
| eval_id | int(11) | NO | PRI | NULL | |
| eval_name | varchar(20) | NO | | NULL | |
| date | datetime(6) | NO | | NULL | |
+-----------+-------------+------+-----+---------+-------+
How can I write an API to get a particular evaluation by its id?
I tried the below, but it doesn't work.
#app.route('/getEval/<int:eval_id>', methods=['GET'])
def getEvalByID(eval_id):
cur.execute('''select * from evaluation.evaluation where eval_id=eval_id''')
res = cur.fetchall()
return jsonify({'test':str(res)})
How can I correct this and get only the evaluation based on the eval_id mentioned in the app.route.
You need to place the eval_id not as String but as a VAR.
#app.route('/getEval/<int:eval_id>', methods=['GET'])
def getEvalByID(eval_id):
cur.execute('select * from evaluation.evaluation where eval_id=' + str(eval_id))
res = cur.fetchall()
return jsonify({'test':str(res)})
try with cur.execute('select * from evaluation.evaluation where eval_id={}'.format(eval_id))

MySQL and Python Select Statement

I'm currently trying to select specific text from a cell within my table, but I can't get MySQL to obey my command. I'm sure I'm doing it wrong, but to me this feels right... I've checked the documentation and Google for this but I wasn't able to come up with an answer myself, hence why I'm turning to you.
This is the outlay of the MySQL:
mysql> describe configuration;
+---------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+---------+--------------+------+-----+---------+-------+
| option | varchar(255) | NO | PRI | NULL | |
| setting | varchar(255) | NO | | NULL | |
+---------+--------------+------+-----+---------+-------+
In PHPMyAdmin it looks like this:
+---------+---------+
| option | setting |
+---------+---------+
| version | 0.7.48 |
+---------+---------+
I'm trying this:
#!/usr/bin/env python
import MySQLdb
db = MySQLdb.connect(host="XXX", user="XXX", passwd="XXX", db="XX")
cur = db.cursor()
cur.execute("SELECT setting FROM configuration WHERE option = version")
current_version = cur.fetchone()
db.close()
cur.close()
How come this isn't working? I'm at loss.
e4c5 and acw1668 are right:
cur.execute("SELECT setting FROM configuration WHERE `option` = 'version'")
Does the trick.

Categories