When I try to return the value at a specific position from my database and store the value to a text file I get the following error:
Argument must be a string or a number, not 'ResultProxy'.
int(expire) and str(expire) won't convert a 'ResultProxy'.
def expire():
today = datetime.date.today()
day = today.strftime('%d %b %y')
conn = engine.connect()
sql = text('select account.expire from account where account.user_name = "%s"'%('Bob'))
expire = conn.execute(sql)
filename = 'mysite/expiry.txt'
read = open(filename, 'r')
target = open(filename, 'w')
target.truncate()
target.write(str(expire))
target.write("\n")
target.close()
read = open(filename, 'r')
daysleft = read
return render_template('expire.html', daysleft=daysleft)
how do I convert the ResultProxy into a string?
Executing a query always returns a list of rows, a ResultProxy in SQLAlchemy's case. You are trying to write this result object to the file, rather than the actual result. Since it looks like you only expect one result, just make sure there's one result to write.
results = conn.execute(sql)
if results:
expire = results[0]
# write it to the file
Or if you expect multiple results, loop over them.
results = conn.execute(sql)
for expire in results:
# write it to the file
Here is my suggestion on how you can do it in Flask-SQLAlchemy.
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:////tmp/test.db'
db = SQLAlchemy(app)
Create the model that SQLAlchemy uses. I am assuming your table has 3 fields, a primary key, a user_name, and the expires field, which I assume is an integer from your use of this field as daysleft.
class Account(db.Model):
__tablename__ = 'account' # Sets the actual name of the table in the db
user_id = db.Column(db.String, primary_key=True)
user_name = db.Column(db.String)
expire = db.Column(db.Integer)
Here is your function that will use the model.
def expire():
today = datetime.date.today()
day = today.strftime('%d %b %y')
username = 'Bob'
Query the Account model (which is connected to the db via SQLAlchemy), filtering on the user_name field, and asking only for the first record returned, if any.
account = db.session(Account).filter_by(user_name=username).first()
filename = 'mysite/expiry.txt'
read = open(filename, 'r')
target = open(filename, 'w')
target.truncate()
If the previous query didn't return any rows, account will be None. Otherwise, it will be a dict with each returned field as a property. You can access the value of the expire field using account.expire.
if account != None:
target.write(str(account.expire))
target.write("\n")
target.close()
read = open(filename, 'r')
daysleft = read
return render_template('expire.html', daysleft=daysleft)
Related
I have a dataframe and a 5 million row local Postgres database. In each row of the dataframe, I want to add a column that is the result of a query against the Postgres database.
This is what I have right now:
for index, row in df_tf.iterrows():
row = dict(row)
id = row['National ID']
q = 'select name from companies where company_number=%s'
cursor.execute(q, [company_number])
results = cursor.fetchall()
if len(results):
row['name'] = result[0][0]
writer.writerow(row)
else:
row['name'] = ''
writer.writerow(row)
So I'm iterating over the rows and writing the results to a local CSV.
Is there a way I could do this more neatly, and keep the results in a local dataframe?
I know I could load the Postgres data into pandas and join directly, but it's rather large and slow, so I would prefer to use a Postgres query.
The way to do it with sqlalchemy declarative_base.
Rough code:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import String, Integer # noqa
from sqlalchemy.orm import scoped_session
from sqlalchemy.orm import sessionmaker
base = declarative_base()
engine = create_engine(#some stuff)
session = scoped_session(sessionmaker(bind=engine))()
class Companies(base):
__tablename__ = 'companies'
name = Column(String)
company_number = Column(Integer)
...
#other stuff
#classmethod
def get_by_company_number(cls, company_number):
query = session.query(cls).filter(cls.company_number == company_number)
if query.count() == 0:
return ''
else:
return query.first().name
df_tf['name'] = df_tf['National ID'].apply(Companies.get_by_company_number)
df_tf.to_csv('filename.csv')
I think my first look would be something like (untested):
import pandas
import psycopg2
import csv
import contextlib
def get_company_name(cursor, company_number):
query = 'SELECT name FROM companies WHERE company_number=%s;'
cursor.execute(query, [company_number])
results = cursor.fetchone()
return results[0] if results else ''
df_tf = pandas.DataFrame("...")
with contextlib.ExitStack() as ctx:
connection = ctx.enter_context(psycopg2.connect("..."))
cursor = ctx.enter_context(connection.cursor())
file_out = ctx.enter_context(open("results.csv", "w"))
writer = csv.DictWriter(file_out, fieldnames=["National ID", "Name"])
writer.writeheader()
for _, row in df_tf.iterrows():
row = dict(row)
row['Name'] = get_company_name(cursor, row['National ID'])
writer.writerow(row)
Depending on the data in the dataframe, it might be worth it to cache results from get_company_name(). I imagine there are better answer, but this is what I would try out of the gate.
I have a table of addresses, and I need my Python script to use my Google API geolocate function to return lat/long coordinates for each and add to a new field in the same row for each address. The geocode function works fine- I just can't get the script to iterate through each row of the table, add the address to the function, and then copy the output lat/long to the field in the same row. here's what I have:
import urllib, json, time, arcpy
arcpy.env.workspace = "D:/GIS/addr.dbf"
#sets variables, adds new field to hold lat/long coordinates
fc = 'addr.dbf'
field1 = 'address'
field2 = 'loc'
arcpy.AddField_management(fc, field2, "TEXT")
#function that uses Google API to geolocate- this part works consistently
def geolocate(address,
api="key_here",delay=4):
base = r"https://maps.googleapis.com/maps/api/geocode/json?"
addP = "address=" + address.replace(" ","+")
gUrl = base + addP + "&key=" + api
response = urllib.urlopen(gUrl)
jres = response.read()
jData = json.loads(jres)
if jData['status'] == 'OK':
resu = jData['results'][0]
finList = [resu['formatted_address'],resu['geometry']['location']
['lat'],resu['geometry']['location']['lng']]
else:
finList = [None,None,None]
time.sleep(delay)
return finList
#adds address field as text to geolocate in function, adds output lat/long
#(indexed locations 0 and 1 from the finList output)
##this is the part that doesn't work!
geo = geolocate(address = field1)
cursor = arcpy.UpdateCursor(fc, [field1, field2])
for row in cursor:
field2 = geo[0], geo[1]
cursor.updateRow(row);
You're calling the geolocate function with the string 'address'
field1 = 'address'
geo = geolocate(address = field1)
You want to call geolocate with the actual address, which means you need to do it within your loop that is iterating through the cursor. So it should be something like:
fields = [field1, field2])
with arcpy.da.UpdateCursor(fc, fields) as cursor:
for row in cursor:
address = row[0]
geo = geolocate(address = address)
row[1] = geo[0], geo[1]
cursor.updateRow(row)
Note: I used the cursor from the data access module which was introduced in ArcGIS 10.1. I also used the 'with' syntax for the cursor so that it automatically handles deleting the cursor.
I have a problem where I am trying to call a specific field from the data recovered by the self.results variable from the sqlite3 login database, although I am unable to do this as I believe that the fetched data is not in an array format and therefore the system is unable to use that field, I got rid of all the " ' ", "(", ")" but I do not know what to do now to convert this text file into an array so that a field can be fetched and printed.
Could you help me?
while True:
username = self.usernameEntry.get()
password = self.passwordEntry.get()
conn = sqlite3.connect("database.db")
cursor = conn.cursor()
findUser = ("SELECT * FROM students WHERE CardNumberID = ? AND Password = ?")
cursor.execute(findUser, [(username), (password)])
self.results = cursor.fetchone()
fetchedResults = str(self.results)
fetchedResults = fetchedResults.replace('(', '')
fetchedResults = fetchedResults.replace(')', '')
fetchedResults = fetchedResults.replace("'", '')
fetchedResults.split(',')
print(fetchedResults[2])
print(self.results)
Here are the results that I get:
The results are in an "array" format, but you then explicitly convert the whole thing to a string. Don't do that.
I created a CRUD endpoint wit Flask but when I try to GET data, I receive a 404 error. I tried to access this endpoint with 'http://127.0.0.1:5002/albums/beck//' and 'http://127.0.0.1:5002/albums/beck' but still get a 404. Since I supplied 'beck' as the artist name I thought the get method would run fine. I think I added the resource incorrectly.
class Artistdetails(Resource):
def get(self, artist_name):
conn = db_connect.connect()
# Protect against SQL injection
restricted_char = "!=<>*0&|/\\"
for char in restricted_char:
artist_name = artist_name.replace(char, "")
query_db = conn.execute("SELECT DISTINCT album FROM album WHERE artist='{0}'".format(artist_name.title()))
result = jsonify({'artistAlbumList': [i[0] for i in query_db.cursor.fetchall()]})
return result
def put(self, artist_name, album_name, album_name_new):
conn = db_connect.connect()
# Protect against SQL injection
restricted_char = "!=<>*0&|/\\"
for char in restricted_char:
artist_name = artist_name.replace(char, "")
query_db = conn.execute("UPDATE album SET album='{0}' WHERE artist='{1}' AND"
" album='{2}'".format(artist_name.title(), album_name.title(), album_name_new.title()))
result = jsonify({'putAlbumId': [i[0] for i in query_db.cursor.fetchall()]})
return result, 201
def post(self, artist_name, album_name):
conn = db_connect.connect()
# Protect against SQL injection
restricted_char = "!=<>*0&|/\\"
for char in restricted_char:
artist_name = artist_name.replace(char, "")
query_db = conn.execute("INSERT INTO album (album, artist) VALUES"
" ({0},{1})".format(artist_name.title(), album_name.title()))
result = jsonify({'postAlbumId': [i[0] for i in query_db.cursor.fetchall()]})
return result, 201
def delete(self, artist_name, album_name):
conn = db_connect.connect()
# Protect against SQL injection
restricted_char = "!=<>*0&|/\\"
for char in restricted_char:
artist_id = artist_name.replace(char, "")
album_id = album_name.replace(char, "")
query_db = conn.execute("DELETE FROM album WHERE"
" artist_id='{0}' AND album_id='{1}'".format(artist_name, album_name)
)
result = jsonify({'deleteAlbumId': [i[0] for i in query_db.cursor.fetchall()]})
return result, 204
Create API routes
api.add_resource(Api, '/')
api.add_resource(Albums, '/albums')
api.add_resource(Artistdetails, '/albums/<string:artist_name>/<string:album_name>/<string:album_name_new>')
api.add_resource(Genreyear, '/albums/yr')
api.add_resource(Genrenum, '/albums/genre')
api.add_resource(Artists, '/artists')
This line:
api.add_resource(Artistdetails,
'/albums/<string:artist_name>/<string:album_name>/<string:album_name_new>')
It adds a path to the Flask router that makes it expect /albums/<artist_name>/<album_name>/<album_name_new>, whereas you're trying to request /albums/<artist_name>, which doesn't match anything.
A quick fix for you would be:
api.add_resource(Artistdetails, '/albums/<string:artist_name>')
However, you might instead want to support query string parameters for your search interface so that requests look more like this:
/albums?artist=<string>&album_name=<string>
To do that, the documentation for Flask-RESTful reqparse would be useful.
I have a table called 'authors' in database 'menagerie'.Which has two columns Id and photo.Id is INT and photo is BLOB.when I try to store image in to mysql database table I am getting Error as
TypeError: not enough arguments for format string
My full code is
import MySQLdb
conn = MySQLdb.connect("localhost","root","rset","menagerie" )
cursor = conn.cursor()
def read_file(filename):
with open(filename, 'rb') as f:
photo = f.read()
return photo
def update_blob(author_id, filename):
# read file
data = read_file(filename)
# prepare update query and data
query = "UPDATE authors SET photo = %s WHERE id = %s"
#query blob data form the authors table
cursor.execute(query, (author_id,))
photo = cursor.fetchone()[0]
cursor.close()
conn.close()
def main():
update_blob(1,"d:/Emmanu/project-data/bc1.jpg")
if __name__ == '__main__':
main()
Try with the following changes
query = "UPDATE authors SET photo = %s WHERE id = %s"
cursor.execute(query, (data, author_id))
%s is a placeholder that is converted into a SQL literal to handle proper escaping (and avoid SQL injection).
Not sure what you were trying to do with .format(data, author_id), and then providing author_id as a one item tuple in your original execute call, but see if the proposed changes work for you.
With these changes you would end up with
import MySQLdb
conn = MySQLdb.connect("localhost","root","rset","menagerie" )
cursor = conn.cursor()
def read_file(filename):
with open(filename, 'rb') as f:
photo = f.read()
return photo
def update_blob(author_id, filename):
# read file
data = read_file(filename)
# prepare update query and data
query = "UPDATE authors SET photo = %s WHERE id = %s"
#query blob data form the authors table
cursor.execute(query, (data, author_id))
cursor.close()
conn.close()
def main():
update_blob(1,"d:/Emmanu/project-data/bc1.jpg")
if __name__ == '__main__':
main()