Importing module required by module - python

I'm writing a web app for a college assignment using Python/Flask and, to keep my app.py file neat, I have a function to query a DB stored in another file. This function uses pymysql and json modules and I can't manage to load these in a way that makes it work - I keep getting an attribute error saying pymysql is not defined.
I've tried putting import statements in my module file (DBjson.py), within the function contained in my module, and within app.py. This is my module code:
def fetchFromDB(host,port,dbname,user,password,query,jsonString=False):
import pymysql # these import statements are in the function in this example - one of several places I've tried them!
import json
conn = pymysql.connect(host, user=user,port=port, passwd=password, db=dbname)
cursorObject = conn.cursor(pymysql.cursors.DictCursor)
with cursorObject as cursor:
cursor.execute(query)
result = cursor.fetchall()
conn.close()
if jsonString == True:
try:
for i in range(len(result)):
result[i]['dateTime'] = result[i]['dateTime'].strftime('%Y-%m-%d %H:%M')
except:
pass
result = json.dumps(result)
return result
And the route from my app.py:
import pymysql
import json
#app.route('/')
def index():
wds = DBjson.fetchFromDB(host,port,dbname,user,password,weatherQuery)
bds = DBjson.fetchFromDB(host,port,dbname,user,password,bikesQuery)
return render_template('weatherDateTime.html', wds=wds, bds=bds)
Any help on how to make this work?
Thanks!
edit - I wrote a test script from which I can load my module and run my function no problem - I have my import statements at the start of the DBjson.py module file and outside of the function. Is this some quirk of Flask/scoping that I don't know about?
PS - Thanks for all the replies so far
import DBjson
query = "SELECT * FROM dublinBikesInfo WHERE dateTime LIKE (SELECT MAX(datetime) FROM dublinBikesInfo);"
#login details for AWS RDS DB
host="xyza"
port=3306
dbname="xyza"
user="xyza"
password="xyza"
a = DBjson.fetchFromDB(host,port,dbname,user,password,query)
print(a)

Hi in your code there is a indent error all the statements has to be inside of the function/method that you have created ex.
`def method():
#code here`
And also try to import the library’s before defining a function/method in the beginning of the page is also good!!
In your scenario please put all the function/method related statements inside of the function/method!!

Python is very unforgiving about indentation. Your module code isn't indented correctly.
The proper way to do this, would be:
def Function():
#your code indented in here

Related

API Calls bombs in loop when object not found

My code is erroring out when the object "#odata.nextLink" is not found in the JSON. I thought the while loop was supposed to account for this? I apologize if this is rudimentary but this is my first python project, so i dont know the stuffs yet.
Also, for what it is worth, the api results are quite limited, there now "total pages" value I can extract
# Import Python ODBC module
import pyodbc
import requests
import json
import sys
cnxn = pyodbc.connect(driver="{ODBC Driver 17 for SQL Server}",server="theplacewherethedatais",database="oneofthosedbthings",uid="u",pwd="pw")
cursor = cnxn.cursor()
storedProc = "exec GetGroupsAndCompanies"
for irow in cursor.execute(storedProc):
strObj = str(irow[0])
strGrp = str(irow[1])
print(strObj+" "+strGrp)
response = requests.get(irow[2], timeout=300000, auth=('u', 'pw')).json()
data = response["value"]
while response["#odata.nextLink"]:
response = requests.get(response["#odata.nextLink"], timeout=300000, auth=('u', 'pw')).json()
data.extend(response["value"])
cnxn.commit()
cnxn.close()
You can use the in keyword to test if a key is present:
while "#odata.nextLink" in response:

Airflow: Running PostgresOperator within a PythonOperator with Taskflow API

The goal I am having is to loop over a list that I get from another PythonOperator and within this loop save the json to the Postgres DB. I am using the Taskflow API from Airflow 2.0.
The code works well if I write the SQL statement directly to the sql parameter in the PostgresOperator. But when I write the SQL to a file and put the file path into the sql parameter, then the error is thrown:
psycopg2.errors.SyntaxError: syntax error at or near "sql"
LINE 1: sql/insert_deal_into_deals_table.sql
This is the code of the task:
#task()
def write_all_deals_to_db(all_deals):
for deal in all_deals:
deal_json = json.dumps(deal)
pg = PostgresOperator(
task_id='insert_deal',
postgres_conn_id='my_db',
sql='sql/insert_deal_into_deals_table.sql',
params={'deal_json': deal_json}
)
pg.execute(dict())
The weird thing is, that the code works if I use it as a standalone Operator (outside of a PythonOperator). Like this:
create_deals_table = PostgresOperator(
task_id='create_deals_table',
postgres_conn_id='my_db',
sql='sql/create_deals_table.sql'
)
I tried around a lot and I guess that it has to do with the Jinja templating. Somehow within a PythonOperator the PostgresOperator cannot make use of neither the param nor the .sql file parsing.
Any tip or reference is greatly appreciated!
EDIT:
This code works, but is rather a quick fix. The actual problem I am still having, is that the Jinja template is not working for the PostgresOperator when I am using it inside a PythonOperator.
#task()
def write_all_deals_to_db(all_deals):
sql_path = 'sql/insert_deal_into_deals_table.sql'
for deal in all_deals:
deal_json = _transform_json(deal)
sql_query = open(path.join(ROOT_DIRECTORY, sql_path)).read()
sql_query = sql_query.format(deal_json)
pg = PostgresOperator(
task_id='insert_deal',
postgres_conn_id='my_db',
sql=sql_query
)
pg.execute(dict())

Sqlite3.DatabaseError only when I deploy

Introduction
I'm developing a python webapp running on Flask. One of the module I developed use sqlite3 to access a database file in one of my project directory. Locally it works like a charm, but I have issues to make it run properly on pythonanywhere.
Code
Here's an insight of my module_database.py (both sql query are only SELECT):
import sqlite3
import os
PATH_DB = os.path.join(os.path.dirname(__file__), 'res/database.db')
db = sqlite3.connect(PATH_DB)
cursor = db.cursor()
def init():
cursor.execute(my_sql_query)
val = cursor.fetchone()
def process():
cursor.execute(another_sql_query)
another_val = cursor.fetchone()
I don't know if that's important but my module is imported like this:
from importlib import import_module
module = import_module(absolute_path_to_module)
module.init() # module init
And afterwards my webapp will regularly call:
module.process()
So, I have one access to the db in my init() and one access to the db in my process(). Both works when I run it locally.
Problem
I pulled my code via github on pythonanywhere, restarted the app and I can see in the log file that the access to the DB in the init() worked (I print a value, it's working fine)
But then, when my app calls the process() method I got a:
2017-11-06 16:27:55,551: File "/home/account-name/project-name/project_modules/module_database.py", line 71, in my_method
2017-11-06 16:27:55,551: cursor.execute(sql)
2017-11-06 16:27:55,552: sqlite3.DatabaseError: database disk image is malformed
I tried via the console to run an integrity check:
PRAGMA integrity_check;
and it prints OK
I'd be glad to hear if you have any idea where this could come from.
a small thing, and it may not fix your specific problem, but you should always call path.abspath on __file__ before calling path.dirname, otherwise you can get unpredictable results depending on how your code is imported/loaded/run
PATH_DB = os.path.join(
os.path.dirname(os.path.abspath(__file__)),
'res/database.db'
)

comparing a given variable to data in a database and checking to see if it exists

I have a sqlite db of API keys and I want to make something check and see if the given key is in the database.I'm generating the API keys using another python script named apikeygen.py. I'm using python 2.7 and pattern 2.6. This is going to be a data scraping/mining/filtering application that I'm doing just for fun and maybe have a future use for malware analysis.
I need help getting the main piece of code that we will call API.py to check and see if the given API key is in the database.
This is the code for the API.py file so far.
import os, sys; sys.path.insert(0, os.path.join(os.path.dirname(__file__), "..", ".."))
import sqlite3 as lite
from pattern.server import App
from pattern.server import MINUTE, HOUR, DAY
app = App("api")
def search_db(key=''):
con = lite.connect('apikeys.db')
with con:
cur = con.cursor()
cur.execute("SELECT * FROM keys")
while True:
row = cur.fetchone()
if row == None:
break
print row[2]
I'm still not really clear what you are asking. Why don't you explicitly query for the key, rather than iterating over your whole table?
cur.execute("SELECT * FROM keys WHERE key = ?", (key,))

instance has no attribute (python)

I have a weird issue, which is probably easy to resolve.
I have a class Database with an __init__ and an executeDictMore method (among others).
class Database():
def __init__(self, database, server,login, password ):
self.database = database
my_conv = { FIELD_TYPE.LONG: int }
self.conn = MySQLdb.Connection(user=login, passwd=password, db=self.database, host=server, conv=my_conv)
self.cursor = self.conn.cursor()
def executeDictMore(self, query):
self.cursor.execute(query)
data = self.cursor.fetchall()
if data == None :
return None
result = []
for d in data:
desc = self.cursor.description
dict = {}
for (name, value) in zip(desc, d) :
dict[name[0]] = value
result.append(dict)
return result
Then I instantiate this class in a file db_functions.py :
from Database import Database
db = Database()
And I call the executeDictMore method from a function of db_functions :
def test(id):
query = "SELECT * FROM table WHERE table_id=%s;" %(id)
return db.executeDictMore(query)
Now comes the weird part.
If I import db_functions and call db_functions.test(id) from a python console:
import db_functions
t = db_functions.test(12)
it works just fine.
But if I do the same thing from another python file I get the following error :
AttributeError: Database instance has no attribute 'executeDictMore'
I really don't understand what is going on here. I don't think I have another Database class interfering. And I append the folder where the modules are in sys.path, so it should call the right module anyway.
If someone has an idea, it's very welcome.
You have another Database module or package in your path somewhere, and it is getting imported instead.
To diagnose where that other module is living, add:
import Database
print Database.__file__
before the from Database import Database line; it'll print the filename of the module. You'll have to rename one or the other module to not conflict.
You could at least try to avoid SQL injection. Python provides such neat ways to do so:
def executeDictMore(self, query, data=None):
self.cursor.execute(query, data)
and
def test(id):
query = "SELECT * FROM table WHERE table_id=%s"
return db.executeDictMore(query, id)
are the ways to do so.
Sorry, this should rather be a comment, but an answer allows for better formatting. Iam aware that it doesn't answer your question...
You should insert (not append) into your sys.path if you want it first in the search path:
sys.path.insert(0, '/path/to/your/Database/class')
Im not too sure what is wrong but you could try passing the database object to the function as an argument like
db_functions.test(db, 12) with db being your Database class

Categories