I am using the flask and flask_restplus to build an rest api in python.
The directory structure i followed is same in the link below under the heading
Multiple APIs with reusable namespaces
https://flask-restplus.readthedocs.io/en/stable/scaling.html
Now I am trying to add mySql database connection and for that I used the following code in app.py
from flaskext.mysql import MySQL
mysql = MySQL()
app.config['MYSQL_DATABASE_USER'] = 'name'
app.config['MYSQL_DATABASE_PASSWORD'] = 'test'
app.config['MYSQL_DATABASE_DB'] = 'textclassifier'
app.config['MYSQL_DATABASE_HOST'] = 'localhost'
mysql.init_app(app)
I tried using mysql object through
from app import mysql
which is wrong, i read the whole document but I could not find a way to add the mysql connection without changing the structure.
Does anyone has an idea how i can solve this ?
I am new to flask and rest api in python
Related
I have an API that I built using python fastAPI that pulls from a sqlite database using sqlalchemy, and it works fine in dev but after I deployed it I'm getting--
qlalchemy.exc.OperationalError: (sqlite3.OperationalError) attempt to write a readonly database
The code right now is--
SQLALCHEMY_DATABASE_URL = 'sqlite:///./etfs.db'
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args = {'check_same_thread':False})
How do I connect to the database so that it isn't read-only? Or how do I create the database in sqlite so that it isn't read-only to begin with?
I am trying to connect a Python Flask app running in Azure App Service Web App to an Azure SQL Database.
The works just fine when I use SQL authentication with username and password.
Now I want to move to using the Web Apps managed identity.
I have activated the system-assigned managed identity, created a user for it in SQL and added it to the db_datareader role.
I am connecting with SqlAlchemy using a connection string like this
params = urllib.parse.quote_plus(os.environ['SQL_CONNECTION_STRING'])
conn_str = 'mssql+pyodbc:///?odbc_connect={}'.format(params)
engine_azure = db.create_engine(conn_str,echo=True)
The connection string is stored as an application setting, and its value is
"Driver={ODBC Driver 17 for SQL Server};Server=tcp:<server>.database.windows.net,1433;Database=<database>;Authentication=ActiveDirectoryMsi;"
I expected this to be all I need to do, but now my app is not starting.
The logs report a timeout when connecting to the database.
How can I fix this?
I know this is quite an old post, but it may help people like me who are looking for a solution.
You could modify the connection string by adding "Authentication" parameters as "ActiveDirectoryMsi", no need to use endpoint and headers.
(Works with Azure SQL, for other databases like Postgress you may need to use the struct token)
import pyodbc
pyodbc.connect(
"Driver="
+ driver
+ ";Server="
+ server
+ ";PORT=1433;Database="
+ database
+ ";Authentication=ActiveDirectoryMsi")
I wrote a quick article for those who are interested in Azure MSI:
https://hedihargam.medium.com/python-sql-database-access-with-managed-identity-from-azure-web-app-functions-14566e5a0f1a
If you want to connect Azure SQL database with Azure MSI in python application, we can use the SDK pyodbc to implement it.
For example
Enable system-assigned identity for your Azure app service
Add the MSi as contained database users in your database
a. Connect your SQL database with Azure SQL AD admin (I use SSMS to do it)
b. run the following the script in your database
CREATE USER <your app service name> FROM EXTERNAL PROVIDER;
ALTER ROLE db_datareader ADD MEMBER <your app service name>
ALTER ROLE db_datawriter ADD MEMBER <your app service name>
ALTER ROLE db_ddladmin ADD MEMBER <your app service name>
Code
import os
import pyodbc
import requests
import struct
#get access token
identity_endpoint = os.environ["IDENTITY_ENDPOINT"]
identity_header = os.environ["IDENTITY_HEADER"]
resource_uri="https://database.windows.net/"
token_auth_uri = f"{identity_endpoint}?resource={resource_uri}&api-version=2019-08-01"
head_msi = {'X-IDENTITY-HEADER':identity_header}
resp = requests.get(token_auth_uri, headers=head_msi)
access_token = resp.json()['access_token']
accessToken = bytes(access_token, 'utf-8');
exptoken = b"";
for i in accessToken:
exptoken += bytes({i});
exptoken += bytes(1);
tokenstruct = struct.pack("=i", len(exptoken)) + exptoken;
conn = pyodbc.connect("Driver={ODBC Driver 17 for SQL Server};Server=tcp:andyserver.database.windows.net,1433;Database=database2", attrs_before = { 1256:bytearray(tokenstruct) });
cursor = conn.cursor()
cursor.execute("select ##version")
row = cursor.fetchall()
For more details, please refer to the
https://github.com/AzureAD/azure-activedirectory-library-for-python/wiki/Connect-to-Azure-SQL-Database
https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-aad-authentication-configure
I have Python 3.6, pymysql 0.7.11 and sqlalchemy 1.2.4
I am having an issue creating an engine with sqlalchemy.
When I try (credentials changed for privacy, except for ":3306" as the port and the encd variable):
import pymysql
import sqlalchemy
login = ‘username’
passwd = ‘password’
server = '1.1.1.1:3306'
db = 'db_name'
encd = 'charset=utf8'
engine_str = 'mysql+pymysql://{}:{}#{}/{}?{}'.format(login, passwd, server, db, encd)
engine = sqlalchemy.create_engine(engine_str)
I get:
AttributeError: module 'sqlalchemy.sql.sqltypes' has no attribute 'NativeForEmulated'
Note that I also tried without the port :3306 and had same error.
The error occurs at this point, not when connecting the engine or using the connection.
When I create an engine with mypysql using the same credentials, it works fine:
engine = pymysql.connect(user=login, password=passwd,
host='1.1.1.1',
database=db, port=‘3306’)
but I need sqlalchemy for this project.
I haven't found anything searching for this error message. I tried running the exact same code on a different computer and it worked fine. Does anyone have any insight?
I'm running a flask-rest-jsonapi application on top of Flask, sqlalchemy, and cx_Oracle. One requirement of this project is that the connection property client_identifier, made available via cx_Oracle (relevant documentation), be modifiable based on a value sent in a JWT with each client request. We need to be able to write to this property because our internal auditing tables make use of it to track changes made by individual users.
In PHP, setting this value is straightforward using oci8, and has worked great for us in the past.
However, I have been unable to figure out how to set the same property using this new application structure. In cx_Oracle, the client_identifier property is a 'write-only' property, so it's difficult to verify that the value is set correctly without going to the backend and examining the db session properties. You access this property via the sqlalchemy raw_connection object.
Beyond being difficult to read, setting the value has no effect. We get the desired client identifier value from the JWT passed in with each request and attempt to set it on the raw connection object. While the action of setting the value throws no error, the value does not show up on the backend for the relevant session, i.e. the client_identifier property is null when viewing sessions on the db side.
from flask import Flask, jsonify, request
from flask_sqlalchemy import SQLAlchemy
app = Flask(__name__)
db = SQLAlchemy(app)
# the client_identifier property is available at db.engine.raw_connection()
# attempt to set the client_identifier property
raw_conn = db.engine.raw_connection()
raw_conn.client_identifier = 'USER_210'
# execute some sql using raw_conn.cursor()...
# the client identifier value on the db side for this request is null
Is the approach shown above the correct way to set the client_identifier? If so, why isn't USER_210 listed in the client_identifier column when querying the backend session table using the v$session view?
In pure cx_Oracle, this works for me:
import cx_Oracle
db = cx_Oracle.connect("system", "oracle", "localhost/orclpdb")
db.client_identifier = 'this-is-me'
cursor = db.cursor()
cursor.execute("select username, client_identifier from v$session where username = 'SYSTEM'")
v = cursor.fetchone()
print(v)
The result is:
$ python3 q1.py
('SYSTEM', 'this-is-me')
I don't have the setup to test your exact scenario.
So I have main .py file where Flask app object is created and configured, and MySQL is initialized. Then I want to register some blueprint.
from flask import Flask
from flaskext.mysql import MySQL
app = Flask(__name__)
mysql = MySQL()
app.config['MYSQL_DATABASE_USER'] = 'root'
app.config['MYSQL_DATABASE_PASSWORD'] = 'root'
app.config['MYSQL_DATABASE_DB'] = 'EmpData'
app.config['MYSQL_DATABASE_HOST'] = 'localhost'
mysql.init_app(app)
from views1 import views1_blprnt
app.register_blueprint(views1_blprnt)
But in views1.py I need MySQL object to get connection and cursor objects to execute queries. And, of course, when I try to import it I get the ImportError. I've read some similar questions and workarounds, but all of them were using SQLAlchemy. Does someone have any ideas how to resolve it?
Thank you for your help and sorry for my english.
A fairly typical pattern here is to use a third module where you initialize mysql and any other resources you want to share among your blueprints.
I've noticed that people tend to call it extensions.py. So you might do something like this in extension.py:
from flaskext.mysql import MySQL
mysql = MySQL()
And in your main.py, something like this:
from extensions import mysql
app.config['MYSQL_DATABASE_USER'] = 'root'
app.config['MYSQL_DATABASE_PASSWORD'] = 'root'
app.config['MYSQL_DATABASE_DB'] = 'EmpData'
app.config['MYSQL_DATABASE_HOST'] = 'localhost'
mysql.init_app(app)
You'd access the initialized mysql instance from all your blueprints like so:
from extensions import mysql
mysql....
There are other solutions, but this is what I see done most often. It's what I usually do as well.