Is it required to close python SQL connection in Flask app? - python

I've the code like this below. Is it necessary to close mysql connection because whenever my home page is requested, a new sql connection will be created?
I randomly get Connection Limit error. But I'm not sure if the DB connection is the problem.
#app.route("Home", methods=["GET"])
def get_home_page():
db = mysql.connect(host, user, password, db_name, charset='utf8', use_unicode=True)
...

It is good practice to close the connection. You can put your codes inside a try..finally block.
#app.route("Home", methods=["GET"])
def get_home_page():
db = mysql.connect(host, user, password, db_name, charset='utf8', use_unicode=True)
try:
... do something ...
finally:
db.close()

from my experience, close session after use it take significant difference amount of time to response in api whom i've experienced in flask
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
try:
db.session.query( Any Model ...
except:
finally:
db.close_all_sessions

Related

In python flask with mysql, can I use the same connection throught the app life cycle?

Here is my python flask code:
from flask import *
import mysql.connector
app = Flask(__name__)
conn = mysql.connector.connect(
host="<my db host>",
user="<my db user>",
password = "<my db password>",
database = '<my db>'
)
#app.route('/posts/<int:post_id>')
def get_post(post_id):
with conn.cursor(dictionary=True) as cur:
cur.execute('select * from posts where ID=%s',(post_id,))
result = cur.fetchone()
ans=result['post_content']
return ans
app.run(debug=False,threaded=True,host='0.0.0.0',port=80)
Note how I don't create a new connection for each request. Instead, I use the same connection for all requests.
My question is: is there any potential problems in this approach?
You should not use the same connection, because it will stay open forever, only open an SQL connection when you need to use it, and close it afterward. Not doing so can lead to a lot of errors.

How to yield a db connection in a python sqlalchemy function similar to how it is done in FastAPI?

In FastAPI I had the following function that I used to open and close a DB session:
def get_db():
try:
db = SessionLocal()
yield db
finally:
db.close()
And within the routes of my API I would do something like that:
#router.get("/")
async def read_all_events(user: dict = Depends(get_current_user), db: Session = Depends(get_db)):
logger.info("API read_all_events")
if user is None:
raise http_user_credentials_not_valid_exception()
return db.query(models.Events).all()
You can see that I am injectin the session in the api call.
So now i want to do something similar within a python function:
def do_something():
#get person data from database
#play with person data
#save new person data in database
#get cars data from database
So i am wondering if I should use the same approach than in FastAPI (i do not know how) or if i just should be openning and clossing the connection manually like that:
def do_something():
try:
db = SessionLocal()
yield db
#get person data from database
#play with person data
#save new person data in database
#get cars data from database
finally:
db.close()
Thanks
The usage of yield in this case is so that Depends(get_db) returns the db session instance, so that it can be used in the fastapi route, and as soon as the fastapi route returns response to user, the finally clause (db.close()) will be executed. This is good because every request will be using a separate db session, and db connections will be closed after every route response.
If you want to use the db session normally in a function, just get the db instance using db = SessionLocal(), and proceed to use the db instance in the function.
Example:
def do_something():
db = SessionLocal()
event = db.query(models.Events).first()
db.delete(event)
db.commit()
db.close()

cx_oracle persistent connection on flask+apache+mod_wsgi

I have deployed my flask application on apache+mod_wsgi
I'm using WSGI Daemon mode and have this config in apache httpd.conf:
WSGIDaemonProcess flask_test user=apache group=apache threads=20
For simplicity lets say for each request, I need to execute a query to insert data into Oracle DataBase.
So in my flask application, I have done something like this:
# DB.py
import cx_Oracle
class DB:
def __init__(self, connection_string):
self.conn = cx_Oracle.connect(connection_string, threaded=True)
def insert(query):
cur = self.conn.cursor()
cur.execute(query)
cur.close()
self.conn.commit()
# flask_app.py
from flask import Flask, request, jsonify
from DB import DB
app = Flask(__name__)
db = DB(connection_string)
#app.route("/foo", methods=["POST"])
def foo():
post_data = request.get_json()
# parse above data
# create insert query with parsed data values
db.insert(insert_processed_data_QUERY)
# generate response
return jsonify(response)
When I start the apache+mod_wsgi server, the DB object is created and the DB connection is established.
For all incoming requests, the same DB object is used to execute insert query.
So far this works fine for me. However my concern is that if there are no requests for a long period of time, the DB connection might time out, and then my app will not work for a new request when it comes.
I've been monitoring my application and have observed that the DB connection persists for hours and hours. But I'm pretty sure it might timeout if there is no request for 2-3 days(?)
What would be the correct way to ensure that the DB connection will stay open forever? (i.e. as long as the apache server is running)
Use a pool instead of a standalone connection. When you acquire a connection from the pool it will check to see if the connection is no longer valid and automatically dispense a new one. So you need something like this:
pool = cx_Oracle.SessionPool(user=user, password=password, dsn=dsn, min=1,
max=2, increment=1)
Then in your code you need to do the following:
with pool.acquire() as connection:
# do what you need to do with the connection

My insert statement is executing twice and i'm not sure why. sqlite3

I have an insert statement.
conn = sqlite3.connect('WO1.db')
with conn:
cur1 = conn.cursor()
cur1.execute("insert into workorder (Title, Link, Status)
values (?,?,?)", ('iijiji', 'ijjijijj', '22jhhuhij'))
if conn:
conn.close()
The title and link columns had UNIQUE constraints on them and I was getting the following error and my program terminated.
sqlite3.IntegrityError: UNIQUE constraint failed:
But 1 new record inserted into the database which is what I wanted.
I then created a new table where the Title and Link columns didn't have a UNIQUE constraint.
I ran the program again and this time received no error however, the record was inserted into the table twice which explains the error when there was UNIQUE constraints on the Link and Title.
Is there any logical explanation as to why this insert statement is executing twice?
Note This is only one place in the program where a connection is established, a query is executed and then the connection is closed. There is no other interaction with this database in the program other than the normal configuration.
I haven't had any other sessions open with this database either other than within this application.
I'm running this query in the python file where the program is run from.
app = Flask(__name__)
app.config.from_object(Config)
db = SQLAlchemy(app)
conn = sqlite3.connect('WO1.db')
with conn:
cur1 = conn.cursor()
cur1.execute("insert into workorder (Title, Link, Status) values
(?,?,?)", ('en24433', 'www.reddit.com', 'Not Completed'))
if conn:
conn.close()
migrate = Migrate(app, db)
#app.route('/')
def index():
return render_template('index.html')
if __name__ == '__main__':
app.run(host='localhost', port=8080, debug= True)
Your need to remake your database access code.
Firstly you connect twice at the same db by using the with statement after conn.connect().
When a database is accessed by multiple connections, and one of the processes modifies the database, the SQLite database is locked until that transaction is committed.
I think that this is the reason for your error.
After you make the insert in database you need to commit the changes.
This method commits the current transaction. If you don’t call this method, anything you did since the last call to commit() is not visible from other database connections. If you wonder why you don’t see the data you’ve written to the database, please check you didn’t forget to call this method.
Be aware that close() does not automatically commit:
This closes the database connection. Note that this does not automatically call commit(). If you just close your database connection without calling commit()first, your changes will be lost!
Take a look at sqlite3 API docs
It worked when I put the database connection and insert statements into my index route rather than above the routes.
app = Flask(__name__)
app.config.from_object(Config)
db = SQLAlchemy(app)
migrate = Migrate(app, db)
#app.route('/')
def index():
conn = sqlite3.connect('WO1.db')
with conn:
cur1 = conn.cursor()
cur1.execute("insert into work_order (Title, Link, Status) values (?,?,?)",
('iikii', 'ijkoijj', '66hhuhij'))
conn.close()
return render_template('index.html')

CherryPy and MySQL can't connect to the database

I have a CherryPy "site" set up under Apache with modwsgi. It works fine and I can return hello world messages no problem. The problem is when I try to connect to my MySQL database. Here is the code I'm using.
import sys
sys.stdout = sys.stderr
import atexit
import threading
import cherrypy
import MySQLdb
cherrypy.config.update({'environment': 'embedded'})
if cherrypy.__version__.startswith('3.0') and cherrypy.engine.state == 0:
cherrypy.engine.start(blocking=False)
atexit.register(cherrypy.engine.stop)
def initServer():
global db
db=MySQLdb.connect(host="localhost", user="root",passwd="pass",db="Penguin")
class Login(object):
def index(self):
return 'Login Page'
index.exposed = True
class Root(object):
login = Login();
def index(self):
# Sample page that displays the number of records in "table"
# Open a cursor, using the DB connection for the current thread
c=db.cursor()
c.execute('SELECT count(*) FROM Users')
result=cursor.fetchall()
cursor.close()
return 'Help' + result
index.exposed = True
application = cherrypy.Application(Root(), script_name=None, config=None)
Most of this was copied from the CherryPy site on setting up modwsgi, I just added the database stuff which I pieced together from various internet sources.
When I try to view the root page I get a 500 Internal Server Error. I can still get to the login page fine but so I'm pretty sure I'm messing up the database connection somehow.
You have a bunch of errors, not related to CherryPy really.
def initServer():
global db
db is not defined in the global scope. Try:
db = None
def initServer():
global db
In addition, initServer() is never called to create the DB connection.
Another:
c = db.cursor()
c.execute('SELECT count(*) FROM Users')
result = cursor.fetchall()
cursor.close()
cursor is not defined. I think you mean c:
c = db.cursor()
c.execute('SELECT count(*) FROM Users')
result = c.fetchall()
c.close()

Categories