Do peewee models automatically close the connection? - python

I am using peewee to access a SQLite DB.
I have made a model.py like:
from peewee import *
db = SqliteDatabase('people.db')
class Person(Model):
name = CharField()
birthday = DateField()
is_relative = BooleanField()
class Meta:
database = db
In another Python file (with import model) I then manipulate the DB with calls like Person.create() or Person.select(name=='Joe').delete_instance().
The Quickstart says at the end to call db.close() to close the connection. Does this apply to my case as well? Am I supposed to call something like model.db.close()?

According to Charles Leifer, the maker of peewee it is the programmer's job to terminate connections. The documentation about Connection Pools tell, that all connections are thread-local, so as long as the Model is in use, the connection stays open and dies, if the thread containing the Transaction joins the Main-Thread.
Charles explicitly answers a question about the Connection Pool. The answer is a bit generalized, but I suppose this applies to all connections equally: About connection pool
Implicit answers on the topic:
Error 2006: MySQL server has gone away
Excerpt from the docs Quickstart Page:
Although it’s not necessary to open the connection explicitly, it is good practice since it will reveal any errors with your database connection immediately, as opposed to some arbitrary time later when the first query is executed. It is also good to close the connection when you are done – for instance, a web app might open a connection when it receives a request, and close the connection when it sends the response.
The final answer to your question is, based on these information: No.

You open and close the connection manually:
In your case (with db = SqliteDatabase('people.db'))
you established connection with the database by:
db.connect()
next you do whatever you want with the database and finally you close the connection with:
db.close()

Related

SQLalchemy: Engine/Connection disconnects after some hours of inactivity

Situation
I have a plotly-dash application running in a docker container (based on python3.7-slim).
The app is accessing a postgres database and visualizes the queried data.
However, if the app has not been used for some time (I would estimate around 24-48 hours. We first noticed this issue on mondays after nobody used the app during the weekend) i.e. if no data has been queried from the database, the app freezes and the logs show some errors related to the database.
I cannot fully access the logs, but they contain this error:
AttributeError: 'Connection' object has no attribute '_Connection_connection'
and in the following, all the pieces of code which tried to query data from the database are stated (but not what exactly went wrong).
The problem was always solved with a restart of the app (and thus a new connection to the database)
Assumption
As stated above, this always occured after a period of inactivity. So my assumption is, that the engine disconnects after some idle time
Code Sample
For accessing the database, I have a DatabaseConnection class. The relevant part of the code contais something like this:
from sqlalchemy import create_engine
...
engine = create_engine(f"postgresql+psycopg2://{user}:{passw}#{url}:{port}/{db_name}")
self.engine = engine.connect()
...
Question
What is the best solution for overcoming the issue of the disconnect after some inactivity?
How could I possibly check whether the database connection is still active and if not, reconnect it somehow?
-Is there a better way the access the database than through an engine-object?
Is there something wrong with my approach in general?
Please let me know if you require further information. Thanks in Advance.
There is an error in my code. It should be
self.engine = create_engine(f"postgresql+psycopg2://{user}:{passw}#{url}:{port}/{db_name}")
and the second line should be omitted. I misunderstood what engine.connect() is doing: It returns a Connection object (not an engine, as the attribute name suggests).
Then, for each query I execute, I use the context manager like this:
with self.engine.connect() as conn
table1 = pd.read_sql_table("table1", con=conn)
That way, the Connection object is closed after it has been used. But the engine object may open new connections whenever necessary.
In my previous solution, die Connection was killed after some Idle time.
(Based on this GitHub Discussion)

Connect to sqlite3.Connection using sqlalchemy

I am using a library that creates an SQLite library in-memory by calling sqlite3.connect(':memory:'). I would like to connect to this database using sqlalchemy to use some ORM and other nice bells and whistles. Is there, in the depths of SQLAlchemy's API, a way to pass the resulting sqlite3.Connection object through so that I can re-use it?
I cannot just re-connect with connection = sqlalchemy.create_engine('sqlite:///:memory:').connect() – as the SQLite documentation states: “The database ceases to exist as soon as the database connection is closed. Every :memory: database is distinct from every other. So, opening two database connections each with the filename ":memory:" will create two independent in-memory databases.” (Which makes sense. I also tried it, and the behaviour is as expected.)
I have tried to follow SQLAlchemy's source code to find the low level location where the database connection is established and SQLite is actually called, but so far I found nothing. It looks like SQLAlchemy uses far too much obscure alchemy to do that for me to understand when and where it happens.
Here's a way to do that:
# some connection is created - by you or someone else
conn = sqlite3.connect(':memory:')
...
def get_connection():
# just a debug print to verify that it's indeed getting called:
print("returning the connection")
return conn
# create a SQL Alchamy engine that uses the same in-memory sqlite connection
engine = create_engine('sqlite://', creator = get_connection)
From this point on, just use the engine as you wish.
Here's a link to the documentation of this feature.

Python solving _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away') when using db models

So our python program is running into these errors _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away'). The problem is that the program accesses the db, hen does a lot of crawling to come back with the results after the mysql connection timeout ... and then its too late.
By logic there are 2 solutions
increase mysql connection timeout but this is no option
have python check for an open connection, and if closed then re-open it
Some solutions have been found and sound clear possible solution and here about closing and opening cursors.
However we are using models from django .. and I dont know where to implement logic for a check for a connection and reconnect of the connectin is lost check.
question: Where and how can I implement the described logic to re-connect to a lost db connection when using models? (is there some kind of INIT or CONNECT event to access)
Sample code
from django.db import models
class Domain(models.Model):
name = models.CharField(max_length=100)
domain = models.CharField(max_length=100, blank=True, null=True)
This is a bug that wont be fixed as you can see here in the Django docs
for the workaround you just have to close the connection if your program is going to be idle for a long time.
from django.db import connection
# DO SOMETHING TO THE DATABASE
connection.close()
If you close your connection, django will automatically re-open it if it needs to query the database.
So if you know that the problem comes in some function, wether a view or not, you can close the connection right before or at the beginning of that function:
from django.db import connection
...
connection.close()

Is having a global database connection allowed in WSGI applications?

I need to create a simple project in Flask. I don't want to use SQLAlchemy. In the code snippet below, everyone that connects to the server uses the same connection object but for each request, a new cursor object is created. I am asking this because I have never used Python DB api before in this way. Is it correct? Should I create a new connection object for each request or use the same connection and cursor object for each request or the method below. Which one is correct?
import mysql.connector
from flask import Flask, request
app = Flask(__name__)
try:
con = mysql.connector.connect(user='root',password='',host='localhost',database='pywork')
except mysql.connector.Error as err:
print("Something went wrong")
#app.route('/')
def home():
cursor = con.cursor()
cursor.execute("INSERT INTO table_name VALUES(NULL,'test record')")
con.commit()
cursor.close()
return ""
WSGI applications may be served by several worker processes and threads. So you might end up having multiple threads using the same connection. So you need to find out whether your library's implementation of the connection is thread safe. Look up the documentation and see if they claim to provide Level 2 thread safety.
Then you should reflect about whether or not you need transactions during your requests. If you find you need transactions (e.g., requests issue multiple database commands with an inconsistent state in between or possible race conditions), you should use different connections, because transactions are always connection wide. Note that some database systems or configurations don't support transactions or don't isolate separate connections from each other.
So if you share a connection, you should assume that you work with autocommit turned on (or better: actually do that).

In Django, how can I set db connection timeout?

OK, I know it's not that simple. I have two db connections defined in my settings.py: default and cache. I'm using DatabaseCache backend from django.core.cache. I have database router defined so I can use separate database/schema/table for my models and for cache. Perfect!
Now sometimes my cache DB is not available and there are two cases:
Connection to databse was established already when DB crashed - this is easy - I can use this recipe: http://code.activestate.com/recipes/576780-timeout-for-nearly-any-callable/ and wrap my query like this:
try:
timelimited(TIMEOUT, self._meta.cache.get, cache_key))
expect TimeLimitExprired:
# live without cache
Connection to database wasn't yet established - so I need to wrap in timelimited some portion of code that actually establishes database connection. But I don't know where such code exists and how to wrap it selectively (i.e. wrap only cache connection, leave default connection without timeout)
Do you know how to do point 2?
Please note, this answer https://stackoverflow.com/a/1084571/940208 is not correct:
grep -R "connect_timeout" /usr/local/lib/python2.7/dist-packages/django/db
gives no results and cx_Oracle driver doesn't support this parameter as far as I know.

Categories