why pool.connection().cursor().execute() is a warning operation? - python

I'm new to python and learning DB operation with DBUtils. Why pool.connection().cursor().execute() would release the connection too early for reuse?
If you don't need it any more, you should immediately return it to the
pool with db.close(). You can get another connection in the same way.
Warning: In a threaded environment, never do the following:
pool.connection().cursor().execute(...)
This would release the connection too early for reuse which may be
fatal if the connections are not thread-safe. Make sure that the
connection object stays alive as long as you are using it, like that:
db = pool.connection()
cur = db.cursor()
cur.execute(...)
res = cur.fetchone()
cur.close() # or del cur
db.close() # or del db

Related

What happen if we use PostgreSQL multiple connection without closing it?

connection.py File
def create_connection():
connection = psycopg2.connect("dbname=suppliers user=postgres password=postgres")
return connection
def create_cursor():
connection = create_connection()
cursor = connection.cursor()
return cursor
Above example will create a connection 2 times when calling both create_connection() and create_cursor() method
Query File
def common_query():
sql_query = "INSERT INTO supply VALUES (1, 'test', 'test')"
conn = create_connection()
cursor = create_cursor()
with conn:
with cursor as cur:
cur.execute(sql_query)
conn.close()
Above example will call create_connection and create_cursor method, but as you can see while calling create_connection , connection has already been established and in create_cursor() while calling create_connection() method again it create another connection.
So while execution query it does't show any error nor it insert my data into database.
Let me know whats happened in it ?
You create two connections for each call to common_query. One is explicitly closed, the other is closed at some point because it went out of scope. (Python is a garbage collected language)
You don't commit on either one, so whatever work you did gets rolled back automatically. This is unrelated to the first point. The same thing would happen if you had only created one connection (and also didn't commit on it).

In pymysql of python can I keep a connection object for next use?

All tutorials told me database connection is a precious resource.We must close it after do some operations on it.And reopen it when we want do another thing.But I only find out a property(open) that indicate the connection status.
It means I need create connection object for every query|update|delete?
If I won't create connection for every operation (code like below) and how safely destroy the connection?
connection = pymysql.connect(host='localhost',
user='root',
password='password',
db='blog',
charset='utf8',
cursorclass=pymysql.cursors.DictCursor)
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
connection.close()
#do other things ..............................
#maby return here and do not execute below
#occur error below
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
Normally your program opens a database connection and keeps it open until the program finishes. Creating/opening a connection is an expensive operation so normally you just want to keep it open for the duration of your program. During the creation of a connection the database has to allocate resources for that connection. If you are going to open and close a connection for every operation this will negatively effect the complete database performance since the database has needs exclusive access to those memory structures and that causes waits.

pymongo MongoClient end_request() will not terminate cursor

i have a question about pymongo connection pool - MongoClient
how is it possible that the cursor ("results" in the following example) is retrieving the documents, even after the connection was returned to the connection pool by end_request() statement
mongo_connection_pool = MongoClient(host="127.0.0.1", port=27017)
db_connection = mongo_connection_pool["db_name"]
collection = db_connection["collection"]
results = collection.find()
db_connection.end_request()
for result in results:
print result
is there something that i'm missing?
cheers
In PyMongo 2.x MongoClient.start_request is used to pin a socket from the connection pool to an application thread. MongoClient.end_request removes that mapping (if it exists).
This has no impact on iterating a cursor. For each OP_GET_MORE operation the driver has to execute it will get a socket out of the pool. If you are in a "request" it will use the request socket for the current thread. If not, it will use any available socket. You can read more about requests here. Note that "requests" no longer exist in PyMongo 3.0.
If you want to "terminate" a cursor you can del the cursor object, or call cursor.close()

Pymysql Insert Into not working

I'm running this from PyDev in Eclipse...
import pymysql
conn = pymysql.connect(host='localhost', port=3306, user='userid', passwd='password', db='fan')
cur = conn.cursor()
print "writing to db"
cur.execute("INSERT INTO cbs_transactions(leagueID) VALUES ('test val')")
print "wrote to db"
The result is, at the top of the Console it says C:...test.py, and in the Console:
writing to db
wrote to db
So it's not terminating until after the execute command. But when I look in the table in MySQL it's empty. A record did not get inserted.
First off, why isn't it writing the record. Second, how can I see a log or error to see what happened. Usually there should be some kind of error in red if the code fails.
Did you commit it? conn.commit()
PyMySQL disable autocommit by default, you can add autocommit=True to connect():
conn = pymysql.connect(
host='localhost',
user='user',
passwd='passwd',
db='db',
autocommit=True
)
or call conn.commit() after insert
You can either do
conn.commit() before calling close
or
enable autocommit via conn.autocommit(True) right after creating the connection object.
Both ways have been suggested from various people at a duplication of the question that can be found here: Database does not update automatically with MySQL and Python

Unable to delete sqlite database

If I execute a script containing the following and then try to delete mydb on the file system, I am unable to do so until I shut down the python idle. What is the issue here?
with sqlite3.connect(r'./mydb') as connection:
cursor = connection.cursor()
cursor.executemany('...' )
connection.commit()
The sqlite connection context manager manages transactions, not the connection. The __exit__ handler commits or rolls back, it does not close the connection. See Using the connection as a context manager:
Connection objects can be used as context managers that automatically commit or rollback transactions. In the event of an exception, the transaction is rolled back; otherwise, the transaction is committed.
You'll have to explicitly close the connection yourself, or use the contextlib.closing context manager:
from contextlib import closing
with closing(sqlite3.connect(r'./mydb')) as connection:
with connection:
cursor = connection.cursor()
cursor.executemany('...' )

Categories