Create cx_Oracle Python timeout on connect - python

I'm trying to get the Oracle inbuilt timeout attribute to work in Python.
Some relevant docs are here:
https://cx-oracle.readthedocs.io/en/latest/connection.html
import cx_Oracle
connection = cx_Oracle.connect("user/pass#thedb")
connection.callTimeout = 2000
cursor = connection.cursor()
The trouble is that the initial connection command is the one that is taking an excessive period to timeout (several minutes).
My question is, is there a way to somehow apply the callTimeout before the connection is made? or is there another way to do what I want?
I'm aware of this help:
Set database connection timeout in Python
It seems excessive to use threads for this.

Related

Python long idle connection in cx_Oracle getting: DPI-1080: connection was closed by ORA-3113

I have long-running Python executable running.
Open Oracle connection using cx_Oracle on start.
After more than 45-60 mins of idle connects - it get's this error.
Any idea or special setup required in cx_Oracle ?
Instead of leaving a connection unused in your application, consider closing it when it isn't needed, and then reopening when it is needed. Using a connection pool would be recommended, since pools can handle some underlying failures such as yours and will give you a usable connection.
At application initialization start the pool once:
pool = cx_Oracle.SessionPool("username", pw,
"localhost/orclpdb1", min=0, max=4, increment=1)
Then later get the connection and hold it only when you need it:
with pool.acquire() as connection:
cursor = connection.cursor()
for result in cursor.execute(
"""select sys_context('userenv','sid') from dual"""):
print(result)
The end of the with block will release the connection back to the pool. It
won't be closed. The next time acquire() is called the pool can check the
connection is still usable. If it isn't, it will give you a new one. Because of these checks, the pool is useful even if you only have one connection.
See my blog post Always Use Connection Pools — and
How
most of which applies to cx_Oracle.
But if you don't want to change your code, then try setting an Oracle Network parameter EXPIRE_TIME as shown in the cx_Oracle documentation. This can be set in various places. In C-based Oracle clients like cx_Oracle:
With 18c client libraries it can be added as (EXPIRE_TIME=n) to the DESCRIPTION section of a connect descriptor
With 19c client libraries it can additionally be used via Easy Connect: host/service?expire_time=n.
With 21c client libraries it can additionally be used in a client-side sqlnet.ora file
This may not always help, depending what is closing the connection.
Fundamentally you should/could fix the root cause, which could be a firewall timeout, or a DBA-imposed user resource or DB idle time limit.

How to verify if a mysql database is reachable quickly in python?

I have a mysql server running on my local network that isn't reachable off the network, and it needs to stay like this.
When I am on a different network the following code hangs for about 5-10 seconds, my guess is that its retrying to connect for a number of attempts:
import mysql.connector
conn = mysql.connector.connect(
host="Address",
user="user",
password="password",
database="database"
)
Is there a way to "ping" the mysql server before this code to verify that the MySQL server is reachable or limit the number of retries?
At the moment I am having to use a try-except clause to catch if the server is not reaachable.
Instead of trying to implement specific behavior before connecting, adjust the connect timeout so that you don't have to wait - according to your need, the server is down if you can't connect within a short timeframe anyway.
You can use connection_timeout to adjust the socket timeout used when connecting to the server.
If you set it to a low value (seems like it's in seconds - so 1 should work fine) you'll get the behavior you're looking for (and it will also help you catch any issues with the user/password/database values).

Is there a limit on the number of connections opened from a client's side to a SQL Server database?

I'm working on a Python application with an SQL Server database using pyodbc, and I need to open multiple connections from the application's side to the database.
I learnt that the max number of connections allowed on an instance of the SQL Server database is 32,767. My understanding is this is the max that the DB instance "can handle", i.e. all simultaneous users combined.
Is there a limit on how many connections one client can open towards the same database instance, is it also 32,767? If yes, where / how is this limit configured?
Taking an educated guess here that there is no connection count limit on the client side towards the same DB instance, there is a limit of 32,767 on the server side, but the client would be more likely to run out of other resources way before it gets close to this figure.
I was using one connection, one cursor, and threading to insert multiple records, but kept getting a "connection is busy" error, this is resolved by adding "MARS_Connection=yes" in the pyodbc database connection string, thanks to this MS documentation.
Related:
How costly is opening and closing of a DB connection?
Can I use multiple cursors on one connection with pyodbc and MS SQL Server?

How to close mysql connection automatically after specified time in python?

I want to close MySQL database connection after 50 sec automatically if queries are taking more than 50 sec? Is there any option in python while making connection or any other solution to do that ?
Reference site for python database connection
Look Connection in this site they might explained about timeout for query, you can pass an integer which is in seconds

How many connections will open in mongodb in python?

I am working with mongodb and python, for accessing the database I am doing this:
#pymongo connection settings
from pymongo import Connection
connection = Connection('localhost', 27017)
db = connection['MyDB']
I am inserting documents like this:
db_data = db.mycollection.insert({'name': 'Mulagala', 'age': 24})
and finding like this:
db_data = db.mycollection.find()
When I am creating multiple users or getting mycollection details for multiple times how many conections will open in mongodb. Do I need to close any open connection before returning result?
No matter how many db.coll.find({...}) and db.coll.insert({...}) will you do, you will still have only one connection. You do not need to close open connection (at the end of the script it will be closed automatically)
P.S. MongoClient is the preferred method to connect to mongo (Connection is deprecated)

Categories