connection.py File
def create_connection():
connection = psycopg2.connect("dbname=suppliers user=postgres password=postgres")
return connection
def create_cursor():
connection = create_connection()
cursor = connection.cursor()
return cursor
Above example will create a connection 2 times when calling both create_connection() and create_cursor() method
Query File
def common_query():
sql_query = "INSERT INTO supply VALUES (1, 'test', 'test')"
conn = create_connection()
cursor = create_cursor()
with conn:
with cursor as cur:
cur.execute(sql_query)
conn.close()
Above example will call create_connection and create_cursor method, but as you can see while calling create_connection , connection has already been established and in create_cursor() while calling create_connection() method again it create another connection.
So while execution query it does't show any error nor it insert my data into database.
Let me know whats happened in it ?
You create two connections for each call to common_query. One is explicitly closed, the other is closed at some point because it went out of scope. (Python is a garbage collected language)
You don't commit on either one, so whatever work you did gets rolled back automatically. This is unrelated to the first point. The same thing would happen if you had only created one connection (and also didn't commit on it).
Related
I am a beginner in python and mysql. I have a small application written in Python that connects to remote mysql server. There is no issues to connect and fetch data. It works fine then the code is outside a function. As I want to close and open connections, execute different queries from several functions inside my application, I would like to be able to call a function to establish a connection or run a query as needed. It seems that when I create an connection, that connection can not be used outside the function. I would like to implement something like this:
mydbConnection():
....
mydbQuery():
....
connected = mydbConnection()
myslq = 'SELECT *.......'
result = mydbQuery(mysql)
And so on...
Thanks for any direction on this.
import mysql.connector
from mysql.connector import Error
def mydbConnection(host_name, user_name, user_password):
connection = None
try:
connection = mysql.connector.connect(
host=host_name,
user=user_name,
passwd=user_password
)
print("Connection to MySQL DB successful")
except Error as e:
print(f"The error '{e}' occurred")
return connection
connection = mydbConnection("localhost", "root", "")
In the above script, you define a function mydbConnection() that accepts three parameters:
host_name
user_name
user_password
The mysql.connector Python SQL module contains a method .connect() that you use in line 7 to connect to a MySQL database server. Once the connection is established, the connection object is returned to the calling function. Finally, in line 18 you call mydbConnection() with the host name, username, and password.
Now, to use this connect variable, here is a function:
def mydbQuery(connection, query):
cursor = connection.cursor()
try:
cursor.execute(query)
print("Database created successfully")
except Error as e:
print(f"The error '{e}' occurred")
To execute queries, you use the cursor object. The query to be executed is passed to cursor.execute() in string format.
Create a database named db for your social media app in the MySQL database server:
create_database_query = "CREATE DATABASE db"
mydbQuery(connection, create_database_query)
I'm new to python and learning DB operation with DBUtils. Why pool.connection().cursor().execute() would release the connection too early for reuse?
If you don't need it any more, you should immediately return it to the
pool with db.close(). You can get another connection in the same way.
Warning: In a threaded environment, never do the following:
pool.connection().cursor().execute(...)
This would release the connection too early for reuse which may be
fatal if the connections are not thread-safe. Make sure that the
connection object stays alive as long as you are using it, like that:
db = pool.connection()
cur = db.cursor()
cur.execute(...)
res = cur.fetchone()
cur.close() # or del cur
db.close() # or del db
All tutorials told me database connection is a precious resource.We must close it after do some operations on it.And reopen it when we want do another thing.But I only find out a property(open) that indicate the connection status.
It means I need create connection object for every query|update|delete?
If I won't create connection for every operation (code like below) and how safely destroy the connection?
connection = pymysql.connect(host='localhost',
user='root',
password='password',
db='blog',
charset='utf8',
cursorclass=pymysql.cursors.DictCursor)
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
connection.close()
#do other things ..............................
#maby return here and do not execute below
#occur error below
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * from categories"
cursor.execute(sql)
result = cursor.fetchall()
print(result)
Normally your program opens a database connection and keeps it open until the program finishes. Creating/opening a connection is an expensive operation so normally you just want to keep it open for the duration of your program. During the creation of a connection the database has to allocate resources for that connection. If you are going to open and close a connection for every operation this will negatively effect the complete database performance since the database has needs exclusive access to those memory structures and that causes waits.
I'm running this from PyDev in Eclipse...
import pymysql
conn = pymysql.connect(host='localhost', port=3306, user='userid', passwd='password', db='fan')
cur = conn.cursor()
print "writing to db"
cur.execute("INSERT INTO cbs_transactions(leagueID) VALUES ('test val')")
print "wrote to db"
The result is, at the top of the Console it says C:...test.py, and in the Console:
writing to db
wrote to db
So it's not terminating until after the execute command. But when I look in the table in MySQL it's empty. A record did not get inserted.
First off, why isn't it writing the record. Second, how can I see a log or error to see what happened. Usually there should be some kind of error in red if the code fails.
Did you commit it? conn.commit()
PyMySQL disable autocommit by default, you can add autocommit=True to connect():
conn = pymysql.connect(
host='localhost',
user='user',
passwd='passwd',
db='db',
autocommit=True
)
or call conn.commit() after insert
You can either do
conn.commit() before calling close
or
enable autocommit via conn.autocommit(True) right after creating the connection object.
Both ways have been suggested from various people at a duplication of the question that can be found here: Database does not update automatically with MySQL and Python
I have a python script that makes about ten INSERTs into a MySQL database. This is its current structure:
conn = MySQLdb.connect (host = DB_HOST,
port = DB_PORT,
user = DB_USER,
passwd = DB_PASSWORD,
db = DB_NAME)
cursor = conn.cursor()
cursor.execute("INSERT INTO...")
# do some stuff, then another INSERT
cursor.execute("INSERT INTO...")
# do some other stuff, then another INSERT
cursor.execute("INSERT INTO...")
etc...
conn.commit()
cursor.close()
conn.close()
Is the above the correct way to do multiple inserts, or should I be closing the cursor after each INSERT? Should I be doing a commit after each INSERT?
should I be closing the cursor after each INSERT?
It doesn't much matter. Cursors are reused cleverly.
You can close them to be super careful of your resources.
Do this.
from contextlib import closing
with closing( conn.cursor() ) as cursor:
cursor.execute("INSERT INTO...")
This assures that the cursor is closed no matter what kind of exceptions happen.
Should I be doing a commit after each INSERT?
That depends on what your application is expected to do.
If it's an "all or nothing" proposition, then you do one commit. All the inserts are good or none of them are.
If partial results are acceptable, then you can commit after each insert.