I'm making a connection to SQL Server to execute a stored procedure. What is the correct way to 'poll' the server to determine whether the stored procedure finished running successfully or returned an error if the SP takes longer than 60 seconds / 3600 seconds, etc?
import pyodbc
cnxn = pyodbc.connect('DRIVER={SQL Server}; SERVER=ServerName; PORT=1433;DATABASE=dbname;UID=%s;PWD=%s' % (username, password))
cnxn.execute("EXECUTE msdb.dbo.sp_start_job 'TestSP'")
<pyodbc.Cursor object at 0x0000000002D6DDB0>
How can I determine the status of the SP?
Consider wrapping the execute in a try/except to catch exceptions (which encompass errors). If no error is raised, execute is assumed to run correctly. Also, use the timeout variable (in seconds) as the database should raise OperationError if timeout occurs.
cnxn = pyodbc.connect('DRIVER={SQL Server}; SERVER=ServerName; PORT=1433; \
DATABASE=dbname;UID={0};PWD={1}'.format(username, password))
cnxn.timeout = 60
cursor = cnxn.cursor()
try:
cnxn.execute("EXECUTE msdb.dbo.sp_start_job 'TestSP'")
except Exception as e:
print(e)
It looks like you've skipped making a cursor, so you need to do that, then fetch the results. Try this:
import pyodbc
connection = pyodbc.connect('DRIVER={SQL Server}; SERVER=ServerName; PORT=1433;DATABASE=dbname;UID=%s;PWD=%s' % (username, password))
cursor = connection.cursor()
cursor.execute("EXECUTE msdb.dbo.sp_start_job 'TestSP'")
rows = cursor.fetchall()
for row in rows:
# Do stuff
print(row)
Related
I have a word_set with ~170k elements and want to save it to MySQL database. My code works with small amount (for example 5 or 10 records) but returns an error with larger amount:
2055: Lost connection to MySQL server at 'localhost:3306', system
error: 10054 Удаленный хост принудительно разорвал существующее
подключение
Code:
from mysql.connector import connect, Error
try:
with connect(
host='localhost',
user='root',
password='root',
port=3306,
database='scrabble',
# connection_timeout=20,
) as connection:
insert_vocabulary = """
INSERT INTO vocabulary
(word)
VALUES ( %s )
"""
vocabulary_records = []
for word in word_set:
record = (word,)
vocabulary_records.append(record)
with connection.cursor() as cursor:
cursor.executemany(insert_vocabulary,
vocabulary_records)
connection.commit()
except Error as e:
print(e)
I tried specify connection_timeout=20 but it doesn't change anything. Error occurs within a second, definitely not 20 seconds.
What am I doing wrong?
Good Afternoon,
I wrote a query in MySQL, and I want to execute the same query in Python.
The Code I wrote as Follows.
1.
import mysql.connector
from mysql.connector import Error
try:
connection = mysql.connector.connect(host='localhost',
database='AdventureWorks2012',
user='root',
password='r##*****')
sql_select_Query = "select * from Person.person"
cursor = connection.cursor()
cursor.execute(sql_select_Query)
records = cursor.fetchall()
However, I'm getting following error message while running part two-
''File "", line 5
password='r##*****')
^
SyntaxError: unexpected EOF while parsing''
Any suggestion please how to overcome this problem?
You are missing the except clause:
try:
connection = mysql.connector.connect(host='localhost',
database='AdventureWorks2012',
user='root',
password='r##*****')
except Exception as e:
print(e)
You should check the documentation for more information.
I have continuous Python script that parses certain websites or XML's every 30 second and adds records o database if it turns out, there is something new.
First, I was just connecting to database every time which I knew, wasn't ideal way how to do it. i had something like this:
def job():
try:
cnx = mysql.connector.connect(user=DB_USER, password=DB_PASSWORD, host='XYZ', database=DB_NAME)
cursor = cnx.cursor()
# CALLS OF PARSERS:
run_parser1(cnx, cursor)
run_parser2(cnx, cursor)
# etc...
except Exception as e:
cursor.close()
cnx.close()
schedule.every(30).seconds.do(job)
while 1:
schedule.run_pending()
time.sleep(1)
Now I edited my code so connection is open until there is an exception either in connecting to database or in parsing:
try:
cnx = mysql.connector.connect(user=DB_USER, password=DB_PASSWORD, host='XYZ', database=DB_NAME)
cursor = cnx.cursor()
except Exception as e:
cursor.close()
cnx.close()
def job():
try:
alert_list = CAPParser(getxml()).as_dict()
# CALL OF PARSERS:
run_parser1(cnx, cursor)
run_parser2(cnx, cursor)
# etc...
except Exception as e:
cursor.close()
cnx.close()
schedule.every(30).seconds.do(job)
while 1:
schedule.run_pending()
time.sleep(1)
However, there is problem. While I know it's secure way to do it, it means that I need to restart script several times per day. There are a lot of exceptions either from lost database connection or unavailable URL's of parsed sites or files.
Any advice how to find better solution?
You could try to create a function which verifies if the cnx is open and if it's not, recreate it. Something like:
def job():
global cnx
global cursor
if not cnx.is_connected(): # Uses ping to verify connection
try:
cnx = mysql.connector.connect(user=DB_USER, password=DB_PASSWORD, host='XYZ', database=DB_NAME)
cursor = cnx.cursor()
except Exception as e:
cursor.close()
cnx.close()
# You can do this try in a while with a delay to keep retrying to connect
... # job here
Am receiving json data (from an other python script) to put inside MYSQL database, the code work fine the first time but the second time I got this error:
raise errors.OperationalError("MySQL Connection not available.")
mysql.connector.errors.OperationalError: MySQL Connection not available.
For troubleshooting am sending always the same data, but it still write an error the second time.
I tried also from information found on furums to place : cur = mydb.cursor() at diferents places but I have never been able to get this code work the second time.
There is my code :
import mysql.connector
import json
mydb = mysql.connector.connect(
host="localhost",
user="***",
passwd="***",
database="***"
)
def DATA_REPARTITION(Topic, jsonData):
if Topic == "test":
#print ("Start")
INSERT_DEBIT(jsonData)
def INSERT_DEBIT(jsonData):
cur = mydb.cursor()
#Read json from MQTT
print("Start read data to insert")
json_Dict = json.loads(jsonData)
debit = json_Dict['debit']
print("I send")
print(debit)
#Insert into DB Table
sql = ("INSERT INTO debit (data_debit) VALUES (%s)")
val=debit,
cur.execute(sql,val)
mydb.commit()
print(cur.rowcount, "record inserted.")
cur.close()
mydb.close()
Thanks for your help!
You only open your database connection once, at the start of the script, and you close that connection after making the first insert. Hence, second and subsequent inserts are failing. You should create a helper function which returns a database connection, and then call it each time you want to do DML:
def getConnection():
mydb = mysql.connector.connect(
host="localhost",
user="***",
passwd="***",
database="***")
return mydb
def INSERT_DEBIT(jsonData):
mydb = getConnection()
cur = mydb.cursor()
# Read json from MQTT
# rest of your code here...
cur.close()
mydb.close()
i have to connect to mysql server and grab some data for ever
so i have two way
1)connect to mysql the grab data in a while
conn = mysql.connector.connect(user='root',password='password',host='localhost',database='db',charset='utf8',autocommit=True)
cursor = conn.cursor(buffered=True)
while True:
cursor.execute("statments")
sqlData = cursor.fetchone()
print(sqlData)
sleep(0.5)
this working good but if script crashed due to mysql connection problem script goes down
2)connect to mysql in while
while True:
try:
conn = mysql.connector.connect(user='root',password='password',host='localhost',database='db',charset='utf8',autocommit=True)
cursor = conn.cursor(buffered=True)
cursor.execute("statments")
sqlData = cursor.fetchone()
print(sqlData)
cursor.close()
conn.close()
sleep(0.5)
except:
print("recoverable error..")
both code working good but my question is which is better?!
Among these two, better way will be to use a single connection but create a new cursor for each statement because creation of new connection takes time but creating a new cursor is fast. You may update the code as:
conn = mysql.connector.connect(user='root',password='password',host='localhost',database='db',charset='utf8',autocommit=True)
while True:
try:
cursor = conn.cursor(buffered=True)
cursor.execute("statments")
sqlData = cursor.fetchone()
print(sqlData)
except Exception: # Catch exception which will be raise in connection loss
conn = mysql.connector.connect(user='root',password='password',host='localhost',database='db',charset='utf8',autocommit=True)
cursor = conn.cursor(buffered=True)
finally:
cursor.close()
conn.close() # Close the connection
Also read Defining Clean-up Actions regarding the usage of try:finally block.