I'm using python3.4 to interact with oracle(11g)/sql developer.
Is it true that cx_Oracle could not deal with sqlPlus statements? It seems that the page https://sourceforge.net/p/cx-oracle/mailman/message/2932119/ said so.
So how could we execute 'spool' command by python?
The code:
import cx_Oracle
db_conn = cx_Oracle.connect(...)
cursor = db_conn.cursor()
cursor.execute('spool C:\\Users\Administrator\Desktop\mycsv.csv')
...
the error: cx_Oracle.DatabaseError: ORA-00900:
The "spool" command is very specific to SQL*Plus and is not available in cx_Oracle or any other application that uses the OCI (Oracle Call Interface). You can do something similar, however, without too much trouble.
You can create your own Connection class subclassed from cx_Oracle.Connection and your own Cursor class subclassed from cx_Oracle.Cursor that would perform any logging and have a special command "spool" that would turn it on and off at will. Something like this:
class Connection(cx_Oracle.Connection):
def __init__(self, *args, **kwargs):
self.spoolFile = None
return super(Connection, self).__init__(*args, **kwargs)
def cursor(self):
return Cursor(self)
def spool(self, fileName):
self.spoolFile = open(fileName, "w")
class Cursor(cx_Oracle.Cursor):
def execute(self, statement, args):
result = super(Cursor, self).execute(statement, args)
if self.connection.spoolFile is not None:
self.connection.spoolFile.write("Headers for query\n")
self.connection.spoolFile.write("use cursor.description")
def fetchall(self):
rows = super(Cursor, self).fetchall()
if self.connection.spoolFile is not None:
for row in rows:
self.connection.spoolFile.write("row details")
That should give you some idea on where to go with this.
Related
I'd like to speed my integration tests a bit and execute raw SQL code equivalent to create_all()
My idea is to run create_all (in order to get it SQL equivalent) just once when the test session starts and use SQL code between the tests to migrate the tables.
Do you have any idea how it can be done?
Thanks in advance!
You can accomplish the task by hooking your code with sqlalachemy after_cursor_execute hook.
https://docs.sqlalchemy.org/en/13/core/events.html#sqlalchemy.events.ConnectionEvents.after_cursor_execute
class QueryLogger:
"""Log query duration and SQL as a context manager."""
def __init__(self,
engine: sqlalchemy.engine.Engine,
f: io.StringIO):
"""
Initialize for an engine and file.
engine: The sqlalchemy engine for which events should be logged.
You can pass the class `sqlalchemy.engine.Engine` to capture all engines
f: file you want to write your output to
"""
self.engine = engine
self.file = f
def _after_cursor_execute(self, conn, cursor, statement, parameters, context, executemany):
"""Listen for the 'after_cursor_execute' event and log sqlstatement and time."""
# check if it's a ddl operation create_all execute a bunch of select statements
if context.isddl:
s = statement % parameters
self.file.write(f"{s};")
def __enter__(self, *args, **kwargs):
"""Context manager."""
if isinstance(self.engine, sqlalchemy.engine.Engine):
sqlalchemy.event.listen(self.engine, "after_cursor_execute", self._after_cursor_execute)
return self
def __exit__(self, *args, **kwargs) -> None:
"""Context manager."""
if isinstance(self.engine, sqlalchemy.engine.Engine):
sqlalchemy.event.remove(self.engine, "after_cursor_execute", self._after_cursor_execute)
And then you can use the context manager to log the queries to an in-memory file for write to SQL
with open("x.sql", "w") as f:
with QueryLogger(db.engine, f):
db.create_all()
A major part of the code is inspired by https://stackoverflow.com/a/67298123/3358570
So because of [reasons], I'm looking at overriding the tzinfo classes that are set by pyscopg2. I thought this would be a simple case of overriding tzinfo_factory on the cursor class. However, this doesn't seem to work.
import psycopg2
from psycopg2.extensions import cursor
from psycopg2.tz import FixedOffsetTimezone
class MyFixedOffsetTimezone(FixedOffsetTimezone):
pass
class MyCursorFactory(cursor):
tzinfo_factory = MyFixedOffsetTimezone
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
conn = psycopg2.connect('', cursor_factory=MyCursorFactory)
cursor = conn.cursor()
cursor.execute("select now()")
results = cursor.fetchall()
print(results[0][0].tzinfo)
print(results[0][0].tzinfo.__class__)
Will still give you
$ python3 example.py
psycopg2.tz.FixedOffsetTimezone(offset=60, name=None)
<class 'psycopg2.tz.FixedOffsetTimezone'>
Is this a result of my fundamental misunderstanding of how the C implementation's members and the higher level python interact? (or am I being a complete pleb?) versions: python 3.5.2 tested in psycopg2 2.6.2 and 2.7.1
I've trawled through the code, and it does seem to be referencing tzinfo_factory on the cursor (psycopg/typecast_datetime.c:typecast_PYDATETIME_cast line 140 # 2.7.1)
tzinfo_factory = ((cursorObject *)curs)->tzinfo_factory;
You have to pass the cursor_factory=... and assign MyFixedOffsetTimezone to MyCursorFactory:
class MyFixedOffsetTimezone(FixedOffsetTimezone):
pass
class MyCursorFactory(cursor):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.tzinfo_factory = MyFixedOffsetTimezone
conn = psycopg2.connect('...')
cursor = conn.cursor(cursor_factory=MyCursorFactory)
cursor.execute("select now()")
results = cursor.fetchall()
print(results[0][0].tzinfo)
print(results[0][0].tzinfo.__class__)
returns:
psycopg2.tz.FixedOffsetTimezone(offset=120, name=None)
<class '__main__.MyFixedOffsetTimezone'>
From the PostgreSQL docs on BEGIN:
Issuing BEGIN when already inside a transaction block will provoke a
warning message. The state of the transaction is not affected.
How can I make psycopg2 raise an exception on any such warning?
I am very far from being psycopg2 or Postgres expert, and, I am sure there is a better solution to increase the warning level, but here is something that worked for me - a custom cursor which looks into connection notices and, if there is something there - it throws an exception. The implementation itself is for education purposes mostly - I am sure it needs to be adjusted to work in your use case:
import psycopg2
# this "cursor" class needs to be used as a base for custom cursor classes
from psycopg2.extensions import cursor
class ErrorThrowingCursor(cursor):
def __init__(self, conn, *args, **kwargs):
self.conn = conn
super(ErrorThrowingCursor, self).__init__(*args, **kwargs)
def execute(self, query, vars=None):
result = super(ErrorThrowingCursor, self).execute(query, vars)
for notice in self.conn.notices:
level, message = notice.split(": ")
if level == "WARNING":
raise psycopg2.Warning(message.strip())
return result
Usage sample:
conn = psycopg2.connect(user="user", password="secret")
cursor = conn.cursor(conn, cursor_factory=ErrorThrowingCursor)
This would throw an exception (of a psycopg2.Warning type) if a warning was issued after a query execution. Sample:
psycopg2.Warning: there is already a transaction in progress
I am working on a simple convenience class to use with the with operator so that I can establish exclusive access to a sqlite3 database for a extended writing session in a concurrent system.
Here is the class:
import sqlite3 as sql
class ExclusiveSqlConnection(object):
"""meant to be used with a with statement to ensure proper closure"""
def __enter__(self, path):
self.con = sql.connect(path, isolation_level = "EXCLUSIVE")
self.con.execute("BEGIN EXCLUSIVE")
self.cur = self.con.cursor()
return self
def __exit__(self):
self.con.commit()
self.con.close()
However, when I run this I get the error:
with sql_lib.ExclusiveSqlConnection(self.output) as c:
TypeError: object.__new__() takes no parameters
The constructor (__init__ method) for your ExclusiveSqlConnection needs to take a path parameter.
On the other hand, __enter__ takes no parameter other than self.
import sqlite3 as sql
class ExclusiveSqlConnection(object):
"""meant to be used with a with statement to ensure proper closure"""
def __init__(self, path):
self.path = path
def __enter__(self):
self.con = sql.connect(self.path, isolation_level = "EXCLUSIVE")
self.con.execute("BEGIN EXCLUSIVE")
self.cur = self.con.cursor()
return self
def __exit__(self, exception_type, exception_val, trace):
self.con.commit()
self.con.close()
I'm trying to come up with SQLiteDB object, and following is the open/close code for it.
Does this work without problem? Am I missing something important?
For close(), I use con.close() and cursor.close(), but I'm wondering if cursor.close() is necessary.
class SQLiteDB(object):
def __init__(self, dbFile, connect = True):
self.dbFile = dbFile
self.con = None
self.cursor = None
if connect:
self.open()
def __del__(self):
if self.con:
self.close()
def open(self):
self.con = sqlite3.connect(self.dbFile)
self.cursor = self.connector.cursor()
return self.con, self.cursor
def close(self):
self.con.close()
self.cursor.close()
self.cursor = None
self.con = None
What happens on Cursor.close() depends on the underlying database implementation. For SQLite it might currently work without closing, but for other implementations or a future SQLite version it might not, so I would recommend to close the Cursor object. You can find further information on Cursor.close() in PEP 249.
Also, there seems to be a typo in your code:
self.connector = sqlite3.connect(self.dbFile)
should probably be
self.con = sqlite3.connect(self.dbFile)
Otherwise your code looks fine to me. Happy coding :) .