I have the following line code in a Python script:
sql_create_table(total_stores, "total_stores")
This is a function I created to upload tables to an Oracle database. I want to do something like this in order to record what tables were not created because the line failed to run:
Try:
sql_create_table(total_stores, "total_stores")
except:
print in a log.txt "table x could not be created in the database"
Any suggestions?
Thanks in advance!
There is the python logging module, which has a good tutorial, which even includes how to log to a file.
Very basic example:
import logging
logging.basicConfig(filename="program.log", level=logging.INFO)
…
try:
sql_create_table(total_stores, "total_stores")
except:
logging.warning("table x could not be created in the database")
You can write the log to a txt file by doing the following:
Try:
sql_create_table(total_stores, "total_stores")
except:
with open('log.txt', 'a') as log:
log.write("table x could not be created in the database")
Note, that by using 'a', we are appending to the txt file and won't be overwriting old logs.
Related
I am building a Python app using FBS, but part of it relies on an SQLite3 database. I have code to create this database if it doesn't find this, using a try-catch block.
When I try to run it after compiling, it not only can not find the preexisting SQLite3 file, but also won't create it. It does not display any error messages.
I have tried creating the file if it doesn't exist using this code:
try:
self.connection = sqlite3.connect(path)
self.cursor = self.connection.cursor()
except:
if not os.path.exists(path):
# Try and make the .config directory
try:
os.makedirs(".config")
except OSError as e:
if e.errno != errno.EEXIST:
raise
# Create the datastore, and close it
f = open(path, "w+")
f.close()
# And try connect to database again
return self.__connect(path)
else:
print(f"No database exists, and could not create one.\nPlease create file in app directory called: {path}\nThen restart application.")
raise
The code works find in dev, but as soon as I compile it to a Mac app, it refuses to find or create the database.
Fixed. If anyone has a similar issue please use the builtin appctxt.get_resource(file_path) method.
I have multiple unstructured txt files in a directory and I want to insert all of them into mysql; basically, the entire content of each text file should be placed into a row . In MySQL, I have 2 columns: ID (auto increment), and LastName(nvarchar(45)). I used Python to connect to MySql; used LOAD DATA LOCAL INFILE to insert the whole content. But when I run the code I see the following messages in Python console:
.
Also, when I check MySql, I see nothing but a bunch of empty rows with Ids being automatically generated.
Here is the code:
import MySQLdb
import sys
import os
result = os.listdir("C:\\Users\\msalimi\\Google Drive\\s\\Discharge_Summary")
for x in result:
db = MySQLdb.connect("localhost", "root", "Pass", "myblog")
cursor = db.cursor()
file1 = os.path.join(r'C:\\Discharge_Summary\\'+x)
cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test" %(file1,));
db.commit()
db.close()
Can someone please tell me what is wrong with the code? What is the right way to achieve my goal?
I edited my code with:
.....cursor.execute("LOAD DATA LOCAL INFILE '%s' INTO TABLE clamp_test LINES TERMINATED BY '\r' (Lastname) SET id = NULL" %(file1,))
and it worked :)
I'm trying to export a table, contained within an Oracle 12c database, to csv format - using Python 2.7. The code I have written is shown below:
import os
import cx_Oracle
import csv
SQL = 'SELECT * FROM ORACLE_TABLE'
filename = 'C:\Temp\Python\Output.csv'
file = open(filename, 'w')
output = csv.writer(file, dialect='excel')
connection = cx_Oracle.connect('username/password#connection_name')
cursor = connection.cursor()
cursor.execute(SQL)
for i in cursor:
output.writerow(i)
cursor.close()
connection.close()
file.close()
This code yields an error in the line where I define 'connection':
ORA-12557: TNS:protocol adapter not loadable
How can I remedy this? Any help would be appreciated.
Please note: I have already encountered StackOverflow responses to very similar problems to this. However, they often suggest changing the path within environment variables - I cannot do this since I don't have appropriate administer privileges. Thanks again for your assistance.
ORA-12557 is caused by problems with the %ORACLE_HOME% on Windows. That's the usual suggestion is to change the PATH setting.
"I cannot do this since I don't have appropriate administer privileges."
In which case you don't have too many options. Perhaps you could navigate to the ORACLE_HOME directory and run your script from there. Otherwise look to see what other tools you have available: Oracle SQL Developer? TOAD? SQL*Plus?
We found that by navigating to config -> Oracle and editing the file 'tnsnames.ora' the problem can be solved. The tnsnames file appears as follows:
connection_name =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS= ... )
)
(CONNECT_DATA =
(SERVICE_NAME= ...)
)
)
By changing the first instance of connection_name to connection_name.WORLD, then typing
set ORACLE_HOME=
into the command line before executing the Python script, the above script now runs with no error.
I use ini-file to store DB connection parameters. Hope it helps.
self.mydsn = cx_Oracle.makedsn(self.parser.get('oracle', 'db'),self.parser.get('oracle', 'port'),self.parser.get('oracle', 'service_name'))
try:
self.connpool = cx_Oracle.SessionPool(user=self.parser.get('oracle', 'username'),password=self.parser.get('oracle', 'userpass'),dsn=self.mydsn,min=1,max=5,increment=1)
except Exception as e:
print e
You can use this python script for oracle csv export:
https://github.com/teopost/csv_exp
I am using the Python logger mechanism for keeping a record of my logs. I have two types of logs,
one is the Rotating log (log1, log2, log3...) and a non-rotating log called json.log (which has json logs in it as the name suggests).
The log files are created when the server is started and close when the app is closed.
What I am trying to do in general is: When I press the import button on my page, to have all json logs saved on the sqlite db.
The problem I am facing is:
When I try to rename the json.log file like this:
source_file = "./logs/json.log"
snapshot_file = "./logs/json.snapshot.log"
try:
os.rename(source_file, snapshot_file)
I get the windowsError: [Error 32] The process cannot access the file because it is being used by another process
and this is because the file is being used by the logger continuously. Therefore, I need to "close" the file somehow so I can do my I/O operation successfully.
The thing is that this is not desirable because logs might be lost until the file is closed, then renamed and then "re-created".
I was wondering if anyone came across such scenario again and if any practical solution was found.
I have tried something which works but does not seem convenient and not sure if it is safe so that any logs are not lost.
My code is this:
source_file = "./logs/json.log"
snapshot_file = "./logs/json.snapshot.log"
try:
logger = get_logger()
# some hackish way to remove the handler for json.log
if len(logger.handlers) > 2:
logger.removeHandler(logger.handlers[2])
if not os.path.exists(snapshot_file):
os.rename(source_file, snapshot_file)
try:
if type(logger.handlers[2]) == RequestLoggerHandler:
del logger.handlers[2]
except IndexError:
pass
# re-adding the logs file handler so it continues writing the logs
json_file_name = configuration["brew.log_file_dir"] + os.sep + "json.log"
json_log_level = logging.DEBUG
json_file_handler = logging.FileHandler(json_file_name)
json_file_handler.setLevel(json_log_level)
json_file_handler.addFilter(JSONLoggerFiltering())
json_file_handler.setFormatter(JSONFormatter())
logger.addHandler(json_file_handler)
... code continues to write the logs to the db and then delete the json.snapshot.file
until the next time the import button is pressed; then the snapshot is created again
only for writing the logs to the db.
Also for reference my log file has this format:
{'status': 200, 'actual_user': 1, 'resource_name': '/core/logs/process', 'log_level': 'INFO', 'request_body': None, ... }
Thanks in advance :)
I have written a loop that uploads some data to a MySQL database. When the wifi is not working, it is supposed to write the data on a txt file instead. The problem is that it does not write the txt file. I have initialised the database (called it "database") and cursor (called it "cursor"). The txt file is called "test".
EDIT: by experimenting plugging in and out the ethernet cable, I realised that when I replug the cable, a bunch of data is automatically sent with the same timestamp (maybe saved in ram or some cache- this happens whenever I restart the programme as well but on smaller scale). Do you think there might be another way to get a back up? Maybe by writing everything on txt file and erasing the txt file after every 1GB of data (so that the SD won't get full- it's on a Raspberry Pi 2)?
try:
try:
cursor.execute("""INSERT INTO table (column1) VALUES(%s)""", (temperature))
database.commit
except:
text=open("test.txt","a")
test.write(temperature + "\n")
test.close()
except:
print "FAIL"
Since write() function expects a character buffer object, you might need to typecast temperature to string while passing it as argument to the write function.
test.write(str(temperature) + "\n")
This is my approach:
import urllib, time, mysqldb
time.sleep(1) # Protects Overlap from execfile
url = "https://www.google.co.uk/"
try:
checkcon = urllib.urlopen(url)
except Exception as e:
"""
If the program gets a socket error
it goes to sleep, after that it restarts
and closes this Script
"""
time.sleep(5)# retry timer
execfile('/path/to/your/file.py')
raise IOError(e)
# Now that you know connection is working
try:
cursor.execute("""INSERT INTO table (column1) VALUES(%s)""", (temperature))
database.commit()
except MySQLdb.IntegrityError as e:
# Database Error
with open ('test.txt'), 'a' as f:
f.write(temperature)
f.close()
raise IOError(e)