i'm using python 3.8 on ubuntu 20.04, and I cant get the DEBUG statements to log to my other file, my code -
import logging
logging.basicConfig(level=logging.DEBUG, filename = 'logDEBUG.txt')
def say(x):
print(x)
phrase = 'hi'
logging.debug(say(phrase))
You need to return the value, not print it.
import logging
logging.basicConfig(level=logging.DEBUG, filename = 'logDEBUG.txt')
def say(x):
return x
phrase = 'hi'
logging.debug(say(phrase))
Result in logDEBUG.txt:
DEBUG:root:hi
Do you have the problem that the file is not created?
If so check to see what your python path is, i don't know if you expect that the log file should appear in the same folder as your script.
Search for the file on your computer and see where it ends up.
Related
I am trying to create a python def to unzip a few .gz files within a folder. I know that the main script works if it is not in the def. The script I have created is similar to others I have done but this one give me the error
File "unzip.py", line 24, in
decompressed_files(input_folder)
NameError: name 'input_folder' is not defined
I copied the script below so someone can help me to see where the error is. I haven't done any BioInformatics for the last couple of years and I am a bit rusty.
import glob
import sys
import os
import argparse
import subprocess
import gzip
def decompressed_files(input_folder):
print ('starting decompressed_files')
output_folder=input_folder + '/fasta_files'
if os.path.exists(output_folder):
print ('folder already exists')
else:
os.makedirs(output_folder)
for f in input_folder:
fastqs=glob.glob(input_folder + '/*.fastq.gz')
cmd =[gunzip, -k, fastqs, output_folder]
my_file=subprocess.Popen(cmd)
my_file.wait
print ('The programme has finished doing its job')
decompressed_files(input_folder)
This is done for python 2.7, I know that is old but it is the one that it is installed in my work server.
That's why when you call decompressed_files(input_folder) in the last line, you didn't define input_folder before. you should do it like this :
input_folder = 'C:/Some Address/'
decompressed_files(input_folder)
How to know which file is calling which file in filesystem, like file1.exe is calling file2.exe
so file2.exe is modified,
and file1.exe is entered in log file.
winos
I have searched INTERNET but not able to find any samples.
In order know which file is calling which file you can use the Trace module
exp: if you have 2 files
***file1.py***
import file2
def call1():
file2.call2()
***file2.py***
def call2():
print "---------"
u can use it using console:
$ python -m trace --trackcalls path/to/file1.py
or within a program using a Trace object
****tracefile.py***
import trace,sys
from file1 import call1
#specify what to trace here
tracer = trace.Trace(ignoredirs=[sys.prefix, sys.exec_prefix], trace=0, count=1)
tracer.runfunc(call1) #call the function call1 in fille1
results = tracer.results()
results.write_results(summary=True, coverdir='.')
I have been stuck on this for the past hour. I had plugged logging into my tkinter gui, but could not get it to work. I then started removing parts until I got at the bare bones example in the very python docs and it will not work. At this point I have nothing else to remove.
The code is as follows:
import logging
LOG_FILENAME = r'logging_example.out'
logging.basicConfig(filename=LOG_FILENAME ,level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
f = open(LOG_FILENAME, 'rt')
try:
body = f.read()
finally:
f.close()
print('FILE:')
print (body)
The warning is printed to stdout, but the file is not generated.
I am runing python 3.4, x64 on a windows 7. It is a anacondas distribution, so this is running in Ipython inside spyder.
I guess this should be working
As Jonas Byström noted, this does work outside Ipython. It seems that Ipython configures a logging handler before I get the chance to do so. Also, basicConfig will do nothing if a handler is already present. So, in order to have it working in Ipython, one must do one of three things: 1) Add a new handler, OR 2)reload logging, OR 3) remove existing handlers. I did number 2 bellow.
import logging
from imp import reload
reload(logging)
LOG_FILENAME = r'logging_example.out'
logging.basicConfig(filename=LOG_FILENAME ,level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
f = open(LOG_FILENAME, 'rt')
try:
body = f.read()
finally:
f.close()
print('FILE:')
print (body)
See theese for more information:
Logging in ipython;
More on the same
I have a python file (html2text.py) which gives the desired result when i pass command line argument to it i.e., in the following way:
python html2text.py file.txt
where file.txt contains the source code of a web-site and the result is displayed on the console...
I want to use it in another file (let say a.py) and store the result (which was getting printed on the console) in a string.
For this I need to first import the file (html2text.py) in my file (a.py). Can anyone tell me how do I proceed further...?
Good way is to create some API in your html2text.py. For example:
# html2text.py
def parse(filename):
f = open(filename)
# do the stuff
return output_string
def main():
import sys
print parse(sys.argv[1])
if __name__ == '__main__':
main()
Then you will be able to use it in your a.py:
import html2text # main() will not run
import sys
output = html2text.parse(sys.argv[1])
I think the best way is the reorganize a little your html2text.py file. Append the line like this to your file:
def main():
message = sys.stdin.readlines()
a = your_def(message)
if __name__ == '__main__': main()
Now you're sure, that when invoking the file from command line, everything will go fine. Moreover, if you have everything kept in functions and classes, you can now in your a.py
import html2text
and work on it already in a.py.
I want to parse a file everytime a new file is created in a certain directory. For this, I'm trying to use pyinotify to setup a directory to watch for IN_CREATE kernel events, and fire the parse() method.
Here is the module:
from pyinotify import WatchManager,
ThreadedNotifier, ProcessEvent, IN_CREATE
class Watcher(ProcessEvent):
watchdir = '/tmp/watch'
def __init__(self):
ProcessEvent.__init__(self)
wm = WatchManager()
self.notifier = ThreadedNotifier(wm, self)
wdd = wm.add_watch(self.watchdir, IN_CREATE)
self.notifier.start()
def process_IN_CREATE(self, event):
pfile = self._parse(event.pathname)
print(pfile)
def _parse(self, filename):
f = open(filename)
file = [line.strip() for line in f.readlines()]
f.close()
return file
if __name__ == '__main__':
Watcher()
The problem is that the list returned by _parse is empty when triggered by a new file creation event, like so (the file is created in another window while watcher.py is running):
$ python watcher.py
[]
...but strangely enough, it works from an interpreter session when called directly.
>>> import watcher
>>> w = watcher.Watcher()
>>> w._parse('/tmp/watch/sample')
['This is a sample file', 'Another line', 'And another...']
Why is this happening? The farthest I've come debugging this thing is to know that something is making pyinotify not read the file correctly. But... why?
may be you want to wait till file is closed?
Here's some code that works for me, with a 2.6.18 kernel, Python 2.4.3, and pyinotify 0.7.1 -- you may be using different versions of some of these, but it's important to make sure we're talking about the same versions, I think...:
#!/usr/bin/python2.4
import os.path
from pyinotify import pyinotify
class Watcher(pyinotify.ProcessEvent):
watchdir = '/tmp/watch'
def __init__(self):
pyinotify.ProcessEvent.__init__(self)
wm = pyinotify.WatchManager()
self.notifier = pyinotify.ThreadedNotifier(wm, self)
wdd = wm.add_watch(self.watchdir, pyinotify.EventsCodes.IN_CREATE)
print "Watching", self.watchdir
self.notifier.start()
def process_IN_CREATE(self, event):
print "Seen:", event
pathname = os.path.join(event.path, event.name)
pfile = self._parse(pathname)
print(pfile)
def _parse(self, filename):
f = open(filename)
file = [line.strip() for line in f]
f.close()
return file
if __name__ == '__main__':
Watcher()
when this is running in a terminal window, and in another terminal window I do
echo "ciao" >/tmp/watch/c3
this program's output is:
Watching /tmp/watch
Seen: event_name: IN_CREATE is_dir: False mask: 256 name: c3 path: /tmp/watch wd: 1
['ciao']
as expected. So can you please try this script (fixing the Python version in the hashbang if needed, of course) and tell us the exact releases of Linux kernel, pyinotify, and Python that you are using, and what do you observe in these exact circunstances? Quite possibly with more detailed info we may identify which bug or anomaly is giving you problems, exactly. Thanks!
As #SilentGhost mentioned, you may be reading the file before any content has been added to file (i.e. you are getting notified of the file creation not file writes).
Update: The loop.py example with pynotify tarball will dump the sequence of inotify events to the screen. To determine which event you need to trigger on, launch loop.py to monitor /tmp and then perform the file manipulation you want to track.
I think I solved the problem by using the IN_CLOSE_WRITE event instead. I'm not sure what was happening before that made it not work.
#Alex: Thanks, I tried your script, but I'm using newer versions: Python 2.6.1, pyinotify 0.8.6 and Linux 2.6.28, so it didn't work for me.
It was definitely a matter of trying to parse the file before it was written, so kudos to SilentGhost and DanM for figuring it out.