I'm writing a class named reloader that receives a main_function (or any callable) and a FileWatcher instance.
FileWatcher is just a class that watches files and invokes a callback function if any of the files are modified.
When the run() method of an instance of the Reloader is invoked it starts a Thread whose target is main_function and it also runs the FileWatcher.
What I want to achive is that whenever any of the files were modified I could reload the main_function.
(this is similar to Django's autoreloader)
here's the realoder class:
class Reloader:
def __init__(self, watcher: FileWatcher, func: callable, *args, **kwargs) -> None:
self.func = func
self.args = args
self.kwargs = kwargs
self.watcher = watcher
self.watcher.set_callback(self._callback)
self.thread = Thread(target=self._main)
self.thread.daemon = True
def _callback(self, file):
print(f'{file} was modified.')
self.thread.join()
self.thread = Thread(target=self._main)
self.thread.daemon = True
self.thread.start()
def _main(self):
self.func(*self.args, **self.kwargs)
def run(self):
self.thread.start()
self.watcher.run()
The whole thing works except that it doesn't reload the main function. It just reruns it.
If I change the file which contains the main_function, the changes are not seen!
Related
I'm aware of this structure
class MyThread(QThread):
def __init__(self):
super().__init__()
def run():
# do stuff
t = MyThread()
t.start()
With regular threading.Thread you can do something like this:
def stuff():
# do stuff
t = threading.Thread(target=stuff)
t.start()
Any way to do this in pyqt5 with QThreads? Something like this:
t = Qthread(target=stuff)
t.start()
I tried that but I got this error:
TypeError: 'target' is an unknown keyword argument
You can add the function to a custom argument in the __init__, create an instance attribute for its reference and then run it in the run.
class MyThread(QThread):
def __init__(self, target=None):
super().__init__()
self.target = target
def run():
if self.target:
self.target()
def stuff():
# do something
t = MyThread(target=stuff)
t.start()
Be aware that access to UI elements is not allowed in external threads, so don't use the threaded function to do anything related to UI: reading values and properties is unreliable, and writing can cause your program to crash.
As an example I have a Django custom management command which periodically (APScheduler + CronTrigger) sends tasks to Dramatiq.
Why the following code with separate functions:
def get_crontab(options):
"""Returns crontab whether from options or settings"""
crontab = options.get("crontab")
if crontab is None:
if not hasattr(settings, "REMOVE_TOO_OLD_CRONTAB"):
raise ImproperlyConfigured("Whether set settings.REMOVE_TOO_OLD_CRONTAB or use --crontab argument")
crontab = settings.REMOVE_TOO_OLD_CRONTAB
return crontab
def add_cron_job(scheduler: BaseScheduler, actor, crontab):
"""Adds cron job which triggers Dramatiq actor"""
module_path = actor.fn.__module__
actor_name = actor.fn.__name__
trigger = CronTrigger.from_crontab(crontab)
job_path = f"{module_path}:{actor_name}.send"
job_name = f"{module_path}.{actor_name}"
scheduler.add_job(job_path, trigger=trigger, name=job_name)
def run_scheduler(scheduler):
"""Runs scheduler in a blocking way"""
def shutdown(signum, frame):
scheduler.shutdown()
signal.signal(signal.SIGINT, shutdown)
signal.signal(signal.SIGTERM, shutdown)
scheduler.start()
class Command(BaseCommand):
help = "Periodically removes too old publications from the RSS feed"
def add_arguments(self, parser: argparse.ArgumentParser):
parser.add_argument("--crontab", type=str)
def handle(self, *args, **options):
scheduler = BlockingScheduler()
add_cron_job(scheduler, tasks.remove_too_old_publications, get_crontab(options))
run_scheduler(scheduler)
is better than a code with methods?
class Command(BaseCommand):
help = "Periodically removes too old publications from the RSS feed"
def add_arguments(self, parser: argparse.ArgumentParser):
parser.add_argument("--crontab", type=str)
def get_crontab(self, options):
"""Returns crontab whether from options or settings"""
crontab = options.get("crontab")
if crontab is None:
if not hasattr(settings, "REMOVE_TOO_OLD_CRONTAB"):
raise ImproperlyConfigured(
"Whether set settings.REMOVE_TOO_OLD_CRONTAB or use --crontab argument"
)
crontab = settings.REMOVE_TOO_OLD_CRONTAB
return crontab
def handle(self, *args, **options):
scheduler = BlockingScheduler()
self.add_cron_job(scheduler, tasks.remove_too_old_publications, self.get_crontab(options))
self.run_scheduler(scheduler)
def add_cron_job(self, scheduler: BaseScheduler, actor, crontab):
"""Adds cron job which triggers Dramatiq actor"""
module_path = actor.fn.__module__
actor_name = actor.fn.__name__
trigger = CronTrigger.from_crontab(crontab)
job_path = f"{module_path}:{actor_name}.send"
job_name = f"{module_path}.{actor_name}"
scheduler.add_job(job_path, trigger=trigger, name=job_name)
def run_scheduler(self, scheduler):
"""Runs scheduler in a blocking way"""
def shutdown(signum, frame):
scheduler.shutdown()
signal.signal(signal.SIGINT, shutdown)
signal.signal(signal.SIGTERM, shutdown)
scheduler.start()
This code is used in a one single place and will not be reused.
StackOverflow requires more details, so:
The second code is the version that I originally wrote. After that, I runned Prospector with Pylint and besides other useful messages I've got pylint: no-self-use / Method could be a function (col 4). To solve this issue I rewrote my code as in the first example. But I still don't understand why it is better this way.
At least, in this case, it is not better. Pylint is notifying you about "self" being unused, just like it would notify you about a variable or an import being unused.
Couple of other options for fixing the pylint-messages would be to actually use "self" in the functions or add staticmethod (or classmethod) decorator. Examples for both are after the horizontal line. Here are the docs for staticmethod and here's the difference between staticmethod and classmethod.
Since this is a Django-command and you likely won't have multiple instances of the class or other classes that inherit Command (that would i.e. overload the functions) or something that would benefit from the functions being inside the class, pick the one you find most readable/easiest to change.
And just for completeness, StackExchange Code Review could have further insight for which is best, if any.
Example that uses self, main difference is that scheduler is created in __init__ and not passed as an argument to the functions that use it:
class Command(BaseCommand):
help = "Periodically removes too old publications from the RSS feed"
def __init__(self):
super().__init__()
self.scheduler = BlockingScheduler()
def add_arguments(self, parser: argparse.ArgumentParser):
parser.add_argument("--crontab", type=str)
def handle(self, *args, **options):
self.add_cron_job(tasks.remove_too_old_publications, self.get_crontab(options))
self.run_scheduler()
# ...
def run_scheduler(self):
"""Runs scheduler in a blocking way"""
def shutdown(signum, frame):
self.scheduler.shutdown()
signal.signal(signal.SIGINT, shutdown)
signal.signal(signal.SIGTERM, shutdown)
self.scheduler.start()
Example that uses staticmethod, where the only difference is the staticmethod-decorator and the functions with the decorator don't have self-argument:
class Command(BaseCommand):
help = "Periodically removes too old publications from the RSS feed"
def add_arguments(self, parser: argparse.ArgumentParser):
parser.add_argument("--crontab", type=str)
def handle(self, *args, **options):
scheduler = BlockingScheduler()
self.add_cron_job(scheduler, tasks.remove_too_old_publications, self.get_crontab(options))
self.run_scheduler(scheduler)
# ...
#staticmethod
def run_scheduler(scheduler):
"""Runs scheduler in a blocking way"""
def shutdown(signum, frame):
scheduler.shutdown()
signal.signal(signal.SIGINT, shutdown)
signal.signal(signal.SIGTERM, shutdown)
scheduler.start()
Using Python 3.7 and PySide2, I created a worker object on a dedicated QThread to execute a long-running function. This is illustrated in the code below.
import threading
from time import sleep
from PySide2.QtCore import QObject, QThread, Signal, Slot
from PySide2.QtWidgets import QApplication
class Main(QObject):
signal_for_function = Signal()
def __init__(self):
print('The main thread is "%s"' % threading.current_thread().name)
super().__init__()
self.thread = QThread(self)
self.worker = Worker()
self.worker.moveToThread(self.thread)
self.thread.start()
self.signal_for_function.connect(self.worker.some_function)
def some_decorator(func):
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
class Worker(QObject):
# #some_decorator
def some_function(self):
print('some_function is running on thread "%s"' % threading.current_thread().name)
app = QApplication()
m = Main()
m.signal_for_function.emit()
sleep(0.100)
m.thread.quit()
m.thread.wait()
If I use some_function without the decorator, I get this as expected:
The main thread is "MainThread"
some_function is running on thread "Dummy-1"
However, if I apply a decorator (i.e. uncomment "#some_decorator"), I get:
The main thread is "MainThread"
some_function is running on thread "MainThread"
Why does this happen, and how do I make the decorated function run on the worker thread as I intented to?
Solution:
You must use #functools.wrap:
import functools
# ...
def some_decorator(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
return wrapper
Output:
The main thread is "MainThread"
some_function is running on thread "Dummy-1"
Explanation:
To analyze the difference of using #functools.wrap or not then the following code must be used:
def some_decorator(func):
print(func.__name__, func.__module__, func.__doc__, func.__dict__)
#functools.wraps(func)
def wrapper(*args, **kwargs):
return func(*args, **kwargs)
print(wrapper.__name__, wrapper.__module__, wrapper.__doc__, wrapper.__dict__)
return wrapper
By removing #functools.wrap you should get the following:
some_function __main__ None {}
wrapper __main__ None {}
By not removing #functools.wrap you should get the following:
some_function __main__ None {}
some_function __main__ None {'__wrapped__': <function Worker.some_function at 0x7f610d926a60>}
The main difference is in __name__, in the case of #functools.wrap it makes the wrapper function have the same name as "func", and what difference does that make? It serves to identify if the function belongs to the Worker class or not, that is, when the Worker class is created, a dictionary is created that stores the methods, attributes, etc., but when the signal invokes some_function then it returns the wrapper that has the name "wrapper" that is not in the Worker's dictionary, but in the case of using #functools.wrapper some_function is invoked then it returns to wrapper with the name "some_function" causing the Worker object to invoke it.
I am running 2 python scripts, say main.py and test.py
In main.py i am executing get_details function "x" number of times every 30 seconds.
NOTE: I want to execute funA,funcB funC in sequence. The issue i am facing here is - when i run test.py, it first runs funcC(), even though i am calling funcA() first.
test.py
def funcA():
#do something
funcB()
def funcB():
#do something
funcC()
def funcC():
#here i want to execute script main.py
#My attempt 1 :
import subprocess
import sys
theproc = subprocess.Popen([sys.executable, "main.py"])
theproc.communicate()
#------OR-----------
#My attempt 2:
execfile("main.py")
main.py
import threading
def get_details(a,b,c):
#do something ...
class RepeatEvery(threading.Thread):
def __init__(self, interval, func, *args, **kwargs):
threading.Thread.__init__(self)
self.interval = interval # seconds between calls
self.func = func # function to call
self.args = args # optional positional argument(s) for call
self.kwargs = kwargs # optional keyword argument(s) for call
self.runable = True
def run(self):
while self.runable:
self.func(*self.args, **self.kwargs)
time.sleep(self.interval)
def stop(self):
self.runable = False
thread = RepeatEvery(30, get_details,"arg1","arg2","arg3")
print "starting"
thread.start()
thread.join(21) # allow thread to execute a while...
I want to execute script main.py only after all functions (funcA,funcB) executed properly. But in my case, main.py executed first and then control goes back to test.py and it executes funcA() and funcB().
What am i missing here ?
Okay. I rewrote your code so it would work as you said it should.
main.py...
#Good design for small classes: keep global functions separate for people who want
#to explore the type, but not everything that comes along with it.
#I moved the the global functions and code execution from top and bottom to test.py
import threading
import time #You forgot to import time.
class RepeatEvery(threading.Thread):
def __init__(self, interval, func, *args, **kwargs):
threading.Thread.__init__(self)
self.interval = interval # seconds between calls
self.func = func # function to call
self.args = args # optional positional argument(s) for call
self.kwargs = kwargs # optional keyword argument(s) for call
self.runable = True
def run(self):
while self.runable:
self.func(*self.args, **self.kwargs)
time.sleep(self.interval)
def stop(self):
self.runable = False
""" We couuuld have done this, but why bother? It is hard to work with.
def get_details(self,a,b,c):
#do something else as a function of the class...
"""
test.py...
import main #File where class lives.
def funcA():
#do something
print ("In A...") #Helps us observe scope.
funcB()
def funcB():
#do something
print("In B...") #scope
funcC()
def funcC():
#here i want to execute script main.py
#My attempt 1 :
print("In C...") #scope
main() #Reached C, lets run main now...
#This is one way to do allow a function to be accessible to your class.
def get_details(a,b,c):
#do something else as a function of ¬[class] test.py operating on
#a RepeatEvery object...
pass
def main(): #Function main is separate from class main. It houses our opening code.
thread = main.RepeatEvery(30, get_details,"arg1","arg2","arg3")
print ("starting")
thread.start()
thread.join(21) # allow thread to execute a while...
funcA()
I'm trying to use the StoppableThread class presented as an answer to another question:
import threading
# Technique for creating a thread that can be stopped safely
# Posted by Bluebird75 on StackOverflow
class StoppableThread(threading.Thread):
"""Thread class with a stop() method. The thread itself has to check
regularly for the stopped() condition."""
def __init__(self):
super(StoppableThread, self).__init__()
self._stop = threading.Event()
def stop(self):
self._stop.set()
def stopped(self):
return self._stop.isSet()
However, if I run something like:
st = StoppableThread(target=func)
I get:
TypeError: __init__() got an unexpected keyword argument 'target'
Probably an oversight on how this should be used.
The StoppableThread class does not take or pass any additional arguments to threading.Thread in the constructor. You need to do something like this instead:
class StoppableThread(threading.Thread):
"""Thread class with a stop() method. The thread itself has to check
regularly for the stopped() condition."""
def __init__(self,*args,**kwargs):
super(threading.Thread,self).__init__(*args,**kwargs)
self._stop = threading.Event()
This will pass both positional and keyword arguments to the base class.
You are overriding init and your init doesn't take any arguments. You should add a "target" argument and pass it through to your base class constructor with super or even better allow arbitrary arguments via *args and *kwargs.
I.e.
def __init__(self,*args,**kwargs):
super(threading.Thread,self).__init__(*args,**kwargs)
self._stop = threading.Event()