Daemon process structure - python

I am looking to build a daemon process that does some tasks when given some input. 99% of the time it lays silent on the background doing nothing and the tasks are short and few in number. How would I build the interface between two applications one of which constructs the task and the daemon that executes it?
I was thinking that the daemon might have a folder which i periodically checks. If there are some files in there, it reads it and follows instructions from there.
Would that work well or is there a better way?
EDIT: added example daemon code.
#!/usr/bin/python
import time
from daemon import runner
class Daemon():
def __init__(self):
self.stdin_path = '/dev/null'
self.stdout_path = '/dev/tty'
self.stderr_path = '/dev/tty'
self.pidfile_path = '/tmp/foo.pid'
self.pidfile_timeout = 5
self.task_dir = os.path.expanduser("~/.todo/daemon_tasks/")
def run(self):
while not time.sleep(1):
if len(os.listdir(self.task_dir)) == 0:
for task in os.listdir(self.task_dir):
self.process_task(task)
def process_task(self, task):
# input: filename
# output: void
# takes task and executes it according to instructions in the file
pass
if __name__ == '__main__':
app = Daemon()
daemon_runner = runner.DaemonRunner(app)
daemon_runner.do_action()

I would look into unix sockets of FIFOs as an option. This eliminates need for polling of some directory. Some SO link for help How to create special files of type socket?

Related

python design pattern queue with workers

I'm currently working on a project that involves three components,
an observer that check for changes in a directory, a worker and an command line interface.
What I want to achieve is:
The observer, when a change happens send a string to the worker (add a job to the worker's queue).
The worker has a queue of jobs and forever works on his queue.
Now I want the possibility to run a python script to check the status of the worker (number of active jobs, errors and so on)
I don't know how to achieve this with python in terms of which component to use and how to link the three components.
I though as a singleton worker where the observer add a job to a queue but 1) I was not able to write a working code and 2) How can I fit the checker in?
Another solution that I thought of may be multiple child processes from a father that has the queue but I'm a bit lost...
Thanks for any advices
I'd use some kind of observer pattern or publish-subscribe pattern. For the former you can use for example the Python version of ReactiveX. But for a more basic example let's stay with the Python core. Parts of your program can subscribe to the worker and receive updates from the process via queues for example.
import itertools as it
from queue import Queue
from threading import Thread
import time
class Observable(Thread):
def __init__(self):
super().__init__()
self._observers = []
def notify(self, msg):
for obs in self._observers:
obs.put(msg)
def subscribe(self, obs):
self._observers.append(obs)
class Observer(Thread):
def __init__(self):
super().__init__()
self.updates = Queue()
class Watcher(Observable):
def run(self):
for i in it.count():
self.notify(i)
time.sleep(1)
class Worker(Observable, Observer):
def run(self):
while True:
task = self.updates.get()
self.notify((str(task), 'start'))
time.sleep(1)
self.notify((str(task), 'stop'))
class Supervisor(Observer):
def __init__(self):
super().__init__()
self._statuses = {}
def run(self):
while True:
status = self.updates.get()
print(status)
self._statuses[status[0]] = status[1]
# Do something based on status updates.
if status[1] == 'stop':
del self._statuses[status[0]]
watcher = Watcher()
worker = Worker()
supervisor = Supervisor()
watcher.subscribe(worker.updates)
worker.subscribe(supervisor.updates)
supervisor.start()
worker.start()
watcher.start()
However many variations are possible and you can check the various patterns which suits you most.

How & where to best retrieve sudo password via a native GUI on a macOS Python-based app - (while maintaining an interactive output stream (stdout))

Ok, so the situation is this: I am building a macOS GUI App using Python and wx (wxphoenix). The user can use the GUI (say: script1) to launch a file-deletion process (contained in script2). In order to run successfully script2 needs to run with sudo rights.
script2 will itterate over a long list of files and delete them. But I need it to communicate with the GUI contained in script1 after each round so that script1 can update the progressbar.
In it's absolute most basic form my current working setup looks like this:
Script1:
import io
from threading import Thread
import subprocess
import wx
# a whole lot of wx GUI stuff
def get_password():
"""Retrieve user password via a GUI"""
# A wx solution using wx.PasswordEntryDialog()
# Store password in a variable
return variable
class run_script_with_sudo(Thread):
"""Launch a script with administrator privileges"""
def __init__(self, path_to_script, wx_pubsub_sendmessage):
"""Set variables to self"""
self.path = path_to_script
self.sender = wx_pubsub_sendmessage
self.password = get_password()
Thread.__init__(self)
self.start()
def run(self):
"""Run thread"""
prepare_script = subprocess.Popen(["echo", password], stdout=subprocess.PIPE)
prepare_script.wait()
launch_script = subprocess.Popen(['sudo', '-S', '/usr/local/bin/python3.6', '-u', self.path], stdin=prepare_script.stdout, stdout=subprocess.PIPE)
for line in io.TextIOWrapper(launch_script.stdout, encoding="utf-8"):
print("Received line: ", line.rstrip())
# Tell progressbar to add another step:
wx.CallAfter(self.sender, "update", msg="")
Script2:
import time
# This is a test setup, just a very simple loop that produces an output.
for i in range(25):
time.sleep(1)
print(i)
The above setup works in that script1 receives the output of script2 in real-time and acts on it. (So in the given example: after each second script1 adds another step to the progress bar until it reaches 25 steps).
What I want to achieve = not storing the password in a variable and using macOS it's native GUI to retrieve the password.
However when I change:
prepare_script = subprocess.Popen(["echo", password], stdout=subprocess.PIPE)
prepare_script.wait()
launch_script = subprocess.Popen(['sudo', '-S', '/usr/local/bin/python3.6', '-u', self.path], stdin=prepare_script.stdout, stdout=subprocess.PIPE)
for line in io.TextIOWrapper(launch_script.stdout, encoding="utf-8"):
print("Received line: ", line.rstrip())
# Tell progressbar to add another step:
wx.CallAfter(self.sender, "update", msg="")
Into:
command = r"""/usr/bin/osascript -e 'do shell script "/usr/local/bin/python3.6 -u """ + self.path + """ with prompt "Sart Deletion Process " with administrator privileges'"""
command_list = shlex.split(command)
launch_script = subprocess.Popen(command_list, stdout=subprocess.PIPE)
for line in io.TextIOWrapper(launch_script.stdout, encoding="utf-8"):
print("Received line: ", line.rstrip())
# Tell progressbar to add another step:
wx.CallAfter(self.sender, "update", msg="")
It stops working because osascript apparently runs in a non-interactive shell. This means script2 doesn't sent any output until it is fully finished, causing the progress bar in script1 to stall.
My question thus becomes: How can I make sure to use macOS native GUI to ask for the sudo password, thus preventing having to store it in a variable, while still maintaining the possibility to catch the stdout from the privileged script in an interactive / real-time stream.
Hope that makes sense.
Would appreciate any insights!
My question thus becomes: How can I make sure to use macOS native GUI
to ask for the sudo password, thus preventing having to store it in a
variable, while still maintaining the possibility to catch the stdout
from the privileged script in an interactive / real-time stream.
I have found a solution myself, using a named pipe (os.mkfifo()).
That way, you can have 2 python scripts communicate with each other while 1 of them is launched with privileged rights via osascript (meaning: you get a native GUI window that asks for the users sudo password).
Working solution:
mainscript.py
import os
from pathlib import Path
import shlex
import subprocess
import sys
from threading import Thread
import time
class LaunchDeletionProcess(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
launch_command = r"""/usr/bin/osascript -e 'do shell script "/usr/local/bin/python3.6 -u /path/to/priviliged_script.py" with prompt "Sart Deletion Process " with administrator privileges'"""
split_command = shlex.split(launch_command)
print("Thread 1 started")
testprogram = subprocess.Popen(split_command)
testprogram.wait()
print("Thread1 Finished")
class ReadStatus(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
while not os.path.exists(os.path.expanduser("~/p1")):
time.sleep(0.1)
print("Thread 2 started")
self.wfPath = os.path.expanduser("~/p1")
rp = open(self.wfPath, 'r')
response = rp.read()
self.try_pipe(response)
def try_pipe(self, response):
rp = open(self.wfPath, 'r')
response = rp.read()
print("Receiving response: ", response)
rp.close()
if response == str(self.nr_of_steps-1):
print("Got to end")
os.remove(os.path.expanduser("~/p1"))
else:
time.sleep(1)
self.try_pipe(response)
if __name__ == "__main__":
thread1 = LaunchDeletionProcess()
thread2 = ReadStatus()
thread1.start()
thread2.start()
priviliged_script.py
import os
import time
import random
wfPath = os.path.expanduser("~/p1")
try:
os.mkfifo(wfPath)
except OSError:
print("error")
pass
result = 10
nr = 0
while nr < result:
random_nr = random.random()
wp = open(wfPath, 'w')
print("writing new number: ", random_nr)
wp.write("Number: " + str(random_nr))
wp.close()
time.sleep(1)
nr += 1
wp = open(wfPath, 'w')
wp.write("end")
wp.close()

Passing data between separately running Python scripts

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?
I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.
The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.
EDIT:
Here are the examples as requested. I am aware why this doesn't work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I'm hoping this can be achieved as a part of Python.
Script A
import time
i = 0
def return_data():
return i
if __name__ == "__main__":
while True:
i = i + 1
print i
time.sleep(.01)
Script B
import time
from scriptA import return_data
if __name__ == '__main__':
while True:
print return_data() # from script A
time.sleep(1)
you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.
Ex:
mp2.py:
from multiprocessing import Process,Queue,Pipe
from mp1 import f
if __name__ == '__main__':
parent_conn,child_conn = Pipe()
p = Process(target=f, args=(child_conn,))
p.start()
print(parent_conn.recv()) # prints "Hello"
mp1.py:
from multiprocessing import Process,Pipe
def f(child_conn):
msg = "Hello"
child_conn.send(msg)
child_conn.close()
If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).
As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.
Also, there are methods to avoid waiting for data, if there hasn't been any passed yet (queue.empty() and pipeConn.poll()).
See an example using Queue() below:
# main.py
from multiprocessing import Process, Queue
from stage1 import Stage1
from stage2 import Stage2
s1= Stage1()
s2= Stage2()
# S1 to S2 communication
queueS1 = Queue() # s1.stage1() writes to queueS1
# S2 to S1 communication
queueS2 = Queue() # s2.stage2() writes to queueS2
# start s2 as another process
s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
s2.daemon = True
s2.start() # Launch the stage2 process
s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2
s2.join() # wait till s2 daemon finishes
# stage1.py
import time
import random
class Stage1:
def stage1(self, queueS1, queueS2):
print("stage1")
lala = []
lis = [1, 2, 3, 4, 5]
for i in range(len(lis)):
# to avoid unnecessary waiting
if not queueS2.empty():
msg = queueS2.get() # get msg from s2
print("! ! ! stage1 RECEIVED from s2:", msg)
lala = [6, 7, 8] # now that a msg was received, further msgs will be different
time.sleep(1) # work
random.shuffle(lis)
queueS1.put(lis + lala)
queueS1.put('s1 is DONE')
# stage2.py
import time
class Stage2:
def stage2(self, queueS1, queueS2):
print("stage2")
while True:
msg = queueS1.get() # wait till there is a msg from s1
print("- - - stage2 RECEIVED from s1:", msg)
if msg == 's1 is DONE ':
break # ends loop
time.sleep(1) # work
queueS2.put("update lists")
EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there's no need to check first if the queue is empty. This is no possible if you use pipes.
You could use the pickling module to pass data between two python programs.
import pickle
def storeData():
# initializing data to be stored in db
employee1 = {'key' : 'Engineer', 'name' : 'Harrison',
'age' : 21, 'pay' : 40000}
employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack',
'age' : 50, 'pay' : 50000}
# database
db = {}
db['employee1'] = employee1
db['employee2'] = employee2
# Its important to use binary mode
dbfile = open('examplePickle', 'ab')
# source, destination
pickle.dump(db, dbfile)
dbfile.close()
def loadData():
# for reading also binary mode is important
dbfile = open('examplePickle', 'rb')
db = pickle.load(dbfile)
for keys in db:
print(keys, '=>', db[keys])
dbfile.close()
This will pass data to and from two running scripts using TCP host socket. https://zeromq.org/languages/python/. required module zmq: use( pip install zmq ).
This this is called a client server communication. The server will wait for the client to send a request. The client will also not run if the server is not running. In addition, this client server communication allows for you to send a request from one device(client) to another device(server), as long as the client and server are on the same network and you change localhost (localhost for the server is marked with: * )to the actual IP of your device(server)( IP help( go into your device network settings, click on your network icon, find advanced or properties, look for IP address. note this may be different from going to google and asking for your ip. I am using IPV6 so. DDOS protection.)) Change the localhost IP of the client to the server IP. QUESTION to OP. Do you have to have script b always running or can script b be imported as a module to script a? If so look up how to make python modules.
I solved the same problem using the lib Shared Memory Dict, it's a very simple dict implementation of multiprocessing.shared_memory.
Source1.py
from shared_memory_dict import SharedMemoryDict
from time import sleep
smd_config = SharedMemoryDict(name='config', size=1024)
if __name__ == "__main__":
smd_config["status"] = True
while True:
smd_config["status"] = not smd_config["status"]
sleep(1)
Source2.py
from shared_memory_dict import SharedMemoryDict
from time import sleep
smd_config = SharedMemoryDict(name='config', size=1024)
if __name__ == "__main__":
while True:
print(smd_config["status"])
sleep(1)

Listening for a threading Event in python

first time SO user, please excuse any etiquette errors. I'm trying to implement a multithreaded program in python and am having troubles. This is no doubt due to a lack of understanding of how threading is implemented, but hopefully you can help me figure it out.
I have a basic program that continually listens for messages on a serial port and can then print/save/process/etc them, which works fine. It basically looks like this:
import serial
def main():
usb = serial.Serial('/dev/cu.usbserial-A603UBRB', 57600) #open serial w\ baud rate
while True:
line = usb.readline()
print(line)
However what I want to do is continually listen for the messages on a serial port, but not necessarily do anything with them. This should run in the background, and meanwhile in the foreground I want to have some kind of interface where the user can command the program to read/use/save these data for a while and then stop again.
So I created the following code:
import time
import serial
import threading
# this runs in the background constantly, reading the serial bus input
class serial_listener(threading.Thread):
def __init__(self, line, event):
super(serial_listener, self).__init__()
self.event = threading.Event()
self.line = ''
self.usb = serial.Serial('/dev/cu.usbserial-A603UBRB', 57600)
def run(self):
while True:
self.line = self.usb.readline()
self.event.set()
self.event.clear()
time.sleep(0.01)
# this lets the user command the software to record several values from serial
class record_data(threading.Thread):
def __init__(self):
super(record_data, self).__init__()
self.line = ''
self.event = threading.Event()
self.ser = serial_listener(self.line,self.event)
self.ser.start() #run thread
def run(self):
while(True):
user_input = raw_input('Record data: ')
if user_input == 'r':
event_counter = 0
while(event_counter < 16):
self.event.wait()
print(self.line)
event_counter += 1
# this is going to be the mother function
def main():
dat = record_data()
dat.start()
# this makes the code behave like C code.
if __name__ == '__main__':
main()
It compiles and runs, but when I order the program to record by typing r into the CLI, nothing happens. It doesn't seem to be receiving any events.
Any clues how to make this work? Workarounds are also fine, the only thing is that I can't constantly open and close the serial interface, it has to remain open the whole time, or else the device stops working until un/replugged.
Instead of using multiple threads, I would suggest using multiple processes. When you use threads, you have to think about the global interpreter lock. So you either listen to events or do something in your main thread. Both at the same time will not work.
When using multiple processes I would then use a queue to forward the events from your watchdog that you would like to handle. Or you could code your own event handler. Here you can find an example for multiprocess event handlers

DaemonRunner: Detecting if a daemon is already running

I have a script using DaemonRunner to create a daemon process with a pid file. The problem is that if someone tried to start it without stopping the currently running process, it will silently fail. What's the best way to detect an existing process and alert the user to stop it first? Is it as easy as checking the pidfile?
My code is similar to this example:
#!/usr/bin/python
import time
from daemon import runner
class App():
def __init__(self):
self.stdin_path = '/dev/null'
self.stdout_path = '/dev/tty'
self.stderr_path = '/dev/tty'
self.pidfile_path = '/tmp/foo.pid'
self.pidfile_timeout = 5
def run(self):
while True:
print("Howdy! Gig'em! Whoop!")
time.sleep(10)
app = App()
daemon_runner = runner.DaemonRunner(app)
daemon_runner.do_action()
To see my actual code, look at investor.py in:
https://github.com/jgillick/LendingClubAutoInvestor
since DaemonRunner handles its own lockfile, it's more wisely to refer to that one, to be sure you can't mess up. Maybe this block can help you with that:
Add
from lockfile import LockTimeout
to the beginning of the script and surround daemon_runner.doaction() like this
try:
daemon_runner.do_action()
except LockTimeout:
print "Error: couldn't aquire lock"
#you can exit here or try something else
This is the solution that I decided to use:
lockfile = runner.make_pidlockfile('/tmp/myapp.pid', 1)
if lockfile.is_locked():
print 'It looks like a daemon is already running!'
exit()
app = App()
daemon_runner = runner.DaemonRunner(app)
daemon_runner.do_action()
Is this a best practice or is there a better way?

Categories