create subprocess object from already running process - python

I would like to create an subprocess.Popen object from an already running process... Is that possible somehow?
Another idea would be to serialize (pickle) the subprocess object and write it to a database so that if the main process restarts it could get the subprocess.Popen objects back from the database. I'm unsure if that works.

create an subprocess.Popen object from an already running process
Do you mean from an already running sub-process? The only way I know of to pass objects between processes is to pickle them and write them out either to a file or a database as you suggested.
Typically, sub-processes cannot be spawned from already running sub-processes, but you can keep a reference to the new process you want to create and spawn it from the main process. This could get really ugly, and I suggest against it strongly. Why, specifically do you need to further your process tree past two-deep? This info might lead to a better answer.

Assuming you want to communicate with the "subprocess" and must do so using its standard i/o streams, you could create a wrapper around the executable that maps its stdin/out/err to a socket or named pipe.
The program that intends to control the "subprocess" can then start and stop communications at any time. You may have to provide for a locking mechanism too.
Then, assuming you're on Linux, you can access the stdin/out/err of a running process through /proc/<pid>/fd/<0,1,2>. You won't connect these to a subprocess.Popen object but open('/proc/<pid>/fd/1', 'rb') will behave like Popen().stdout.

Related

How to get any new process start and stop in OS?

I want to perform some operations in my application when any new listed process starts or stops in Windows OS.
I read subprocess module, but didn't get a clue.
Can any one knows which method is called when new process is started, or how I can override which os module method to achieve my goal.
Please help me to resolve it.
The best way to do this is to store all the processes at the beginning of the program using some sort of Array. Have a thread that constantly checks for changes inside the array. If it detects a change send "an interrupt" into the main process.

Passing argument to constantly running Python application

I have a constantly running Python code on Linux, every so often outside data needs to be fed into this code so Python code can alter a file.
How do I go about structuring Python code so it receives these arguments for further processing?
I found some stuff on outgoing args, Running external program using pipes and passing arguments in python
But looking for in coming args
Flexible with how arguments get passed down
You need some kind of Inter Process Communication.
For example, you can feed program's standard input. You can read it by reading from sys.stdin, but it requires the program that started your process to give its handle to another process.
Another way is to create a socket of some kind. That's far more scalable, allows connecting to the program when it's running on another machine, and allows non-Python processes to easily communicate with your process.

User Input Python Script Executing Daemon

I am working on a web service that requires user input python code to be executed on my server (we have checks for code injection). I have to import a rather large module so I would like to make sure that I am not starting up python and importing the module from scratch each time something runs (it takes about 4-6s).
To do this I was planning to create a python (3.2) deamon that imports the user input code as a module, executes it and then delete/garbage collect that module. I need to make sure that that module is completely gone from RAM since this process will continue until the server is restarted. I have read a bunch of things that say this is a very difficult thing to do in python.
What is the best way to do this? Would it be better to use exec to define a function with the user input code (for variable scoping) and then execute that function and somehow remove the function? Or is there a better way to do this process that I have missed?
You could perhaps consider to create a pool of python daemon processes?
Their purpose would be to serve one request and to die afterwards.
You would have to write a pool-manager that ensures that there are always X daemon processes waiting for an incoming request. (X being the number of waiting daemon processes: depending on the required workload). The pool-manager would have to observe the pool of daemon processes and start new instances every time a process was finished.

SSL error after python/django fork

I've got a python django app where part of it is parsing a large file. This takes forever, so I put a fork in to deal with the processing, allowing the user to continue to browse the site. Within the fork code, there's a bunch of calls to our postgres database, hosted on amazon.
I'm getting the following error:
SSL error: decryption failed or bad record mac
Here's the code:
pid = os.fork()
if pid == 0:
lengthy_code_here(long)
database_queries(my_database)
os._exit(0)
None of my database calls are working, although they were working just fine before I inserted the fork. After looking around a little, it seems like it might be a stale database connection, but I'm not sure how to fix it. Does anyone have any ideas?
Forking while holding a socket open (such as a database connection) is generally not safe, as both processes will end up trying to use the same socket at once.
You will need, at a minimum, to close and reopen the database connection after forking.
Ideally, though, this is probably better suited for a task queueing system like Celery.
Django in production typically has a process dispatching to a bunch of processes that house django/python. These processes are long running, ie. they do NOT terminate after handling one request. Rather they handle a request, and then another, and then another, etc. What this means is changes that are not restored/cleaned up at the end of servicing a request will affect future requests.
When you fork a process, the child inherits various things from the parent including all open descriptors (file, queue, directories). Even if you do nothing with the descriptors, there is still a problem because when a process dies all it's open descriptors will be cleaned up.
So when you fork from a long running process you are setting yourself up to close all the open descriptors (such as the ssl connection) when the child process dies after it finishes processing. There are ways to prevent this from happening in a fork, but they can sometimes be difficult to get right.
A better design is to not fork, and instead hand off to another process that is either running, or started in a safer manner. For example:
at(1) can be used to queue up jobs for later (or immediate) execution
message queues can be used to pass messages to other daemons
standard IPC constructs such as pipes can be used to communicate to other daemons
update:
If you want to use at(1) you will have to create a standalone script. You can use a serializer to pass the data from django to the script.

How do I execute two programs from python at the same time?

This post explains how to launch a single external program from Python
How shall I launch multipal programs(or threads) at the same time ?
My intended application is a video slide show. I want to launch a image sequence player and a music player at the same time
Thanks in advance
subprocess.Popen doesn't block unless you explicitly ask it to by calling communicate on the returned object, so you can call it more than once to start more than one process.
If you do need to communicate with both sub-processes simultaneously (read their STDOUT, for instance), then invoke subprocess.Popen in separate threads. Each thread can manage a sub-process and communicate with it. Naturally, this leaves you to do all the synchronization but that highly depends on your specific application.

Categories