Python: interacting with STDIN/OUT of a running process in *nix - python

Is there any way of attaching a console's STDIN/STDOUT to an already running process?
Use Case:
I have a python script which runs another python script on the command line using popen.
Let's say foo.py runs popen to run python bar.py.
Then bar.py blocks on input. I can get the PID of python bar.py. Is there any way to attach a new console to the running python instance in order to be able to work interactively with it? This is specifically useful because I want to run pdb inside of bar.py.

No way. But you can modify the way you start bar.py in order to be prepared to take over stdin and stdout.
A simple method would be to just create named pipes and supply these as stdin/stdout in the Popen call. You may then connect to these pipes from you shell (exec <pipe1 >pipe2) and interact. This has the disadvantage that you must connect to the pipes to see what the process is doing. While you can work around that by using tee on a log file, depending on the terminal capability demands of bar.py this kind of interaction may not be the greatest pleasure.
A better way could be to incorporate a terminal multiplexer like GNU screen or tmux into you process tree. These tools can create a virtual terminal in which the application is run. You may then at any time attach and detach any other terminal to this terminal buffer. In your specific case, foo.py would run screen or tmux, which will run python bar.py in a complete (VT100) terminal emulation. Maybe this will solve your problem.

You cannot redirect stdin or stdout for a running process. You can, however, add code to your caller -- foo.py -- that will read from foo.py's stdin and send it to bar.py's stdout, and vice-versa.
In this model, foo.py would connect bar.py's stdin and stdout to pipes and would be responsible for shuttling data between those pipes and the real stdin/stdout.

Related

Python execute code in parent shell upon exit

I have a search program that helps users find files on their system. I would like to have it perform tasks, such as opening the file within editor or changing the parent shell directory to the parent folder of the file exiting my python program.
Right now I achieve this by running a bash wrapper that executes the commands the python program writes to the stdout. I was wondering if there was a way to do this without the wrapper.
Note:
subprocess and os commands create a subshell and do not alter the parent shell. This is an acceptable answer for opening a file in the editor, but not for moving the current working directory of the parent shell to the desired location on exit.
An acceptable alternative might be to open a subshell in a desired directory
example
#this opens a bash shell, but I can't send it to the right directory
subprocess.run("bash")
This, if doable, will require quite a hack. Because the PWD is passed from the shell into the subprocess - in this case, the Python process, as a subprocess owned variable, and changing it won't modify what is in the super program.
On Unix, maybe it is achievable by opening a detachable sub-process that will pipe keyboard strokes into the TTY after the main program exits - I find this the most likely to succeed than any other thing.

Python controlling Tcl shell

Is there a way in Python to launch a Tcl shell or a unix shell and take control of it to perform operations? I understand that the subprocess, TkInter can be used to execute shell/Tcl scripts and get the output.
Specifically this will be a wrapper on the Tcl shell/tool that will control its stdin/out. I have tried establishing a client-server connection but in order for python to control tcl shell, the connection needs to be established manually. What should be approach here?
With Tkinter, you have a Tcl interpreter inside the same process as Python. That's fine for doing conventional stuff (or Tk) but if you've got some sort of custom Tcl interpreter to control, you'll probably want to launch that as a subprocess connected by pipes to the parent Python process.
The subprocess module in Python has most of what you'll need.
import subprocess
p = subprocess.Popen(["tclimplementation"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
# Turn off buffering on the Tcl side
print("fconfigure stdout -buffering none", file=p.stdin)
# Also should read from p.stdout; here's how
line = p.stdout.readline().strip()
That telling of the Tcl side to fconfigure stdout -buffering none is important, as Tcl (in common with most processes) defaults to buffering multiple kilobytes when writing to anything other than a terminal (great for performance, not great for interactive use).
Now that you can send messages back and forth, you need to think more about how to actually control the other side. That's rather application specific.

Prohibit reading from /dev/tty

Tools like sudo read from /dev/tty to read a password.
I would like to avoid this.
The subprocess should not be able to read /dev/tty. The subprocess should fail immediately instead of waiting for input for ever.
I am using the subprocess module of Python. The subprocess should fail if it tries to read from /dev/tty.
Remember: The tool sudo is just an example. A fancy command line argument to sudo does not solve my problem. This should work for all linux command line tools.
Question: How to make any tool fail as soon as it wants to read from /dev/tty (called via the subprocess module of Python)?
Background: This is a normal linux user process, not root.
Since python3.2 Popen takes an argument start_new_session which will cause the executed process to be started detached from the current controlling terminal by calling setsid() prior to executing the child process.
So all you should need is to start the process with start_new_session=True

How do I grab the console output (and only output) of an other program's application?

I am writing a python code that will work as a dameon in a Raspberry pi. However, the person I am writing this for want to see the raw output it gets while it is running, not just my log files.
My first idea to do this was to use a bash script using the Screen program, but that has some features in it that I CANNOT have. Mainly the ability to kill the program through the Screen program.
Is there a way I can write a program (preferably python) or bash script, that is able to read the output of another program running, but doesn't send anything to it?
Thanks.
the easiest is just the bash command
tail -f my_logfile.txt
this assumes you are already logging to a file ...
if you wish to snoop on another process' stdout stream you can use strace (also bash)
export PID=$(pgrep my_program)
strace -p$PID -s9999 -e write
You may get the stdout stream by calling
cat /proc/<pid>/fd/1

Interacting with another command line program in Python

I need to write a Python script that can run another command line program and interact with it's stdin and stdout streams. Essentially, the Python script will read from the target command line program, intelligently respond by writing to its stdin, and then read the results from the program again. (It would do this repeatedly.)
I've looked through the subprocess module, and I can't seem to get it to do this read/write/read/write thing that I'm looking for. Is there something else I should be trying?
To perform such detailed interaction (when, outside of your control, the other program may be buffering its output unless it thinks it's talking to a terminal) needs something like pexpect -- which in turns requires pty, a Python standard library module that (on operating systems that allow it, such as Linux and Mac OS x) implements "pseudo-terminals".
Life is harder on Windows, but maybe this zipfile can help -- it's supposed to be a port of pexpect to Windows (sorry, I have no Windows machine to check it on). The project in question, called wexpect, lives here.
see the question
wxPython: how to create a bash shell window?
there I have given a full fledged interaction with bash shell
reading stdout and stderr and communicating via stdin
main part is extension of this code
bp = Popen('bash', shell=False, stdout=PIPE, stdin=PIPE, stderr=PIPE)
bp.stdin.write("ls\n")
bp.stdout.readline()
if we read all data it will get blocked so the link to script I have given does it in a thread. That is a complete wxpython app mimicking bash shell partially.

Categories