Python: Use multiple python windows for a same program - python

Is there a way to have a python program being able to open and manage multiple terminal windows? Like, in my script, making two windows open, and when I write something in the first one, like with
d = input()
it prints it in the second one? I'd like to avoid using sockets if possible, and avoid using python GUI libraries like Tkinter... But if there's no other way it's okay, just avoid it if possible.
Thanks.

Yes you can!!
1) you can save your input_output data in a file and use it on another python script ( realtime ).
2) you can use multiprocessing module for handling multi process read more on:
https://pymotw.com/2/multiprocessing/basics.html
3) you can use Multithreaded module for handling multi thread, read more on :https://www.tutorialspoint.com/python/python_multithreading.htm
4) you can use sys and subprocess modules for using pip port

Related

how can I run python file from another file, then have the new file restart the first file?

So far I don't think this is actually possible, but basically what I am trying to do is have one python program call another and run it, like how you would use import.
But then I need to be able to go from the second file back to the beginning of the first.
Doing this with import doesn't work because the first program never closed and will be still running, so running it again will only return to where it left off when it ran the second file.
Without understanding a bit more about what you want to do, I would suggest looking into the threading or multiprocessing libraries. These should allow you to create multiple instances of a program or function.
This is vague and I'm not quite sure what you're trying to do, but you can also explore the Subprocess module for Python. It will allow you to spawn new processes similarly to if you were starting them from the command-line, and your processes will also be able to talk to the child processes via stdin and stdout.
If you don't want to import any modules:
exec("file.py")
Otherwise:
import os
os.system('file.py')
Or:
import subprocess
subprocess.call('file.py')

Import Python library in terminal

I need to run a Python script in a terminal, several times. This script requires me to import some libraries. So every time I call the script in the terminal, the libraries are loaded again, which results in a loss of time. Is there any way I can import the libraries once and for all at the beginning?
(If I try the "naive" way, calling first a script just to import libraries then running my code, it doesn't work).
EDIT: I need to run the script in a terminal because actually it is made to serve in another program developed in Java. The Java code calls the Pythin script in the terminal, reads its result and processes it, then calls it again.
One solution is that you can leave the python script always running and use a pipe to communicate between processes like the code below taken from this answer.
import os, time
pipe_path = "/tmp/mypipe"
if not os.path.exists(pipe_path):
os.mkfifo(pipe_path)
# Open the fifo. We need to open in non-blocking mode or it will stalls until
# someone opens it for writting
pipe_fd = os.open(pipe_path, os.O_RDONLY | os.O_NONBLOCK)
with os.fdopen(pipe_fd) as pipe:
while True:
message = pipe.read()
if message:
print("Received: '%s'" % message)
print("Doing other stuff")
time.sleep(0.5)
The libraries will be unloaded once the script finishes, so the best way you can handle this is to write the script so it can iterate however many times you want, rather than running the whole script multiple times. I would likely use input() (or raw_input() if you're running Python2) to read in however many times you want to iterate over it, or use a library like click to create a command line argument for it.

How to run a python program from c++

I am trying to make a program in c++, but i cant make the program because in one part of the code I need to run a python program from c++ and I dont know how to do it. I've been trying many ways of doing it but none of them worked. So the code should look sometihnglike this:somethingtoruntheprogram("pytestx.py"); or something close to that. Id prefer doing it without python.h. I just need to execute this program, I need to run the program because I have redirected output and input from the python program with sys.stdout and sys.stdin to text files and then I need to take data from those text files and compare them. I am using windows.
You have two way of doing that:
Use system/fork and exec*/...
Embed a python interpreter in your program (cf python 2.6 doc or boost.python)
Using a embedded interpreter is (IMHO) the best way to do it because it gives you more control over the execution of the script, because it's not OS-dependant and it does not rely on your target having a python interpreter (configured as you require).
There's POSIX popen and on Windows _popen, which is halfway between exec and system. It offers the required control over stdin and stdout, which system does not. But on the other hand, it's not as complicated as the exec family of functions.

terminating application through python script

How to close (terminate) Windows applications using Python script? When I switch-on my PC, I find many applications like MSOSYNC.exe, ONENOTEM.exe etc. along with many others, running, which are not very useful. I want to close those? I tried "subprocess" module and some other, they not working. Which method should I use?
You're using the Popen class to construct a new object in you example already. It has methods to deal with it. Read the documentation
import subprocess
proc = subprocess.Popen(['c:\\windows\\system32\\notepad.exe','C:\\file1.txt'])
proc.terminate()

Feasibility of using pipe for ruby-python communication

Currently, I have two programs, one running on Ruby and the other in Python. I need to read a file in Ruby but I need first a library written in Python to parse the file. Currently, I use XMLRPC to have the two programs communicate. Porting the Python library to Ruby is out of question. However, I find and read that using XMLRPC has some performance overhead. Recently, I read that another solution for the Ruby-Python conundrum is the use of pipes. So I tried to experiment on that one. For example, I wrote this master script in ruby:
(0..2).each do
slave = IO.popen(['python','slave.py'],mode='r+')
slave.write "master"
slave.close_write
line = slave.readline
while line do
sleep 1
p eval line
break if slave.eof
line = slave.readline
end
end
The following is the Python slave:
import sys
cmd = sys.stdin.read()
while cmd:
x = cmd
for i in range(0,5):
print "{'%i'=>'%s'}" % (i, x)
sys.stdout.flush()
cmd = sys.stdin.read()
Everything seems to work fine:
~$ ruby master.rb
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
My question is, is it really feasible to implement the use of pipes for working with objects between Ruby and Python? One consideration is that there may be multiple instances of master.rb running. Will concurrency be an issue? Can pipes handle extensive operations and objects to be passed in between? If so, would it be a better alternative for RPC?
Yes. No. If you implement it, yes. Depends on what your application needs.
Basically if all you need is simple data passing pipes are fine, if you need to be constantly calling functions on objects in your remote process then you'll probably be better of using some form of existing RPC instead of reinventing the wheel. Whether that should be XMLRPC or something else is another matter.
Note that RPC will have to use some underlying IPC mechanism, which could well be pipes. but might also be sockets, message queues, shared memory, whatever.

Categories