Import Python library in terminal - python

I need to run a Python script in a terminal, several times. This script requires me to import some libraries. So every time I call the script in the terminal, the libraries are loaded again, which results in a loss of time. Is there any way I can import the libraries once and for all at the beginning?
(If I try the "naive" way, calling first a script just to import libraries then running my code, it doesn't work).
EDIT: I need to run the script in a terminal because actually it is made to serve in another program developed in Java. The Java code calls the Pythin script in the terminal, reads its result and processes it, then calls it again.

One solution is that you can leave the python script always running and use a pipe to communicate between processes like the code below taken from this answer.
import os, time
pipe_path = "/tmp/mypipe"
if not os.path.exists(pipe_path):
os.mkfifo(pipe_path)
# Open the fifo. We need to open in non-blocking mode or it will stalls until
# someone opens it for writting
pipe_fd = os.open(pipe_path, os.O_RDONLY | os.O_NONBLOCK)
with os.fdopen(pipe_fd) as pipe:
while True:
message = pipe.read()
if message:
print("Received: '%s'" % message)
print("Doing other stuff")
time.sleep(0.5)

The libraries will be unloaded once the script finishes, so the best way you can handle this is to write the script so it can iterate however many times you want, rather than running the whole script multiple times. I would likely use input() (or raw_input() if you're running Python2) to read in however many times you want to iterate over it, or use a library like click to create a command line argument for it.

Related

how can I run python file from another file, then have the new file restart the first file?

So far I don't think this is actually possible, but basically what I am trying to do is have one python program call another and run it, like how you would use import.
But then I need to be able to go from the second file back to the beginning of the first.
Doing this with import doesn't work because the first program never closed and will be still running, so running it again will only return to where it left off when it ran the second file.
Without understanding a bit more about what you want to do, I would suggest looking into the threading or multiprocessing libraries. These should allow you to create multiple instances of a program or function.
This is vague and I'm not quite sure what you're trying to do, but you can also explore the Subprocess module for Python. It will allow you to spawn new processes similarly to if you were starting them from the command-line, and your processes will also be able to talk to the child processes via stdin and stdout.
If you don't want to import any modules:
exec("file.py")
Otherwise:
import os
os.system('file.py')
Or:
import subprocess
subprocess.call('file.py')

How can I pass a python variable to a process in the command prompt

I have a python loop that at each iteration is creating a new image in a new path. I want to send that path into a process previously executed that is waiting for a path at the command prompt.
In more detail:
I am running a self-driving simulator in python and every 5 frames I want to test that current frame (that is saved in a RAM disk) into an object detector algorithm I have trained (that spends arround 9ms to detect my object BUT 2 seconds to open the process). Actually, I execute the trained algorithm by using the subprocess module, but the problem I have is that when that process is opened (I just open the process once, when I run the main script) it is waiting for a path image. I believe that with your tips I am close to the answer but I don't face how to pass that path image to this subprocess that is waiting for it at each 5 frames iteration.
PD: I am on Windows, Python 3.5.4
Do you know what can I do?
If you're on a *nix environment, and I'm understanding what you want, pipes provides what you want to do quite elegantly. I don't know Windows, but maybe the same concept could be used there.
Here's a simple example that illustrates this:
Pipe1a.py:
#!/usr/bin/env python
import time
import sys
for i in range(10):
time.sleep(1) # represent some processing delay
print("filename{}.jpg".format(i))
sys.stdout.flush()
Pipe1b.py:
#!/usr/bin/env python
import sys
while True:
line = sys.stdin.readline()
if len(line) == 0:
break
print "Processing image '{}'".format(line.strip())
If you made both of these scripts executable, then you could chain them together via a pipe at the command prompt:
> Pipe1a.py | Pipe1b.py
Resulting output:
Processing image 'filename0.jpg'
Processing image 'filename1.jpg'
Processing image 'filename2.jpg'
Processing image 'filename3.jpg'
Processing image 'filename4.jpg'
Processing image 'filename5.jpg'
Processing image 'filename6.jpg'
Processing image 'filename7.jpg'
Processing image 'filename8.jpg'
Processing image 'filename9.jpg'
The concept here is that one process writes data to its stdout and a second process reads that data from its stdin.
If you don't want to use the command prompt to string these together, but rather want to run a single script, you can do this same thing, with pretty much the same code, using the Python subprocess1 module. For this example, you could have the Pipe1b.py program run the Pipe1a.py program via subprocess.Popen and then process its output in this same way.
Thank you for all your comments and for your help. Finally I have achieved it.
I use the library 'pexpect' that allows to launch a program inside a python script with the function process = popen_spawn.PopenSpawn. Then, a function from that library called 'send' allows to pass an argument to that running process (process.send(arg)).
In my case, I launch the program (.exe) at the beginning of the script defining that instance as a global variable. Then I just have to execute the send function at each iteration.

Control executed programm with python

I want to execute a testrun via bash, if the test needs too much time. So far, I found some good solutions here. But since the command kill does not work properly (when I use it correctly it says it is not used correctly), I decided to solve this problem using python. This is the Execution call I want to monitor:
EXE="C:/program.exe"
FILE="file.tpt"
HOME_DIR="C:/Home"
"$EXE" -vm-Xmx4096M --run build "$HOME_DIR/test/$FILE" "Auslieferung (ML) Execute"
(The opened *.exe starts a testrun which includes some simulink simulation runs - sometimes there are simulink errors - in this case, the execution time of the tests need too long and I want to restart the entire process).
First, I came up with the idea, calling a shell script containing these lines within a subprocess from python:
import subprocess
import time
process = subprocess.Popen('subprocess.sh', shell = True)
time.sleep(10)
process.terminate()
But when I use this, *.terminate() or *.kill() does not close the program I started with the subprocess call.
That´s why I am now trying to implement the entire call in python language. I got the following so far:
import subprocess
file = "somePath/file.tpt"
p = subprocess.Popen(["C:/program.exe", file])
Now I need to know, how to implement the second call "Auslieferung (ML) Execute" of the bash function. This call starts an intern testrun named "Auslieferung (ML) Execute". Any ideas? Or is it better to choose one of the other ways? Or can I get the "kill" option for bash somewhere, somehow?

Python multiprocessing/threading blocking main thread

I’m trying to write a program in Python. What I want to write is a script which immediately returns a friendly message to the user, but spawns a long subprocess in the background that takes with several different files and writes them to a granddaddy file. I’ve done several tutorials on threading and processing, but what I’m running into is that no matter what I try, the program waits and waits until the subprocess is done before it displays the aforementioned friendly message to the user. Here’s what I’ve tried:
Threading example:
#!/usr/local/bin/python
import cgi, cgitb
import time
import threading
class TestThread(threading.Thread):
def __init__(self):
super(TestThread, self).__init__()
def run(self):
time.sleep(5)
fileHand = open('../Documents/writable/output.txt', 'w')
fileHand.write('Big String Goes Here.')
fileHand.close()
print 'Starting Program'
thread1 = TestThread()
#thread1.daemon = True
thread1.start()
I’ve read these SO posts on multithreading
How to use threading in Python?
running multiple threads in python, simultaneously - is it possible?
How do threads work in Python, and what are common Python-threading specific pitfalls?
The last of these says that running threads concurrently in Python is actually not possible. Fair enough. Most of those posts also mention the multiprocessing module, so I’ve read up on that, and it seems fairly straightforward. Here’s the some of the resources I’ve found:
How to run two functions simultaneously
Python Multiprocessing Documentation Example
https://docs.python.org/2/library/multiprocessing.html
So here’s the same example translated to multiprocessing:
#!/usr/local/bin/python
import time
from multiprocessing import Process, Pipe
def f():
time.sleep(5)
fileHand = open('../Documents/writable/output.txt', 'w')
fileHand.write('Big String Goes Here.')
fileHand.close()
if __name__ == '__main__':
print 'Starting Program'
p = Process(target=f)
p.start()
What I want is for these programs to immediately print ‘Starting Program’ (in the web-browser) and then a few seconds later a text file shows up in a directory to which I’ve given write privileges. However, what actually happens is that they’re both unresponsive for 5 seconds and then they print ‘Starting Program’ and create the text file at the same time. I know that my goal is possible because I’ve done it in PHP, using this trick:
//PHP
exec("php child_script.php > /dev/null &");
And I figured it would be possible in Python. Please let me know if I’m missing something obvious or if I’m thinking about this in the completely wrong way. Thanks for your time!
(System information: Python 2.7.6, Mac OSX Mavericks. Python installed with homebrew. My Python scripts are running as CGI executables in Apache 2.2.26)
Ok- I think I found the answer. Part of it was my own misunderstanding. A python script can't simply return message to a client-side (ajax) program but still be executing a big process. The very act of responding to the client means that the program has finished, threads and all. The solution, then, is to use the python version of this PHP trick:
//PHP
exec("php child_script.php > /dev/null &");
And in Python:
#Python
subprocess.call(" python worker.py > /dev/null &", shell=True)
It starts an entirely new process outside the current one, and it will continue after the current one has ended. I'm going to stick with Python because at least we're using a civilized api function to start the worker script instead of the exec function, which always made me uncomfortable.

How to open an application in Python, and then let the rest of the script run?

I have been trying to create a script which reloads a web browser called Midori if the internet flickers. But, it seems only to work if I open Midori through the CLI - otherwise, the program crashes after I reload it. I have decided that the best idea is thus to have the script open Midori through the subprocess module. So, I put this as one of the first arguments in my code:
import subprocess as sub
sub.call(["midori"])
The browser opens, but the rest of the program freezes until I quit Midori. I have tried to use threading, but it doesn't seem to work.
Is there any way to open an application through Python, and then let the rest of the script continue to run once said application has been opened?
From the docs:
Run the command described by args. Wait for command to complete, then return the returncode attribute.
(Emphasis added)
You can see this is the behaviour we should expect. To get around this, use subprocess.Popen instead. This will not block in the same way:
from subprocess import Popen
midori_process = Popen(["midori"])

Categories