import python program which contains "if __name__ == '__main__':" in python - python

I have two python scripts one of whic has to write a json (file1.py) and another one(file2.py) is to import file1.
My python script file1.py has executed successfully, but when I try to import in file2.py it doesn't work as it contain if __name__ == '__main__':
file1.py
def demo(opt,streamPixel):
#some functions
if __name__ == '__main__':
streamPixel = "{\"type\":\"FeatureCollection\",\"features\":["
#parser = argparse.ArgumentParser()
Transformation= 'TPS'
FeatureExtraction = 'ResNet'
SequenceModeling = 'BiLSTM'
Prediction = 'Attn'
num_fiducial = 20
input_channel = 1
output_channel = 512
imgH = 72
imgW =320
hidden_size = 256
rgb = False
batch_max_length = 25
character = '01'
sensitive =True
#PAD = True
image_folder ='D:/work/source_code/VIC_OCR/cropped'
workers = 4
batch_size= 192
num_class = 4
saved_model = 'D:\\SARIGHA\\source_code\\saved_models\\TPS-ResNet-BiLSTM-Attn-Seed323\\best_accuracy.pth'
opt = None
""" vocab / character number configuration """
if sensitive:
character = string.printable[:-6] # same with ASTER setting (use 94 char).
cudnn.benchmark = True
cudnn.deterministic = True
num_gpu = torch.cuda.device_count()
demo(opt,streamPixel)
file2.py:
import file1
from file1 import demo
if I run my file2.py is simply produce like this
(victoria) D:\work\source_code\textReg\imageOrientation>python file2.py
(victoria) D:\work\source_code\textReg\imageOrientation>
is there is any possible to import file1.py in file2.py

You could instead create a class, put it in file1.py and import it like this
from file1.py import pixelModel
pixelModel = pixelModel()
class pixelModel():
# all your variables you have mentioned in main
def __init__(sensitive):
self.sensitive = sensitive
if sensitive:
character = string.printable[:-6]
cudnn.benchmark = True
cudnn.deterministic = True
self.num_gpu = torch.cuda.device_count()
demo(opt,streamPixel)

What do you mean by it doesn't work? What exactly happens? If file2.py is purely just that, of course it won't run, because you're not running anything. if __name__ == '__main__': means that that stuff will only run if it's being run directly, rather than imported.

When you write stuff under if __name__ == '__main__', they get executed when our script is run from the command line. If you import your script in an other python script, this part won't get executed (see this well detailed explanation to understand why).
One way to be able to import the code in another script would be to put it in a function like this:
file1.py:
def myfunction():
# Do what you want here
print('This is my function in file1')
if __name__ == '__main__':
myfunction()
file2.py:
from file1 import myfunction
if __name__ == '__main__':
myfunction()

Related

How do I execute a python script within another and have it return a list object?

I have a script test.py and I want it to execute another script this_other_script.py which will return a list object. test.py looks like:
if __name__ == '__main__':
someValue = this_other_script
print(len(someValue))
this_other_script.py looks like:
if __name__ == '__main__':
data = [a,b,c,d]
return(data)
When I run test.py I receive an error of SyntaxError: 'return' outside function.
If this is due to program scope I would have thought it would be OK for the calling program to be given a return value from a program it is calling. I wouldn't expect for this_other_script to access a value of a variable not given to it by test.py so I'm not sure why this error is displayed.
in test.py:
if __name__ == '__main__':
import this_other_script
someValue = this_other_script.get_data()
print(len(someValue))
in this_other_script.py:
def get_data():
data = [1,2,3,4]
return(data)
alternative answer:
in test.py
if __name__ == '__main__':
import this_other_script
someValue = this_other_script.get_data()
print(len(someValue))
in this_other_script.py:
def get_data():
data = [1,2,3,4]
return(data)
if __name__ == '__main__':
get_data()

Multiprocessing goes stuck after .get()

I'm trying to understand how multiprocessing works in python and having some issues.
This is the example:
import multiprocessing
def func():
return 1
p = multiprocessing.Pool()
result = p.apply_async(func).get()
When .get() function is called, the code is just stuck. What am I doing wrong?
You need to add these 2 lines inside if __name__ == "__main__":
so now your code must look like
import multiprocessing
def func():
return 1
if __name__ == "__main__":
p = multiprocessing.Pool()
result = p.apply_async(func).get()
If this is being called as an import, it will cause an infinite sequence of new processes. And adding them inside the if block works because if statement won't execute during import.
i do not have enough details to know exactly what is the problem.
but i have real strong guess that putting those lines:
p = multiprocessing.Pool()
result = p.apply_async(func).get()
inside a function will fix your problem.
try this:
import multiprocessing
def func():
return 1
def main():
p = multiprocessing.Pool()
result = p.apply_async(func).get()
print(result)
if __name__ == '__main__':
main()
tell me if it worked :)

How to make python scripts pipe-able both in bash and within python

Summary: I'd like to write python scripts that act like bash scripts on the command line, but then I'd also like to pipe them together easily in python. Where I'm having trouble is the glue to make the latter happen.
So imagine I wrote two scripts, script1.py and script2.py and I can pipe them together like so:
echo input_string | ./script1.py -a -b | ./script2.py -c -d
How do I get this behavior from within another python file?
Here's the way I know, but I don't like:
arg_string_1 = convert_to_args(param_1, param_2)
arg_string_2 = convert_to_args(param_3, param_4)
output_string = subprocess.check_output("echo " + input_string + " | ./script1.py " + arg_string_1 + " | ./script2.py " + arg_string_2)
If I didn't want to take advantage of multithreading, I could do something like this (?):
input1 = StringIO(input_string)
output1 = StringIO()
script1.main(param_1, param_2, input1, output1)
input2 = StringIO(output1.get_value())
output2 = StringIO()
script2.main(param_3, param_4, input2, output2)
Here's the approach I was trying, but I got stuck at writing the glue. I'd appreciate either learning how to finish my approach below, or suggestions for a better design/approach!
My approach: I wrote script1.py and script2.py to look like:
#!/usr/bin/python3
... # import sys and define "parse_args"
def main(param_1, param_2, input, output):
for line in input:
...
print(stuff, file=output)
if __name__ == "__main__":
parameter_1, parameter_2 = parse_args(sys.argv)
main(parameter_1, parameter_2, sys.stdin, sys.stdout)
Then I wanted to write something like this, but don't know how to finish:
pipe_out, pipe_in = ????
output = StringIO()
thread_1 = Thread(target=script1.main, args=(param_1, param_2, StreamIO(input_string), pipe_out))
thread_2 = Thread(target=script2.main, args=(param_3, param_4, pipe_in, output)
thread_1.start()
thread_2.start()
thread_1.join()
thread_2.join()
output_str = output.get_value()
For the "pipe in", uses sys.stdin with the readlines() method. (Using method read() would read one character at a time.)
For passing information from one thread to another, you can use Queue. You must define one way to signal the end of data. In my example, since all data passed between threads are str, I simply use a None object to signal the end of data (since it cannot appear in the transmitted data).
One could also use more threads, or use different functions in threads.
I did not include the sys.argvin my example to keep it simple. Modifying it to get parameters (parameter1, ...) should be easy.
import sys
from threading import Thread
from Queue import Queue
import fileinput
def stdin_to_queue( output_queue ):
for inp_line in sys.stdin.readlines(): # input one line at at time
output_queue.put( inp_line, True, None ) # blocking, no timeout
output_queue.put( None, True, None ) # signal the end of data
def main1(input_queue, output_queue, arg1, arg2):
do_loop = True
while do_loop:
inp_data = input_queue.get(True)
if inp_data is None:
do_loop = False
output_queue.put( None, True, None ) # signal end of data
else:
out_data = arg1 + inp_data.strip('\r\n').upper() + arg2 # or whatever transformation...
output_queue.put( out_data, True, None )
def queue_to_stdout(input_queue):
do_loop = True
while do_loop:
inp_data = input_queue.get(True)
if inp_data is None:
do_loop = False
else:
sys.stdout.write( inp_data )
def main():
q12 = Queue()
q23 = Queue()
q34 = Queue()
t1 = Thread(target=stdin_to_queue, args=(q12,) )
t2 = Thread(target=main1, args=(q12,q23,'(',')') )
t3 = Thread(target=main1, args=(q23,q34,'[',']') )
t4 = Thread(target=queue_to_stdout, args=(q34,))
t1.start()
t2.start()
t3.start()
t4.start()
main()
Finally, I tested this program (python2) with a text file.
head sometextfile.txt | python script.py
Redirect the return value to stdout depending on whether the script is being run from the command line:
#!/usr/bin/python3
import sys
# Example function
def main(input):
# Do something with input producing stuff
...
return multipipe(stuff)
if __name__ == '__main__':
def multipipe(data):
print(data)
input = parse_args(sys.argv)
main(input)
else:
def multipipe(data):
return data
Each other script will have the same two definitions of multipipe. Now, use multipipe for output.
If you call all the scripts together from the command line $ ./scrip1.py | ./scrip2.py, each will have __name__ == '__main__' and so multipipe will print it all to stdout to be read as an argument by the next script (and return None, so each function returns None, but you're not looking at the return values anyway in this case).
If you call them within some other python script, each function will return whatever you passed to multipipe.
Effectively, you can use your existing functions, just replace print(stuff, file=output) with return multipipe(stuff). Nice and simple.
To use it with multithreading or multiprocessing, set the functions up so that each function returns a single thing, and plug them into a simple function that adds data to a multithreading queue. For an example of such a queueing system, see the sample at the bottom of the Queue docs. With that example, just make sure that each step in the pipeline puts None (or other sentinel value of your choice - I like ... for that since it's extremely rare that you'd pass the Ellipsis object for any reason other than as a marker for its singleton-ness) in the queue to the next one to signify done-ness.
There is a very simple solution using the standard Popen class.
Here's an example:
#this is the master python program
import subprocess
import sys
import os
#note the use of stdin and stdout arguments here
process1 = subprocess.Popen(['./script1.py'], stdin=sys.stdin, stdout=subprocess.PIPE)
process2 = subprocess.Popen(['./script2.py'], stdin=process1.stdout)
process1.wait()
process2.wait()
the two scripts are:
#!/usr/bin/env python
#script1.py
import sys
for line in sys.stdin:
print(line.strip().upper())
Here's the second one
#!/usr/bin/env python
#script2.py
import sys
for line in sys.stdin:
print("<{}>".format(line.strip()))

Import module containing input/output

I have a python module file (func.py) and a unit test file (f_test.py):
# func.py
def f(x):
return x + 1
x = input("Enter x: "))
print("f(x): " + str(f(x)))
and
# f_test.py
import unittest
from func import f
class MyTest(unittest.TestCase):
def test(self):
self.assertEqual(f(1), 2)
When I run f_test.py I expect the test suite to be executed (successfully).
Instead however I see the following input:
Finding files... done.
Importing test modules ... Enter x:
If I comment out the input/output lines from func.py then I get the expected behaviour.
How do I achieve it without modifying func.py?
When you import func, all the code inside is run. Running the definition def f(x)... is what creates the function f.
You can distinguish between importing and running a file by using if __name__=='__main__'.
For instance:
# func.py
def f(x):
return x + 1
if __name__ == '__main__':
x = input("Enter x: "))
print("f(x): " + str(f(x)))
When you run func.py, __name__=='__main__' will be true. When you import it, it will be false (__name__ will be 'func' instead).
Of course, the correct answer is to modify func.py. If you absolutely, positively, won't modify func.py, then you could redirect standard input:
# f_test.py
import os
import sys
import unittest
oldstdin, sys.stdin = sys.stdin, StringIO.StringIO('7')
try:
from func import f
finally:
sys.stdin = oldstdin
class MyTest(unittest.TestCase):
def test(self):
self.assertEqual(f(1), 2)
You need to add to the bottom of f_test.py:
if __name__ == '__main__':
unittest.main()
This way the tests will be executed when the file is run (I forget this more times than I would like to admit).

how to pass arguments to imported script in Python

I have a script (script1.py) of the following form:
#!/bin/python
import sys
def main():
print("number of command line options: {numberOfOptions}".format(numberOfOptions = len(sys.argv)))
print("list object of all command line options: {listOfOptions}".format(listOfOptions = sys.argv))
for i in range(0, len(sys.argv)):
print("option {i}: {option}".format(i = i, option = sys.argv[i]))
if __name__ == '__main__':
main()
I want to import this script in another script (script2.py) and pass to it some arguments. The script script2.py could look something like this:
import script1
listOfOptions = ['option1', 'option2']
#script1.main(listOfOptions) insert magic here
How could I pass the arguments defined in script2.py to the main function of script1.py as though they were command line options?
So, for example, would it be Pythonic to do something such as the following?:
import script1
import sys
sys.argv = ['option1', 'option2']
script1.main()
Separate command line parsing and called function
For reusability of your code, it is practical to keep the acting function separated from command line parsing
scrmodule.py
def fun(a, b):
# possibly do something here
return a + b
def main():
#process command line argumens
a = 1 #will be read from command line
b = 2 #will be read from command line
# call fun()
res = fun(a, b)
print "a", a
print "b", b
print "result is", res
if __name__ == "__main__":
main()
Reusing it from another place
from scrmodule import fun
print "1 + 2 = ", fun(1, 2)
# script1.py
#!/bin/python
import sys
#main function is expecting argument from test_passing_arg_to_module.py code.
def main(my_passing_arg):
print("number of command line options: {numberOfOptions}".format(numberOfOptions = len(sys.argv)))
print("list object of all command line options: {listOfOptions}".format(listOfOptions = my_passing_arg))
print(my_passing_arg)
if __name__ == '__main__':
main()
#test_passing_arg_to_module.py
import script1
my_passing_arg="Hello world"
#calling main() function from script1.py code.
#pass my_passinga_arg variable to main(my_passing_arg) function in scritp1.py.
script1.main(my_passing_arg)
##################
# Execute script
# $python3.7 test_passing_arg_to_module.py
# Results.
# number of command line options: 1
# list object of all command line options: Hello world
# Hello world

Categories