Essentially I have a script in one file which I would like to import and run as a function in another file. Here is the catch, the contents of the first file CANNOT be written as function definition it just needs to be a plain old script (I'm writing a simulator for my robotics kit so user experience is important). I have no idea how to go about this.
Adam
Anything can be written as a function.
If you additionally need the ability to call your script directly, you just use the __name__ == '__main__' trick:
def my_function():
... code goes here ...
if __name__ == '__main__':
my_function()
Now you can import my_function from the rest of your code, but still execute the file directly since the block at the end will call the function.
Assuming that the code in the file you need to import is a well bounded script - then you can read in as a text variable and use the "execfile" function to create a function from that script.
By well bounded I mean that you understand all the data it needs and you are able to provide all of it from your program.
An alternative would be to use the "system" call, or the subprocess module to call the script as if it was an external program (depending if you need the script output).
A final approach will be to use exec to create a function - see approach 3.
The approach you use determines what you need your other script to do ..
examples :
hello.py (your file you want to run, but can't change):
# Silly example to illustrate a script which does something.
fp = open("hello.txt", "a")
fp.write("Hello World !!!\n")
fp.close()
Three approaches to use hello.py without importing hello.py
import os
print "approach 1 - using system"
os.system("python hello.py")
print "approach 2 - using execfile"
execfile("hello.py", globals(), locals())
print "approach 3 - exec to create a function"
# read script into string and indent
with open("hello.py","r") as hfp:
hsrc = [" " + line for line in hfp]
# insert def line
hsrc.insert(0, "def func_hello():")
# execute our function definition
exec "\n".join( hsrc) in globals(), locals()
# you now have a function called func_hello, which you can call just like a normal function
func_hello()
func_hello()
print "My original script is still running"
Related
I'm working on cloning a Virtual Machine (VM) in vCenter environment using this code. It takes command line arguments for name of the VM, template, datastore, etc. (e.g. $ clone_vm.py -s <host_name> -p < password > -nossl ....)
I have another Python file where I've been able to list the Datastore volumes in descending order of free_storage. I have stored the datastore with maximum available storage in a variable ds_max. (Let's call this ds_info.py)
I would like to use ds_max variable from ds_info.py as a command line argument for datastore command line argument in clone_vm.py.
I tried importing the os module in ds_info.py and running os.system(python clone_vm.py ....arguments...) but it did not take the ds_max variable as an argument.
I'm new to coding and am not confident to change the clone_vm.py to take in the Datastore with maximum free storage.
Thank you for taking the time to read through this.
I suspect there is something wrong in your os.system call, but you don't provide it, so I can't check.
Generally it is a good idea to use the current paradigm, and the received wisdom (TM) is that we use subprocess. See the docs, but the basic pattern is:
from subprocess import run
cmd = ["mycmd", "--arg1", "--arg2", "val_for_arg2"]
run(cmd)
Since this is just a list, you can easily drop arguments into it:
var = "hello"
cmd = ["echo", var]
run(cmd)
However, if your other command is in fact a python script it is more normal to refactor your script so that the main functionality is wrapped in a function, called main by convention:
# script 2
...
def main(arg1, arg2, arg3):
do_the_work
if __name__ == "__main__":
args = get_sys_args() # dummy fn
main(*args)
Then you can simply import script2 from script1 and run the code directly:
# script 1
from script2 import main
args = get_args() # dummy fn
main(*args)
This is 'better' as it doesn't involve spawning a whole new python process just to run python code, and it generally results in neater code. But nothing stops you calling a python script the same way you'd call anything else.
I have a Python program I'm building that can be run in either of 2 ways: the first is to call python main.py which prompts the user for input in a friendly manner and then runs the user input through the program. The other way is to call python batch.py -file- which will pass over all the friendly input gathering and run an entire file's worth of input through the program in a single go.
The problem is that when I run batch.py, it imports some variables/methods/etc from main.py, and when it runs this code:
import main
at the first line of the program, it immediately errors because it tries to run the code in main.py.
How can I stop Python from running the code contained in the main module which I'm importing?
Because this is just how Python works - keywords such as class and def are not declarations. Instead, they are real live statements which are executed. If they were not executed your module would be empty.
The idiomatic approach is:
# stuff to run always here such as class/def
def main():
pass
if __name__ == "__main__":
# stuff only to run when not called via 'import' here
main()
It does require source control over the module being imported, however.
Due to the way Python works, it is necessary for it to run your modules when it imports them.
To prevent code in the module from being executed when imported, but only when run directly, you can guard it with this if:
if __name__ == "__main__":
# this won't be run when imported
You may want to put this code in a main() method, so that you can either execute the file directly, or import the module and call the main(). For example, assume this is in the file foo.py.
def main():
print "Hello World"
if __name__ == "__main__":
main()
This program can be run either by going python foo.py, or from another Python script:
import foo
...
foo.main()
Use the if __name__ == '__main__' idiom -- __name__ is a special variable whose value is '__main__' if the module is being run as a script, and the module name if it's imported. So you'd do something like
# imports
# class/function definitions
if __name__ == '__main__':
# code here will only run when you invoke 'python main.py'
Unfortunately, you don't. That is part of how the import syntax works and it is important that it does so -- remember def is actually something executed, if Python did not execute the import, you'd be, well, stuck without functions.
Since you probably have access to the file, though, you might be able to look and see what causes the error. It might be possible to modify your environment to prevent the error from happening.
Put the code inside a function and it won't run until you call the function. You should have a main function in your main.py. with the statement:
if __name__ == '__main__':
main()
Then, if you call python main.py the main() function will run. If you import main.py, it will not. Also, you should probably rename main.py to something else for clarity's sake.
There was a Python enhancement proposal PEP 299 which aimed to replace if __name__ == '__main__': idiom with def __main__:, but it was rejected. It's still a good read to know what to keep in mind when using if __name__ = '__main__':.
You may write your "main.py" like this:
#!/usr/bin/env python
__all__=["somevar", "do_something"]
somevar=""
def do_something():
pass #blahblah
if __name__=="__main__":
do_something()
I did a simple test:
#test.py
x = 1
print("1, has it been executed?")
def t1():
print("hello")
print("2, has it been executed?")
def t2():
print("world")
print("3, has it been executed?")
def main():
print("Hello World")
print("4, has it been executed?")
print("5, has it been executed?")
print(x)
# while True:
# t2()
if x == 1:
print("6, has it been executed?")
#test2.py
import test
When executing or running test2.py, the running result:
1, has it been executed?
5, has it been executed?
1
6, has it been executed?
Conclusion: When the imported module does not add if __name__=="__main__":, the current module is run, The code in the imported module that is not in the function is executed sequentially, and the code in the function is not executed when it is not called.
in addition:
def main():
# Put all your code you need to execute directly when this script run directly.
pass
if __name__ == '__main__':
main()
else:
# Put functions you need to be executed only whenever imported
A minor error that could happen (at least it happened to me), especially when distributing python scripts/functions that carry out a complete analysis, was to call the function directly at the end of the function .py file.
The only things a user needed to modify were the input files and parameters.
Doing so when you import you'll get the function running immediately. For proper behavior, you simply need to remove the inside call to the function and reserve it for the real calling file/function/portion of code
Another option is to use a binary environment variable, e.g. lets call it 'run_code'. If run_code = 0 (False) structure main.py to bypass the code (but the temporarily bypassed function will still be imported as a module). Later when you are ready to use the imported function (now a module) set the environment variable run_code = 1 (True). Use the os.environ command to set and retrieve the binary variable, but be sure to convert it to an integer when retrieving (or restructure the if statement to read a string value),
in main.py:
import os
#set environment variable to 0 (False):
os.environ['run_code'] = '0'
def binary_module():
#retrieve environment variable, convert to integer
run_code_val = int(os.environ['run_code'] )
if run_code_val == 0:
print('nope. not doing it.')
if run_code_val == 1:
print('executing code...')
# [do something]
...in whatever script is loading main.py:
import os,main
main.binary_module()
OUTPUT: nope. not doing it.
# now flip the on switch!
os.environ['run_code'] = '1'
main.binary_module()
OUTPUT: executing code...
*Note: The above code presumes main.py and whatever script imports it exist in the same directory.
Although you cannot use import without running the code; there is quite a swift way in which you can input your variables; by using numpy.savez, which stores variables as numpy arrays in a .npz file. Afterwards you can load the variables using numpy.load.
See a full description in the scipy documentation
Please note this is only the case for variables and arrays of variable, and not for methods, etc.
Try just importing the functions needed from main.py? So,
from main import SomeFunction
It could be that you've named a function in batch.py the same as one in main.py, and when you import main.py the program runs the main.py function instead of the batch.py function; doing the above should fix that. I hope.
I got a Python script (test1.py) I need to run with a bat file (render.bat).
Question 1:
First, I had a definition in my test1.py and it always failed to run, nothing happened. Can someone kindly explain why?
import os
def test01 :
os.system('explorer')
and in the bat file:
python c:/test01.py
but as soon as I removed the def it worked. I just want to learn why this happened.
Question 2:
How can I take "render" string from render.bat as a string input for my python script so I can run something like :
import os
def test1(input) :
os.system("explorer " + input)
So the "input" is taken from the .BAT filename?
Functions don't actually do anything unless you call them. Try putting test01() at the end of the script.
%0 will give you the full name of the batch file called, including the .bat. Stripping it will probably be easier in Python than in the batch file.
Question1: Keyword def in python defines a function. However, to use a function you have to explicitly call it, i.e.
import os
def test01(): # do not forget ()
os.system('explorer')
test01() # call the function
1) You have to actually call the functions to achieve your task.
2) %0 refers to the running script. Therefor create a test.bat file like
# echo off
echo %0
Output = test.bat
You can strip the .bat extension from the output.
I have a Python script and I was wondering how I can make it executable; in other words how can I run it by using a shell like bash.
I know the first thing is to stick on the first line #! /usr/bin/env python but then do I need for example the functions to be in a specific order (i.e., the main one at the top or the bottom). What's more do I need to keep the extension .py for my python file (can I just call the function Dosomething?).
To be short, could you provide a simple guide, the important points someone has to take into account to make a Python file executable?
This is how I make an executable script. It doesn't take eggs or anything like that into account. It's just a simple script that I want to be able to execute. I'm assuming you are using linux.
#! /usr/bin/env python
import sys
def main():
#
# Do something ... Whatever processing you need to do, make it happen here.
# Don't shove everything into main, break it up into testable functions!
#
# Whatever this function returns, is what the exit code of the interpreter,
# i.e. your script, will be. Because main is called by sys.exit(), it will
# behave differently depending on what you return.
#
# So, if you return None, 0 is returned. If you return integer, that
# return code is used. Anything else is printed to the console and 1 (error)
# is returned.
#
if an_error_occurred:
return 'I\'m returning a string, it will be printed and 1 returned'
# Otherwise 0, success is returned.
return 0
# This is true if the script is run by the interpreter, not imported by another
# module.
if __name__ == '__main__':
# main should return 0 for success, something else (usually 1) for error.
sys.exit(main())
Now, if you're permissions are set correctly, you can execute this script.
One thing to realize is as your script is processed each line is executed in the interpreter. This is true, regardless of how the processor "gets it". That is importing a script as a module and executing it as a script essentially both work the same, in that they both execute each line of the module.
Once you realize your script is simply executing as it runs, you realize that the order of functions don't matter. A function declaration is a function declaration. It's when you call the function that matters.
So, in general, the layout of your script looks like this
def func1():
pass
def func2():
pass
def main():
return 0
if __name__ == '__main__':
sys.exit(main())
You create the functions you want to use first, then you use them. Hope it helps.
Delete the first space. That is,
#!/usr/bin/env python
Should be the very first line of your file. Then, make sure you make set the permisions for the the file to executable with:
chmod u+x your_script.py
Python scripts execute in sequential order. If you have a file filled with functions, common practice is to have something that looks like this at the very end of your file:
if __name__ == '__main__':
main()
Where main() starts execution of your script. The reason we do it this way, instead of a bare call to main() is that this allows you to make your script act like a module without any modification; if you just had a single line that called main(), your module would would execute the script's main. The if statement just checks if your script is running in the __main__ namespace, i.e., it is running as a script.
The only thing (like you said it) is to include:
#! /bin/env python
on the first line. And is not even mandatory, but recommended.
After that, you can just call it writing:
python [filename].py
in a terminal or in a bash file.
You'll also have to give it execution rights:
chmod u+x yourfile.py
Your code should follow the template
# any functions I want to define, and will be accessible when imported as module
# or run from command line
...
if __name__ == '__main__':
# things I want to do only when I run it from the command line
...
If you want to be able to run it without having to use python fileName.py but rather just ./fileName.py then you will want to make the first line of your file
#!/usr/bin/env python
And make the file executable by the user at least
chmod u+x fileName.py
If you do not add a .py extension to your file then it will still be runnable from the command line ... but not importable by other modules.
Place #!/usr/bin/python in the first line
You can name your python script anything, like: myPythonScript (No, you do not need to keep .py extension)
chmod +x myPythonScript
Run it: ./myPythonScript
Example: myPythonScript
#!/usr/bin/python
print "hello, world!"
You need to add sha bang as you described, e.g.
#!/usr/bin/python
or
#!/usr/bin/env python
as the first line in your file and you need to make it executable by running
chmod +x Dosomething
You do not need to do anything else, in particular file name may be anything including Dosomething. Your PATH probably doesn't include the directory where the file resides, so you should run it like this (assuming your current working directory is where the file is):
./Dosomething
Imagine a python script that will take a long time to run, what will happen if I modify it while it's running? Will the result be different?
Nothing, because Python precompiles your script into a PYC file and launches that.
However, if some kind of exception occurs, you may get a slightly misleading explanation, because line X may have different code than before you started the script.
When you run a python program and the interpreter is started up, the first thing that happens is the following:
the module sys and builtins is initialized
the __main__ module is initialized, which is the file you gave as an argument to the interpreter; this causes your code to execute
When a module is initialized, it's code is run, defining classes, variables, and functions in the process. The first step of your module (i.e. main file) will probably be to import other modules, which will again be initialized in just the same way; their resulting namespaces are then made available for your module to use. The result of an importing process is in part a module (python-) object in memory. This object does have fields that point to the .py and .pyc content, but these are not evaluated anymore: module objects are cached and their source never run twice. Hence, modifying the module afterwards on disk has no effect on the execution. It can have an effect when the source is read for introspective purposes, such as when exceptions are thrown, or via the module inspect.
This is why the check if __name__ == "__main__" is necessary when adding code that is not intended to run when the module is imported. Running the file as main is equivalent to that file being imported, with the exception of __name__ having a different value.
Sources:
What happens when a module is imported: The import system
What happens when the interpreter starts: Top Level Components
What's the __main__ module: __main__- Top-level code environment
This is a fun question. The answer is that "it depends".
Consider the following code:
"Example script showing bad things you can do with python."
import os
print('this is a good script')
with open(__file__, 'w') as fdesc:
fdesc.write('print("this is a bad script")')
import bad
Try saving the above as "/tmp/bad.py" then do "cd /tmp" and finally "python3 bad.py" and see what happens.
On my ubuntu 20 system I see the output:
this is a good script
this is a bad script
So again, the answer to your question is "it depends". If you don't do anything funky then the script is in memory and you are fine. But python is a pretty dynamic language so there are a variety of ways to modify your "script" and have it affect the output.
If you aren't trying to do anything funky, then probably one of the things to watch out for are imports inside functions.
Below is another example which illustrates the idea (save as "/tmp/modify.py" and do "cd /tmp" and then "python3 modify.py" to run). The fiddle function defined below simulates you modifying the script while it is running (if desired, you could remove the fiddle function, put in a time.sleep(300) at the second to last line, and modify the file yourself).
The point is that since the show function is doing an import inside the function instead of at the top of the module, the import won't happen until the function is called. If you have modified the script before you call show, then your modified version of the script will be used.
If you are seeing surprising or unexpected behavior from modifying a running script, I would suggest looking for import statements inside functions. There are sometimes good reasons to do that sort of thing so you will see it in people's code as well as some libraries from time to time.
Below is the demonstration of how an import inside a function can cause strange effects. You can try this as is vs commenting out the call to the fiddle function to see the effect of modifying a script while it is running.
"Example showing import in a function"
import time
def yell(msg):
"Yell a msg"
return f'#{msg}#'
def show(msg):
"Print a message nicely"
import modify
print(modify.yell(msg))
def fiddle():
orig = open(__file__).read()
with open(__file__, 'w') as fdesc:
modified = orig.replace('{' + 'msg' + '}', '{msg.upper()}')
fdesc.write(modified)
fiddle()
show('What do you think?')
No, the result will not reflect the changes once saved. The result will not change when running regular python files. You will have to save your changes and re-run your program.
If you run the following script:
from time import sleep
print("Printing hello world in: ")
for i in range(10, 0, -1):
print(f"{i}...")
sleep(1)
print("Hello World!")
Then change "Hello World!" to "Hello StackOverflow!" while it's counting down, it will still output "Hello World".
Nothing, as this answer. Besides, I did experiment when multiprocessing is involved. Save the script below as x.py:
import multiprocessing
import time
def f(x):
print(x)
time.sleep(10)
if __name__ == '__main__':
with multiprocessing.Pool(2) as pool:
for _ in pool.imap(f, ['hello'] * 5):
pass
After python3 x.py and after the first two 'hello' being printed out, I modified ['hello'] to ['world'] and observed what happend. Nothing interesting happened. The result was still:
hello
hello
hello
hello
hello
It happens nothing. Once the script is loaded in memory and running it will keep like this.
An "auto-reloading" feature can be implemented anyway in your code, like Flask and other frameworks does.
This is slightly different from what you describe in your question, but it works:
my_string = "Hello World!"
line = input(">>> ")
exec(line)
print(my_string)
Test run:
>>> print("Hey")
Hey
Hello World!
>>> my_string = "Goodbye, World"
Goodbye, World
See, you can change the behavior of your "loaded" code dynamically.
depending. if a python script links to other modified file, then will load newer version ofcourse. but if source doesnt point to any other file it'll just run all script from cache as long as its run. changes will be visible next time...
and if about auto-applying changes when they're made - yes, #pcbacterio was correct. its possible to do thar but script which does it just remembers last action/thing what was doing and checks when the file is modified to rerun it (so its almost invisible)
=]