Script 'top.py' makes use of another script 'myScript.py' by passing an argument to it. Script 'myScript.py' contains:
if __name__ == "__main__":
...
sys.argv[1] = myObject
sys.argv[1].functionSpecificToMyObject() # works as expected
The use is achieved in 'top.py' by simply calling 'myScript.py' with an argument 'myHelperArgument'. However, in top.py I cannot make use 'myHelperArgument'. Is this possible? If not, any suggestion on a possible solution (given the intention above) is welcome. Of course, I prefer not to write to a file and read later.
I would recommend import. This is designed to use objects defined in one Python source file (module) in another.
Create two files in the same directory:
First file my_lib.py:
def my_func():
return 42
Second file my_script.py:
import my_lib
print(my_lib.my_func())
Now, python my_script.py in the same directory shows 42.
Related
I use argv to pass in unique arguments to python scripts while in terminal. Very useful when running same program on multiple unique files (this program parses an xml file for certain things). I have built three different programs that serve unique purposes.
I am aggregating my programs to one .py file so that I can use 'import' within the actual running instance of python and then go one by one through the files:
>>>import xml
>>>xml.a()
>>>xml.b()
>>>xml.c()
How can I pass arguments to these programs on the fly? I'm getting a syntax error when I place the arguments after calling the program in this way.
>>>xml.a() file1.xml file1.csv
^
>>>SyntaxError: invalid syntax
You pass arguments to functions in Python by placing them between the parentheses (these are called "parameters," see documentation). So you'll need to modify your functions so that they can take arguments (rather than read from sys.argv), and then run the function like:
my_library.py
def function1(filename):
print filename
Interpreter
>>> import my_library
>>> my_library.function1("file1.xml")
>>> file1.xml
If you want your function be able to process an indefinite number of arguments (as you can with sys.argv), you can use the * syntax at the end of your arguments list to catch the remaining parameters as a list (see documentation). For example:
my_library.py
def function1(*filenames):
for filename in filenames:
print filename
Interpreter
>>> import my_library
>>> my_library.function1("file1.xml", "file2.csv", "file3.xml")
file1.xml
file2.csv
file3.xml
I need to interpret few files (scripts) by embedded python interpreter concurrently (to be more detailed one script executes another script as Popen and my app intercepts it and executes it itself). I've found it's called sub-interpreter and i'm going to use it. But i've read sub-interpreter does not have sys.argv:
The new environment has no sys.argv variable
I need to pass argv anyway so how can i do it?
You might find it easier to modify each of the scripts follow the pattern:
def run(*posargs, **argdict):
"""
This does the work and can be called with:
import scriptname
scriptname.run(someargs)
"""
# Code goes here and uses posargs[n] where it would use sys.argv[n+1]
if __name__ == "__main__":
import sys
run(sys.argv[1:])
Then your main script can just call each of the subscripts in turn by simply calling the run method.
You can use environment variables. Have the parent set them by updating the dict os.environ if it's in Python, or setenv() if in C or C++ etc. Then the children can read os.environ to get whatever strings they need.
Essentially I have a script in one file which I would like to import and run as a function in another file. Here is the catch, the contents of the first file CANNOT be written as function definition it just needs to be a plain old script (I'm writing a simulator for my robotics kit so user experience is important). I have no idea how to go about this.
Adam
Anything can be written as a function.
If you additionally need the ability to call your script directly, you just use the __name__ == '__main__' trick:
def my_function():
... code goes here ...
if __name__ == '__main__':
my_function()
Now you can import my_function from the rest of your code, but still execute the file directly since the block at the end will call the function.
Assuming that the code in the file you need to import is a well bounded script - then you can read in as a text variable and use the "execfile" function to create a function from that script.
By well bounded I mean that you understand all the data it needs and you are able to provide all of it from your program.
An alternative would be to use the "system" call, or the subprocess module to call the script as if it was an external program (depending if you need the script output).
A final approach will be to use exec to create a function - see approach 3.
The approach you use determines what you need your other script to do ..
examples :
hello.py (your file you want to run, but can't change):
# Silly example to illustrate a script which does something.
fp = open("hello.txt", "a")
fp.write("Hello World !!!\n")
fp.close()
Three approaches to use hello.py without importing hello.py
import os
print "approach 1 - using system"
os.system("python hello.py")
print "approach 2 - using execfile"
execfile("hello.py", globals(), locals())
print "approach 3 - exec to create a function"
# read script into string and indent
with open("hello.py","r") as hfp:
hsrc = [" " + line for line in hfp]
# insert def line
hsrc.insert(0, "def func_hello():")
# execute our function definition
exec "\n".join( hsrc) in globals(), locals()
# you now have a function called func_hello, which you can call just like a normal function
func_hello()
func_hello()
print "My original script is still running"
Research is at the bottom, read before -1'ing... Thanks.
I have to write a Python script that runs SQL queries. I made a main class and called SQLQuery. Each SQLQuery instance represents a query. The script must be structured like this:
class SQLQuery(object):
def __init___(self, string_myQuery)...
instance1 = SQLQuery(SQLQuery1)...
instance2 = SQLQuery(SQLQuery2)...
As a user requirement, the instances must be in the same file as the class (so I can't just make each instance a main and execute that file separately), and each instance must be executed with Linux console commands. I can execute the entire script with a simple python SQLQuery.py but I need to execute each instance separately. The queries will be executed every day, automatically, so I don't need a terminal UI tree. It should be executed with a command similar to this:
python SQLQuery.py -inst1
will execute instance1.
python SQLQuery.py -inst2
will execute instance2.
I have researched how to execute Python scripts with Linux commands and most of the articles are about calling commands from the Python script. However, I found this article from the Python documentation. It suggests adding -m, so:
python SQLQuery.py -m inst1
This would let me set my main with a console command, but it doesn't work since the instances aren't modules. And since the instances must be in the same file as the class, I can't just import them as a module when I execute SQLQuery.py with a console command.
Ignoring all the irrelevancies, it sounds like your problem is that you have a bunch of global objects named instance1, instance2, instance3, etc., and you want to call some method on one of them based on a command-line parameter whose value will be similar to, but not identical to, the instance names.
That's probably not a good idea… but it's not that hard:
if __name__ == '__main__':
inst = sys.argv[1] # will be 'inst1', 'inst13', etc.
inst_number = inst[5:] # so '1', '13', etc.
inst_name = 'instance' + inst_number
instance = globals()[inst_name]
instance.execute()
A much better way to do the same thing is to put the instance globals into a list or dict that you can index.
For example, let's say instead of instance1, instance2, etc., you've got an instances dict, with instances['1'], instances[2], etc. Now instead of this:
inst_name = 'instance' + inst_number
instance = globals()[inst_name]
instance.execute()
… you just do this:
instances[inst_number].execute()
Also, instead of coming up with a command-line parameter that has extra stuff in it that you have to parse and throw away, and has no more meaning for a human reader than for your code, why not just take a number?
python myscript.py 12
Or, alternatively, use argparse to create an argument that can be used in all of the obvious ways:
python myscript.py --instance=12
python myscript.py --instance 12
python myscript.py -i12
python myscript.py -i 12
Either way, your code gets the string '12', which it can then use to look up the function, as above.
You have the wrong syntax for the -m option. Suppose you have the following file named foo.py:
import sys
print 'First arg is: ', sys.argv[1]
Then you would call it like this:
$ python -m foo bar
First arg is: bar
Note that the ".py" extension is omitted. You can then use the command line argument to decide which object to use or use the argparse or optparse module to handle the argument.
maybe the title is not very clear, let me elaborate.
I have a python script that open a ppm file , apply a chosen filter(rotations...) and create a new picture. until here everything work fine.
but I want to do the same thing through a linux console like:
ppmfilter.py ROTD /path/imageIn.ppm /path/imageOut.ppm
here ROTD is the name of the function that apply a rotation.
I don't know how to do this, I'm looking for a library that'll allow me to do this.
looking forward for your help.
P.S.: I'm using python 2.7
There is a relatively easy way:
You can determine the global names (functions, variables, etc.) with the use of 'globals()'. This gives you a dictionary of all global symbols. You'll just need to check the type (with type() and the module types) and if it's a function, you can call it with sys.argv:
import types
import sys
def ROTD(infile, outfile):
# do something
if __name__ == '__main__':
symbol = globals().get(sys.argv[1])
if hasattr(symbol, '__call__'):
symbol(*sys.argv[2:])
This will pass the program argument (excluding the filename and the command name) to the function.
EDIT: Please, don't forget the error handling. I omited it for reasons of clarity.
Use main() function:
def main()
# call your function here
if __name__ == "__main__":
main()
A nice way to do it would be to define a big dictionary {alias: function} inside your module. For instance:
actions = {
'ROTD': ROTD,
'REFL': reflect_image,
'INVT': invIm,
}
You get the idea. Then take the first command-line argument and interpret it as a key of this dictionary, applying actions[k] to the rest of the arguments.
You can define in your ppmfilter.py main section doing this:
if __name__ == "__main__":
import sys
ROTD(sys.argv[1], sys.argv[2]) # change according to the signature of the function
and call it: python ppmfilter.py file1 file2
You can also run python -c in the directory that contains you *.py file:
python -c "import ppmfilter; ppmfilter.ROTD('/path/to/file1', '/path/to/file2')"