Command line options and importing functions - python

I'm importing a function from a python module (using from python_file import function), then making use of that function in my system.
My problem right now is that I want the user to specify the python file and function via the commandline using the argparse module. But I am not sure how to do this. Please can someone explain how to do this?

The use of argparse is straightforward.
The trick is to import a module and one of its functions, both being provided as strings.
import argparse
parser = argparse.ArgumentParser(description='Import stuff')
parser.add_argument('--module')
parser.add_argument('--function')
args = parser.parse_args()
module = __import__(args.module)
function = getattr(module, args.function)

Related

What can modules provide us that functions can't in python?

For example I have a module names my_mod.py and the code is:-
def get_sum(int1, int2):
a = int(int1) + int(int2)
print(a)
And I have the same function in the python file. my_python_file.py:-
def get_sum(int1, int2):
a = int(int1) + int(int2)
print(a)
get_sum(12, 43)
So what is the main difference between modules and functions? Not talking about my example code.
Main Question:-
Can anyone give me an example about what can modules do that functions can't?
Thank you!
Modules are libraries of function(s).
A module is a file.
my_mod.py
A function is a code.
my_func()
You import a module.
import my_mod
From a module you can import a function.
from my_mod import my_func
You can pass variables in a function, you will not pass a variable in a module.
from my_mod import my_func
my_func(myvar)
You can install a module (if published here on PIP)
pip install my_mod
You can not install a function. The function is defined in your module.
#my_mod.py
def my_func(myvar):
return myvar
When you are going to use the same code multiple times in the same script, you make it into a function to reduce redundancy right? Similarly if you are going to use the same code in multiple different scripts, instead of writing the same function again and again in all the scripts, we write it into a module and we can import and use the function whenever needed. This is a simplified explanation based on how you have used modules alone.
I think in general the Idea with modules, is that a person would write a module, (collection of code for other people to access). It is perhaps a collection of related code to be used as a tool.

Accessing a Module Imported from Another Module's Function

The Issue:
I want to use a module that has been imported from within a different module with it's function.
What I want to achieve:
main.py
import differentFile
print (differentFile.functionName.os.getcwd())
differentFile.py
def functionName():
import os
What I have tried:
Pretty much the above, but it doesn't work as functionName as no function os.
I have managed to achieve the above without using the function, but I need to use the function.
There is no reason to do this, you can simply import os in main

Passing arguments to python module

I want to pass arguments to the python module to help me decide whether or not to execute some part of the module initialisation code.
Suppose I have a python module named my_module
import sys
flag = sys.argv[1]
if (flag):
# Do Some thing
else:
# Do something else
def module_hello ():
print "hello"
However, I don't want a user script to interfere with the arguments. It has to be purely based on parameters passed while spawning. In the environment where this will be used, I control the spawn of the script. But the script is provided by user of the module
Say a user writes script which imports this module
sys.argv[1] = "Change to something unpleasant"
import my_module
I don't want user to have control over sys.argv. The CLI arguments passed to the script should go to the module unharmed.
Is there a way to achieve this?
If you want to set some global values for a module, you should probably consider encapsulating it in a class, or setting them by calling an intialisation function, so you can pass the parameters like that.
main.py:
import my_module
my_module.init('some values')
mymodule.py:
VALUES = None
function init(values):
global VALUES
VALUES = values
But why not simply declare some variables in the module and just set the value when you load it?
main.py:
import my_module
my_module.values = 'some values'
mymodule.py:
values = None
Or if you just want to read the arguments, it's like any other script:
main.py:
import my_module
mymodule.py:
import sys
values = sys.argv[1]
Of course you can get as fancy as you like, read https://docs.python.org/3/library/argparse.html
So, try to read arguments at your module.
my-module.py
import sys
# Assign my module properties
is_debug = arg[1]

Python: How to Call Module from Other Path With __name__ == '__main__'

As described in this answer how to import module one can import a module located in another path this way:
import sys
sys.path.append('PathToModule')
import models.user
My question is:
How can I execute this other module (and also pass parameters to it), if this other module is setup this way:
if __name__ == '__main__':
do_something()
and do_something() uses argparse.ArgumentParser to work with the parameters supplied?
I ADDED THE FOLLOWING AFTER THE FIRST QUESTIONS/COMMENTS CAME UP
I am able to pass the parameters via
sys.argv[1:] = [
"--param1", "123",
"--param2", "456",
"--param3", "111"
]
so this topic is already covered.
Why do I want to call another module with parameters?
I would like to be able to do a kind of a small regression test for another project. I would like to get this other project via a git clone and have different versions locally available, that I can debug, too, if needed.
But I do not want to be involved too much in that other project (so that forking does not make sense).
AND SO MY REMAINING QUESTION IS
How can I tweak the contents of __name__ when calling the other module?
There are multiple ways to approach this problem.
If the module you want to import is well-written, it should have separate functions for parsing the command line arguments and for actually doing work. It should look something like this:
def main(arg1, arg2):
pass # do something
def parse_args():
parser = argparse.ArgumentParser()
... # lots of code
return vars(parser.parse_args())
if __name__ == '__main__':
args = parse_args()
main(**args)
In this case, you would simply import the module and then call its main function with the correct arguments:
import yourModule
yourModule.main('foo', 'bar')
This is the optimal solution.
If the module doesn't define such a main function, you can manually set sys.argv and use runpy.run_module to execute the module:
import runpy
import sys
sys.argv[1:] = ['foo', 'bar']
runpy.run_module('yourModule', run_name='__main__', alter_sys=True)
Note that this only executes the module; it doesn't import it. (I.e. the module won't be added to sys.modules and you don't get a module object that you can interact with.)

dynamically import submodules in python

Suppose I provide a module in the command line and want to import it using the "imp" module:
$ foo.py mod.a.b.c
What is the proper way of doing this?
Split the "mod.a.b.c" and add each path? The behaviour of "imp" does not seem to be parallel to "import".
Given a module path as a string (modulename), you can import it with
module = __import__(modulename,fromlist='.')
Note that __import__('mod.a.b.c') returns the package mod, while __import__('mod.a.b.c',fromlist='.') returns the module mod.a.b.c.

Categories