I have several different modules, and I need to import one of them depending on different situations, for example:
if check_situation() == 1:
import helper_1 as helper
elif check_situation() == 2:
import helper_2 as helper
elif ...
...
else:
import helper_0 as helper
these helpers contain same dictionaries dict01, dict02, dict03...but have different values to be called in different situations.
But this has some problems:
import sentences are all written in the top of a file, but check_situation() function here needs prerequisites so that it's now far from top.
more than 1 file needs this helper module, so it's hard and ugly to use this kind of import.
So, how to re-arrange these helpers?
Firstly, there is no strict requirement that import statements need to be at the top of a file, it is more a style guide thing.
Now, importlib and a dict can be used to replace your if/elif chain:
import importlib
d = {1: 'helper_1', 2: 'helper_2'}
helper = importlib.import_module(d.get(check_situation(), 'helper_0'))
But it's just syntactic sugar really, I suspect you have bigger fish to fry. It sounds like you need to reconsider your data structures, and redesign code.
Anytime you have variables named like dict01, dict02, dict03 it is a sure sign that you need to gear up a level, and have some container of dicts e.g. a list of them. Same goes for your 'helper' module names ending with digits.
You can use __import__(), it accepts a string and returns that module:
helper=__import__("helper_{0}".format(check_situation()))
example :
In [10]: mod=__import__("{0}math".format(raw_input("enter 'c' or '': ")))
enter 'c' or '': c #imports cmath
In [11]: mod.__file__
Out[11]: '/usr/local/lib/python2.7/lib-dynload/cmath.so'
In [12]: mod=__import__("{0}math".format(raw_input("enter 'c' or '': ")))
enter 'c' or '':
In [13]: mod.__file__
Out[13]: '/usr/local/lib/python2.7/lib-dynload/math.so'
As pointed out by #wim and from python3.x docs on __import__():
Import a module. Because this function is meant for use by the Python
interpreter and not for general use it is better to use
importlib.import_module() to programmatically import a module.
Solve it myself, refered to #Michael Scott Cuthbert
# re_direct.py
import this_module
import that_module
wanted = None
# caller.py
import re-direct
'''
many prerequisites
'''
def imp_now(case):
import re_direct
if case1:
re_direct.wanted = re_direct.this_module
elif case2:
re_direct.wanted = re_direct.that_module
then, if in caller, I call that imp_now, then wanted, no matter called in caller file or other file calling this wanted, will all be re-directed to this_or_that_module.
also, for I import re_direct only in a function, so you will not see this module anywhere else, but only see wanted.
I agree that the approaches given in the other answers are closer to the main question posed in your title, but if the overhead on importing modules is low (as importing a couple of dictionaries likely is) and there are no side-effects to importing, in this case, you may be better off importing them all and selecting the proper dictionary later in the modules:
import helper_0
import helper_1
...
helperList = [helper_0, helper_1, helper_2...]
...
helper = helperList[check_situation()]
Related
I want to neatly import all variables in a file that have '_COLUMN' in their name. I do this because there are 20+ variables and the normal import statement would be huge. I also want the variables to be directly accessible e.g. TIME_COLUMN and not earthquakes.tools.TIME_COLUMN
The following code works:
import earthquakes.tools
for item in dir(earthquakes.tools):
if '_COLUMN' in item:
exec(f'from earthquakes.tools import {item}')
Is this considered Pythonic ? If not, is there a better way or should this not even be done ?
For information, I have tried searching for regex in import statements and other solutions but did not find anything significant.
Please do not mention from ... import * as it is not considered Pythonic
All the _COLUMN-prefixed variables should probably be a dict in the first place, but if you can't do that in earthquake.tools yourself, you can build a local one using getattr.
import earthquake.tools
column_names = ["foo", "bar", "baz", ...]
columns = {k: getattr(earthquake.tools, f'{k}_COLUMN') for k in column_names}
Don't use dir to "infer" the columns names. No one reading the scipt will know what variables are defined without knowing what earthquake.tools defines in the first place, so be explicit and list the names you expect to be using. (There is no point creating a variable you will never actually use.)
you can use this trick
from earthquakes.tools import *
You can override the __all__ variable of the module:
tools.py
__all__ == ['var1_COLUMN', 'var2_COLUMN', 'var3_COLUMN']
main.py:
from earthquakes.tools import *
To know more: Importing * From a Package
This question already has answers here:
Python, doing conditional imports the right way
(4 answers)
Closed last month.
I'm new to conditional importing in Python, and am considering two approaches for my module design. I'd appreciate input on why I might want to go with one vs. the other (or if a better alternative exists).
The problem
I have a program that will need to call structurally identical but distinct modules under different conditions. These modules all have the same functions, inputs, outputs, etc., the only difference is in what they do within their functions. For example,
# module_A.py
def get_the_thing(input):
# do the thing specific to module A
return thing
# module_B.py
def get_the_thing(input):
# do the thing specific to module B
return thing
Option 1
Based on an input value, I would just conditionally import the appropriate module, in line with this answer.
if val == 'A':
import module_A
if val == 'B':
import module_B
Option 2
I use the input variable to generate the module name as a string, then I call the function from the correct module based on that string using this method. I believe this requires me to import all the modules first.
import module_A
import module_B
in_var = get_input() # Say my input variable is 'A', meaning use Module A
module_nm = 'module_' + in_var
function_nm = 'get_the_thing'
getattr(globals()[module_nm], function_nm)(my_args)
The idea is this would call module_A.get_the_thing() by generating the module and function names at runtime. This is a frivolous example for only one function call, but in my actual case I'd be working with a list of functions, just wanted to keep things simple.
Any thoughts on whether either design is better, or if something superior exists to these two? Would appreciate any reasons why. Of course, A is more concise and probably more intuitive, but wasn't sure this necessarily equated to good design or differences in performance.
I'd go with Option 1. It's significantly neater, and you aren't needing to fiddle around with strings to do lookups. Dealing with strings, at the very least, will complicate refactoring. If you ever change any of the names involved, you must remember to update the strings as well; especially since even smart IDEs won't be able to help you here with typical shift+F6 renaming. The less places that you have difficult to maintain code like that, the better.
I'd make a minor change to 1 though. With how you have it now, each use of the module will still require using a qualified name, like module_A.do_thing(). That means whenever you want to call a function, you'll need to first figure out which was imported in the first place, which leads to more messy code. I'd import them under a common name:
if val == 'A':
import module_A as my_module
if val == 'B':
import module_B as my_module
. . .
my_module.do_thing() # The exact function called will depend on which module was imported as my_module
You could also, as suggested in the comments, use a wildcard import to avoid needing to use a name for the module:
if val == 'A':
from module_A import *
if val == 'B':
from module_B import *
. . .
do_thing()
But this is discouraged by PEP8:
Wildcard imports (from <module> import *) should be avoided, as they make it unclear which names are present in the namespace, confusing both readers and many automated tools.
It also pollutes the namespace that you're importing into, making it easier to accidentally shadow a name from the imported file.
I am in the habit of using raw_input(...) for certain debugging. However, in python3 this has changed to input(...). Is there a way to define an alias at the top of my project, such as:
# __init__.py
raw_input = input
I tried the above, but it only worked in the file I added it to, and not any other files in that directory. I'd like this to work basically in every file within my python repository.
You can define all aliases in a separate file (e.g. aliases.py) then import said file where needed (i.e. import aliases).
The con with this method that you'll be referencing the alias through aliases.alias unless you make the import stricter (i.e. from aliases import raw_input) or if you don't care about avoiding a wildcard import (i.e. from aliases import *).
Additionally, if you don't mind another import in the aliases file you can use the builtins namespace:
import builtins
builtins.raw_input = input
You still have to define all aliases separate file (e.g. aliases.py) then import said file where needed (i.e. import aliases) but the advantage of using the builtins namespace is that you can use that import exactly as given.
You can do it by creating a module for creating the renaming function and then importing it to every file you want to like this:
First the module function declaration in alias.py
def raw_input(a):
return input(a)
Secondly, import to another file:
from alias import raw_input
x = raw_input("hello world")
print(x)
Sadly, you will have to make the import of the module to every file you want to use the renamed function.
Hope it works for you!
Put this at the top, and you will get exactly what you want.
import builtins
builtins.raw_input = builtins.input
It is guaranteed to work, but generally considered a bad practice (everybody will be confused with where is that raw_input defined)
I know that from module import * will import all the functions in current namespace but it is a bad practice. I want to use two functions directly and use module.function when I have to use any other function from the module. What I am doing currently is:
import module
from module import func1, func2
# DO REST OF MY STUFF
Is it a good practice? Does the order of first two statements matter?
Is there a better way using which I can use these two functions directly and use rest of the functions as usual with the module's name prepended to them?
Using just import module results in very long statements with a lot of repetition if I use the same function from the given module five times in a single statement. That's what I want to avoid.
The order doesn't matter and it's not a pythonic way. When you import the module there is no need to import some of its functions separately again. If you are not sure how many of the functions you might need to use just import the module and access to the functions on demand with a simple reference.
# The only import you need
import module
# Use module.funcX when you need any of its functions
After all, if you want to use some of your functions (much) more than the others, as the cost of attribute access is greater than importing the functions separately, you better to import them as you've done.
And still, the order doesn't matter. You can do:
import module
from module import func1, func2
For more info read the documentation https://www.python.org/dev/peps/pep-0008/#imports
It is not good to do (may be opinion based):
import module
from module import func1, func2 # `func1` and `func2` are already part of module
Because you already hold a reference to module.
If I were you, I would import it in the form of import module. Since your issue is that module.func1() becomes too long. I may import the module and use as for creating a alias for the name. For example:
import module as mo
# ^ for illustration purpose. Even the name of
# your actual module wont be `module`.
# Alias should also be self-explanatory
# For example:
import database_manager as db_manager
Now I may access the functions as:
mo.func1()
mo.func2()
Edit: Based on the edit in actual question
If your are calling same function in the same line, there is possibility that your are already doing some thing wrong. It will be great if you can share what your that function does.
For example: Want to the rertun value of those functions to be passed as argument to another function? as:
test_func(mo.func1(x), mo.func1(y). mo.func1(z))
could be done as:
params_list = [x, y, z]
func_list = [mo.func1(param) for param in params_list]
test_func(*func_list)
This question was marked as duplicate. However the duplicate questions deals with modules while my question is asking for how to import parts of module.
I know that I can import certain portions of the module by using from mymodule import myfunction. What if want to import several things, but I need to list them as strings. For example:
import_things = ['thing1', 'thing2', 'thing2']
from mymodule import import_things
I did ask this question, but it looks like I need to use the trick (if there is one) above for my code as well.
Any help is appreciated.
import importlib
base_module = importlib.import_module('mymodule')
imported_things = {thing: getattr(base_module, thing) for thing in import_things}
imported_things['thing1']() # Function call
If you want to be able to use the things which have been imported globally, then do:
globals().update(imported_things)
thing1() # function call
To remove the imports, you can do
del thing1
or
del globals()['thing1']
Another useful operation you can do is to reload a module