python importlib.import_module requires explicitly imported module - python

i want to implement sort of plugin architecture that dynamically loads modules and calls a function from them
for instance plugin code looks like (in file "foo_func.py")
foo_local = []
def foo_add(a: int, b: int) -> int:
c = a + b
foo_local.append(c)
return c
def foo_print():
print(foo_local)
i need to support two plugins with the same code but with different memory state, so i created directory structure like this:
<ROOT_PROJECT>
app.py
bar/
apple/
foo/
foo_func.py
__init__.py
orange/
foo/
foo_func.py
__init__.py
code in "apple" and "orange" folders is the same.
then in app file i try to load modules and invoke functions from them
import importlib
from bar.apple.foo.foo_func import foo_add as apple_foo_add, foo_print as apple_foo_print
from bar.orange.foo.foo_func import foo_add as orange_foo_add, foo_print as orange_foo_print
apple = importlib.import_module('bar.apple.foo')
orange = importlib.import_module('bar.orange.foo')
apple_foo = getattr(apple, 'foo_func')
orange_foo = getattr(orange, 'foo_func')
apple_foo_add_my = getattr(apple_foo, 'foo_add')
apple_foo_print_my = getattr(apple_foo, 'foo_print')
apple_foo_add_my(1, 2)
apple_foo_print_my()
and this works fine, but you see these import lines at the top
from bar.apple.foo.foo_func import foo_add as apple_foo_add, foo_print as apple_foo_print
from bar.orange.foo.foo_func import foo_add as orange_foo_add, foo_print as orange_foo_print
they are not used in code (even pycharm complains about it)
but if i try to comment code and run it - then failure
AttributeError: module 'bar.apple.foo' has no attribute 'foo_func'
why ?
I suppose normal plugins should deal only with "importlib.import_module" and "getattr" and it must be enough ?
what is wrong here ?

Let's switch completely to direct imports for this explanation, because:
import something.whatever as name
is the same as:
name = importlib.import_module("something.whatever")
So let's rewrite your apple code:
apple = importlib.import_module('bar.apple.foo')
apple_foo = getattr(apple, 'foo_func')
will become:
import bar.apple.foo as apple
apple_foo = apple.foo_func
Now, the first line loads bar.apple.foo as a module. In case of packages, this means importing package's __init__.py code. And treating it as a module itself.
And what's the code in the package's init? Usually nothing! That's why the name lookup fails.
However, when you do any import my_package.whatever, the package gets its insides checked and the name becomes visible. You're basically pre-loading the module for interpreter to look at.
Why is pycharm giving you not used suggestion? Because it's not used as a variable anywhere. You're only using a side-effect + pycharm doesn't analyze strings for imports or attributes.
Visual example, with a part of standard library:
>>> import xml
>>> dir(xml)
['__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__']
>>> xml.etree
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: module 'xml' has no attribute 'etree'
>>>
>>> import xml.etree
>>> dir(xml)
['__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'etree']
And another example, what happens if there are multiple modules in the package:
>>> import dateutil
>>> dir(dateutil)
['__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__version__', '_version']
but:
>>> import dateutil.parser
>>> dir(dateutil)
['__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__version__', '_common', '_version', 'parser', 'relativedelta', 'tz']
All sub-modules are now visible and usable with their qualified name.
tl;dr: import my_package == only my_package/__init__.py is looked at. import my_package.whatever == python now knows it's a package and registers its insides, all modules of my_package are visible and usable.

Related

python not recognizing function within imported module

I am working in Jupyter notebook. I created a simple module called conv.py for converting miles to km. When I try to import this module in a separate code (in the same directory) the import seems to go successfully but it doesn't recognize either of the functions I defined in the 'conv' module.
I have imported os and os.getcwd() provides the correct folder for conv.py...
code for conv.py
in_n_ft = 12
ft_n_mile = 5280
m_n_km = 1000
cm_n_in = 2.54
cm_n_m = 100
mm_n_m = 1000
def ft_to_km(feet):
return feet*in_n_ft*cm_n_in/cm_n_m/m_n_km
print(ft_to_km(5280))
def mil_to_km(mile):
return mile*ft_n_mile*in_n_ft*cm_n_in/cm_n_m/m_n_km
print(mil_to_km(3.2))
Code for new module
import conv
km = conv.mil_to_km(5)
Error provided
AttributeError Traceback (most recent call last)
<ipython-input-111-bfd778724ae2> in <module>
3 import conv
4
----> 5 km = conv.mil_to_km(5)
AttributeError: module 'conv' has no attribute 'mil_to_km'
When I type
dir(conv)
I get
['__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__']
What am I doing wrong?
EDIT
I have also tried
from conv import mil_to_km
when I do that I get a different error
cannot import name 'mil_to_km' from 'conv' (C:\Users\223023441\Documents\python\conv.py)
I have also queried the module using:
from inspect import getmembers, isfunction
import conv
print(getmembers(conv, isfunction))
from here I get:
['__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__']
I am also unable to access any of the variables within the conv.py file after import... Am I doing something wrong when I save the py file? Jupyter makes ipynb as the common file, when I 'save as' to conv.py, is this screwing it up?
You should import from the module.
Try this:
from conv import mil_to_km
km = mil_to_km(5)
The reason is that when you import the module in that way, you are executing it.
In the way I shown, you are just importing the needed functions.
So the ultimate issue was the way I was saving the .py file... I was using the 'save as' command in jupyter notebook and typing 'conv.py' for my file save... This was showing up in the directory as a .py file, but my main file wasn't recognizing it properly. Once I downloaded the file as a .py file, cut from my downloads folder and pasted into my working directory everything worked...

Distinguish between imported and globally defined module attributes

How to distinguish between attributes defined at global level and those imported from other modules programatically? For instance, I want to know which module HIGHEST_PROTOCOL and MY_HIGHEST_PROTOCOL defined in mymod.py belong to.
Contents of mymod.py:
from pickle import HIGHEST_PROTOCOL
MY_HIGHEST_PROTOCOL = 123
Inspecting in IPython.
In [2]: import mymod
In [3]: dir(mymod)
Out[3]:
['HIGHEST_PROTOCOL',
'MY_HIGHEST_PROTOCOL',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__']

Calling dir function on a module

When I did a dir to find the list of methods in boltons I got the below output
>>> import boltons
>>> dir(boltons)
['__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__']
When I explicitly did
>>> from boltons.strutils import camel2under
>>> dir(boltons)
['__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__', 'strutils']
found that strutils getting added to attribute of boltons
Why is strutils not showing before explicit import?
From the docs on what dir does:
With an argument, attempt to return a list of valid attributes for
that object.
When we import the boltons package we can see that strutils is not an attribute on the boltons object. Therefore we do not expect it to show up in dir(boltons).
>>>import boltons
>>>getattr(boltons, 'strutils')
AttributeError: module 'boltons' has no attribute 'strutils'
The docs on importing submodules say:
For example, if package spam has a submodule foo, after importing spam.foo, spam will have an attribute foo which is bound to the submodule.
Importing a submodule creates an attribute on the package. In your example:
>>>import boltons
>>>getattr(boltons, 'strutils')
AttributeError: module 'boltons' has no attribute 'strutils'
>>>from boltons.strutils import camel2under
>>>getattr(boltons, 'strutils')
<module 'boltons.strutils' from '/usr/local/lib/python3.5/site-packages/boltons/strutils.py'>
Therefore in this case we do expect strutils to show up in dir(boltons)

Why does python put 'A' in global namespace when I import A.B.C

I am reading about how import works in python.
When I do:
import A.B.C
A, A.B, A.B.C are put in sys.modules. Expected.
A's __init__, A.B's __init__ get executed. Expected.
But here is a surprise: When I print globals(), only A is put into the namespace, while 'A.B.C' is not. I expect 'A.B.C' to be in global namespace.
And this means, I can access A.x defined in A's __init__.
Why is import implemented this way?
Only objects/names are put in globals namespace. A.B.C is not a valid name.
In your above case, the object is the module object for A , and its name is A .
In this particular case, if you do -
dir(A)
You would see B inside it, and that means its an attribute of the module object A . If you do -
hasattr(A,'B')
It would return True.
And in the same way if you do - dir(A.B) , you would be able to see C in it , and C is an attribute of A.B .
A Very simple example to show this -
My directory structur -
shared/
__init__.py
pkg/
__init__.py
b.py
Then in code I do -
>>> import shared.pkg.b
>>> dir(shared)
['__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'pkg']
>>> hasattr(shared,'pkg')
True
>>>
>>> dir(shared.pkg)
['__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'b']
>>> hasattr(shared.pkg,'b')
True
B and C are reachable through A.
eg.
import A.B.C
print(A.B.C)
If you want B and C to appear directly in your current namespace then do
from A import B
from A.B import C
print(B, C)

python : module attributes missing in python script but not in interpreter

i have installed a python package plivo using the sudo pip install plivo.
and interpreter i test it with some code like:
>>> import plivo
>>> p = plivo.RestAPI('xxx', 'yyy')
everything working fine in python interpreter.
exactly same code is not working in a python script test_plivio.py
giving error : AttributeError: 'module' object has no attribute 'RestAPI'
then i checked with dir()
in interpreter
>>> dir(plivo)
['Account', 'Application', 'Call', 'Carrier', 'Conference', 'ConferenceMember', 'EndPoint', 'Message', 'Number', 'PLIVO_VERSION', 'PlivoError', 'PlivoResponse', 'Pricing', 'Recording', 'RestAPI', 'SubAccount', 'XML', '__builtins__', '__doc__', '__file__', '__name__', '__package__', 'base64', 'hmac', 'json', 'requests', 'sha1', 'validate_signature']
RestAPI is there.
while in test_plivo.py dir(plivo) is like:
['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'main']
clearly dir(plivo) in script is missing RestAPI with other functions.
why is that behavior and how to resolve that ?
You are importing a different module; on your path you have a different plivo.py (or plivo.pyc cached bytecode) file.
Print out the __file__ attribute to see what is imported instead:
print plivo.__file__
and rename that or move it somewhere else.

Categories