How to import submodule into parent namespace? - python

I have a directory structure like this:
foo/
__init__.py
printsum.py
'''
def printsum(x,y):
print("sum is " + str(x+y))
'''
bar.py
'''
def printsum(x,y):
print("sum is " + str(x+y))
'''
example.py
'''
#!/usr/bin/env python3
import foo.printsum
import bar
foo.printsum.printsum(1,2)
bar.printsum(3,4)
'''
The file example.py is meant to be run as a script, while foo and bar are meant to be imported modules. I would like to make the namespace of foo like bar. That is - I don't want the double printsum.printsum. I don't want all of the foo module to be saved in one big monolithic file, like bar.py. I want the foo.printsum() method to be saved in a file by itself, but I want printsum() to exist at the foo. namespace level.
Can this be done somehow?

Import the functions in __init__py.
foo/__init__py:
from .printsum import printsum
(If you are using python2, you might need to remove the dot or use full import path. Rules for relative import paths are stricter in python 3)
Then you can call the function directly from foo:
import foo
foo.printsum(...)
The foo import will run __init__. You can add some prints in there as well if you want proof. This is a perfectly normal way to expose functionality in package modules. You can also do * imports, but that is generally not recommended. Import strictly what you want to expose.
__init__ in other words is a place you can glue stuff to the package, but also do automatic initialisation that triggers in the import.
Also look up the __all__ keyword. It can be extremely useful to hide code that is not relevant for a user of the package or module.

Related

Call a function for every script inside a folder

Is there a way (using only python. i.e.: without a bash script nor another language code) to call a specific function in every script inside a folder without needing to import all of them explicitly.
For example, let's say that this is my structure:
main.py
modules/
module1.py
module2.py
module3.py
module4.py
and every moduleX.py has this code:
import os
def generic_function(caller):
print('{} was called by {}'.format(os.path.basename(__file__), caller))
def internal_function():
print('ERROR: Someone called an internal function')
while main.py has this code:
import modules
import os
for module in modules.some_magic_function():
module.generic_function(os.path.basename(__file__))
So if I run main.py, I should get this output:
module1.py was called by main.py
module2.py was called by main.py
module3.py was called by main.py
module4.py was called by main.py
*Please note that internal_function() shouldn't be called (unlike this question). Also, I don't want to declare explicitly every module file even on a __init__.py
By the way, I don't mind to use classes for this. In fact it could be even better.
You can use exec or eval to do that. So it would go roughly this way (for exec):
def magic_execute():
import os
import glob
for pyfl in glob.glob(os.path(MYPATH, '*.py'):
with open(pyfl, 'rt') as fh:
pycode = fh.read()
pycode += '\ngeneric_function({})'.format(__file__)
exec(pycode)
The assumption here is that you are not going to import the modules at all.
Please note, that there are numerous security issues related to using exec in such a non-restricted manner. You can increase security a bit.
While sophros' approach is quickly and enough for implicitly importing the modules, you could have issues related to controlling every module or with complex calls (like having conditions for each calls). So I went with another approeach:
First I created a class with the function(s) (now methods) declared. With this I can avoid checking if the method exists as I can use the default one if I didn't declare it:
# main.py
class BaseModule:
def __init__(self):
# Any code
def generic_function(self, caller):
# This could be a Print (or default return value) or an Exception
raise Exception('generic_function wasn\'t overridden or it was used with super')
Then I created another class that extends the BaseModule. Sadly I wasn't able to get a good way for checking inherence without knowing the name of the child class so I used the same name for every module:
# modules/moduleX.py
from main import BaseModule
class GenericModule(BaseModule):
def __init__(self):
BaseModule.__init__(self)
# Any code
def generic_function(self, caller):
print('{} was called by {}'.format(os.path.basename(__file__), caller))
Finally, in my main.py, I used the importlib for importing the modules dynamically and saving an instance for each one, so I can use them later (for sake of simplicity I didn't save them in the following code, but it's easy as using a list and appending every instance on it):
# main.py
import importlib
import os
if __name__ == '__main__':
relPath = 'modules' # This has to be relative to the working directory
for pyFile in os.listdir('./' + relPath):
# just load python (.py) files except for __init__.py or similars
if pyFile.endswith('.py') and not pyFile.startswith('__'):
# each module has to be loaded with dots instead of slashes in the path and without the extension. Also, modules folder must have a __init___.py file
module = importlib.import_module('{}.{}'.format(relPath, pyFile[:-3]))
# we have to test if there is actually a class defined in the module. This was extracted from [1]
try:
moduleInstance = module.GenericModule(self)
moduleInstance.generic_function(os.path.basename(__file__)) # You can actually do whatever you want here. You can save the moduleInstance in a list and call the function (method) later, or save its return value.
except (AttributeError) as e:
# NOTE: This will be fired if there is ANY AttributeError exception, including those that are related to a typo, so you should print or raise something here for diagnosting
print('WARN:', pyFile, 'doesn\'t has GenericModule class or there was a typo in its content')
References:
[1] Check for class existence
[2] Import module dynamically
[3] Method Overriding in Python

is there anyway to restrict a method in module from import in python?

I have a python directory like below.. in test.py i have a,b,c methods.
While import test i dont want user to import c method. One way is making as c method as private. Is there anyway to achieve that in __init__.py or using __import__
test
__init__.py
test.py
I have see few solution in stackoverflow am not getting a way to achieve that.
Thanks.
If you're looking for absolute private methods, then python is the wrong language for you - go to Java or C/++/# or some other language that supports the distinction between public and private. In python, if it is known something exists, then it is generally possible to access that thing no matter how hidden it is.
If you're simply trying to limit the convenient options of the user when they import your module, then you can simply selectively include or exclude methods in __init__.py. Say that you have
test.py
def a():
pass
def b():
pass
def c():
pass
and you wanted a and b to be accessible to the user but c not to be, then you could do
__init__.py
from .test import a, b
and export the folder as a module. Now, when a user does
import test
they get access to only the stuff that's in the namespace by the end of __init__.py (that is, they can get test.a and test.b, but test.c doesn't exist). Since you never included c in __init__.py, it doesn't appear there.
Note that c would still be accessible by doing
from test.test import c
which accesses the sourcefile directly.
Alternatively, you can designate on a per-file basis which names should be immediately accessible by using a built-in variable __all__. The following will have the same effect as the above code:
test.py
...
__all__ = ['a', 'b'] # note that 'c' is excluded
__init__.py
from test.py import * # imports a and b, but not c

Dynamically add a function to a python module (instead of a class)?

I'm aware I can add it to a particular class or its instance using the setattr method. But in my case I want to dynamically "add" it to say a utilities.py module in a way that every file that does 'import utilities' sees this new function.
let's call your "new function" new_func(). I'm not quite sure which of the following you mean:
import utilities will make utilities.new_func() available
OR
import utilities will make new_func() available without the utilities prefix.
If it's the former:
Just put the function inside the module somewhere in that module's top-level namespace. In other words def new_func() should not be indented at all.
If it's the latter:
You are out of luck; you'll need to change import utilities to from utilities import * which is not recommended and would be just as much work as explicitly importing new_func().

Python - Unexpected Import Occuring

I'm hoping someone can provide some insight on some extra name bindings that Python3 is creating during an import. Here's the test case:
I created a test package called spam (original, I know). It contains 3 files as follows:
The contents of the files are as follows:
__init__.py:
from .foo import Foo
from .bar import Bar
foo.py:
def Foo():
pass
bar.py:
def Bar():
pass
Pretty simple stuff. When I import the spam package, I can see that it creates name bindings to the Foo() and Bar() functions in the spam namespace, which is expected. What isn't expected is that it also binds a name to the foo and bar modules in the spam namespace, as shown below.
What's even more interesting is that these extra name bindings to the module don't occur if I import the Foo() and Bar() functions in __main__, as shown below:
Reading through the documentation on the import statement (language ref and tutorial), I don't see anything that would cause this to be.
Can anyone shed some light on why, when importing a function from a module inside a package, it also binds a name to the module containing the function?
Yes - that is correct, and part of Python import mechanism.
When you import a module a lot of things happen, but we can focus on a few:
1) Python checks if the module is already loaded - to that means, it checks if it is qualifyed bane (name with dots) is under sys.modules
2) If not, it actually loads the module: that includes checking for pre-compiled cached bytecode files, parse, compile the .py file otherwise, etc...
3) It actually makes the name bindings as they are in the import command: that is "from .foo import Foo" creates a variable "Foo" in the current namespace that points to the "spam.foo.Foo" object.
Perceive that the module is always loaded as a whole - and associated in the sys.modules dictionary. Besides that, the import process makes all sub-modules to a package available in the module namespace visible in that package - that is what causes the names "foo" and "bar" do be visible in your Spam package.
You could, at the end of your __init__.py file delete the names "foo" and "bar" - but that will break the way expected import and usage of spam.foo works in funamental ways - basically: sys.modules["spam.foo"] will exist, butsys.modules["spam"].foo wont - meaning that after
one tries to do:
import spam.foo
spam.foo.Foo()
Python will yield a name error on "foo".
The import machinery will report it as existing (it is in sys.modules), so it does nothing. But "spam.foo" has been removed, so it can't be reached.

Is there a way to give parts of the local namespace to an importee?

a.py:
import b
import c
...
import z
class Foo(object):
...
Each of thoses module B-Z needs to use class foo.
Is some way, like importing, which allows indirect access (e.g. via an object) to all values of all modules A-Z, while still allowing each module B-Z access to A's namespace (e.g. foo).
No. They must each in turn import A themselves.
I still cannot tell what you are trying to do or even asking, but this is my best guess:
Normally, just use classic imports.
IF a module is growing too large, or if you have an extremely good reason to split things up but desire to share the same namespace, you can "hoist" values into a dummy namespace. For example if I had widget.Foo and widget.Bar and wanted them in different files, but I wanted to be able to type Foo and Bar in each file, I would normally have to from widget import Foo and from widget import Bar. If you have MANY of these files (foo.py,bar.py,baz.py,...,zeta.py) it can get a bit unwieldy. Thus you can improve your situation by importing them only once, in widget/__init__.py, and then going from foo import *, from bar import *, ... in each folder just once, and going from widget import * only once in each module. And you're done!... well... almost...
This gets you into a circular import scenario, which you have to be extremely careful of: Circular (or cyclic) imports in Python It will be fine for example if you reference Bar in a function in foo.py, everything is fine because you don't immediately use the value. However if you do x = Bar in foo.py then the value may not have been defined yet!
sidenote: You can programatically import using the __import__ function. If you couple this with os.walk then you can avoid having to type from ... import * for each file in your widget folder. This is a critical and necessary step to avoid bugs down the line.

Categories