Suppose I have a module package containing the following files. An empty file C:\codes\package\__init__.py and some non-trivial files:
One located in C:\codes\package\first.py
def f():
print 'a'
Another located in C:\codes\package\second.py
def f():
print 'b'
There is also a third file: C:\codes\package\general.py with the following code
def myPrint(module_name):
module = __import__(module_name)
module.f()
if __name__ == '__main__':
myPrint('first')
myPrint('second')
When I run the latter file, everything goes fine. However, if I try to execute the file C:\codes\test.py containing
if __name__ == '__main__':
from package import general
general.myPrint('first')
general.myPrint('second')
I get the import error ImportError: No module named first. How to resolve this issue?
First, I suspect you forgot to metion you have a (possibly empty) file package\__init__.py which makes package a package. Otherwise, from package import general wouldn't work.
The second case differs from the first in so far as you are in a package. From inside a package, you wouldn't do import first, but import .first. The equivalent to the latter is described here where you either add level=1 as a parameter or (but I am not sure about that) you put .first into the string and set level to -1 (if it isn't the default nevertheless, that's not clear from the documentation).
Additionally, you have to provide at least globals(), so the right line is
module = __import__(module_name, globals(), level=1)
I have found this solution here.
In your case, you should import your module_name from package. Use fromlist argument:
getattr(__import__("package", fromlist=[module_name]), module_name)
Assuming, you're using Python 3, that's just because this version dropped the support for implicit relative imports. With Python 2 it would be working just fine.
So either you'd need to use relative imports in C:\codes\package\general.py, which would result in erroneous call to it, or add your package to the path. A little dirty, but working hack would be:
def myPrint(module_name):
pkg = os.path.dirname(__file__)
sys.path.insert(0, pkg)
try:
module = __import__(module_name)
except:
raise
finally:
sys.path.remove(pkg)
module.f()
Maybe you can achieve a cleaner implementation with the importlib module.
Related
Suppose I have a module file like this:
# my_module.py
print("hello")
Then I have a simple script:
# my_script.py
import my_module
This will print "hello".
Let's say I want to "override" the print() function so it returns "world" instead. How could I do this programmatically (without manually modifying my_module.py)?
What I thought is that I need somehow to modify the source code of my_module before or while importing it. Obvisouly, I cannot do this after importing it so solution using unittest.mock are impossible.
I also thought I could read the file my_module.py, perform modification, then load it. But this is ugly, as it will not work if the module is located somewhere else.
The good solution, I think, is to make use of importlib.
I read the doc and found a very intersecting method: get_source(fullname). I thought I could just override it:
def get_source(fullname):
source = super().get_source(fullname)
source = source.replace("hello", "world")
return source
Unfortunately, I am a bit lost with all these abstract classes and I do not know how to perform this properly.
I tried vainly:
spec = importlib.util.find_spec("my_module")
spec.loader.get_source = mocked_get_source
module = importlib.util.module_from_spec(spec)
Any help would be welcome, please.
Here's a solution based on the content of this great talk. It allows any arbitrary modifications to be made to the source before importing the specified module. It should be reasonably correct as long as the slides did not omit anything important. This will only work on Python 3.5+.
import importlib
import sys
def modify_and_import(module_name, package, modification_func):
spec = importlib.util.find_spec(module_name, package)
source = spec.loader.get_source(module_name)
new_source = modification_func(source)
module = importlib.util.module_from_spec(spec)
codeobj = compile(new_source, module.__spec__.origin, 'exec')
exec(codeobj, module.__dict__)
sys.modules[module_name] = module
return module
So, using this you can do
my_module = modify_and_import("my_module", None, lambda src: src.replace("hello", "world"))
This doesn't answer the general question of dynamically modifying the source code of an imported module, but to "Override" or "monkey-patch" its use of the print() function can be done (since it's a built-in function in Python 3.x). Here's how:
#!/usr/bin/env python3
# my_script.py
import builtins
_print = builtins.print
def my_print(*args, **kwargs):
_print('In my_print: ', end='')
return _print(*args, **kwargs)
builtins.print = my_print
import my_module # -> In my_print: hello
I first needed to better understand the import operation. Fortunately, this is well explained in the importlib documentation and scratching through the source code helped too.
This import process is actually split in two parts. First, a finder is in charge of parsing the module name (including dot-separated packages) and instantiating an appropriate loader. Indeed, built-in are not imported as local modules for example. Then, the loader is called based on what the finder returned. This loader get the source from a file or from a cache, and executed the code if the module was not previously loaded.
This is very simple. This explains why I actually did not need to use abstract classes from importutil.abc: I do not want to provide my own import process. Instead, I could create a subclass inherited from one of the classes from importuil.machinery and override get_source() from SourceFileLoader for example. However, this is not the way to go because the loader is instantiated by the finder so I do not have the hand on which class is used. I cannot specify that my subclass should be used.
So, the best solution is to let the finder do its job, and then replace the get_source() method of whatever Loader has been instantiated.
Unfortunately, by looking trough the code source I saw that the basic Loaders are not using get_source() (which is only used by the the inspect module). So my whole idea could not work.
In the end, I guess get_source() should be called manually, then the returned source should be modified, and finally the code should be executed. This is what Martin Valgur detailed in his answer.
If compatibility with Python 2 is needed, I see no other way than reading the source file:
import imp
import sys
import types
module_name = "my_module"
file, pathname, description = imp.find_module(module_name)
with open(pathname) as f:
source = f.read()
source = source.replace('hello', 'world')
module = types.ModuleType(module_name)
exec(source, module.__dict__)
sys.modules[module_name] = module
If importing the module before the patching it is okay, then a possible solution would be
import inspect
import my_module
source = inspect.getsource(my_module)
new_source = source.replace('"hello"', '"world"')
exec(new_source, my_module.__dict__)
If you're after a more general solution, then you can also take a look at the approach I used in another answer a while ago.
My solution updates the source file, which works for the inner import situation. The inner import means that transformers.models.albert import modeling_albert from the source file. In such case, even if I use the solution from Martin Valgur, it won't work. So I update the source file. Hope it help the people who have the same trouble with me.
import inspect
from transformers.models.albert import modeling_albert
# Get source
source = inspect.getsource(modeling_albert)
source_before = "AlbertModel(config, add_pooling_layer=False)"
source_after = "AlbertModel(config, add_pooling_layer=True)"
new_source = source.replace(source_before, source_after)
# Update file
file_path = modeling_albert.__spec__.origin
with open(file_path, 'w') as f:
f.write(new_source)
Not elegant, but works for me (may have to add a path):
with open ('my_module.py') as aFile:
exec (aFile.read () .replace (<something>, <something else>))
Question
The standard library clearly documents how to import source files directly (given the absolute file path to the source file), but this approach does not work if that source file uses implicit sibling imports as described in the example below.
How could that example be adapted to work in the presence of implicit sibling imports?
I already checked out this and this other Stackoverflow questions on the topic, but they do not address implicit sibling imports within the file being imported by hand.
Setup/Example
Here's an illustrative example
Directory structure:
root/
- directory/
- app.py
- folder/
- implicit_sibling_import.py
- lib.py
app.py:
import os
import importlib.util
# construct absolute paths
root = os.path.abspath(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
isi_path = os.path.join(root, 'folder', 'implicit_sibling_import.py')
def path_import(absolute_path):
'''implementation taken from https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly'''
spec = importlib.util.spec_from_file_location(absolute_path, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
isi = path_import(isi_path)
print(isi.hello_wrapper())
lib.py:
def hello():
return 'world'
implicit_sibling_import.py:
import lib # this is the implicit sibling import. grabs root/folder/lib.py
def hello_wrapper():
return "ISI says: " + lib.hello()
#if __name__ == '__main__':
# print(hello_wrapper())
Running python folder/implicit_sibling_import.py with the if __name__ == '__main__': block commented out yields ISI says: world in Python 3.6.
But running python directory/app.py yields:
Traceback (most recent call last):
File "directory/app.py", line 10, in <module>
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed
File "/Users/pedro/test/folder/implicit_sibling_import.py", line 1, in <module>
import lib
ModuleNotFoundError: No module named 'lib'
Workaround
If I add import sys; sys.path.insert(0, os.path.dirname(isi_path)) to app.py, python app.py yields world as intended, but I would like to avoid munging the sys.path if possible.
Answer requirements
I'd like python app.py to print ISI says: world and I'd like to accomplish this by modifying the path_import function.
I'm not sure of the implications of mangling sys.path. Eg. if there was directory/requests.py and I added the path to directory to the sys.path, I wouldn't want import requests to start importing directory/requests.py instead of importing the requests library that I installed with pip install requests.
The solution MUST be implemented as a python function that accepts the absolute file path to the desired module and returns the module object.
Ideally, the solution should not introduce side-effects (eg. if it does modify sys.path, it should return sys.path to its original state). If the solution does introduce side-effects, it should explain why a solution cannot be achieved without introducing side-effects.
PYTHONPATH
If I have multiple projects doing this, I don't want to have to remember to set PYTHONPATH every time I switch between them. The user should just be able to pip install my project and run it without any additional setup.
-m
The -m flag is the recommended/pythonic approach, but the standard library also clearly documents How to import source files directly. I'd like to know how I can adapt that approach to cope with implicit relative imports. Clearly, Python's internals must do this, so how do the internals differ from the "import source files directly" documentation?
The easiest solution I could come up with is to temporarily modify sys.path in the function doing the import:
from contextlib import contextmanager
#contextmanager
def add_to_path(p):
import sys
old_path = sys.path
sys.path = sys.path[:]
sys.path.insert(0, p)
try:
yield
finally:
sys.path = old_path
def path_import(absolute_path):
'''implementation taken from https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly'''
with add_to_path(os.path.dirname(absolute_path)):
spec = importlib.util.spec_from_file_location(absolute_path, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return module
This should not cause any problems unless you do imports in another thread concurrently. Otherwise, since sys.path is restored to its previous state, there should be no unwanted side effects.
Edit:
I realize that my answer is somewhat unsatisfactory but, digging into the code reveals that, the line spec.loader.exec_module(module) basically results in exec(spec.loader.get_code(module.__name__),module.__dict__) getting called. Here spec.loader.get_code(module.__name__) is simply the code contained in lib.py.
Thus a better answer to the question would have to find a way to make the import statement behave differently by simply injecting one or more global variables through the second argument of the exec-statement. However, "whatever you do to make the import machinery look in that file's folder, it'll have to linger beyond the duration of the initial import, since functions from that file might perform further imports when you call them", as stated by #user2357112 in the question comments.
Unfortunately the only way to change the behavior of the import statement seems to be to change sys.path or in a package __path__. module.__dict__ already contains __path__ so that doesn't seem to work which leaves sys.path (Or trying to figure out why exec does not treat the code as a package even though it has __path__ and __package__ ... - But I don't know where to start - Maybe it has something to do with having no __init__.py file).
Furthermore this issue does not seem to be specific to importlib but rather a general problem with sibling imports.
Edit2: If you don't want the module to end up in sys.modules the following should work (Note that any modules added to sys.modules during the import are removed):
from contextlib import contextmanager
#contextmanager
def add_to_path(p):
import sys
old_path = sys.path
old_modules = sys.modules
sys.modules = old_modules.copy()
sys.path = sys.path[:]
sys.path.insert(0, p)
try:
yield
finally:
sys.path = old_path
sys.modules = old_modules
add to the PYTHONPATH environment variable the path your application is on
Augment the default search path for module files. The format is the same as the shell’s PATH: one or more directory pathnames
separated by os.pathsep (e.g. colons on Unix or semicolons on
Windows). Non-existent directories are silently ignored.
on bash its like this:
export PYTHONPATH="./folder/:${PYTHONPATH}"
or run directly:
PYTHONPATH="./folder/:${PYTHONPATH}" python directory/app.py
The OP's idea is great, this work only for this example by adding sibling modules with proper name to the sys.modules, I would say it is the SAME as adding PYTHONPATH. tested and working with version 3.5.1.
import os
import sys
import importlib.util
class PathImport(object):
def get_module_name(self, absolute_path):
module_name = os.path.basename(absolute_path)
module_name = module_name.replace('.py', '')
return module_name
def add_sibling_modules(self, sibling_dirname):
for current, subdir, files in os.walk(sibling_dirname):
for file_py in files:
if not file_py.endswith('.py'):
continue
if file_py == '__init__.py':
continue
python_file = os.path.join(current, file_py)
(module, spec) = self.path_import(python_file)
sys.modules[spec.name] = module
def path_import(self, absolute_path):
module_name = self.get_module_name(absolute_path)
spec = importlib.util.spec_from_file_location(module_name, absolute_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
return (module, spec)
def main():
pathImport = PathImport()
root = os.path.abspath(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
isi_path = os.path.join(root, 'folder', 'implicit_sibling_import.py')
sibling_dirname = os.path.dirname(isi_path)
pathImport.add_sibling_modules(sibling_dirname)
(lib, spec) = pathImport.path_import(isi_path)
print (lib.hello())
if __name__ == '__main__':
main()
Try:
export PYTHONPATH="./folder/:${PYTHONPATH}"
or run directly:
PYTHONPATH="./folder/:${PYTHONPATH}" python directory/app.py
Make sure your root is in a folder that is explicitly searched in the PYTHONPATH. Use an absolute import:
from root.folder import implicit_sibling_import #called from app.py
Make sure your root is in a folder that is explicitly searched in the PYTHONPATH
Use an absolute import:
from root.folder import implicit_sibling_import # called from app.py
To illustrate what I am trying to do, let's say I have a module testmod that lives in ./testmod.py. The entire contents of this module is
x = test
I would like to be able to successfully import this module into Python, using any of the tools available in importlib or any other built in library.
Obviously doing a simple import testmod statement from the current directory results in an error: NameError: name 'test' is not defined.
I thought that maybe passing either globals or locals to __import__ correctly would modify the environment inside the script being run, but it does not:
>>> testmod = __import__('testmod', globals={'test': 'globals'}, locals={'test': 'locals'})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/jfoxrabi/testmod.py", line 1, in <module>
x = test
NameError: name 'test' is not defined
I was setting the value of test differently so I could see which dict testmod.x came from if this worked.
Since neither of these seems to work, I am stuck. Is it even possible to accomplish what I am trying to do? I would guess that yes, since this is Python, not Sparta.
I am using Python 3.5 on Anaconda. I would very much prefer not to use external libraries.
Update: The Why
I am importing a module into my program as a configuration file. The reason that I am not using JSON or INI is that I would like to have the full scope of Python's interpreter available to compute the values in the config from expressions. I would like to have certain values that I compute before-hand in the program available to do those calculations.
While I am aware of the fact that this is about as bad as calling eval (I do that too in my program), I am not concerned with the security aspect for the time being. I am, however, quite willing to entertain better solutions should this indeed turn out to be a case of XY.
I came up with a solution based on this answer and the importlib docs. Basically, I have access to the module object before it is loaded by using the correct sequence of calls to importlib:
from importlib.util import spec_from_file_location, module_from_spec
from os.path import splitext, basename
def loadConfig(fileName):
test = 'This is a test'
name = splitext(basename(fileName))[0]
spec = spec_from_file_location(name, fileName)
config = module_from_spec(spec)
config.test = test
spec.loader.exec_module(config)
return config
testmod = loadConfig('./testmod.py')
This is a bit better than modifying builtins, which may have unintended consequences in other parts of the program, and may also restrict the names I can pass in to the module.
I decided to put all the configuration items into a single field accessible at load time, which I named config. This allows me to do the following in testmod:
if 'test' in config:
x = config['test']
The loader now looks like this:
from importlib.util import spec_from_file_location, module_from_spec
from os.path import splitext, basename
def loadConfig(fileName, **kwargs):
name = splitext(basename(fileName))[0]
spec = spec_from_file_location(name, fileName)
config = module_from_spec(spec)
config.config = kwargs
spec.loader.exec_module(config)
return config
testmod = loadConfig('./testmod.py', test='This is a test')
After finding myself using this a bunch of times, I finally ended up adding this functionality to the utility library I maintain, haggis. haggis.load.load_module loads a text file as a module with injection, while haggis.load.module_as_dict does a more advanced version of the same that loads it as a potentially nested configuration file into a dict.
You could screw with Python's builtins to inject your own fake built-in test variable:
import builtins # __builtin__, no s, in Python 2
builtins.test = 5 # or whatever other placeholder value
import testmod
del builtins.test # clean up after ourselves
I need to make a copy of a socket module to be able to use it and to have one more socket module monkey-patched and use it differently.
Is this possible?
I mean to really copy a module, namely to get the same result at runtime as if I've copied socketmodule.c, changed the initsocket() function to initmy_socket(), and installed it as my_socket extension.
You can always do tricks like importing a module then deleting it from sys.modules or trying to copy a module. However, Python already provides what you want in its Standard Library.
import imp # Standard module to do such things you want to.
# We can import any module including standard ones:
os1=imp.load_module('os1', *imp.find_module('os'))
# Here is another one:
os2=imp.load_module('os2', *imp.find_module('os'))
# This returns True:
id(os1)!=id(os2)
Python3.3+
imp.load_module is deprecated in python3.3+, and recommends the use of importlib
#!/usr/bin/env python3
import sys
import importlib.util
SPEC_OS = importlib.util.find_spec('os')
os1 = importlib.util.module_from_spec(SPEC_OS)
SPEC_OS.loader.exec_module(os1)
sys.modules['os1'] = os1
os2 = importlib.util.module_from_spec(SPEC_OS)
SPEC_OS.loader.exec_module(os2)
sys.modules['os2'] = os2
del SPEC_OS
assert os1 is not os2, \
"Module `os` instancing failed"
Here, we import the same module twice but as completely different module objects. If you check sys.modules, you can see two names you entered as first parameters to load_module calls. Take a look at the documentation for details.
UPDATE:
To make the main difference of this approach obvious, I want to make this clearer: When you import the same module this way, you will have both versions globally accessible for every other module you import in runtime, which is exactly what the questioner needs as I understood.
Below is another example to emphasize this point.
These two statements do exactly the same thing:
import my_socket_module as socket_imported
socket_imported = imp.load_module('my_socket_module',
*imp.find_module('my_socket_module')
)
On second line, we repeat 'my_socket_module' string twice and that is how import statement works; but these two strings are, in fact, used for two different reasons.
Second occurrence as we passed it to find_module is used as the file name that will be found on the system. The first occurrence of the string as we passed it to load_module method is used as system-wide identifier of the loaded module.
So, we can use different names for these which means we can make it work exactly like we copied the python source file for the module and loaded it.
socket = imp.load_module('socket_original', *imp.find_module('my_socket_module'))
socket_monkey = imp.load_module('socket_patched',*imp.find_module('my_socket_module'))
def alternative_implementation(blah, blah):
return 'Happiness'
socket_monkey.original_function = alternative_implementation
import my_sub_module
Then in my_sub_module, I can import 'socket_patched' which does not exist on system! Here we are in my_sub_module.py.
import socket_patched
socket_patched.original_function('foo', 'bar')
# This call brings us 'Happiness'
This is pretty disgusting, but this might suffice:
import sys
# if socket was already imported, get rid of it and save a copy
save = sys.modules.pop('socket', None)
# import socket again (it's not in sys.modules, so it will be reimported)
import socket as mysock
if save is None:
# if we didn't have a saved copy, remove my version of 'socket'
del sys.modules['socket']
else:
# if we did have a saved copy overwrite my socket with the original
sys.modules['socket'] = save
Here's some code that creates a new module with the functions and variables of the old:
def copymodule(old):
new = type(old)(old.__name__, old.__doc__)
new.__dict__.update(old.__dict__)
return new
Note that this does a fairly shallow copy of the module. The dictionary is newly created, so basic monkey patching will work, but any mutables in the original module will be shared between the two.
Edit: According to the comment, a deep copy is needed. I tried messing around with monkey-patching the copy module to support deep copies of modules, but that didn't work. Next I tried importing the module twice, but since modules are cached in sys.modules, that gave me the same module twice. Finally, the solution I hit upon was removing the modules from sys.modules after importing it the first time, then importing it again.
from imp import find_module, load_module
from sys import modules
def loadtwice(name, path=None):
"""Import two copies of a module.
The name and path arguments are as for `find_module` in the `imp` module.
Note that future imports of the module will return the same object as
the second of the two returned by this function.
"""
startingmods = modules.copy()
foundmod = find_module(name, path)
mod1 = load_module(name, *foundmod)
newmods = set(modules) - set(startingmods)
for m in newmods:
del modules[m]
mod2 = load_module(name, *foundmod)
return mod1, mod2
Physically copy the socket module to socket_monkey and go from there? I don't feel you need any "clever" work-around... but I might well be over simplifying!
I have a file called foobar (without .py extension). In the same directory I have another python file that tries to import it:
import foobar
But this only works if I rename the file to foobar.py. Is it possible to import a python module that doesn't have the .py extension?
Update: the file has no extension because I also use it as a standalone script, and I don't want to type the .py extension to run it.
Update2: I will go for the symlink solution mentioned below.
You can use the imp.load_source function (from the imp module), to load a module dynamically from a given file-system path.
import imp
foobar = imp.load_source('foobar', '/path/to/foobar')
This SO discussion also shows some interesting options.
Here is a solution for Python 3.4+:
from importlib.util import spec_from_loader, module_from_spec
from importlib.machinery import SourceFileLoader
spec = spec_from_loader("foobar", SourceFileLoader("foobar", "/path/to/foobar"))
foobar = module_from_spec(spec)
spec.loader.exec_module(foobar)
Using spec_from_loader and explicitly specifying a SourceFileLoader will force the machinery to load the file as source, without trying to figure out the type of the file from the extension. This means that you can load the file even though it is not listed in importlib.machinery.SOURCE_SUFFIXES.
If you want to keep importing the file by name after the first load, add the module to sys.modules:
sys.modules['foobar'] = foobar
You can find an implementation of this function in a utility library I maintain called haggis. haggis.load.load_module has options for adding the module to sys.modules, setting a custom name, and injecting variables into the namespace for the code to use.
Like others have mentioned, you could use imp.load_source, but it will make your code more difficult to read. I would really only recommend it if you need to import modules whose names or paths aren't known until run-time.
What is your reason for not wanting to use the .py extension? The most common case for not wanting to use the .py extension, is because the python script is also run as an executable, but you still want other modules to be able to import it. If this is the case, it might be beneficial to move functionality into a .py file with a similar name, and then use foobar as a wrapper.
imp.load_source(module_name, path) should do or you can do the more verbose imp.load_module(module_name, file_handle, ...) route if you have a file handle instead
importlib helper function
Here is a convenient, ready-to-use helper to replace imp, with an example, based on what was mentioned at: https://stackoverflow.com/a/43602645/895245
main.py
#!/usr/bin/env python3
import os
import importlib
import sys
def import_path(path):
module_name = os.path.basename(path).replace('-', '_')
spec = importlib.util.spec_from_loader(
module_name,
importlib.machinery.SourceFileLoader(module_name, path)
)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
sys.modules[module_name] = module
return module
notmain = import_path('not-main')
print(notmain)
print(notmain.x)
not-main
x = 1
Run:
python3 main.py
Output:
<module 'not_main' from 'not-main'>
1
I replace - with _ because my importable Python executables without extension have hyphens. This is not mandatory, but produces better module names.
This pattern is also mentioned in the docs at: https://docs.python.org/3.7/library/importlib.html#importing-a-source-file-directly
I ended up moving to it because after updating to Python 3.7, import imp prints:
DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
and I don't know how to turn that off, this was asked at:
The imp module is deprecated
How to ignore deprecation warnings in Python
Tested in Python 3.7.3.
If you install the script with package manager (deb or alike) another option would be to use setuptools:
"...there’s no easy way to have a script’s filename match local conventions on both Windows and POSIX platforms. For another, you often have to create a separate file just for the “main” script, when your actual “main” is a function in a module somewhere... setuptools fixes all of these problems by automatically generating scripts for you with the correct extension, and on Windows it will even create an .exe file..."
https://pythonhosted.org/setuptools/setuptools.html#automatic-script-creation
import imp has been deprecated.
The following is clean and minimal for me:
import sys
import types
import pathlib
def importFileAs(
modAsName: str,
importedFilePath: typing.Union[str, pathlib.Path],
) -> types.ModuleType:
""" Import importedFilePath as modAsName, return imported module
by loading importedFilePath and registering modAsName in sys.modules.
importedFilePath can be any file and does not have to be a .py file. modAsName should be python valid.
Raises ImportError: If the file cannot be imported or any Exception: occuring during loading.
Refs:
Similar to: https://stackoverflow.com/questions/19009932/import-arbitrary-python-source-file-python-3-3
but allows for other than .py files as well through importlib.machinery.SourceFileLoader.
"""
import importlib.util
import importlib.machinery
# from_loader does not enforce .py but importlib.util.spec_from_file_location() does.
spec = importlib.util.spec_from_loader(
modAsName,
importlib.machinery.SourceFileLoader(modAsName, importedFilePath),
)
if spec is None:
raise ImportError(f"Could not load spec for module '{modAsName}' at: {importedFilePath}")
module = importlib.util.module_from_spec(spec)
try:
spec.loader.exec_module(module)
except FileNotFoundError as e:
raise ImportError(f"{e.strerror}: {importedFilePath}") from e
sys.modules[modAsName] = module
return module
And then I would use it as so:
aasMarmeeManage = importFileAs('aasMarmeeManage', '/bisos/bpip/bin/aasMarmeeManage.cs')
def g_extraParams(): aasMarmeeManage.g_extraParams()