Ive searched the web and this site and cant find an answer to this problem. Im sure its right in front of me somewhere but cant find it.
I need to be able to import a module based on a string. Then execute a function within that module while passing arguments.
I can import based on the string and then execute using eval() but I know this is not the best way to handle this. I also cant seem to pass arguments that way.
My current module that would be set based on a string is named TestAction.py and lives in a folder called Tasks.
This is the content of TestAction.py:
def doSomething(var):
print var
This is the code I am executing to import TestAction and execute.
module = "Tasks.TestAction"
import Tasks
mymod = __import__(module)
eval(module + ".doSomething()")
How can I make this code #1 not use eval() and #2 pass the var argument to doSomething()?
Thanks in advance!
Thanks everyone for the help. it looks like importlib combined with getattr was what I needed. For future reference here is the exact code that is working for me.
module = "FarmTasks.TestAction"
mymod = importlib.import_module(module)
ds = getattr(mymod, "doSomething")
ds("stuff")
Is the function name also variable? If not, just use your imported module:
mymod.doSomething('the var argument')
if it is, use getattr:
fun = 'doSomething'
getattr(mymod, fun)('the var argument')
According to the documentation
it is better to use importlib.import_module() to programmatically import a module.
Using this you can retrieve your module like this:
import importlib
TestAction = importlib.import_module("TestAction", package="Tasks")
After that you can simply call functions normally or by name:
TestAction.some_known_function(arg1, arg2)
getattr(TestAction, "some_other_function_name")(arg1, arg2)
I hope this answered your question, feel free to elaborate if you are still missing something.
If you use Python 2.7 or 3.1+ the easiest way to go is to use the importlib module in combination with getattr. Your code would look like that then:
import importlib
module = "Tasks.TestAction"
mymod = importlib.import_module(module)
myfunc = getattr(mymod, "doSomething")
myfunc()
I recently wrote a simple function that imports a given function, it seems to work for me:
def import_function(name):
"""Import a function by name: module.function or
module.submodule.function, etc. Return the function object."""
mod, f = name.rsplit('.', 1)
return getattr(__import__(mod, fromlist=[f]), f)
You can use it as:
f = import_function('Tasks.TestAction.doSometing')
f()
or just
import_function('Tasks.TestAction.doSometing')()
Related
Im trying to store the method in a package in a variable as it might change and I dont want to manually update in multiple places in the code.
import hashlib as hashy
foo='hello world'
bar='hello world'
algo='md5'
hfoo=hashy.algo(foo.encode())
hbar=hashy.algo(bar.encode())
In this particular case you can use hashlib.new() to create a hasher by its name.
import hashlib # don't randomly rename standard libraries
ALGORITHM = 'md5'
h = hashlib.new(ALGORITHM)
h.update('hello world'.encode('utf-8'))
print(h.hexdigest())
If you think you might want to change which function in a module you're calling, you can wrap it in your own function, which is right generic answer to the question you're asking.
import hashlib
def hash(s):
hashlib.md5(s.encode('utf-8'))
print hash('hello world')
You can use getattr() on a module to retrieve a function by name, but that's not the usual way to do things.
In Python, if you want to dynamically import a module (such as from a string name) you can use the module importlib and the function importlib.import_module("foo"), which essentially gives the same result as import foo (but it's dynamic).
Anyway, in my program, I'm using a function to import a module from a list, so it looks something like this:
# Note: this code does not produce the desired result.
# Please see the snippet below, for the working version
module_list = ["os"]
def import_module(name):
exec("global {}".format(name))
exec("import {}".format(name))
for item in module_list:
import_module(item)
I haven't seen this type of solution anywhere else on the web. What I'm asking, is why? Is it bad practice because I'm using the exec() function (as I've read not to do countless times) or is it because It's simply more confusing
Edit: I feel like it's relevant to note that it's not my exact code above, but it's the part that's actually relevant to this question, instead of confusing people
Edit (2): thanks to user Aran-Fey for discovering that my code doesn't work. I hadn't properly tested this specific snippet. Here's a version that works in python 3.6:
module_list = ["os"]
def import_module(name):
exec("global {}".format(name), globals())
exec("import {}".format(name), globals())
for item in module_list:
import_module(item)
I have a Python module that I want to dynamically import given only a string of the module name. Normally I use importlib or __import__ and this works quite well given that I know which objects I want to import from the module, but is there a way to do the equivalent of import * dynamically. Or is there a better approach?
I know in general its bad practice to use import * but the modules I'm trying to import are automatically generated on the fly and I have no way of knowing the exact module which contains the class I'm addressing.
Thanks.
Use update for dicts:
globals().update(importlib.import_module('some.package').__dict__)
Note, that using a_module.__dict__ is not the same as from a_module import *, because all names are "imported", not only those from __all__ or not starting with _.
I came up with some ugly hacky code, it works in python 2.6. I'm not sure if this is the smartest thing to do though, perhaps some other people here have some insight:
test = __import__('os',globals(),locals())
for k in dir(test):
globals()[k] = test.__dict__[k]
You probably want to put a check here to make sure you aren't overwriting anything in the global namespace. You could probably avoid the globals part and just look through each dynamically imported module for your class of interest. This would probably be much better than polluting the global namespace with everything you are importing.
For example, say your class is named Request from urllib2
test = __import__('urllib2',globals(),locals())
cls = None
if 'Request' in dir(test):
cls = test.__dict__['Request']
# you found the class now you can use it!
cls('http://test.com')
The following is highly sinful and will condemn you to purgatory or worse
# module_a.py
myvar = "hello"
# module_b.py
import inspect
def dyn_import_all(modpath):
"""Incredibly hackish way to load into caller's global namespace"""
exec('from ' + modpath + ' import *', inspect.stack()[1][0].f_globals)
# module_c.py
from module_b import dyn_import_all
def print_from(modpath):
dyn_import_all(modpath)
print(myvar)
Demo:
>>> import module_c
>>> module_c.print_from("module_a")
hello
(Important: See update below.)
I'm trying to write a function, import_something, that will important certain modules. (It doesn't matter which for this question.) The thing is, I would like those modules to be imported at the level from which the function is called. For example:
import_something() # Let's say this imports my_module
my_module.do_stuff() #
Is this possible?
Update:
Sorry, my original phrasing and example were misleading. I'll try to explain my entire problem. What I have is a package, which has inside it some modules and packages. In its __init__.py I want to import all the modules and packages. So somewhere else in the program, I import the entire package, and iterate over the modules/packages it has imported.
(Why? The package is called crunchers, and inside it there are defined all kinds of crunchers, like CruncherThread, CruncherProcess, and in the future perhaps MicroThreadCruncher. I want the crunchers package to automatically have all the crunchers that are placed in it, so later in the program when I use crunchers I know it can tell exactly which crunchers I have defined.)
I know I can solve this if I avoid using functions at all, and do all imports on the main level with for loops and such. But it's ugly and I want to see if I can avoid it.
If anything more is unclear, please ask in comments.
Functions have the ability to return something to where they were called. Its called their return value :p
def import_something():
# decide what to import
# ...
mod = __import__( something )
return mod
my_module = import_something()
my_module.do_stuff()
good style, no hassle.
About your update, I think adding something like this to you __init__.py does what you want:
import os
# make a list of all .py files in the same dir that dont start with _
__all__ = installed = [ name for (name,ext) in ( os.path.splitext(fn) for fn in os.listdir(os.path.dirname(__file__))) if ext=='.py' and not name.startswith('_') ]
for name in installed:
# import them all
__import__( name, globals(), locals())
somewhere else:
import crunchers
crunchers.installed # all names
crunchers.cruncherA # actual module object, but you can't use it since you don't know the name when you write the code
# turns out the be pretty much the same as the first solution :p
mycruncher = getattr(crunchers, crunchers.installed[0])
You can monkey with the parent frame in CPython to install the modules into the locals for that frame (and only that frame). The downsides are that a) this is really quite hackish and b) sys._getframe() is not guaranteed to exist in other python implementations.
def importer():
f = sys._getframe(1) # Get the parent frame
f.f_locals["some_name"] = __import__(module_name, f.f_globals, f.f_locals)
You still have to install the module into f_locals, since import won't actually do that for you - you just supply the parent frame locals and globals for the proper context.
Then in your calling function you can have:
def foo():
importer() # Magically makes 'some_name' available to the calling function
some_name.some_func()
Are you looking for something like this?
def my_import(*names):
for name in names:
sys._getframe(1).f_locals[name] = __import__(name)
then you can call it like this:
my_import("os", "re")
or
namelist = ["os", "re"]
my_import(*namelist)
According to __import__'s help:
__import__(name, globals={}, locals={}, fromlist=[], level=-1) -> module
Import a module. The globals are only used to determine the context;
they are not modified. ...
So you can simply get the globals of your parent frame and use that for the __import__ call.
def import_something(s):
return __import__(s, sys._getframe(1).f_globals)
Note: Pre-2.6, __import__'s signature differed in that it simply had optional parameters instead of using kwargs. Since globals is the second argument in both cases, the way it's called above works fine. Just something to be aware of if you decided to use any of the other arguments.
This question already has answers here:
How can I import a module dynamically given its name as string?
(10 answers)
Closed 3 months ago.
I'm trying to dynamically load modules I've created.
Right now this works properly:
import structures.index
But if I try the same thing by importing it dynamically, it fails.
struct = __import__("structures.index")
Error supplied is:
Error ('No module named structures.index',)
Any ideas why?
Edit: When using full scope (it sort of works?):
struct = __import__("neoform.structures.index")
This doesn't throw any errors, however, it isn't loading the index module, it's loading the "neoform" module instead.
The result of "struct" is:
<module 'neoform' from '/neoform/__init__.py'>
Also, as a side question, how can I then instantiate a class within a dynamically loaded module? (assuming all the modules contain a common class name).
Edit: Solution: (thanks coonj & Rick) This ended up being what worked. Not sure why (yet), but the fromlist had to be something "anything apparently, since it worked when I put the letter "a" as a value (strange, given that the file only had 1 class in it).
def get_struct_module(self, name):
try:
return = __import__("neoform.structures." + name, fromlist='*')
except ImportError, e:
self.out.add("Could not load struct: neoform.structure." + name + "\n\n" + "Error " + str(e.args))
I'm not sure what "it fails" means, so I'll just mention that __import__('structures.index') should, in fact, work, but it doesn't assign the module name in the current scope. To do that (and then use a class in the dynamically imported module), you'll have to use:
structures = __import__('structures.index')
structures.index.SomeClass(...)
The complete details on __import__ are available here.
Edit: (based on question edit)
To import neoform.structures.index, and return the index module, you would do the following:
structures = __import__('neoform.structures.index',
fromlist=['does not in fact matter what goes here!'])
So if you have a list of package names packages, you can import their index modules and instantiate some MyClass class for each using the following code:
modules = [ __import__('neoform.%s.index' % pkg, fromlist=['a'])
for pkg in packages ]
objects = [ m.MyClass() for m in modules ]
To import sub-modules, you need to specify them in the fromlist arg of __import__() Fo example, the equivalent of:
import structures.index
is:
structures = __import__('structures', fromlist=['index'])
To do this in a map is a little more tricky...
import mod1.index
import mod2.index
import mod3.index
For those imports, you would want to define a new function to get the index sub-module from each module:
def getIndexMods(mod_names):
mod_list = map(lambda x: __import__(x, fromlist='index'))
index_mods = [mod.index for mod in mod_list]
return index_mods
Now, you can do this to get references to all index modules:
index_mods = getIndexMods(['mod1', 'mod2', 'mod3'])
Also, if you want to grab sub-modules that are not named 'index' then you could do this:
mod1, mod2, mod3 = map(lambda x,y: __import__(x, fromlist=y),
['mod1', 'mod2', 'mod3'], ['index1', 'index2', 'index3'])
Use full scope ("neoform.structures.index") with this helper method.
def import_module(name):
mod = __import__(name)
components = name.split('.')
for comp in components[1:]:
mod = getattr(mod, comp)
return mod
module = import_module("neoform.structures.index")
# do stuff with module
>>> import imp
>>> fm = imp.find_module('index', ['./structures']) # for submodule
>>> mymod = imp.load_module('structures.index', *fm)
>>> mymod
<module 'structures.index' from './structures/index.pyc'>
>>> x = mymod.insideIndex()
Initialising index class...
Voila!
Java programmer here, but I think you need the imp module
Why on earth would you replace
import structures.index
with
map(__import__, ["structures.index"])
The first one (a) works, (b) is dynamic and (c) is directly supported. What possible use case is there for replacing easy-to-change, plain-text source with something more complex?
In short: don't do this. It doesn't have any value.
Edit
The "I'm getting the import from a database" is a noble effort, but still not sensible. What code block depends on those imports? That whole code block -- imports and all -- is what you want to execute. That whole code block -- imports, statements and everything -- should be a plain old python module file.
Import that block of code from the file system. Use the database to identify which file, the author of the file -- anything you want to use the database for. But simply import and execute the module the simplest possible way.
Really late post here. But I was searching for this question on google. I did some trial and error. Not sure if this snippet will help but here it is. Using it for Flask site.
modules = ['frontend', 'admin']
for module in modules:
mod = __init__('controllers.%s' % module, fromlist=[module])
app.register_blueprint(mod.blueprint_mod)
# or
from importlib import import_module
modules = ['frontend', 'admin']
for module in modules:
mod = import_module('controllers.%s' % module)
app.regitster_blueprint(mod.blueprint_mod)