This question already has answers here:
How to list all functions in a module?
(20 answers)
Closed 5 months ago.
Is there a way to retrieve all the different functions/classes of a specific package?
For example, I'd like to receive something like this for scipy:
scipy.ndimage.center_of_mass
scipy.ndimage.binary_dilation
scipy.ndimage.binary_erosion
scipy.ndimage.gaussian_filter
scipy.ndimage.filters.gaussian_filter
scipy.ndimage.filters.convolve
scipy.ndimage.sobel
scipy.ndimage.zoom
scipy.ndimage.distance_transform_edt
scipy.ndimage.filters.convolve
scipy.spatial.distance.cdist
scipy.optimize.curve_fit
scipy.signal.find_peaks
scipy.signal.correlate
scipy.signal.peak_widths
scipy.signal.find_peaks
scipy.signal.peak_widths
scipy.interpolate.LinearNDInterpolator
scipy.interpolate.interp1d
scipy.interpolate.make_interp_spline
scipy.integrate.trapz
scipy.linalg.circulant
This is just a subset of scipy, but you can get the idea. I'd like to list all the different functions/classes of that package. Is there a tool that does that maybe?
There are many ways:
dir(module)
or
from inspect import getmembers, isfunction
from somemodule import foo
print(getmembers(foo, isfunction))
but in your case, scipy contains other sub packages. To print all the contents of a package, you can use .__all__:
scipy.__all__
This produce a list of all submodules, methods, functions and attributes. It is possible to select the relevant for your according to the type (module, function or other). You just need to loop over them and check their corresponding types:
for i in scipy.__all__:
print(f"{i}: {type(getattr(scipy, i))}")
For each subpackage, you can use getmembers function from inspect to get the function and the classes of each. you can specify using is function, ismodule and ismethod what you're really looking for.
For more details, https://docs.python.org/3/library/inspect.html#inspect.getmembers
import inspect
inspect.getmembers(scipy.signal, inspect.ismodule)
Related
This question already has an answer here:
inspect.getmembers() vs __dict__.items() vs dir()
(1 answer)
Closed 1 year ago.
Python's dir() is nice, don't get me wrong. But I'd like a list that actually tells me what kind of things the objects are: methods, variables, things that are inherited, etc.
As best I can tell, dir always returns a simple list of strings with no indication as to what the objects are. I checked the documentation for dir() and I don't see any way of getting better information.
Are there any other packages or tools for doing this?
I'd like a list that actually tells me what kind of things the objects are: methods, variables, things that are inherited, etc.
pyclbr built-in module might be useful for you, if you are interested in classes in certain *.py file, let somefile.py content be
class MyClass():
def parent_method(self):
return None
class MyChildClass(MyClass):
def child_method(self):
return None
then
import pyclbr
objects = pyclbr.readmodule("somefile")
print(objects['MyChildClass'].super)
print(objects['MyChildClass'].methods)
output
['MyClass']
{'child_method': 6}
Explanation: pyclbr does not execute code, but extract information from python source code. In above example from .super we can conclude that MyChildClass is child of MyClass and that MyChildClass define method child_method in 6th line of line
This question already has answers here:
Callable modules
(7 answers)
Closed 3 years ago.
I'm writing a module called foo that has many internal functions, but ultimately boils down to a single external function foo(). When using this module/function, I'd like to do
import foo
foo(whatever)
instead of
from foo import foo
foo(whatever)
Is this possible?
You could monkey patch the sys.modules dictionary to make the name of your module point to the function instead of your module.
foo.py (the file defining your module foo) would look like this
import sys
def foo(x):
return x + x
sys.modules[__name__] = foo
then you can use this module from a different file like this
import foo
print(foo(3))
6
There are probably reasons for why you shouldn't do this. sys.modules isn't supposed to point to functions, when you do from some_module import some_function, the module some_module is what gets added to sys.modules, not the function some_function.
It is not strictly possible. Python module names are ment to help the programmer distinguish between modules. So even if you had one function, bar in your module foo, using import foo will still need a foo.bar(). You're probably better off just using from foo import *.
However there may be a way around this. Python also has built-in functions, and you may be able to add your own functions to this. Doing so might require rewriting the compile though.
So conclusion: writing from foo import * isn't all that ugly and is a lot easier and prettier than the long way around.
This question already has answers here:
Why is "import *" bad?
(12 answers)
How do I import other Python files?
(23 answers)
Closed 5 years ago.
We can import numpy and use its functions directly as:
from numpy import *
a = arraay([1,2,3]) # and it works well.
Why do some people use the following method?
import numpy as np
a= np.array([1,2,3])
The difference is easy: from numpy import * imports all names from the top-level NumPy module into your current "module" (namespace). import numpy as np will just make that top-level NumPy module available if you use np.xxx.
However there is one reason why you shouldn't use from any_module import *: It may just overwrite existing names. For example NumPy has its own any, max, all and min functions, which will happily shadow the built-in Python any, max, ... functions (a very common "gotcha").
My advise: Avoid from numpy import * even if it seems like less effort than typing np. all the time!
It's a matter of neatness but also consistency: you might have multiple functions with the same name from different modules (for instance there's a function called "random" in Numpy, but also in other packages like SciPy) so it's important to denote which exact function you're using from which exact module. This link has a great explanation and makes the point about code readability as well.
I know that from module import * will import all the functions in current namespace but it is a bad practice. I want to use two functions directly and use module.function when I have to use any other function from the module. What I am doing currently is:
import module
from module import func1, func2
# DO REST OF MY STUFF
Is it a good practice? Does the order of first two statements matter?
Is there a better way using which I can use these two functions directly and use rest of the functions as usual with the module's name prepended to them?
Using just import module results in very long statements with a lot of repetition if I use the same function from the given module five times in a single statement. That's what I want to avoid.
The order doesn't matter and it's not a pythonic way. When you import the module there is no need to import some of its functions separately again. If you are not sure how many of the functions you might need to use just import the module and access to the functions on demand with a simple reference.
# The only import you need
import module
# Use module.funcX when you need any of its functions
After all, if you want to use some of your functions (much) more than the others, as the cost of attribute access is greater than importing the functions separately, you better to import them as you've done.
And still, the order doesn't matter. You can do:
import module
from module import func1, func2
For more info read the documentation https://www.python.org/dev/peps/pep-0008/#imports
It is not good to do (may be opinion based):
import module
from module import func1, func2 # `func1` and `func2` are already part of module
Because you already hold a reference to module.
If I were you, I would import it in the form of import module. Since your issue is that module.func1() becomes too long. I may import the module and use as for creating a alias for the name. For example:
import module as mo
# ^ for illustration purpose. Even the name of
# your actual module wont be `module`.
# Alias should also be self-explanatory
# For example:
import database_manager as db_manager
Now I may access the functions as:
mo.func1()
mo.func2()
Edit: Based on the edit in actual question
If your are calling same function in the same line, there is possibility that your are already doing some thing wrong. It will be great if you can share what your that function does.
For example: Want to the rertun value of those functions to be passed as argument to another function? as:
test_func(mo.func1(x), mo.func1(y). mo.func1(z))
could be done as:
params_list = [x, y, z]
func_list = [mo.func1(param) for param in params_list]
test_func(*func_list)
This question already has answers here:
Injecting variables into an import namespace
(2 answers)
Closed 4 years ago.
Basically, I'd like to force a variable, lets call him jim into a plugin I load as a global, before the plugin loads, for instance:
load_plugin('blah', variables={'jim':1}) #For instance
And then inside blah.py:
print jim #prints 1
Is there any easy way to do this? Not a big deal if its not in the standard library.
No - there is no way to do that before the plug-in is imported in first place - so, if your variable is used in the module body itself, you are out of luck.
If the variable is used as a global variable inside the module's functions or methods (but not class bodies), you can change it after the module is imported simply doing:
import module
module.jim = 5
as you probably know. (And I am aware this is not what you are asking for).
So, the only way to achieve that would be to parse the source code for the module, and change the variable assignment there, save the source code and import it. Ok, there are ways to emulate importing with the source code in memory, but this approach is so impratical, we should not detail it.
If you have control over the source of the module you want to monkey-patch this way, my suggestion would be to use a configuration file from which the module would pick the variable names.
Then you generate the configuration file, perform the importing (taking care that it is not already imported into sys.modules) and you are done.
You could use the __import__ function. It lets you override the globals.
for instance:
__import__('blah', dict(jim=1, **globals()))