I've got a Python module which has several variables with hard-coded values which are used throughout the project. I'd like to bind the variables somehow to a function to redirect to a config file. Is this possible?
# hardcoded_values.py
foo = 'abc'
bar = 1234
# usage somewhere else in another module
from hardcoded_values import *
print foo
print bar
What I want to do is change only hardcoded_values.py, so that print foo transparently calls a function.
# hardcoded_values.py
import config
foo = SomeWrapper(config.get_value, 'foo') # or whatever you can think of to call config.get_value('foo')
...
config.get_value would be a function that is called with parameter 'foo' when using variable foo (as in print foo).
I'm pretty sure that you can't do what you want to do if you import like from hardcoded_values import *.
What you want to do is to set foo to some function, and then apply the property decorator (or equivalent) so that you can call foo as foo rather than foo(). You cannot apply the property decorator to modules for reasons detailed here: Why Is The property Decorator Only Defined For Classes?
Now, if you were to import hardcoded_values then I think there is a way to do what you want to hardcoded_values.foo. I have a pretty good feeling that what I am about to describe is a BAD IDEA that should never be used, but I think it is interesting.
BAD IDEA???
So say you wanted to replace a constant like os.EX_USAGE which on my system is 64 with some function, and then call it as os.EX_USAGE rather than os.EX_USAGE(). We need to be able to use the property decorator, but for that we need a type other than module.
So what can be done is to create a new type on the fly and dump in the __dict__ of a module with a type factory function that takes a module as an argument:
def module_class_factory(module):
ModuleClass = type('ModuleClass' + module.__name__,
(object,), module.__dict__)
return ModuleClass
Now I will import os:
>>> import os
>>> os.EX_USAGE
64
>>> os.getcwd()
'/Users/Eric'
Now I will make a class OsClass, and bind the name os to an instance of this class:
>>> OsClass = module_class_factory(os)
>>> os = OsClass()
>>> os.EX_USAGE
64
>>> os.getcwd()
'/Users/Eric'
Everything still seems to work. Now define a function to replace os.EX_USAGE, noting that it will need to take a dummy self argument:
>>> def foo(self):
... return 42
...
...and bind the class attribute OsClass.EX_USAGE to the function:
>>> OsClass.EX_USAGE = foo
>>> os.EX_USAGE()
42
It works when called by the os object! Now just apply the property decorator:
>>> OsClass.EX_USAGE = property(OsClass.EX_USAGE)
>>> os.EX_USAGE
42
now the constant defined in the module has been transparently replaced by a function call.
You definitely cannot do this if the client code of your module uses from hardcoded_variables import *. That makes references to the contents of hardcoded_variables in the other module's namespace, and you can't do anything with them after that.
If the client code can be changed to just import the module (with e.g. import hardcoded_variables) and then access its attributes (with hardcoded_variables.foo) you do have a chance, but it's a bit awkward.
Python caches modules that have been imported in sys.modules (which is a dictionary). You can replace a module in that dictionary with some other object, such as an instance of a custom class, and use property objects or other descriptors to implement special behavior when you access the object's attributes.
Try making your new hardcoded_variables.py look like this (and consider renaming it, too!):
import sys
class DummyModule(object):
def __init__(self):
self._bar = 1233
#property
def foo(self):
print("foo called!")
return "abc"
#property
def bar(self):
self._bar += 1
return self._bar
if __name__ != "__main__": # Note, this is the opposite of the usual boilerplate
sys.modules[__name__] = DummyModule()
If I understand correctly, you want your hardcoded_variables module evaluated every time you try to access a variable.
I would probably have hardcoded_variables in a document (e.g. json?) and a custom wrapper function like:
import json
def getSettings(var):
with open('path/to/variables.json') as infl:
data = json.load(infl)
return infl[var]
Related
What happens if I define a magic method outside a class?
For example, say I do:
def __str__(stuff):
return "chicken"
directly inside the module.
When would something like this be useful? I thought it might be useful if I import this module named module1 elsewhere and try to do print(module1), but that just prints out the file location and other stuff.
So is there even any use for using a magic method outside a class? Is it even really a magic method any more?
At the moment (Python 3.9), two module-level magic methods can be defined: __getattr__ and __dir__ (see PEP 562 for details).
__getattr__ overrides attribute access on that module, including imports of the form from x import y.
__dir__ overrides what is returned by dir(module).
For example:
# foo.py
def __getattr__(name):
return name
def __dir__():
return ['foo', 'bar']
Then we can use it in the following way:
>>> import foo
>>> dir(foo)
['bar', 'foo']
>>> from foo import bar
>>> bar
'bar'
In python, mocking an object using
#patch('foo.bar')
def test_things(self, bar):
bar.return_value= ...
requires that all tested classes use
import foo
and can not use
from foo import bar
In the second case code under test uses the original object, as mock patches names rather than the function itself. This feels very brittle.
How do we write mocks which will work with both forms of import?
Short answer: No
The principle of a mock is to mock one object. If you import the same object from different ways in you your code (which is somehow weird) you need to create a mock for each object.
Example:
import os
from os.path import isdir
from unittest.mock import patch
>>> with patch('os.path') as mock_os_path:
... mock_os_path.isdir.return_value = "Hello"
... mocked_res = os.path.isdir("./")
... res = path.isdir("./")
... print("mocked_res)
... print(res)
...
Hello
True
According to docs
target should be a string in the form 'package.module.ClassName'. The target is imported and the specified object replaced with the new object, so the target must be importable from the environment you are calling patch() from. The target is imported when the decorated function is executed, not at decoration time.
I like small, self-contained modules, that sometimes contain a single class or a single function, e.g.
def decorator(function):
return function
By convention, I use full, absolute imports only, e.g.
# Yes
import module
module.function()
# No
from module import function
function()
Together, this might become annoyingly verbose, e.g.
import decorator
#decorator.decorator
def function():
pass
So I like to export things other than modules via sys.modules, e.g.
import sys
def decorator(function):
return function
sys.modules[__name__] = decorator
And then,
import decorator
#decorator
def function():
pass
This was the intro; whether I should do this or not is not the issue. The issue is this strange behaviour:
# foo.py
import sys
x = 1
def foo():
print(x)
sys.modules[__name__] = foo
And then,
>>> import foo
>>> foo()
None
And stranger still, this only happens in Python 2.7; in Python 3.4 it works as expected! My question is, why does this happen, and how can I make this work in Python 2.7?
Thanks.
The question is in context of unit testing.
I have created an instance of the class I am testing, and trying to test one of its methods. The method uses data it gets from different class defined in separate module. I am going to mock that module.
How I can access my instance's name space - I have to do it before running the method I am testing, to mock module which contain definition of the class my method is getting data from?
I am going to create an example here which I think parallels what you are trying to do.
Say you have some class that we'll call Data that is defined in the module foo. The foo module imports bar and a method of foo.Data calls bar.get_data() to populate itself.
You want to create a module test that will create an instance of foo.Data, but instead of using the actual module bar you want that instance to use a mocked version of this.
You can set this up by importing foo from your test module, and then rebinding foo.bar to your mocked version of the module.
Here is an example of how this might look:
bar.py:
def get_data():
return 'bar'
foo.py:
import bar
class Data(object):
def __init__(self):
self.val = bar.get_data()
if __name__ == '__main__':
d = Data()
print d.val # prints 'bar'
test.py:
import foo
class bar_mock(object):
#staticmethod
def get_data():
return 'test'
if __name__ == '__main__':
foo.bar = bar_mock
d = foo.Data()
print d.val # prints 'test'
Although this will get you by for a simple test case, you are probably better off looking into a mocking library to handle this for you.
I'm writing a decorator, and for various annoying reasons[0] it would be expedient to check if the function it is wrapping is being defined stand-alone or as part of a class (and further which classes that new class is subclassing).
For example:
def my_decorator(f):
defined_in_class = ??
print "%r: %s" %(f, defined_in_class)
#my_decorator
def foo(): pass
class Bar(object):
#my_decorator
def bar(self): pass
Should print:
<function foo …>: False
<function bar …>: True
Also, please note:
At the point decorators are applied the function will still be a function, not an unbound method, so testing for instance/unbound method (using typeof or inspect) will not work.
Please only offer suggestions that solve this problem — I'm aware that there are many similar ways to accomplish this end (ex, using a class decorator), but I would like them to happen at decoration time, not later.
[0]: specifically, I'm writing a decorator that will make it easy to do parameterized testing with nose. However, nose will not run test generators on subclasses of unittest.TestCase, so I would like my decorator to be able to determine if it's being used inside a subclass of TestCase and fail with an appropriate error. The obvious solution - using isinstance(self, TestCase) before calling the wrapped function doesn't work, because the wrapped function needs to be a generator, which doesn't get executed at all.
Take a look at the output of inspect.stack() when you wrap a method. When your decorator's execution is underway, the current stack frame is the function call to your decorator; the next stack frame down is the # wrapping action that is being applied to the new method; and the third frame will be the class definition itself, which merits a separate stack frame because the class definition is its own namespace (that is wrapped up to create a class when it is done executing).
I suggest, therefore:
defined_in_class = (len(frames) > 2 and
frames[2][4][0].strip().startswith('class '))
If all of those crazy indexes look unmaintainable, then you can be more explicit by taking the frame apart piece by piece, like this:
import inspect
frames = inspect.stack()
defined_in_class = False
if len(frames) > 2:
maybe_class_frame = frames[2]
statement_list = maybe_class_frame[4]
first_statment = statement_list[0]
if first_statment.strip().startswith('class '):
defined_in_class = True
Note that I do not see any way to ask Python about the class name or inheritance hierarchy at the moment your wrapper runs; that point is "too early" in the processing steps, since the class creation is not yet finished. Either parse the line that begins with class yourself and then look in that frame's globals to find the superclass, or else poke around the frames[1] code object to see what you can learn — it appears that the class name winds up being frames[1][0].f_code.co_name in the above code, but I cannot find any way to learn what superclasses will be attached when the class creation finishes up.
A little late to the party here, but this has proven to be a reliable means of determining if a decorator is being used on a function defined in a class:
frames = inspect.stack()
className = None
for frame in frames[1:]:
if frame[3] == "<module>":
# At module level, go no further
break
elif '__module__' in frame[0].f_code.co_names:
className = frame[0].f_code.co_name
break
The advantage of this method over the accepted answer is that it works with e.g. py2exe.
Some hacky solution that I've got:
import inspect
def my_decorator(f):
args = inspect.getargspec(f).args
defined_in_class = bool(args and args[0] == 'self')
print "%r: %s" %(f, defined_in_class)
But it relays on the presence of self argument in function.
you can use the package wrapt to check for
- instance/class methods
- classes
- freestanding functions/static methods:
See the project page of wrapt: https://pypi.org/project/wrapt/
You could check if the decorator itself is being called at the module level or nested within something else.
defined_in_class = inspect.currentframe().f_back.f_code.co_name != "<module>"
I think the functions in the inspect module will do what you want, particularly isfunction and ismethod:
>>> import inspect
>>> def foo(): pass
...
>>> inspect.isfunction(foo)
True
>>> inspect.ismethod(foo)
False
>>> class C(object):
... def foo(self):
... pass
...
>>> inspect.isfunction(C.foo)
False
>>> inspect.ismethod(C.foo)
True
>>> inspect.isfunction(C().foo)
False
>>> inspect.ismethod(C().foo)
True
You can then follow the Types and Members table to access the function inside the bound or unbound method:
>>> C.foo.im_func
<function foo at 0x1062dfaa0>
>>> inspect.isfunction(C.foo.im_func)
True
>>> inspect.ismethod(C.foo.im_func)
False