Import or not to import classmethod? - python

I hope this isn't a stupid question but I found some code where they imported classmethod and some code where they don't so there is difference?
I'm using python 3.6 but the code originally I think was for python 2.7 (it used from __builtin__ import)
import unittest
from selenium import webdriver
from builtins import classmethod #original code was from __builtin__ import classmethod
class HomePageTest(unittest.TestCase):
#classmethod
def setUp(cls):
# create a new Firefox session
cls.driver = webdriver.Firefox()
cls.driver.implicitly_wait(30)
cls.driver.maximize_window()
# navigate to the application home page
cls.driver.get("http://demo-store.seleniumacademy.com/")
def test_search_field(self):
pass
#My tests without #classmethod
#classmethod
def tearDown(cls):
# close the browser window
cls.driver.quit()
if __name__ == '__main__':
unittest.main(verbosity=2)

Normally you only import builtins or __builtin__ if you also have a variable in your code with the same name as a builtin and also want to access the builtin name. The documentation of the module explains it rather well:
builtins — Built-in objects
This module provides direct access to all ‘built-in’ identifiers of Python; for example, builtins.open is the full name for the built-in function open(). See Built-in Functions and Built-in Constants for documentation.
This module is not normally accessed explicitly by most applications, but can be useful in modules that provide objects with the same name as a built-in value, but in which the built-in of that name is also needed. For example, in a module that wants to implement an open() function that wraps the built-in open(), this module can be used directly:
import builtins
def open(path):
f = builtins.open(path, 'r')
return UpperCaser(f)
class UpperCaser:
'''Wrapper around a file that converts output to upper-case.'''
def __init__(self, f):
self._f = f
def read(self, count=-1):
return self._f.read(count).upper()
However in your case there seems to be no classmethod definition in the file so you don't actually need the from builtins import classmethod.

In Python 3, there's no need to import the builtins module, or anything inside it. When the lookup for a name in the current scope fails, builtins is looked up as a fallback.
If you need to maintain code integrity, consider explicitly checking Python version before this.
import sys
if sys.version_info[0] == 2:
from __builtin__ import classmethod

Related

Pickle with a specific module name

I am using the pickle library to serialise a custom object, let's call it A, which is defined in a.py.
If I pickle an object of type A in a.py as follows:
import pickle
class A:
...
if __name__ == "__main__":
inst = A("some param")
with open("a.pickle", 'wb') as dumpfile:
pickle.dump(inst, dumpfile)
Then there is a problem with loading this object from storage if the module A is not explicitly in the namespace __main__. See this other question. This is because pickle knows that it should look for the class A in __main__, since that's where it was when pickle.dump() happened.
Now, there are two approaches to dealing with this:
Deal with it at the deserialisation end,
Deal with it at the serialisation end.
For 1, there are various options (see the above link, for example), but I want to avoid these, because I think it makes sense to give the 'pickler' responsibility regarding its data.
For 2, we could just avoid pickling when the module is under the __main__ namespace, but that doesn't seem very flexible. We could alternatively modify A.__module__, and set it to the name of the module (as done here).
Pickle uses this __module__ variable to find where to import the class A from, so setting it before .dump() works:
if __name__ == "__main__":
inst = A("some param")
A.__module__ = 'a'
with open("a.pickle", 'wb') as dumpfile:
pickle.dump(inst, dumpfile)
Q: is this a good idea? It seems like it's implementation dependent, not interface dependent. That is, pickle could decide to use another method of locating modules to import, and this approach would break. Is there an alternative that uses pickle's interface?
Another way around it would be to import the file itself:
import pickle
import a
class A:
pass
def main():
inst = a.A()
print(inst.__module__)
with open("a.pickle", 'wb') as dumpfile:
pickle.dump(inst, dumpfile)
if __name__ == "__main__":
main()
Note that it works because the import statement is purely an assignment of the name a to a module object, it doesn't go to infinite recursion when you import a within a.py.

How to trigger a late import of another module only when a certain object is needed from my module?

The Situation
I want to have a module that roughly works like the following:
# my_module.py
my_number = 17
from other_module import foo
my_object = foo(23)
However, there is a problem: Installing other_module causes problems for some users and is only required for those who want to use my_object – which in turn is only a small fraction of users. I want to spare those users who do not need my_object from installing other_module.
I therefore want the import of other_module to happen only if my_object is imported from my_module. With other words, the user should be able to run the following without having installed other_module:
from my_module import my_number
My best solution so far
I could provide my_object via a function that contains the import:
# in my_module.py
def get_my_object():
from other_module import foo
my_object = foo(23)
return my_object
The user would then have to do something like:
from my_module import get_my_object
my_object = get_my_object()
Question
Is there a better way to conditionally trigger the import of other_module? I am mostly interested in keeping things as simple as possible for the users.
I would prefer the get_my_object() approach, but as of Python 3.7, what you ask is possible by defining a module-level __getattr__ function:
# my_module.py
my_number = 17
def __getattr__(name):
if name != 'my_object':
raise AttributeError
global my_object
from other_module import foo
my_object = foo(23)
return my_object
This will attempt to import other_module and call foo only once my_object is accessed. A few caveats:
It will not trigger for attempts to access my_object by global variable lookup within my_module. It will only trigger on my_module.my_object attribute access, or a from my_module import my_object import (which performs attribute access under the hood).
If you forget to assign to the global my_object name in __getattr__, my_object will be recomputed on every access.
Module-level __getattr__ does nothing before Python 3.7, so you may want to perform a version check and do something else for Python 3.6 and below:
import sys
if sys.version_info >= (3, 7):
def __getattr__(name):
...
else:
# Do something appropriate. Maybe raise an error. Maybe unconditionally
# import other_module and compute my_object up front.
Approach A – clean solution
Create a new separate module and have the user import the object from the other module. For example
from my_module import my_number # regular use
from my_module.extras import my_object # for a small part of the user base
This means in your code you create a module folder my_module with an __init__.py where you import the usual stuff and don't import the extras submodule.
If you don't want to put extras in my_module (simpler), just create my_object in an individual extras.py module.
Approach B – signals bad architecture 99% of times
You can use importlib.import_module to dynamically import a module inside get_my_object without polluting the global space and that is cleaner than an import inside a function which creates side effects such as overriding your global variable with that import name (see this question and answers), however this is usually a sign of bad coding patterns on other part of the code.
Approach C – simple and effective
I usually tend to favour this simple pattern when there are users that might not have a library, as Python 3 discourages imports that are not at top level:
try:
import other_module
_HAS_OTHER_MODULE_ = True
except:
_HAS_OTHER_MODULE_ = False
def get_my_object():
assert _HAS_OTHER_MODULE_, "Please install other module"
return other_module.foo(23)
This is a common hack to be done:
import sys
class MockModule:
def __init__(self, module):
self.module = module
def __getattr__(self, attr):
if attr == 'dependency_required_var':
try:
import foo
return self.module.dependency_required_var
except ImportError:
raise Exception('foo library is required to use dependency_required_var')
else:
return getattr(self.module, attr)
MockModule.__name__ = __name__
sys.modules[__name__] = MockModule(sys.modules[__name__])
dependency_required_var = 0
With this PEP, we can simply do (we should be able to but I couldn't get it to work) the following in Python 3.7 and higher:
def __getattr__(attr):
if attr == 'dependency_required_var':
try:
import foo
return dependency_required_var
except ImportError:
raise Exception('foo library is required to use dependency_required_var')
else:
return globals()[attr]
The PEP seems to be accepted, but the relevant pull request from the PEP seems to be closed, I'm actually not sure if it has been implemented or not.

How do I do python unittest doc's recommended method of lazy import?

Python's docs say that there's an alternative to local imports to prevent loading the module on startup:
https://docs.python.org/3/library/unittest.mock-examples.html#mocking-imports-with-patch-dict
...to prevent “up front costs” by delaying the import.
This can also be solved in better ways than an unconditional local
import (store the module as a class or module attribute and only do
the import on first use).
I don't understand the explanation in brackets. How do I do this? However I think about it, I seem to end up with local imports anyway.
The documentation likely refers to the use of importlib.import_module, which exposes Python's import functionality:
import importlib
class Example():
TO_IMPORT = "os" # the name of the module to lazily import
__module = None
def __init__(self):
if self.__module is None:
self.__module = importlib.import_module(self.TO_IMPORT)
Note that this way the module is only imported once when the class is first instantiated and is not available in global namespace.
Further, it allows you to change which module is imported, which could be useful e.g. in cases where the same class is used as an interface to different backends:
import importlib
class Example():
def __init__(self, backend="some_module"):
self.module = importlib.import_module(backend)

Inheriting math methods into a class with only static methods

I need to define a class which extends python's standard math module, without instantiating it (no need for that, all methods in the class are static):
import math
class more_math(math):
#staticmethod
def add_func(x):
return math.sqrt(x)+1
The code above doesn't run properly (script exits), with the error:
TypeError: Error when calling the metaclass bases
module.__init__() takes at most 2 arguments (3 given)
When the class declaration above is set to class more_math:, more_math.add_func(x) is called without error. However, more_math.sqrt(x) [sqrt is a method of math] can't be called, as more_math doesn't have math as its base class.
Ideas on how could this be setup properly?
Think long and hard about whether you actually need to provide the functions math implements. You almost certainly don't; you probably just need to provide your extras. That said, if you do need to provide a more_math module that implements all the standard math functions too, the easiest way to do this is to do a from math import *. This will bring every function the math module defines into your module. This is considered very bad practice, because it pollutes the namespace of your module and can make it difficult to tell what is actually being used. However, in this case, pollution of your module's namespace is exactly what you want.
As #user2357112 commented math is a module and not a class. You can create a more_math module simply by creating a more_math.py file with:
from math import *
def add_func(x):
return sqrt(x)+1
this module can be imported with import more_math or from more_math import add_func.
math isn't a class, it's an instance of class types.ModuleType. You can verify this with isinstance(math, types.ModuleType) which will return True. Normally you can't define a subclass that inherits from an instance of another class. However, it is possible with a bit of hackery.
(I got the idea from a recipe for inheriting from an instances on the ActiveState website.)
Since it is a hack, one might not want to use it in production code. However I thought you (and other readers) might find it a least interesting, if not useful.
Script more_math.py:
from copy import deepcopy
import math
import sys
def class_from_instance(instance):
copy = deepcopy(instance.__dict__)
def __init__(self, *args, **kwargs):
super(InstanceFactory, self).__init__(*args, **kwargs)
self.__dict__.update(copy)
InstanceFactory = type('InstanceFactory',
(instance.__class__,),
{'__init__': __init__})
return InstanceFactory
class MoreMathModule(class_from_instance(math)):
#staticmethod
def added_func(x):
return math.sqrt(x)+1
# Replace this module with an instance of the class above.
ref, sys.modules[__name__] = sys.modules[__name__], MoreMathModule('more_math')
if __name__ == '__main__':
import more_math
x = 42
print('more_math.sqrt({}) -> {:.6f}'.format(x, more_math.sqrt(x)))
print('more_math.added_func({}) -> {:.6f}'.format(x, more_math.added_func(x)))
Output:
more_math.sqrt(42) -> 6.480741
more_math.added_func(42) -> 7.480741

Can I "fake" a package (or at least a module) in python for testing purposes?

I want to fake a package in python. I want to define something so that the code can do
from somefakepackage.morefakestuff import somethingfake
And somefakepackage is defined in code and so is everything below it. Is that possible? The reason for doing this is to trick my unittest that I got a package ( or as I said in the title, a module ) in the python path which actually is just something mocked up for this unittest.
Sure. Define a class, put the stuff you need inside that, assign the class to sys.modules["classname"].
class fakemodule(object):
#staticmethod
def method(a, b):
return a+b
import sys
sys.modules["package.module"] = fakemodule
You could also use a separate module (call it fakemodule.py):
import fakemodule, sys
sys.modules["package.module"] = fakemodule
Yes, you can make a fake module:
from types import ModuleType
m = ModuleType("fake_module")
import sys
sys.modules[m.__name__] = m
# some scripts may expect a file
# even though this file doesn't exist,
# it may be used by Python for in error messages or introspection.
m.__file__ = m.__name__ + ".py"
# Add a function
def my_function():
return 10
m.my_function = my_function
Note, in this example its using an actual module (of ModuleType) since some
Python code may expect modules, (instead of a dummy class).
This can be made into a utility function:
def new_module(name, doc=None):
import sys
from types import ModuleType
m = ModuleType(name, doc)
m.__file__ = name + '.py'
sys.modules[name] = m
return m
print(new_module("fake_module", doc="doc string"))
Now other scripts can run:
import fake_module
I took some of the ideas from the other answers and turned them into a Python decorator #modulize which converts a function into a module. This module can then be imported as usual. Here is an example.
#modulize('my_module')
def my_dummy_function(__name__): # the function takes one parameter __name__
# put module code here
def my_function(s):
print(s, 'bar')
# the function must return locals()
return locals()
# import the module as usual
from my_module import my_function
my_function('foo') # foo bar
The code for the decorator is as follows
import sys
from types import ModuleType
class MockModule(ModuleType):
def __init__(self, module_name, module_doc=None):
ModuleType.__init__(self, module_name, module_doc)
if '.' in module_name:
package, module = module_name.rsplit('.', 1)
get_mock_module(package).__path__ = []
setattr(get_mock_module(package), module, self)
def _initialize_(self, module_code):
self.__dict__.update(module_code(self.__name__))
self.__doc__ = module_code.__doc__
def get_mock_module(module_name):
if module_name not in sys.modules:
sys.modules[module_name] = MockModule(module_name)
return sys.modules[module_name]
def modulize(module_name, dependencies=[]):
for d in dependencies: get_mock_module(d)
return get_mock_module(module_name)._initialize_
The project can be found here on GitHub. In particular, I created this for programming contests which only allow the contestant to submit a single .py file. This allows one to develop a project with multiple .py files and then combine them into one .py file at the end.
You could fake it with a class which behaves like somethingfake:
try:
from somefakepackage.morefakestuff import somethingfake
except ImportError:
class somethingfake(object):
# define what you'd expect of somethingfake, e.g.:
#staticmethod
def somefunc():
...
somefield = ...
TL;DR
Patch sys.modules using unittest.mock:
mock.patch.dict(
sys.modules,
{'somefakepackage': mock.Mock()},
)
Explanation
Other answers correctly recommend to fix sys.modules but a proper way to do it is by patching it using mock.patch. Meaning replacing it temporarily (only for when tests are run) with a fake object that optionally imitates the desired behaviour. And restoring it back once tests are finished to not affect other test cases.
The code in TL;DR section will simply make your missing package not raise ImportError. To provide fake package with contents and imitate desired behaviour, initiate mock.Mock(…) with proper arguments (e.g. add attributes via Mock's **kwargs).
Full code example
The code below temporarily patches sys.modules so that it includes somefakepackage and makes it importable from the dependent modules without ImportError.
import sys
import unittest
from unittest import mock
class SomeTestCase(unittest.TestCase):
def test_smth(self):
# implement your testing logic, for example:
self.assertEqual(
123,
somefakepackage_dependent.some_func(),
)
#classmethod
def setUpClass(cls): # called once before all the tests
# define what to patch sys.modules with
cls._modules_patcher = mock.patch.dict(
sys.modules,
{'somefakepackage': mock.Mock()},
)
# actually patch it
cls._modules_patcher.start()
# make the package globally visible and import it,
# just like if you have imported it in a usual way
# placing import statement at the top of the file,
# but relying on a patched dependency
global somefakepackage_dependent
import somefakepackage_dependent
#classmethod # called once after all tests
def tearDownClass(cls):
# restore initial sys.modules state back
cls._modules_patcher.stop()
To read more about setUpClass/tearDownClass methods, see unittest docs.
unittest's built-in mock subpackage is actually a very powerful tool. Better dive deeper into its documentation to get a better understanding.

Categories