I have a class with only class methods. Is it a Pythonic way of namespacing? If not, what is the best way to group similar kinds of methods?.
class OnlyClassMethods(object):
#classmethod
def method_1(cls):
pass
#classmethod
def method_2(cls):
pass
A class is meant to have instances, not to serve as namespace. If your class is never instantiated, it does not serve the intended purpose of Python's class.
If you want to namespace a group of methods which are related, create a new module, that is another .py file, and import it.
Example
Here we create a module named helpers which contains some related methods. This module can then be imported in our main file.
helpers.py
def method_1():
...
def method_2():
...
main.py
import helpers
helpers.method_1()
helpers.method_2()
Related
I'm building a Python library magic_lib in which I need to instantiate a Python class (let's call it SomeClass) which is defined in the Python application that would import magic_lib.
What's the appropriate way to use/work on SomeClass when I develop magic_lib, since I don't have SomeClass in the magic_lib repo?
I'm thinking to create a dummy SomeClass like this. During packaging, I then exclude this class.
from typing import Any
class SomeClass:
def __init__(self, *arg: Any, **kwargs: Any):
pass
I'm wondering if this is the right approach. If not, any suggestions how I could approach this problem.
Thanks.
Additional thoughts: maybe I could use importlib like this?
my_module = importlib.import_module('some.module.available.in.production')
some_class = my_module.SomeClass()
Here is a more specific example:
Let's say I have two repos: workflows and magic_lib. Within workflows, it has defined a class named Task. Generally, we define tasks directly within the workflows repo. Everything works just fine. Now, let's say, I want to use magic_lib to programmatically define tasks in the workflows repo. Something like the following in the workflows repo:
from magic_lib import Generator
tasks: List[Task] = Generator().generate_tasks()
In order to do that, within magic_lib, I need to somehow have access to the class Task so that I can have it returned through the function generate_tasks(). I cannot really import Task defined in workflows from magic_lib. My question is how I can have access to Task within magic_lib.
Original question:
In python, there are decorators:
from <MY_PACKAGE> import add_method
#add_method
class MyClass:
def old_method(self):
print('Old method')
Decorators are functions which take classes/functions/... as argument:
def add_method(cls):
class Wrapper(cls):
def new_method(self):
print('New method')
return Wrapper
MyClass is passed as the cls argument to the add_method decorator function. The function can return a new class which inherits from MyClass
x = MyClass()
x.old_method()
x.new_method()
We can see that the method has been added. YAY !
So to recap, decorators are a great way to pass your user's custom class to your library. Decorators are just functions so they are easy to handle.
Modified question:
Classes can be passed to functions and methods as arguments
from magic_lib import generate_five_instances
tasks: List[Task] = generate_five_instances(Task)
def generate_five_instances(cls):
return [cls() for _ in range(5)]
If you come from another language, you might find this weird, but classes are FIRST CLASS CITIZENS in Python. That means you can assign them to variables and pass them as arguments.
I have a python class called CreateDB, with a method execute(module_name). The module_name variable is a string and tells me which class by that name should be called. CreateDB does not know and does not care where class Car is defined, it only knows that it's not defined in the same file as he is. I know what class to call and what function, but don't know how to access the class.
For example:
#in folder helpers
class CreateDB():
def execute(module_name):
#call the method from the class with that module_name
global_dict[module_name].run_sql_create()
#in different folder classes
class Car():
#staticmethod
def run_sql_create():
#do work
c = CreateDB()
c.execute("Car")
The question is, how to do this, using signals? or to cache all classes into a global dictionary and access it that way?
What you're trying to do is not well-defined. There could be several classes with the same name defined in different modules, which is perfectly valid. How would you tell which one the user wants?
I'd suggest two alternatives:
pre-import all possible classes you want to support. You can then access them via globals() (e.g. cls = globals()[class_name] ; cls.execute('Car'))
have the user also specify the module to import from. Then you can dynamically import the module by name (e.g. using __import__ or the imp module), then access the class defined in it. E.g. mod = __import__(mod_name) ; getattr(mod, class_name).execute('Car')
There are probably other ways to do it, but eval could be your friend here:
class Car(object):
pass
classname = "Car"
the_class = eval(classname, globals(), locals())
This worked for me
I am bit lost while writing the test case for UserCompanyRateLimitValidation class. I am finding difficulty in mocking the class which is being instantiated from inside this class.
class UserCompanyRateLimitValidation:
def __init__(self, user_public_key):
self.adapter = UserAdapter(user_public_key)
container = self.adapter.get_user_company_rate_limit()
super(UserCompanyRateLimitValidation, self).__init__(container,\
UserCompanyRateLimitValidation.TYPE)
I have to test this class. I have written test case something like this. I have tried to mock the UserAdapter class but I am not able to do so completely.
def test_case_1():
self.user_public_key = 'TEST_USER_PUBLIC_KEY_XXXXXX1234567890XXXXX'
UserAdapter_mock = mock(UserAdapter)
when(UserAdapter_mock).get_user_company_rate_limit().\
thenReturn(get_fake_container_object())
self.test_obj = UserCompanyRateLimitValidation(self.user_public_key)
Here if you see I have mocked get_user_company_rate_limit() call from the testable function, container = self.adapter.get_user_company_rate_limit()
but I am still not able to figure out the way in which I can mock this call,
self.adapter = UserAdapter(user_public_key)
It is quite simple if you know the trick.
Creating an object in Python is very much like a function call to the class object. UserCompanyRateLimitValidation is 'invoking' UserAdapter(user_public_key). You want to stub the return value of that 'call' to return UserAdapter_mock.
You can stub this like you would stub a function in a module. The line you're missing is:
when(module_declaring_UserAdapter)\
.UserAdapter(self.user_public_key)\
.thenReturn(UserAdapter_mock)
After that, calling module_declaring_UserAdapter.UserAdapter(self.user_public_key) will return UserAdapter_mock.
Here's the link to the section in the manual: https://code.google.com/p/mockito-python/wiki/Stubbing#Modules
You have to be careful to choose the right module_declaring_UserAdapter, due to the way the from ... import ... statement works. From your code, I'd say you have to pick the module in which UserCompanyRateLimitValidation is declared.
Here is another way of looking at it. Say I have this code in which I would like to mock MyClass:
from some.module import MyClass
class AnotherClass:
def __init__(self):
self.my_class = MyClass()
One would typically call the imports as shown above. With some slight modification of the import, we can get it into a state where MyClass it can be mocked using mockito:
from some import module
class AnotherClass:
def __init__(self):
self.my_class = module.MyClass()
Then the mocking would work like so:
from some import module
when(module).MyClass().thenReturn(mock())
I'm currently writing a wrapper in python for a lot of custom company tools.
I'm basically going to break each tool into its own py file with a class containing the call to the tool as a method.
These will all be contained in a package.
Then there'll be a master class that will import all from the package, then inherit from each and every class, so as to appear as one cohesive class.
masterClass.py
pyPackage
- __ init__.py
- module1.py
--class Module1
---method tool1
- module2.py
--class Module2
---method tool2
etc
Right now, I'm autogenerating the master class file to inherit from the packages modules, but I was wondering if there was a more elegant way to do it?
ie
from package import *
class MasterClass(package.all):
pass
I am not really sure what your reasons are for trying to create a larger master class with inheritance from all of your other smaller classes, but the first thing that comes to mind is that the design might be backwards.
What you might want to do instead is have a base class, for which all your command modules can subclass. You can use packages as intelligent namespaces for collecting similar functionality such as "network" or "compression", etc.
class ToolBase(object):
# common functionality here
# class attributes
# base constructor
# methods: static, class, instance
pass
class Module1(ToolBase):
def __init__(self):
super(Module1, self).__init__()
class Module2(ToolBase):
def __init__(self):
super(Module2, self).__init__()
In this base class example, every subclass can expect the functionality of ToolBase to be there, including any setup from the constructor like a database connection, sockets, resource.
And maybe a project structure like this:
pyPackage
__init__.py
# class ToolBase
# __all__ = ['network', 'compress']
network/
__init__.py
module1.py
# class Module1
compress/
__init__.py
module2.py
# class Module2
Update
As a way to use the base class and have a shared "port" object, you could make it a class level attribute that is only initialized once:
class ToolBase(object):
_PORT = None
def __init__(self):
if self._PORT is None:
self._PORT = "PORT"
#property
def port(self):
return self._PORT
class Foo(ToolBase):
def go(self):
print self.port
Now your classes are useful on their own, and will share the port object.
I've written some code that contains a main and a number of subclasses that inherit variables from a superclass.
E.g.
class superclass(object):
def __init__(self, var1, var2):
self.var1 = var1
self.var2 = var2
class subclass1(superclass):
def method1(self):
pass
class subclass2(superclass):
def method1(self):
pass
The main isn't shown, nor an option factory which is used to choose the subclass to call, but hopefully the info given will be sufficient.
I wish to convert those classes to standalone modules that can be imported.
It is expected that additional subclasses will be written in the future so I was wondering if it is possible to save each subclass and the superclass as seperate modules and that the subclasses will still be able to use/have access too the superclass variables and definitions.
The reasoning behind this is to simplify the writing of any future subclasses. Meaning they can be written as a stand alone module and the previous subclasses and the superclass don't have to be touched as part of the development, but they will still be able to use the suberclasses variables and definitions.
I'm not sure how it would work or if it can be done that way. Whether I just save all the classes as superclass.py, subclass1.py and subclass2.py and import them all?????
Hope that made sense.
Thanks.
Yes, its totally possible. Just don't forget to import superclass in the subclass file:
from module_where_superclass_is import SuperClass
class SubClass(SuperClass):
def method1(self):
# ...
Yes, obviously that is possible - That is the beauty of python !
Module 1
class base:
def p(self):
print "Hello Base"
Module 2
from module1 import base
class subclass(base):
def pp(self):
print "Hello child"
Python Shell
from module2 import subclass
ob = subclass()
ob.p()
"Hello Base"
ob.pp()
"Hello child"
:)
Sure, no problem. You would just do it like
import ModuleWithSuperclass
class subclass1(ModuleWithSuperclass.superclass):
def method1(self):
pass
In superclass.py:
class superclass(object):
def __init__(self, var1, var2):
self.var1 = var1
self.var2 = var2
Then in subclass1.py:
from superclass import superclass
class subclass1(superclass):
def method1(self):
pass
You can subclass any class that exists in the namespace.
If your superclass exists in some weird location you can add it to your script via this answer
I suggest you have a read of the python modules tutorial. It will give you a lot more information on how you can arrange your classes into hierarchically arranged modules and packages.
hmm, it will definitely work, but there is always a however.
if for example you used answer from #juliomalegria, you may find it hard to get all the sub-classes from the parent, that said, in your python-shell, if you go:
from module_where_superclass_is import SuperClass
SuperClass.__subclasses__()
you get nothing, unless you go:
from module_where_superclass_is import SubClass
SuperClass.__subclasses__()
then you can see the correct result comes out.
The solution to access subclasses in different modules from the superclass is the following :
You need to import your modules with your sub classes:
importlib.import_module("example.module1")
importlib.import_module("other.module2")
Then you can use the builtin method MySuperClass.__subclasses__()