im currently writing my own event handling system with python. There i allow user to assign callbacks for each event with a function:
myEventHandlingSystem.addCallback(event_name, event_callback)
However, as each client has a set of event and sometimes the supported event list will be very long. In such situation, manually add each event using the method described above is quite complicated and might also have some mistakes like forgot to assign an event or an event is assigned multiple times with different callbacks by mistake.
What i want to achieve is that as each event has a String name, user should define the callback with exact the same name as the event. So at the end, user just need to provide a list of event, and each callbacks will automatically be associated to the event (of course, the a callback named in other way, the nothing should be assigned to that event) like:
SUPPORTED_EVENT1 = 'evt_Event1'
SUPPORTED_EVENT2 = 'evt_Event2'
clientConfig = json.load(configHandle)
# the json config file contains a field of 'SupportedEventList':[SUPPRTED_EVENT1,...]
def evt_Event1(*args):
...
myEventHandlingSystem.addCallbackFromEventList(clientConfig['SupportedEventList'])
Note that the supportedEventList is alway in form of a string list, where each element is the string contains the name of Event. Also the callback is handeled in the file where class for myEventHandlingSystem is defined
A solution to this would be to have a dict with associations between the functions and their names. Remember, functions are first-class objects in python and as such you can store them in a dict like you can store references to any other data. Hence you could do something along the lines of:
def evt_Event1(*args):
...
ALLOWED_CALLBACKS = {"evt_Event1": evt_Event1}
supportedEventList = ["evt_Event1",...]
for event in supportedEventList:
myEventHandlingSystem.addCallback(event, ALLOWED_CALLBACKS[event])
EDIT: So to address the question as it stands now. There is a function in python called globals() which would have a list of the functions defined in a given python file i.e. {"evt_Event1":<function at ...>}
You could do:
SUPPORTED_EVENT1 = 'evt_Event1'
SUPPORTED_EVENT2 = 'evt_Event2'
clientConfig = json.load(configHandle)
# the json config file contains a field of 'SupportedEventList':[SUPPRTED_EVENT1,...]
def evt_Event1(*args):
...
GLOBALS = globals()
for callback_name in clientConfig['SupportedEventList']:
eventSystem.addCallback(GLOBALS[callback_name])
If you wanted a register type API. You could:
So you could do something along the lines of:
# ./decorator.py
REGISTERED = {}
def register(func):
REGISTERED[func.__name__] = func
return func
and in the main code
from .decorator import REGISTERED, register
#register
def callback():
...
Alternatively you could straight away register the new callback into your event handling system using this type of decorator:
def register(func):
myEventHandlingSystem.addCallback(func.__name__, func)
return func
Also you could check out pluggy and similar plug in frameworks that allow people to add plugin functionality which sort of fits your pattern.
Related
I want to implement the server-side event analytics feature (using https://segment.com/), I am clear in using the api, just we have to add the event api's inside a function whose action we need to monitor, for example, for creating a new user account in my application, I will have an event inside the function create_user
def create_user(email_id, name, id):
# some code to add the user in my table
....
....
# calling the segment api to track event
analytics.track(user_id=email_id, event="Add User", properties={"username": name})
The above code works, but design wise I feel it can be done in a better way, since the create_user should have functionality of adding the user only, but here it contains the track event, and I have to do the same in modifying all the areas wherever I need to monitor by adding this analytics api and this makes my code contain irrelevant logic, I read about decorators but my analytics event depend on the code logic inside the function (like if the user email is valid then only I add the user to Db and trigger the event), so that doesn't seem to help.
So I am seeking help here in handling this scenario with the better approach and keeping the code clean. Is there any way or design approach exist for solving this case?
We can achieve this using decorators and one more separate function like below mentioned. With this code you have to call your confirm_loggin function based on the conditional event to log the data. While your inputs to the user function can be logged temporarily each time.
def confirm_logging():
'''
Final logging function, where once called from the main function
would log the data as need. Can be customized how it needs to be logged.
'''
global temp_log_data
print("Finally logging the data", temp_log_data)
# Can be taken ahead into DB logging.
temp_log_data.clear()
return
def logging_func(func):
'''
Temporary logging function for every function called into temp_log_data
The below temporary logging mechanism can be customized as required
'''
global temp_log_data
temp_log_data = []
def wrapper_function(*args, **kwargs):
# The below print statement can be customzied per your requirement
# You can also call anyother function instead of print and use the args
temp_log_data.append([args[0], args[1], args[2]])
print("Temporary logging data here -", (args[0], args[1], args[2]))
func(*args, **kwargs)
return wrapper_function
#logging_func
def create_user(greet, name, surname):
'''
Your main function, core to functionality specific
'''
print("{} {} {}".format(greet, name, surname))
if name == 'Abhi':
confirm_logging()
return
create_user('Welcome', 'Abhi', 'Jain')
I am creating a text adventure engine, the engine itself is working great but I am trying to implement custom events for game creators with callbacks. I have a main.py file that implements all of the game objects and builds the game. The problem is that I seem to be having trouble accessing the objects after I have instantiated them. have a look at this pseudo example code,
import engine
def build():
# create new item objects
key = engine.Item()
door = engine.Item(a = 0)
# set state of door
door.locked(requirement = key)
# CALLBACK FUNCTION
def on_door_unlock():
# ACCESSING THE ATTRIBUTE `a` IN `door` WORKS FINE
door.a = 1
# ACCESSING THE `key` OBJECT THROWS UnboundLocalError
del key
# assign callback to unlock event
door.on_unlock(on_door_unlock)
build()
engine.run()
My file is obviously much larger than this but it is just as simple and this code isolates my problem. I am able to access the attributes of any object, but when I try to use del keyword on an object itself I get UnboundLocalError: local variable 'key' referenced before assignment
my callback function is written after the creation of the key object
my callback is assigned to an event after the function is written
I can access object attributes but can't access object itself.
everything seems to be placed in order. So what is the problem?
How can i write callback functions that can access instances of the objects I create?
del key means no more or less than "remove the name key from the local scope". But this symbol has never been brought into the local scope of on_door_unlock and even if it had, removing it from there would not do anything to the scope of build.
One of many better approaches would be to create an explicitly-named registry of objects, for example as a dict called all_objects. Create the key inside it. Remove key from it by referring to it by name in your function.
ALL_OBJECTS = {}
def build():
ALL_OBJECTS[ 'copperKey' ] = engine.Item()
...
def on_door_unlock():
del ALL_OBJECTS[ 'copperKey' ]
...
I am using APScheduler and I need to add jobs with a programmatically created list of trigger options. That is, I can't write code where I pass trigger parameters directly to add_job (such as "second"="*/5" etc.).
The documentation mentions that you can create a trigger instance and pass that to add_job as the trigger parameter, instead of "cron" or "interval", etc.
I would like to try to do that, as it appears that the trigger constructor takes kwargs style parameters and I should be able to pass it a dictionary.
I have not found an example of how to do this. I have tried:
from apscheduler.triggers import cron
# skipping irrelevant code
class Schedules(object):
# skipping irrelevant code
def add_schedule(self, new_schedule):
# here I create trigger_args as {'second': '*/5'}, for example
trigger = cron(trigger_args)
This fails with: TypeError: 'module' object is not callable
How do I instantiate a trigger object?
I found a solution to my main problem without figuring out how to create a trigger instance (though I am still curious as to how to do that).
The main issue I had is that I need to programmatically create the trigger parameters. Knowing a bit more now about parameter passing in python, I see that if I make a dict of all the parameters, not just the trigger parameters, I can pass the parameters this way:
job_def = {}
#here I programmatically create the trigger parameters and add them to the dict
job_def["func"] = job_function
job_def["trigger"] = "cron"
job_def["args"] = [3]
new_job = self.scheduler.add_job(**job_def )
I thought it would be possible to create a custom Dexterity factory that calls the default factory and then adds some subcontent (in my case Archetypes-based) to the created 'parent' Dexterity content.
I have no problem creating and registering the custom factory.
However, regardless of what method I use (to create the AT subcontent), the subcontent creation fails when attempted from within the custom factory.
I've tried everything from plone.api to invokeFactory to direct instantiation of the AT content class.
In most cases, traceback shows the underlying Plone/CMF code tries to get portal_types tool using getToolByName and fails; similarly when trying to instantiate the AT class directly, the manage_afterAdd then tries to access reference_catalog, which fails.
Is there any way to make this work?
A different approach can simply be to add event handlers for IObjectAddedEvent, and add there your subcontents using common APIs.
After some trials and errors, it turns out this is possible:
from zope.container.interfaces import INameChooser
from zope.component.hooks import getSite
from plone.dexterity.factory import DexterityFactory
class CustomDexterityFactory(DexterityFactory):
def __call__(self, *args, **kw):
folder = DexterityFactory.__call__(self, *args, **kw)
# we are given no context to work with so need to resort to getSite
# hook or zope.globalrequest.getRequest and then wrap the folder
# in the context of the add view
site = getSite()
wrapped = folder.__of__(site["PUBLISHED"].context)
# invokeFactory fails if the container has no id
folder.id = "tmp_folder_id"
# standard AT content creation
wrapped.invokeFactory("Page", "tmp_obj_id")
page = wrapped["tmp_obj_id"]
new_id = INameChooser(service_obj).chooseName(title, page)
page.setId(new_id)
page.setTitle(title)
# empty the id, otherwise it will keep
folder.id = None
return folder
While the above works, at some point the created Page gets indexed (perhaps by invokeFactory), which means there will be a bogus entry in the catalog. Code to remove the entry could be added to the factory.
Overall, it would be easier to just create an event handler, as suggested by #keul in his answer.
I'm working on a project in SQLAlchemy. I've got Command class which has custom serialization/deserialization method called toBinArray() and fromBinArray(bytes). I use it for TCP communication (I don't want to use pickle because my functions create smaller outputs).
Command has several subclasses, let's call them CommandGet, CommandSet, etc. They have additional methods and attributes and serialization methods redefinitions to keep track of their own attributes. I'm keeping all of them in one table using polymorhic_identity mechanism.
The problem is that there are lot of subclasses and every has different attributes. I have previously written mapping for every of them, but this way table has huge amount of columns.
I would like to write mechanism that will serialize (using self.toBinArray()) every instance to attribute self._bin_array (stored in Binary column) before every write to DB and load (using self.fromBinArray(value)) attributes after every load of instance from DB.
I have already found answer to part of my question: I can call self.fromBinArray(self._bin_array) in function with #orm.reconstructor decorator. It is inherited by every Command subclass and executes proper inherited version of fromBinArray(). My question is how to automatize serialization on writing to DB (I know I can manually set self._bin_array but that's very troublesome)?
P.S. Part of my code, my main class:
class Command(Base):
__tablename__ = "commands"
dbid = Column(Integer, Sequence("commands_seq"), primary_key = True)
cmd_id = Column(SmallInteger)
instance_dbid = Column(Integer, ForeignKey("instances.dbid"))
type = Column(String(20))
_bin_array = Column(Binary)
__mapper_args__ = {
"polymorphic_on" : type,
"polymorphic_identity" : "Command",
}
#orm.reconstructor
def init_on_load(self):
self.fromBinArray(self._bin_array)
def fromBinArray(self, b):
(...)
def toBinArray(self):
(...)
EDIT: I've found solution (below in answer), but are there any other solutions? Maybe some shortcut to insert event listening function inside class body?
It looks that solution was simpler than I expected-you need to use event listener for before_insert (and/or before_update event). I've found information (source) that
reconstructor() is a shortcut into a larger system of “instance level”
events, which can be subscribed to using the event API - see
InstanceEvents for the full API description of these events.
And that gave me the clue:
#event.listens_for(Command, 'before_insert', propagate = True)
def serialize_before_insert(mapper, connection, target):
print("serialize_before_insert")
target._bin_array = target.toBinArray()
You can also use event.listen() function to ,,bind'' event listener to instance, but I personally prefer decorator way. It's very important to add propagate = True) in the declaration so subclasses can inherit listener!