I have created a custom Archetypes content type called "Résumé" and would like to enforce a limitation that lets a Member add only one item of this type inside a folder. Even better would be redirecting the member to the edit page of his or her item, if it already exists in that folder.
How can I enforce this limitation and provide this extra functionality?
A solution to a similar usecase for Plone 3 can be found in the eestec.base. We did it by overriding the createObject.cpy and adding a special check for this.
Code is in the collective SVN, at http://dev.plone.org/collective/browser/eestec.base/trunk/eestec/base/skins/eestec_base_templates/createObject.cpy, lines 20-32.
Well, it is a sort of validation constraint, so maybe add a validator to the title field that in reality does not bother about the title, but checks the user etc.? (I think a field validator is passed enough information to pull this off, if not, overriding the post_validate method or listening to the corresponding event should work.)
If you try this, bear in mind that post_validate is already called while the user is editing (ie on moving focus out of a field).
I dont't know if it's best practice, but you can hook up on def at_post_create_script from base object on creation and manage_beforeDelete on delete.
for example:
from Products.ATContentTypes.lib import constraintypes
class YourContentype(folder.ATFolder)
[...]
def at_post_create(self):
origin = aq_parent(aq_inner(self))
origin.setConstrainTypesMode(constraintypes.ENABLED) # enable constrain types
org_types = origin.getLocallyAllowedTypes() # returns an immutable tuple
new_types = [x for x in org_types if x! = self.portal_type] # filter tuple into a list
origin.setLocallyAllowedTypes(new_types)
def manage_beforeDelete(self, item, container)
BaseObject.manage_beforeDelete(self, item, container) # from baseObject
self._v_cp_refs = None # from baseObject
origin = aq_parent(aq_inner(self))
origin.setConstrainTypesMode(constraintypes.ENABLED) # enable constrain types
org_types = origin.getLocallyAllowedTypes() # returns an immutable tuple
new_types = [x for x in org_types].append(self.portal_type)
origin.setLocallyAllowedTypes(new_types)
Note: There is also a method called setImmediatelyAddableTypes(), which you may want to explore.
Note II: This does not survive content migration.
Related
I'm using Odoo 10. After a new user sign up (through localhost:8069/web/signup) i want him to be automatically allocated inside a group i created on my very own custom module (the user will need authentication from an admin later on so he can be converted to a regular portal user; after signup he will receive restricted access).
I have tried many things. My latest effort looks like this:
class RestrictAccessOnSignup(auth_signup_controller.AuthSignupHome):
def do_signup(self, *args):
super(RestrictAccessOnSignup, self).do_signup(*args)
request.env['res.groups'].sudo().write({'groups_id': 'group_unuser'})
Note that I have import odoo.addons.auth_signup.controllers.main as auth_signup_controller so that I can override the auth_signup controller.
I have located that method as the responsible for doing the signup. So I call it in my new method and then try to change the newly created user's group_id.
What i miss is a fundamental understanding of how to overwrite a field's value from another model inside a controller method context. I'm using the 'request' object although i'm not sure of it. I have seen people using 'self.pool['res.users'] (e.g.) for such purposes but i don't understand how to apply it inside my problem's context.
I believe, also, that there is a way to change the default group for a user after it is created (i would like to know), but i also want to understand how to solve the general problem (accessing and overwriting a field's value from another module).
Another weird thing is that the field groups_id does exist in 'res.users' model, but it does not appear as a column in my pgAdmin interface when i click to see the 'res.users' table... Any idea why?
Thanks a lot!
i don't know if after calling :
super(RestrictAccessOnSignup,self).do_signup(*args)
you will have access to user record in request object but if so just add
the group to user like this, if not you have to find where the user record or id is saved after calling do_signup because you need to update that record to ad this group.
# create env variable i hate typing even i'm typing here ^^
env = request.env
env.user.sudo().write({'groups_id': [
# in odoo relation field accept a list of commands
# command 4 means add the id in the second position must be an integer
# ref return an object so we return the id
( 4, env.ref('your_module_name.group_unuser').id),
]
})
and if changes are not committed in database you may need to commit them
request.env.cr.commit()
Note: self.env.ref you must pass the full xmlID.
This is what worked for me:
def do_signup(self, *args):
super(RestrictAccessOnSignup, self).do_signup(*args)
group_id = request.env['ir.model.data'].get_object('academy2', 'group_unuser')
group_id.sudo().write({'users': [(4, request.env.uid)]})
In the get_object i pass as arguments the 'module' and the 'xmlID' of the group i want to fetch.
It is still not clear to me why 'ir.model.data' is the environment used, but this works as a charm. Please note that here we are adding a user to the group, and not a group to the user, and to me that actually makes more sense.
Any further elucidation or parallel solutions are welcome, the methods aren't as clear to me as they should be.
thanks.
A mongoengine.DynamicEmbeddedDocument can be used to leverage MongoDB's flexible schema-less design. It's expandable and doesn't apply type constraints to the fields, afaik.
A mongoengine.DictField similarly allows for use of MongoDB's schema-less nature. In the documentation they simply say (w.r.t. the DictField)
This is similar to an embedded document, but the structure is not defined.
Does that mean, then, the mongoengine.fields.DictField and the mongoengine.DynamicEmbeddedDocument are completely interchangeable?
EDIT (for more information):
mongoengine.DynamicEmbeddedDocument inherits from mongoengine.EmbeddedDocument which, from the code is:
A mongoengine.Document that isn't stored in its own collection. mongoengine.EmbeddedDocuments should be used as fields on mongoengine.Documents through the mongoengine.EmbeddedDocumentField field type.
A mongoengine.fields.EmbeddedDocumentField is
An embedded document field - with a declared document_type. Only valid values are subclasses of EmbeddedDocument.
Does this mean the only thing that makes the DictField and DynamicEmbeddedDocument not totally interchangeable is that the DynamicEmbeddedDocument has to be defined through the EmbeddedDocumentField field type?
From what I’ve seen, the two are similar, but not entirely interchangeable. Each approach may have a slight advantage based on your needs. First of all, as you point out, the two approaches require differing definitions in the document, as shown below.
class ExampleDynamicEmbeddedDoc(DynamicEmbeddedDocument):
pass
class ExampleDoc(Document):
dict_approach = DictField()
dynamic_doc_approach = EmbeddedDocumentField(ExampleDynamicEmbeddedDoc, default = ExampleDynamicEmbeddedDoc())
Note: The default is not required, but the dynamic_doc_approach field will need to be set to a ExampleDynamicEmbeddedDoc object in order to save. (i.e. trying to save after setting example_doc_instance.dynamic_doc_approach = {} would throw an exception). Also, you could use the GenericEmbeddedDocumentField if you don’t want to tie the field to a specific type of EmbeddedDocument, but the field would still need to be point to an object subclassed from EmbeddedDocument in order to save.
Once set up, the two are functionally similar in that you can save data to them as needed and without restrictions:
e = ExampleDoc()
e.dict_approach["test"] = 10
e.dynamic_doc_approach.test = 10
However, the one main difference that I’ve seen is that you can query against any values added to a DictField, whereas you cannot with a DynamicEmbeddedDoc.
ExampleDoc.objects(dict_approach__test = 10) # Returns a QuerySet containing our entry.
ExampleDoc.objects(dynamic_doc_approach__test = 10) # Throws an exception.
That being said, using an EmbeddedDocument has the advantage of validating fields which you know will be present in the document. (We simply would need to add them to the ExampleDynamicEmbeddedDoc definition). Because of this, I think it is best to use a DynamicEmbeddedDocument when you have a good idea of a schema for the field and only anticipate adding fields minimally (which you will not need to query against). However, if you are not concerned about validation or anticipate adding a lot of fields which you’ll query against, go with a DictField.
If we add a second entity of same Model(NDB) with the same id, would the first entity get replaced by the second entity?
Is this the right way? In future, would this cause any problem?
I use GAE Python with NDB.
Eg,
class X (ndb.Model):
command = ndb.StringProperty ()
x_record = X (id="id_value", command="c1")
x_record.put ()
# After some time
x_record = X (id="id_value", command="c2")
x_record.put ()
I did find a mention of this in official Google docs.
CONTEXT
I intend to use it to reduce code steps. Presently, first the code checks if an entity with key X already exists. If it exists, it updates its properties. Else, it creates a new one with that key(X). New approach would be to just blindly create a new entity with key X.
Yes, you would simply replace the model.
Would it cause any problems? Only if you wanted the original model back...
I have a series of tests and cases in a database. Whenever a test is obsoleted, it gets end dated, and any sub-cases of that test should also be end dated. I see two ways to accomplish this:
1) Modify the save function to end date sub-cases.
2) Create a receiver which listens for Test models being saved, and then end dates their sub-cases.
Any reason to use one other than the other?
Edit: I see this blog post suggests to use the save method whenever you check given values of the model. Since I'm checking the end_date, maybe that suggests I should use a custom save?
Edit2: Also, for the record, the full hierarchy is Protocol -> Test -> Case -> Planned_Execution, and anytime one is end_dated, every child must also be endDated. I figure I'll end up doing basically the same thing for each.
Edit3: It turns out that in order to tell whether the current save() is the one that is endDating the Test, I need to have access to the old data and the new data, so I used a custom save. Here's what it looks like:
def save(self):
"""Use a custom save to end date any subCases"""
try:
orig = Test.objects.get(id=self.id)
enddated = (not orig.end_date) and self.end_date is not None
except:
enddated = False
super(Test, self).save()
if enddated:
for case in self.case_set.exclude(end_date__isnull=False):
case.end_date = self.end_date
case.enddater = self.enddater
case.save()
I generally use this rule of thumb:
If you have to modify data so that the save won't fail, then override save() (you don't really have another option). For example, in an app I'm working on, I have a model with a text field that has a list of choices. This interfaces with old code, and replaces an older model that had a similar text field, but with a different list of choices. The old code sometimes passes my model a choice from the older model, but there's a 1:1 mapping between choices, so in such a case I can modify the choice to the new one. Makes sense to do this in save().
Otherwise, if the save can proceed without intervention, I generally use a post-save signal.
In my understanding, signals are a means for decoupling modules. Since your task seems to happen in only one module I'd customize save.
I'm using google app engine with django 1.0.2 (and the django-helper) and wonder how people go about doing recursive delete.
Suppose you have a model that's something like this:
class Top(BaseModel):
pass
class Bottom(BaseModel):
daddy = db.ReferenceProperty(Top)
Now, when I delete an object of type 'Top', I want all the associated 'Bottom' objects to be deleted as well.
As things are now, when I delete a 'Top' object, the 'Bottom' objects stay and then I get data that doesn't belong anywhere. When accessing the datastore in a view, I end up with:
Caught an exception while rendering: ReferenceProperty failed to be resolved.
I could of course find all objects and delete them, but since my real model is at least 5 levels deep, I'm hoping there's a way to make sure this can be done automatically.
I've found this article about how it works with Java and that seems to be pretty much what I want as well.
Anyone know how I could get that behavior in django as well?
You need to implement this manually, by looking up affected records and deleting them at the same time as you delete the parent record. You can simplify this, if you wish, by overriding the .delete() method on your parent class to automatically delete all related records.
For performance reasons, you almost certainly want to use key-only queries (allowing you to get the keys of entities to be deleted without having to fetch and decode the actual entities), and batch deletes. For example:
db.delete(Bottom.all(keys_only=True).filter("daddy =", top).fetch(1000))
Actually that behavior is GAE-specific. Django's ORM simulates "ON DELETE CASCADE" on .delete().
I know that this is not an answer to your question, but maybe it can help you from looking in the wrong places.
Reconsider the data structure. If the relationship will never change on the record lifetime, you could use "ancestors" feature of GAE:
class Top(db.Model): pass
class Middle(db.Model): pass
class Bottom(db.Model): pass
top = Top()
middles = [Middle(parent=top) for i in range(0,10)]
bottoms = [Bottom(parent=middle) for i in range(0,10) for middle in middles]
Then querying for ancestor=top will find all the records from all levels. So it will be easy to delete them.
descendants = list(db.Query().ancestor(top))
# should return [top] + middles + bottoms
If your hierarchy is only a small number of levels deep, then you might be able to do something with a field that looks like a file path:
daddy.ancestry = "greatgranddaddy/granddaddy/daddy/"
me.ancestry = daddy.ancestry + me.uniquename + "/"
sort of thing. You do need unique names, at least unique among siblings.
The path in object IDs sort of does this already, but IIRC that's bound up with entity groups, which you're advised not to use to express relationships in the data domain.
Then you can construct a query to return all of granddaddy's descendants using the initial substring trick, like this:
query = Person.all()
query.filter("ancestry >", gdaddy.ancestry + "\U0001")
query.filter("ancestry <", gdaddy.ancestry + "\UFFFF")
Obviously this is no use if you can't fit the ancestry into a 500 byte StringProperty.