I have a django site, and some of the feeds are published through FeedBurner. I would like to ping FeedBurner whenever I save an instance of a particular model. FeedBurner's website says to use the XML-RPC ping mechanism, but I can't find a lot of documentation on how to implement it.
What's the easiest way to do the XML-RPC ping in django/Python?
You can use Django's signals feature to get a callback after a model is saved:
import xmlrpclib
from django.db.models.signals import post_save
from app.models import MyModel
def ping_handler(sender, instance=None, **kwargs):
if instance is None:
return
rpc = xmlrpclib.Server('http://ping.feedburner.google.com/')
rpc.weblogUpdates.ping(instance.title, instance.get_absolute_url())
post_save.connect(ping_handler, sender=MyModel)
Clearly, you should update this with what works for your app and read up on signals in case you want a different event.
Use pluggable apps, Luke!
http://github.com/svetlyak40wt/django-pingback/
maybe sth like that:
import xmlrpclib
j = xmlrpclib.Server('http://feedburnerrpc')
reply = j.weblogUpdates.ping('website title','http://urltothenewpost')
Related
I have a Django project and some foreign API's inside it. So, I have a script, that changes product stocks in my store via marketplace's API. And in my views.py I have a CreateAPIView class, that addresses to marketplace's API method, that allows to get product stocks, and writes it to MYSQL DB. Now I have to add a signal to start CreateAPIView class (to get and add changed stocks data) immediately after marketplace's change stocks method worked out. I know how to add a Django signal with pre_save and post_save, but I don't know how to add a singal on request.
I found some like this:
from django.core.signals import request_finished
from django.dispatch import receiver
#receiver(request_finished)
def my_callback(sender, **kwargs):
print("Request finished!")
But it is not that I'm looking for. I need a signal to start an CreateAPIView class after another api class finished its request. I will be very thankfull for any advise how to solve this problem.
You could create a Custom Signal, which can be called after the marketplace API is hit.
custom_signal = Signal(providing_args=["some_data"])
Send the signal when marketplace API is hit:
def marketplace_api():
data = 'some_data'
custom_signal.send(sender=None, some_data=data)
Then simply define a receiver function which will contain the logic you need:
#receiver(custom_signal)
def run_create_api_view(sender, **kwargs):
data = kwargs["some_data"]
if data is not None:
view = CreateAPIView()
view.dispatch(data)
I'd like to notice whenever a model is saved and then do some processing and save another model. I need the model to already have an ID set by the database in the processing stage.
With Django one would override the .save() method of model or use signals like:
from django.db.models.signals import post_save
from django.dispatch import receiver
from .models import MyModel, OtherModel
#receiver(post_save, sender=MyModel)
def do_stuff(sender, instance, created, **kwargs):
assert instance.id is not None
...
OtherModel.create(related=instance, data=...)
How to do similar with SQLAlchemy and Flask? I looked up ORM Events and it seemed that expire IntanceEvent would fit the bill. It seems to fire whenever a model instance is saved but when I try to do the same kind of thing:
from sqlalchemy import event
from . import db
from .models import MyModel, OtherModel
#event.listens_for(MyModel, "expire")
def do_stuff(target, attrs):
assert target.id is not None
...
db.session.add(OtherModel(related=target, data=...))
db.session.commit()
It fails on assert instance.id is not None with:
InvalidRequestError: This session is in 'committed' state; no further SQL can be emitted within this transaction.
It might be that I'm just approaching this the wrong way or I'm missing something crucial but I cannot figure it out. The documentation is split among Flask, Flask-SQLAlchemy and SQLAlchemy and I have hard time piecing this together.
How should I make this kind of post save trigger with SQLAlchemy?
The event you want to listen for is 'after_insert', not 'expire':
#event.listens_for(MyModel, 'after_insert')
def do_stuff(mapper, connection, target):
assert target.id is not None
...
Also, after creating OtherModel inside the listener and calling db.session.add, don't call db.session.commit as it will throw a ResourceClosedError exception.
Have a look at the accepted answer to this question which gives an example of using SQLAlchemy's after_insert mapper event. It should do what you want, but using raw SQL rather than your session object is recommended.
Following is my code in the signals.py file placed in the package where the auth model is defined.
#receiver(post_migrate, sender=settings.AUTH_USER_MODEL)
def define_groups(sender, **kwargs):
# Create groups
Group.objects.get_or_create(name='Promoter')
Group.objects.get_or_create(name='Client')
Group.objects.get_or_create(name='Superuser')
Group.objects.get_or_create(name='Staff')
The documentation (https://docs.djangoproject.com/en/1.8/topics/auth/customizing/#referencing-the-user-model) states that it should be set as
sender=settings.AUTH_USER_MODEL
while this only works for post_save as mentioned in the documentation example.
I've already tried get_user_model() and also directly using the my_custom_user.models.
get_user_model() returns an error, while setting models as sender works just fine, as -
from . import models
#receiver(post_syncdb, sender=models)
def define_groups(sender, **kwargs):
# Create groups
Group.objects.get_or_create(name='Promoter')
Group.objects.get_or_create(name='Client')
Group.objects.get_or_create(name='Superuser')
Group.objects.get_or_create(name='Staff')
But according to documentation this is not the right way to refer a custom user model and is just an ugly workaround.
Would someone please be able to help me with a solution so i can add these Groups with the first migration of user model.
Thank You
EDIT : using get_user_model() returns the following error -
django.core.exceptions.AppRegistryNotReady: Models aren't loaded yet.
The sender for the post_migrate method is never a model (custom or otherwise), it is the AppConfig instance for the app which was installed.
The docs give the following example for connecting your signal handler in the ready method.
from django.apps import AppConfig
from django.db.models.signals import post_migrate
def my_callback(sender, **kwargs):
# Your specific logic here
pass
class MyAppConfig(AppConfig):
...
def ready(self):
post_migrate.connect(my_callback, sender=self)
Similarly, the sender for post_sync_db signal (note the signal is deprecated) is the module containing the models which were installed.
As the title suggests I'd like to know if and how I can override the get and post methods of Tastypie.
For example, every time a user sends over a json file at the API endpoint, I don't want anything to be stored in the models and instead only return a small message back.
How can I do this?
Thanks.
This example coming directly from Tastypie Cookbook:
from tastypie.utils import now
class MyResource(ModelResource):
class Meta:
queryset = MyObject.objects.all()
def get_object_list(self, request):
return super(MyResource, self).get_object_list(request).filter(start_date__gte=now)
Similar approach can be utilized for POST etc. as well. Hope it helps :)
I am having trouble to figure out whether my signal handler is called during fixture loading or not. Most of my signal handlers receive an extra keyword raw when django load fixtures. However, this extra keyword only get passed through when handling 'pre/post' signals, it doesn't get pass through if the signal I am listening to is m2m_changed!
Is there any reliable way to tell whether I am in a "fixture loading mode" or not with m2m_changed
Well, if anyone found this just like me, one horrible, horrible way to solve this is the following:
https://code.djangoproject.com/ticket/8399#comment:7
In this old ticket of the django Project a way to determine whether a signal was triggered form a loaddata or not is requested.
After this, the raw keyword was the proposed solution, which does not appear in the m2m_changed signal. Before that there was the following proposed workaround, which still works:
try:
from functools import wraps
except ImportError:
from django.utils.functional import wraps
import inspect
def disable_for_loaddata(signal_handler):
#wraps(signal_handler)
def wrapper(*args, **kwargs):
for fr in inspect.stack():
if inspect.getmodulename(fr[1]) == 'loaddata':
return
signal_handler(*args, **kwargs)
return wrapper
You can than use this decorater to disable any signal on loaddata, like this:
from django.db.models.signals import m2m_changed
from django.dispatch import receiver
#receiver(m2m_changed, sender=models.Foo.bar.through)
#disable_for_loaddata
def some_signal(sender, instance: models.Foo, action: str, **kwargs):
# signal code