I have a Django project and some foreign API's inside it. So, I have a script, that changes product stocks in my store via marketplace's API. And in my views.py I have a CreateAPIView class, that addresses to marketplace's API method, that allows to get product stocks, and writes it to MYSQL DB. Now I have to add a signal to start CreateAPIView class (to get and add changed stocks data) immediately after marketplace's change stocks method worked out. I know how to add a Django signal with pre_save and post_save, but I don't know how to add a singal on request.
I found some like this:
from django.core.signals import request_finished
from django.dispatch import receiver
#receiver(request_finished)
def my_callback(sender, **kwargs):
print("Request finished!")
But it is not that I'm looking for. I need a signal to start an CreateAPIView class after another api class finished its request. I will be very thankfull for any advise how to solve this problem.
You could create a Custom Signal, which can be called after the marketplace API is hit.
custom_signal = Signal(providing_args=["some_data"])
Send the signal when marketplace API is hit:
def marketplace_api():
data = 'some_data'
custom_signal.send(sender=None, some_data=data)
Then simply define a receiver function which will contain the logic you need:
#receiver(custom_signal)
def run_create_api_view(sender, **kwargs):
data = kwargs["some_data"]
if data is not None:
view = CreateAPIView()
view.dispatch(data)
Related
I have game,genre and producer tables. User creates new games through admin panel. After user creates a new game i need to do something but i have a problem. After new game has been created django sends signal and runs game_changed method. Problem is when i send get request to /api/game/ (Django rest framework endpoint) after i got signal of new game has been created response doesn't have this new model. More interesting is first i get count of game table through Game.objects.count i get 3 (correct) after that i send get request to game endpoint and response has 2 games. What causes this?
Models:
class Game(models.Model):
name=models.CharField(max_length=50)
producer=models.ForeignKey("Producer",on_delete=models.CASCADE)
genres=models.ManyToManyField("Genre")
def __str__(self):
return self.name
class Producer(models.Model):
name=models.CharField(max_length=50)
def __str__(self):
return self.name
class Genre(models.Model):
name=models.CharField(max_length=50)
def __str__(self):
return self.name
Signals:
from django.db.models import signals
from django.dispatch import receiver
from .models import Game
#receiver(signals.post_save,sender=Game)
def game_changed(sender,**kwargs):
print(Game.objects.count())#Returns 3
import requests
url="http://localhost:8000/api/game/"
response=requests.get(url)
print(response.json())# returns 2 games instead of 3
Views:
from rest_framework.viewsets import ReadOnlyModelViewSet
from .serializers import GenreSerializer,GameSerializer,ProducerSerializer
from .models import Genre,Game,Producer
class GenreViewSet(ReadOnlyModelViewSet):
queryset=Genre.objects.all()
serializer_class=GenreSerializer
search_fields=("name",)
class GameViewSet(ReadOnlyModelViewSet):
queryset=Game.objects.all()
serializer_class=GameSerializer
filterset_fields=("genres","producer")
search_fields=("name",)
class ProducerViewSet(ReadOnlyModelViewSet):
queryset=Producer.objects.all()
serializer_class=ProducerSerializer
search_fields=("name",)
I believe what's occuring here is a transaction isolation timing issue. While the transaction has been committed the database hasn't gotten to the point where it's readable. You can configure the transaction isolation level in your database settings. Or you can use transaction.on_commit.
I'd probably reach for transaction.on_commit as modifying your transaction isolation level can have wide reaching effects.
I'm building a single database/shared schema multi-tenant application using Django 2.2 and Python 3.7.
I'm attempting to use the new contextvars api to share the tenant state (an Organization) between views.
I'm setting the state in a custom middleware like this:
# tenant_middleware.py
from organization.models import Organization
import contextvars
import tenant.models as tenant_model
tenant = contextvars.ContextVar('tenant', default=None)
class TenantMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
response = self.get_response(request)
user = request.user
if user.is_authenticated:
organization = Organization.objects.get(organizationuser__is_current_organization=True, organizationuser__user=user)
tenant_object = tenant_model.Tenant.objects.get(organization=organization)
tenant.set(tenant_object)
return response
I'm using this state by having my app's models inherit from a TenantAwareModel like this:
# tenant_models.py
from django.contrib.auth import get_user_model
from django.db import models
from django.db.models.signals import pre_save
from django.dispatch import receiver
from organization.models import Organization
from tenant_middleware import tenant
User = get_user_model()
class TenantManager(models.Manager):
def get_queryset(self, *args, **kwargs):
tenant_object = tenant.get()
if tenant_object:
return super(TenantManager, self).get_queryset(*args, **kwargs).filter(tenant=tenant_object)
else:
return None
#receiver(pre_save)
def pre_save_callback(sender, instance, **kwargs):
tenant_object = tenant.get()
instance.tenant = tenant_object
class Tenant(models.Model):
organization = models.ForeignKey(Organization, null=False, on_delete=models.CASCADE)
def __str__(self):
return self.organization.name
class TenantAwareModel(models.Model):
tenant = models.ForeignKey(Tenant, on_delete=models.CASCADE, related_name='%(app_label)s_%(class)s_related', related_query_name='%(app_label)s_%(class)ss')
objects = models.Manager()
tenant_objects = TenantManager()
class Meta:
abstract = True
In my application the business logic can then retrieve querysets using .tenant_objects... on a model class rather than .objects...
The problem I'm having is that it doesn't always work - specifically in these cases:
In my login view after login() is called, the middleware runs and I can see the tenant is set correctly. When I redirect from my login view to my home view, however, the state is (initially) empty again and seems to get set properly after the home view executes. If I reload the home view, everything works fine.
If I logout and then login again as a different user, the state from the previous user is retained, again until a do a reload of the page. This seems related to the previous issue, as it almost seems like the state is lagging (for lack of a better word).
I use Celery to spin off shared_tasks for processing. I have to manually pass the tenant to these, as they don't pick up the context.
Questions:
Am I doing this correctly?
Do I need to manually reload the state somehow in each module?
Frustrated, as I can find almost no examples of doing this and very little discussion of contextvars. I'm trying to avoid passing the tenant around manually everywhere or using thread.locals.
Thanks.
You're only setting the context after the response has been generated. That means it will always lag. You probably want to set it before, then check after if the user has changed.
Note though that I'm not really sure this will ever work exactly how you want. Context vars are by definition local; but in an environment like Django you can never guarantee that consecutive requests from the same user will be served by the same server process, and similarly one process can serve requests from multiple users. Plus, as you've noted, Celery is a yet another separate process again, which won't share the context.
I'd like to notice whenever a model is saved and then do some processing and save another model. I need the model to already have an ID set by the database in the processing stage.
With Django one would override the .save() method of model or use signals like:
from django.db.models.signals import post_save
from django.dispatch import receiver
from .models import MyModel, OtherModel
#receiver(post_save, sender=MyModel)
def do_stuff(sender, instance, created, **kwargs):
assert instance.id is not None
...
OtherModel.create(related=instance, data=...)
How to do similar with SQLAlchemy and Flask? I looked up ORM Events and it seemed that expire IntanceEvent would fit the bill. It seems to fire whenever a model instance is saved but when I try to do the same kind of thing:
from sqlalchemy import event
from . import db
from .models import MyModel, OtherModel
#event.listens_for(MyModel, "expire")
def do_stuff(target, attrs):
assert target.id is not None
...
db.session.add(OtherModel(related=target, data=...))
db.session.commit()
It fails on assert instance.id is not None with:
InvalidRequestError: This session is in 'committed' state; no further SQL can be emitted within this transaction.
It might be that I'm just approaching this the wrong way or I'm missing something crucial but I cannot figure it out. The documentation is split among Flask, Flask-SQLAlchemy and SQLAlchemy and I have hard time piecing this together.
How should I make this kind of post save trigger with SQLAlchemy?
The event you want to listen for is 'after_insert', not 'expire':
#event.listens_for(MyModel, 'after_insert')
def do_stuff(mapper, connection, target):
assert target.id is not None
...
Also, after creating OtherModel inside the listener and calling db.session.add, don't call db.session.commit as it will throw a ResourceClosedError exception.
Have a look at the accepted answer to this question which gives an example of using SQLAlchemy's after_insert mapper event. It should do what you want, but using raw SQL rather than your session object is recommended.
I read the django docs about signals and wrote this piece of code for my model Car :
#receiver(request_finished)
def signal_callback(sender, **kwargs):
print 'Save Signal received'
#receiver(post_save, sender=Car)
def signal_handler(sender, **kwargs):
pass
request_finished(signal_callback, sender=car, dispatch_url="Unique save id")
But the problem is, that when I fire up my server, and just open up the admin, I get a lot of 'Save Signal received' in my terminal. What I am wondering about is I have restricted the signal_handler to post_save only. But still, without even saving anything, the message shows up a lot of times. I dont understand this.
Note : I will be honest. I understood parts of it, not everything from the documentation.
There is a simpler way to bind post_save signals
from django.db.models.signals import post_save
from myapp.models import Car
def do_something(sender, **kwargs):
print 'the object is now saved.'
car = kwargs['instance'] #now i have access to the object
post_save.connect(do_something, sender=Car)
The signal request finished gets called every time a HTTP request is made, which is a hog.
You binded request_finished signal to signal_callback. Remove(or comment out) signal_callback, and change signal_handler as follow.
#receiver(post_save, sender=Car)
def signal_handler(sender, **kwargs):
print 'Save signal received'
I have a django site, and some of the feeds are published through FeedBurner. I would like to ping FeedBurner whenever I save an instance of a particular model. FeedBurner's website says to use the XML-RPC ping mechanism, but I can't find a lot of documentation on how to implement it.
What's the easiest way to do the XML-RPC ping in django/Python?
You can use Django's signals feature to get a callback after a model is saved:
import xmlrpclib
from django.db.models.signals import post_save
from app.models import MyModel
def ping_handler(sender, instance=None, **kwargs):
if instance is None:
return
rpc = xmlrpclib.Server('http://ping.feedburner.google.com/')
rpc.weblogUpdates.ping(instance.title, instance.get_absolute_url())
post_save.connect(ping_handler, sender=MyModel)
Clearly, you should update this with what works for your app and read up on signals in case you want a different event.
Use pluggable apps, Luke!
http://github.com/svetlyak40wt/django-pingback/
maybe sth like that:
import xmlrpclib
j = xmlrpclib.Server('http://feedburnerrpc')
reply = j.weblogUpdates.ping('website title','http://urltothenewpost')