Django rest auth user_logged_in signal - python

I have a django rest app using django rest auth.
I'm trying to log something everytime a user log in using signals.
I've searched on the web on how to use signals and I haven't found any interesting material on how to make it work. I think the problem may be with allauth signals. Is there any problem with the following configuration?
signals.py
import logging
from allauth.account.signals import user_logged_in
from django.dispatch import receiver
logger = logging.getLogger(__name__)
#receiver(user_logged_in)
def login_logger(request, user, **kwargs):
logger.info("{} logged in with {}".format(user.email, request))
apps.py
from django.apps import AppConfig
class UsersConfig(AppConfig):
name = 'users'
def ready(self):
import users.signals
__init__.py
default_app_config = 'users.apps.UsersConfig'

Here's how I solved it using djangorestframework-jwt==1.11.0:
settings.py
from django.contrib.auth.signals import user_logged_in
def jwt_response_payload_handler(token, user=None, request=None):
if user and request:
user_logged_in.send(sender=user.__class__, request=request, user=user)
return {
'token': token,
}
JWT_AUTH = {
'JWT_RESPONSE_PAYLOAD_HANDLER': jwt_response_payload_handler,
}
models.py
from django.contrib.auth.signals import user_logged_in
def login_handler(sender, user, request, **kwargs):
print('logged in')
user_logged_in.connect(login_handler)

It seems that Django Rest Framework doesn't emit user_logged_in signal when token-based authentication is configured: https://github.com/encode/django-rest-framework/issues/3869

For future googlers. The OP's original question was that he wanted to log something everytime a user logs in when using rest_auth. You would expect a signal to be emitted by rest_auth for something like this but rest_auth does different things depending on the type of login. For a session login, rest_auth calls into django's normal login routines and a signal does get emitted. But for token-based authentication, rest_auth creates the token and returns it and there is no call into django proper and no signal is emitted.
Here's the rest_auth login code
To get the behavior you want, you have to override rest_auth's default token handler (it's straightforward) so you know when a token gets created and then log the event however you want.
In django's settings.py file add:
REST_AUTH_TOKEN_CREATOR = '<your_dotted_project_path>.create_login_token'
In some file in your project:
# this is the same as the default rest_auth token handler except we
# don't throw away the 'created' part because we care whether it was
# created or just retrieved.
def create_login_token(token_model, user, serializer):
token, created = token_model.objects.get_or_create(user=user)
if created:
>>> log it or emit your own signal or whatever <<<
return token

For some strange reason, this doesn't seem to work when placed in signals.py, this implementation of signals.py works fine for every part of project except for allauth(I had my personal experience too). Check this github issueshttps://github.com/pennersr/django-allauth/issues/347
For some strange reason moving this code(signals.py) into models.py of the same app will work.
# place this in models.py
from allauth.account.signals import user_logged_in
from django.dispatch import receiver
logger = logging.getLogger(__name__)
#receiver(user_logged_in)
def login_logger(request, user, **kwargs):
logger.info("{} logged in with {}".format(user.email, request))

It doesn't pick up from signals.py for some strange reason. However, you can still maintain the separation of concerns using the following. This worked for me while still keeping the signal logic away from my models.
signals.py
import logging
from allauth.account.signals import user_logged_in
from django.dispatch import receiver
logger = logging.getLogger(__name__)
#receiver(user_logged_in)
def login_logger(request, user, **kwargs):
logger.info("{} logged in with {}".format(user.email, request))
models.py
from .signals import *
// Your models here
class Foo(models.Model):
pass

Related

Use Django Oscar Signals

I want to send an email to admin if an Order is placed (currently only user who have placed an order received the email). order_paced Oscar Signal can work for me here.
For this I have already forked order app and inside this app order_placed function is created in signals.py. I have also imported signals in config.py but still this order_placed not getting fired when I am placing an order from site.
Can anyone share any example of oscar signal usage ?
Code :
config.py
from oscar.apps.order import config
class OrderConfig(config.OrderConfig):
name = 'catalogue.order'
def ready(self):
from oscar.apps.order import signals
signals.py
from django.db.models.signals import post_save
from django.dispatch import receiver
from oscar.apps.order.models import Order
#receiver(post_save, sender=Order)
def order_placed(*args, **kwargs):
"""
:param args:
:param kwargs:
:return:
"""
print("i ma here ----------------------")
You don't need signals for this, as part of the payment flow (framework) oscar provides the view: PaymentDetailsView which in time implements the mixin OrderPlacementMixin.
In such mixin you'll find the method: handle_successful_order which is the correct place to send the messages, and do other things being sure the order was placed.
So, do not fork order app, fork checkout app and override this method in order to do something like this:
from django.conf import settings
class PaymentDetailView:
# ...
def handle_successful_order(order):
send_mail_to_admin(settings.ADMIN_EMAIL_ADDRESS)
super(PaymentDetailView, self).handle_successful_order(order)
If you read the code of this method in oscar you'll see that this is indeed where oscar notify the user about the order that has been just placed.
An of course, we can not ignore the docstring which states:
Override this view if you want to perform custom actions when an
order is submitted.
def handle_successful_order(self, order):
"""
Handle the various steps required after an order has been successfully
placed.
Override this view if you want to perform custom actions when an
order is submitted.
"""
# Send confirmation message (normally an email)
self.send_confirmation_message(order, self.communication_type_code)
# Flush all session data
self.checkout_session.flush()
# Save order id in session so thank-you page can load it
self.request.session['checkout_order_id'] = order.id
response = HttpResponseRedirect(self.get_success_url())
self.send_signal(self.request, response, order)
return response
Like #raydel-miranda pointed out you don't need signal to send email to the admin when an order is placed by the customer.
just fork the checkout app using ./python manage.py oscar_fork_app checkout apps
Inside the above forked checkout app create views.py file and override the default oscar checkout/views.py file with this code.
yourproject_name/apps/checkout/views.py
from django.conf import settings
from django.views.generic import FormView
from django.contrib import messages
from django.core.mail import EmailMessage,send_mail
OscarPaymentDetailsView = get_class("checkout.views", "PaymentDetailsView")
class PaymentDetailsView(CheckCountryPreCondition, OscarPaymentDetailsView):
def handle_successful_order(self, order):
print('order creted')
send_mail(
subject="New Order Needs attention",
message='Please login the dashboard and complete the order',
from_email=(settings.EMAIL_HOST_USER),
recipient_list=['admin_email_address_1', 'admin_email_address_2'], #x.email for x in partner_id.users.all()
fail_silently=False,
)
ctx=super(PaymentDetailsView, self).handle_successful_order(order)
#send_mail_to_admin(settings.ADMIN_EMAIL_ADDRESS)
This may save someone else quality time trying to create signals for sending admin email when an order is placed.

auditlog with Django and DRF

I need to implement auditlog feature in one of my project which is using Django 1.8 and Django-Rest-Framework 3.2.2. I have extended BaseUserManager class to create user model since I had to use email as a username in my application ( if this information matters ).
Below is my db design which will hold logs :
**fields type desc**
id pk ( auto_increment)
cust_id FK customer
customer_name FK customer
user_id FK user
user_name FK user
module Varchar(100) sales,order,billing,etc
action Varchar(10) Create/Update/Delete
previous_value varchar(500)
current_value varchar(500)
Datetime Datetime timestamp of change
I have tried https://pypi.python.org/pypi/django-audit-log but it has 2 issues as per my requirement-
It does not capture data as per my requirement which I understand is my issue and so I modified it's code and added my fields into it's model.
It is not capturing module information. Behaviour is random.
I am seeking advice to proceed with this feature. Which package would be best suitable for my task.
P.S I have also tried Django-reversion and I have no requirement of data versioning.
Thanks
I achieved what I needed by modifying auditlog code -
Added required field in LogEntry model of auditlog.
Modified log_create,log_update,log_delete functions of receivers.py to save information in newly added fields.
Using this I am halfway done. Now only issue I am facing is that since model instance of 1 table contains information of other tables as well due to FKs used in the table.
To solve this I could come up with a solution which works well but I am not satisfied with it.
I added a function like include_in_model() in each model and modified auditlog's registry.py register() function to get those fields and only use that to save information in LogEntry model.
This approach will require me to create this include_in_model() function in each of my model class and pass required fields for particular model. This way I am avoiding FK related information.
Django Simple History is an excellent app that I've used in production projects in the past, it will give you per model Audits against your users.
Furthermore, you should create your own Authentication Class which will be responsible for logging requests. Let's assume that a User uses a Token to authenticate with your API. It gets sent in the header of each HTTP Request to your API like so: Authorization: Bearer <My Token>. We should then log the User associated with the request, the time, the user's IP and the body.
This is pretty easy:
settings.py
REST_FRAMEWORK = {
'DEFAULT_AUTHENTICATION_CLASSES': (
'common.authentication.MyTokenAuthenticationClass'
),
...
}
common/authentication.py
from django.utils import timezone
from django.utils.translation import ugettext_lazy as _
from ipware.ip import get_real_ip
from rest_framework import authentication
from rest_framework import exceptions
from accounts.models import Token, AuditLog
class MyTokenAuthenticationClass(authentication.BaseAuthentication):
def authenticate(self, request):
# Grab the Athorization Header from the HTTP Request
auth = authentication.get_authorization_header(request).split()
if not auth or auth[0].lower() != b'bearer':
return None
# Check that Token header is properly formatted and present, raise errors if not
if len(auth) == 1:
msg = _('Invalid token header. No credentials provided.')
raise exceptions.AuthenticationFailed(msg)
elif len(auth) > 2:
msg = _('Invalid token header. Credentials string should not contain spaces.')
raise exceptions.AuthenticationFailed(msg)
try:
token = Token.objects.get(token=auth[1])
# Using the `ipware.ip` module to get the real IP (if hosted on ElasticBeanstalk or Heroku)
token.last_ip = get_real_ip(request)
token.last_login = timezone.now()
token.save()
# Add the saved token instance to the request context
request.token = token
except Token.DoesNotExist:
raise exceptions.AuthenticationFailed('Invalid token.')
# At this point, insert the Log into your AuditLog table:
AuditLog.objects.create(
user_id=token.user,
request_payload=request.body,
# Additional fields
...
)
# Return the Authenticated User associated with the Token
return (token.user, token)
Another solution would be to use django auditlog and use a custom middleware which does not capture the 'request.user' directly but at the moment when it is needed, by this time DRF will have set the correct 'request.user' so that it is no longer missing the username in the audit logs.
Create a file named (for example) auditlog_middleware.py and include it in the MIDDLEWARE in your settings.py instead of the default auditlog middleware.
from __future__ import unicode_literals
import threading
import time
from django.conf import settings
from django.db.models.signals import pre_save
from django.utils.functional import curry
from django.apps import apps
from auditlog.models import LogEntry
from auditlog.compat import is_authenticated
# Use MiddlewareMixin when present (Django >= 1.10)
try:
from django.utils.deprecation import MiddlewareMixin
except ImportError:
MiddlewareMixin = object
threadlocal = threading.local()
class AuditlogMiddleware(MiddlewareMixin):
"""
Middleware to couple the request's user to log items. This is accomplished by currying the signal receiver with the
user from the request (or None if the user is not authenticated).
"""
def process_request(self, request):
"""
Gets the current user from the request and prepares and connects a signal receiver with the user already
attached to it.
"""
# Initialize thread local storage
threadlocal.auditlog = {
'signal_duid': (self.__class__, time.time()),
'remote_addr': request.META.get('REMOTE_ADDR'),
}
# In case of proxy, set 'original' address
if request.META.get('HTTP_X_FORWARDED_FOR'):
threadlocal.auditlog['remote_addr'] = request.META.get('HTTP_X_FORWARDED_FOR').split(',')[0]
# Connect signal for automatic logging
set_actor = curry(self.set_actor, request=request, signal_duid=threadlocal.auditlog['signal_duid'])
pre_save.connect(set_actor, sender=LogEntry, dispatch_uid=threadlocal.auditlog['signal_duid'], weak=False)
def process_response(self, request, response):
"""
Disconnects the signal receiver to prevent it from staying active.
"""
if hasattr(threadlocal, 'auditlog'):
pre_save.disconnect(sender=LogEntry, dispatch_uid=threadlocal.auditlog['signal_duid'])
return response
def process_exception(self, request, exception):
"""
Disconnects the signal receiver to prevent it from staying active in case of an exception.
"""
if hasattr(threadlocal, 'auditlog'):
pre_save.disconnect(sender=LogEntry, dispatch_uid=threadlocal.auditlog['signal_duid'])
return None
#staticmethod
def set_actor(request, sender, instance, signal_duid, **kwargs):
"""
Signal receiver with an extra, required 'user' kwarg. This method becomes a real (valid) signal receiver when
it is curried with the actor.
"""
if hasattr(threadlocal, 'auditlog'):
if not hasattr(request, 'user') or not is_authenticated(request.user):
return
if signal_duid != threadlocal.auditlog['signal_duid']:
return
try:
app_label, model_name = settings.AUTH_USER_MODEL.split('.')
auth_user_model = apps.get_model(app_label, model_name)
except ValueError:
auth_user_model = apps.get_model('auth', 'user')
if sender == LogEntry and isinstance(request.user, auth_user_model) and instance.actor is None:
instance.actor = request.user
instance.remote_addr = threadlocal.auditlog['remote_addr']
I know that this answer is coming very late, but here it goes
Because DRF authenticates on the View level NOT on the Middleware level, the user is not yet attached to the request when AuditlogMiddleware runs, resulting in AnonymousUser
You can attach the logic from AuditlogMiddleware after your authentication
This logic connects some signals
This solution befits:
You don't have to decorate every View with it
it doesn't assume anything about AuditlogMiddleware or audit_log implementation in general. so if the code changes, this should still work
It doesn't force or duplicate DRF authentication.
#token_authentication_wrapper.py
from auditlog.middleware import AuditlogMiddleware
from rest_framework.authentication import TokenAuthentication
class TokenAuthenticationWrapper(TokenAuthentication):
def authenticate(self, request):
user, token = super().authenticate(request)
request.user = user # necessary for preventing recursion
AuditlogMiddleware().process_request(request)
return user, token
inherit from your favorite Authentication service e.g. BasicAuthentication SessionAuthentication, TokenAuthentication, etc...
and in setting.py
'DEFAULT_AUTHENTICATION_CLASSES': [
'path.to.file.token_authentication_wrapper.TokenAuthenticationWrapper',
]
First of all you can user package: https://github.com/jcugat/django-custom-user, to solve Email as Username field.
Then you can try to focus development with: http://django-reversion.readthedocs.io/en/stable/
The answer by #hassaan-alansary would have been ideal, but unfortunately the Auditlog devs made significant changes since he posted his answer, and I couldn't figure out how to reconcile their changes with Hassaan's answer.
The solution I ended up finding is based on what was shared here. Instead of writing a new DRF authentication method which invokes the middleware to do the logging, it creates a mixin which needs to be added to each of the DRF views you want added to the audit log. The solution below is the modified version of the one I ended up using from the link above.
# mixins.py
import threading
import time
from functools import partial
from django.db.models.signals import pre_save
from auditlog.models import LogEntry
threadlocal = threading.local()
class DRFDjangoAuditModelMixin:
"""
Mixin to integrate django-auditlog with Django Rest Framework.
This is needed because DRF does not perform the authentication at middleware layer
instead it performs the authentication at View layer.
This mixin adds behavior to connect/disconnect the signals needed by django-auditlog to auto
log changes on models.
It assumes that AuditlogMiddleware is activated in settings.MIDDLEWARE_CLASSES
"""
#staticmethod
def _set_actor(user, sender, instance, signal_duid, **kwargs):
# This is a reimplementation of auditlog.context._set_actor.
# Unfortunately the original logic cannot be used, because
# there is a type mismatch between user and auth_user_model.
if signal_duid != threadlocal.auditlog["signal_duid"]:
return
if (
sender == LogEntry
#and isinstance(user, auth_user_model)
and instance.actor is None
):
instance.actor = user
instance.remote_addr = threadlocal.auditlog["remote_addr"]
def initial(self, request, *args, **kwargs):
"""Overwritten to use django-auditlog if needed."""
super().initial(request, *args, **kwargs)
remote_addr = AuditlogMiddleware._get_remote_addr(request)
actor = request.user
set_actor = partial(
self._set_actor,
user=actor,
signal_duid=threadlocal.auditlog["signal_duid"],
)
pre_save.connect(
set_actor,
sender=LogEntry,
dispatch_uid=threadlocal.auditlog["signal_duid"],
weak=False,
)
def finalize_response(self, request, response, *args, **kwargs):
"""Overwritten to cleanup django-auditlog if needed."""
response = super().finalize_response(request, response, *args, **kwargs)
if hasattr(threadlocal, 'auditlog'):
pre_save.disconnect(sender=LogEntry, dispatch_uid=threadlocal.auditlog['signal_duid'])
del threadlocal.auditlog
return response
You then need to add this mixin to each of your views:
# views.py
...
class CustomerViewSet(DRFDjangoAuditModelMixin, ModelViewSet):
queryset = Client.objects.all()
serializer = ClientSerializer
....
The down side of this implementation is that it isn't DRY on a couple of levels. Not only do you need to add the mixin to each DRF view, but it copies code from nearly all the logging behaviour of auditlog, particularly private methods. I therefore expect this solution to either need adjustment in the future, or for it to also become obsolete.
The solution above is based on this revision of auditlog.

Django log user IP for user_login_failed signal

I would like to log the user IP address in my Django application, specifically for login, logout and failed login events. I'm using Django builtin functions as follows:
from django.contrib.auth.signals import user_logged_in, user_logged_out, user_login_failed
from ipware.ip import get_ip
import logging
logger = logging.getLogger(__name__)
def log_logged_in(sender, user, request, **kwargs):
logger.info("%s User %s successfully logged in" % (get_ip(request), user))
def log_logged_out(sender, user, request, **kwargs):
logger.info("%s User %s successfully logged out" % (get_ip(request), user))
def log_login_failed(sender, credentials, **kwargs):
logger.warning("%s Authentication failure for user %s" % ("...IP...", credentials['username']))
user_logged_in.connect(log_logged_in)
user_logged_out.connect(log_logged_out)
user_login_failed.connect(log_login_failed)
The issue is that I haven't found a way to get the IP for the user_login_failed signal since this function does not have the request in the parameters (https://docs.djangoproject.com/en/1.7/ref/contrib/auth/#module-django.contrib.auth.signals). The credentials parameter is a dictionary that only contains the username and password fields.
How could I get the IP address for this signal?
Many thanks in advance for your help.
Unfortunately user_login_failed singal don't pass request as argument.
Checkout django-axes — https://github.com/django-pci/django-axes/
It uses a custom view decorator to track failed logins.
https://github.com/django-pci/django-axes/blob/master/axes/decorators.py#L273
I just found in newer Django version (I am using 2.1) has updated this and now it includes the request object in the user_login_failed signal:
https://docs.djangoproject.com/en/2.1/ref/contrib/auth/#django.contrib.auth.signals.user_login_failed
You could override the login form and intercept it there.
It has the request at that stage.
import logging
from django.contrib.admin.forms import AdminAuthenticationForm
from django import forms
log = logging.getLogger(__name__)
class AuthenticationForm(AdminAuthenticationForm):
def clean(self):
# to cover more complex cases:
# http://stackoverflow.com/questions/4581789/how-do-i-get-user-ip-address-in-django
ip = request.META.get('REMOTE_ADDR')
try:
data = super(AuthenticationForm, self).clean()
except forms.ValidationError:
log.info('Login Failed (%s) from (%s)', self.cleaned_data.get('username'), ip)
raise
if bool(self.user_cache):
log.info('Login Success (%s) from (%s)', self.cleaned_data.get('username'), ip)
else:
log.info('Login Failed (%s) from (%s)', self.cleaned_data.get('username'), ip)
return data
To install it into the site you need to attach it to django.contrib.admin.site.login_form
I would suggest doing it in your App's ready() method like so:
from django.contrib.admin import site as admin_site
class Config(AppConfig):
...
def ready(self):
# Attach the logging hook to the login form
from .forms import AuthenticationForm
admin_site.login_form = AuthenticationForm

Django 1.3 post login/logout signals in relation with authentication

First of all both methods below return True. I'd expect the second one to return False using the django standard admin authentication procedure or am I wrong?
def post_login(sender, **kwargs):
"""
Django 1.3 post login signal handler
"""
# do stuff
user = kwargs['user']
print user.is_authenticated()
user_logged_in.connect(post_login)
def post_logout(sender, **kwargs):
"""
Django 1.3 post logout signal handler
"""
# do stuff
user = kwargs['user']
print user.is_authenticated()
user_logged_out.connect(post_logout)
Anyway I'm trying to understand why django doesn't have a hook on authentication failure also.. I can use my own backend for users to login and out of their account, however I would like to hook onto the admin procedure as well to cover everything in one portion of code.. I found some topics but no real awnser how to fix this.
I came up with:
import settings
from django.dispatch import Signal
failed_login = Signal(providing_args=['user'])
from django.contrib.auth.backends import ModelBackend
from django.contrib.auth.models import User
class AuthSignalBackend(ModelBackend):
def authenticate(self, username=None, password=None):
try:
user = User.objects.get(username=username)
if user.check_password(password):
return user
else:
failed_login.send(sender=None, user=user)
except User.DoesNotExist:
return None
def login_handler(sender, **kwargs):
if settings.DEBUG:
print "failed login detected...!"
failed_login.connect(login_handler)
That works great, however there's no request in the ModelBackend, while the post_login and logout signals do have the request.. This is unfortunate because it would be great for IP logging
Any advise is welcome, I'm pretty sure people should have come across this one before..
If user is an instance of User model, user.is_authenticated() will always return True. Models instance can't know what's going on on request level. This method is for views.
If you want to deal with failed login attempts, take a look at django-axes. You can just use it, or look at the code and reimplement some ideas as you like.

custom signals in django

am having problem with django custom signals not being able to see signals across application. I made a simple call in my
core/signals.py
from django.dispatch.dispatcher import Signal
# Signal-emitting code... emits whenever a file upload is received
# ----------------------------------------------------------------
upload_recieved = Signal(providing_args=['data'])
def upload_received_handler(sender, data, **kwargs):
print 'upload received handler'
print 'connecting signal'
upload_recieved.connect(upload_received_handler)
in core/models.py
import signals
[the model]
in blog/admin.py
from models import article, category, media
from django.contrib import admin
from libs.shared.core.tasks import Create_Audit_Record
from libs.shared.core import signals
class ArticleModelAdmin(admin.ModelAdmin):
def save_model(self, request, obj, form, change):
upload_recieved.send(sender=self, data='ddd')
instance = form.save()
return instance
admin.site.register(article, ArticleModelAdmin)
admin.site.register(category)
admin.site.register(media)
this is what I did, but am getting an error in the runtime unable to see upload_received function. any ideas?
Regards,
You haven't imported the upload_recieved name into your admin.py.

Categories