I have a django project and i need the ability for executing a function at a specified time on django server.
for example you can consider that in my django project if a client request for friendship to another person if after (say) 7 days that person doesn't answer that request the request will be automatically removed.
so i want the ability of calling a function on django server at a specified time
that is stored in mysql table.
Create a custom command and create a cron job to run it, also you can check some django apps for manage cron jobs/repetitive tasks. I know you can use it on linux (in windows should be alternatives, I my head sounds now schedule task, but must there be other)
Related
I'm in the process of porting my local django app to heroku and am hitting a snag. Mainly with my environment variables. I can't very well create a .env file on my webserver, it would just get overwritten when I push from github again. So I've set environment variables using heroku config:set VAR='' --app <app>. These seem to work, however, I'm working with an API that requires I refresh my token every 60 min. Locally, I developed a method to update my .env every time the task that refreshed this token was executed, but this solution clearly isn't sufficient for my web server...I've attempted to update server-level variables in Python, but I don't think that's possible. Has anyone had to deal with an issue like this? Am I approaching this all wrong?
What is the best way for me to update an environment variable on a web server (ie heroku config:set VAR='' --app <app> but in my python code)? I need this variable to update every 60 minutes (I already have the celery task code built for this). Should I modify the task to simply write it to a text file and use that text file as my "web server .env file"? I'm really lost here, so any help would be much appreciated. Thanks!
EDIT:
As requested here's more information:
I'm building some middlware for two systems. The first system posts a record to my Django API. This event kicks off a task that subsequently updates a separate financial system. This separate financial system's API requires two things, an auth_code and an access_token. The access_token must be updated every 60 minutes.
I have a refresh_token that I use to get a new access_token. This refresh_token expires every 365 days. As a result, I can simply reuse this refresh_token every time I request a new access_token.
My app is in the really early stages and doesn't require anything but a simple api post from the first system to kick off this process. This will eventually be built out to require my own sort of auth_token to access my django api.
first system --> Django App --> Finance System
https://developer.blackbaud.com/skyapi/docs/authorization/auth-code-flow/tutorial
Process:
I currently have a celery task that runs in the background every 55 minutes. This task gathers the new access_token and recreates my .env file with the new access_token.
I have a separate celery task that runs an ETL pipeline and requires the access_token to post to the financial systems api.
I want to build a Mqtt Client, which stores some data in my django database.
This client should always run, when the webserver is running.
What is the best way to run a thread with database access (django models) paralle to the webserver?
If read about the django background task model, but I am not sure if its a good way.
Celery is the most common solution for this. You can also create a custom admin command and execute it using cron or something similar.
I am setting up backend for an application, with Django and MySQL.
As a part of the set up, I need to keep on fetching latest content from Facebook and Twitter Graph APIs and keep updating my database with that.
The user of my application would pull this latest available data from the database.
Now,how and where I implant this code? Shall I put it somewhere in the Django project, if yes, then where?
Or shall I use it as an independent script i.e. not attached to Django in anyway, and update the DB directly with that.
Also since this would be a continuous process, I need it to run as background task. It should not eat consume any resources that might be needed by the foreground tasks.
The recommended way is using Celery. If you want don't want to use async task handling you can also just create a custom management command and run it via cron. Both of them should work with the whole projects context (e.g. what your defined in your settings), so you can use the Django ORM to connect to your DB etc..
I've been trying to make a decision about my student project before going further. The main idea is get disk usage data, active linux user data, and so on from multiple internal server and publish them with Django.
Before I came to RabbitMQ I was thinking about developing a client application for each linux server and geting this data through a socket. But I want to make that student project simple. Also, I don't know how difficult it is to make a socket connection via Django.
So, I thought I could solve my problem with RabbitMQ without socket programming. Basically, I send a message to rabbit queue. Then get whatever I want from the consumer server.
On the Django side, the client will select one of the internal servers and click the "details" button. Then I want to show this information on web page.
I already read almost all documentation about rabbitmq, celery and pika. Sending messages to all internal servers(clients) and the calculation of information that I want to get is OKAY but I can't figure out how I can put this data on a webpage with Django?
How would you figure out that problem if you were me?
Thank you.
I solved my problem own my own. Solution is RabbitMQ RPC call. You can execute your python code on remote server and get result of process via RPC requests. Details can ben found here.
http://www.rabbitmq.com/tutorials/tutorial-six-python.html
Thank you guys.
Looks like you already done the hard work(celery, rabbit, etc) but missing Django basics. Go through the polls tutorial and getting started with django or the many other resources on the web, and It would be quite simple. Basically:
create the models (objects represented in db)
declare urls
setup views to pass the data from the model to the webpage template
create the templates (or do it with client side framework and create a JSON response)
EDIT: (after you clarified the question) Actually I just hit the same problem too. The answer is running another python process parallel to the Django process (in the same virtualenv) in this process you can set up a rabbit consumer (using pica, puka, kombu or whatever) and calling specific Django functions/methods to do something with the information from rabbitmq. you can also just call celery tasks from there to be executed in the Django app context.
a procfile for example (just illustrating, you can run both process in many other ways):
web: python manage.py runserver
worker: python listen_from_servers.py
Notice that you'll have to set the DJANGO_SETTIGNS_MODULE for the settings file enviroment variable for django imports to work.
You need the following two programs running at all times:
The producer, which will populate the queue. This is the program that will collect the various messages and then post them on the queue.
The consumer, which will process messages from the queue. This consumer's job is to read the message and do something with it; so that it is processed and removed from the queue. The function that this consumer does is entirely up to you, but what you want to do in this scenario is write information from the message to a database model; the same database that is part of your django app.
As the producer pushes messages and the consumer removes them from the queue, your database will get updated.
On the django side, the process is simply to filter this database and display records for a particular machine. As such, django does not need to be aware of how the records are being populated in the database - all django is doing is fetching, filtering, sending to the template and rendering the views.
The question comes how best (well actually, easily) populate the databases. You can do it the traditional way, by using Python's well documentation DB-API and write your own SQL statements; but since celery is so well integrated with django - you can use the django's ORM to do this work for you as well.
I hope this gets you going in the right direction.
I'm trying to build a django app that can monitor and interact with a remote database (to interact with the database in a basic way - just performing a look-up and only sometimes making a little change in the remote data), it also has to sometimes store the remote data to its own database.
The website which sits on the remote database is a community website, and anyone without an account is allowed to post on the website's forums. I want the app to be able to check the database every now and then to see for any updates in the discussions. The site gets at least 100 posts an hour and since anyone is allowed to post on the forums without an account, it occasionally gets spammed, but unfortunately the CMS that is being used does not have a good anti-spam system set up.
Only way that I can think of at the moment is to make a python file, and in that file I can import MySQLdb. I can connect to the remote database (mysql) server and select all the posts that have been made in the last X minutes. Using a function that calculates the probability of a post being a spam or not, I can again talk to the remote database and flag the candidates to be not displayed on the website. I can have this file run "every now and then" using cron.
The problem here is a lack of control. I want to have a user interface that can show all the spam candidates on a single webpage and have an "unflag" button to make accidentally flagged posts to be shown on that website again. This means that I'll probably be better off writing a django web app than to write a single python script that simply just flags spam candidates.
How would I have a django app or perhaps a function within that app (which can perform all actions that the stand-alone python script as described above can perform) to run automatically every now then (say every minute)?
Maybe you should try django-celery?