How to set up a continuously running processes along with Django? - python

I am setting up backend for an application, with Django and MySQL.
As a part of the set up, I need to keep on fetching latest content from Facebook and Twitter Graph APIs and keep updating my database with that.
The user of my application would pull this latest available data from the database.
Now,how and where I implant this code? Shall I put it somewhere in the Django project, if yes, then where?
Or shall I use it as an independent script i.e. not attached to Django in anyway, and update the DB directly with that.
Also since this would be a continuous process, I need it to run as background task. It should not eat consume any resources that might be needed by the foreground tasks.

The recommended way is using Celery. If you want don't want to use async task handling you can also just create a custom management command and run it via cron. Both of them should work with the whole projects context (e.g. what your defined in your settings), so you can use the Django ORM to connect to your DB etc..

Related

Can I use webhooks to create a log of when certain actions were taken?

Problem: Habitica is a habit-tracking app, but its personal data logs are not as detailed as I want. I want to create a local log of when I mark off habits/todo's in the app. Habitica offers certain webhooks that trigger when habits/todo's are checked off, which seems perfect for what I want, but how do I turn these triggers into a local log? I would like to use Python for this.
Ideas: It seems to me that I would need to set up some kind of personal cloud server to receive this data, turn it into a log, and then store it for download. I have previously deployed a Flask app using Heroku, so if this could be done similarly, that would be ideal. However, I don't know much about this, so I would welcome any ideas or advice.
Creating the Habitica webhook as Flask application is a good approach.
Heroku supports Python/Flask very nicely however the file system is ephemeral, hence it gets wiped out at every application restart.
In order to persist data you can look at various options:
save the file to AWS S3
save the data into a DB (Heroku has a free plan for PostgreSQL)

managing server application

I have a server which runs flask with python.
Now I want to make an application which can do various tasks like uploading files, updating redis database and various other things.
Now ofcourse this could be done using html pages but since the operation could involve lots of files realtime input of data and other things it might be better to make an application and manage the server from that point rather than webpages.
do you suggest using webpages anyway or would you make an application for it?
and if I make an application should I use http or not?
sorry if this is a uninformed question but I would like to learn the best methods
You might want to look into Flask-Script. It allows you to run various commands related to your flask application easily. It also allows you to easily add your own commands to it. This way you will be able to keep your administrative code still within the Flask app, but not necessarily have it accessible via a web page.

Using RabbitMQ with Django to get information from internal servers

I've been trying to make a decision about my student project before going further. The main idea is get disk usage data, active linux user data, and so on from multiple internal server and publish them with Django.
Before I came to RabbitMQ I was thinking about developing a client application for each linux server and geting this data through a socket. But I want to make that student project simple. Also, I don't know how difficult it is to make a socket connection via Django.
So, I thought I could solve my problem with RabbitMQ without socket programming. Basically, I send a message to rabbit queue. Then get whatever I want from the consumer server.
On the Django side, the client will select one of the internal servers and click the "details" button. Then I want to show this information on web page.
I already read almost all documentation about rabbitmq, celery and pika. Sending messages to all internal servers(clients) and the calculation of information that I want to get is OKAY but I can't figure out how I can put this data on a webpage with Django?
How would you figure out that problem if you were me?
Thank you.
I solved my problem own my own. Solution is RabbitMQ RPC call. You can execute your python code on remote server and get result of process via RPC requests. Details can ben found here.
http://www.rabbitmq.com/tutorials/tutorial-six-python.html
Thank you guys.
Looks like you already done the hard work(celery, rabbit, etc) but missing Django basics. Go through the polls tutorial and getting started with django or the many other resources on the web, and It would be quite simple. Basically:
create the models (objects represented in db)
declare urls
setup views to pass the data from the model to the webpage template
create the templates (or do it with client side framework and create a JSON response)
EDIT: (after you clarified the question) Actually I just hit the same problem too. The answer is running another python process parallel to the Django process (in the same virtualenv) in this process you can set up a rabbit consumer (using pica, puka, kombu or whatever) and calling specific Django functions/methods to do something with the information from rabbitmq. you can also just call celery tasks from there to be executed in the Django app context.
a procfile for example (just illustrating, you can run both process in many other ways):
web: python manage.py runserver
worker: python listen_from_servers.py
Notice that you'll have to set the DJANGO_SETTIGNS_MODULE for the settings file enviroment variable for django imports to work.
You need the following two programs running at all times:
The producer, which will populate the queue. This is the program that will collect the various messages and then post them on the queue.
The consumer, which will process messages from the queue. This consumer's job is to read the message and do something with it; so that it is processed and removed from the queue. The function that this consumer does is entirely up to you, but what you want to do in this scenario is write information from the message to a database model; the same database that is part of your django app.
As the producer pushes messages and the consumer removes them from the queue, your database will get updated.
On the django side, the process is simply to filter this database and display records for a particular machine. As such, django does not need to be aware of how the records are being populated in the database - all django is doing is fetching, filtering, sending to the template and rendering the views.
The question comes how best (well actually, easily) populate the databases. You can do it the traditional way, by using Python's well documentation DB-API and write your own SQL statements; but since celery is so well integrated with django - you can use the django's ORM to do this work for you as well.
I hope this gets you going in the right direction.

Run a script at a special time on django server

I have a django project and i need the ability for executing a function at a specified time on django server.
for example you can consider that in my django project if a client request for friendship to another person if after (say) 7 days that person doesn't answer that request the request will be automatically removed.
so i want the ability of calling a function on django server at a specified time
that is stored in mysql table.
Create a custom command and create a cron job to run it, also you can check some django apps for manage cron jobs/repetitive tasks. I know you can use it on linux (in windows should be alternatives, I my head sounds now schedule task, but must there be other)

How Do You Have a Django App to Frequently Monitor a Remote Database

I'm trying to build a django app that can monitor and interact with a remote database (to interact with the database in a basic way - just performing a look-up and only sometimes making a little change in the remote data), it also has to sometimes store the remote data to its own database.
The website which sits on the remote database is a community website, and anyone without an account is allowed to post on the website's forums. I want the app to be able to check the database every now and then to see for any updates in the discussions. The site gets at least 100 posts an hour and since anyone is allowed to post on the forums without an account, it occasionally gets spammed, but unfortunately the CMS that is being used does not have a good anti-spam system set up.
Only way that I can think of at the moment is to make a python file, and in that file I can import MySQLdb. I can connect to the remote database (mysql) server and select all the posts that have been made in the last X minutes. Using a function that calculates the probability of a post being a spam or not, I can again talk to the remote database and flag the candidates to be not displayed on the website. I can have this file run "every now and then" using cron.
The problem here is a lack of control. I want to have a user interface that can show all the spam candidates on a single webpage and have an "unflag" button to make accidentally flagged posts to be shown on that website again. This means that I'll probably be better off writing a django web app than to write a single python script that simply just flags spam candidates.
How would I have a django app or perhaps a function within that app (which can perform all actions that the stand-alone python script as described above can perform) to run automatically every now then (say every minute)?
Maybe you should try django-celery?

Categories