I have build an application outside of Django which i would like to interact with Django. So within a view in Django i would like to start and stop this application. While the application is running it will return a JSON string which has to be pushed back to the view and processed client-side with JavaScript. The application is pretty much blocking as it is a while-loop running until it has been killed.
It is very important to stress that this is an experimental project and wont be used in production, so it has to work and nothing more than that as i intend to use this for one demonstration.
So to sum it up, i want to (1) start the application from the view, (2) the application runs and returns JSON values to the view and (3) at any given time i should be able to kill the application (while-loop in this case).
Another option is to use for example Tornado, but it seems very comprehensive for such a simple test..
Have you tried django-celery?
Related
so I'm currently writing a Flask application and I am new to flask.
There is some processing going on, which I outsourced to a separate function.
As this processing takes some time, I wanted to give the user a progress update on how many iterations have passed. No problem so far.
However, as soon I call the render template, the function ends and I cannot update that variable anymore.
I was imagining an if loop. if that variable changes, render template with the new variable as input.
But after the first iteration, the if loop will brake.
Currently, the render template renders an html function, which just displays the variable it receives. I want to update that variable as soon as it changes.
Do you guys have any suggestion, on how I could achieve this "background update"?
Cheers and thanks!
You need some kind of ongoing request/response cycle.
As soon as your app sends the response with the rendered template back to the browser, this connection is closed and there's no way to send any more data.
There's a few things that need to happen in order to accomplish what you want:
The long running function needs to run in the background so it doesn't block execution of the rest of the application
There has to be a way to get a status update from the long running function
The client (ie browser) needs a way to receive the status updates
1 and 2 can be solved using celery. It allows you to run tasks in the background and the task to send information via a side channel to be consumed elsewhere.
The easiest way to achieve 3 would be to set up a route in your flask application that returns information about the task, and request it periodically from the browser using some JavaScript. The more favorable method in my opinion would be to use websockets to actively send out the information to the client, but this is a bit more complicated.
This is just a rough outline, but there's a tutorial by Miguel Grinberg about how to set this up using celery and polling from JS.
I'm writing small & simple telegram bot on python. I never used this language in my work and decided that's a good way to learn by practice.
To get updates my app currently uses long polling called from an endless loop.
So I'm basically searching for the simplest way to run this app on openshift. I tried to use this example on flask but it didn't work. There are a lot of other options to implement background infinite processes with multiprocessing (from django and cerely to tornado) but it seems that all of them are way too advanced and complicated for my rather modest needs.
If the polling is not event driven, then you could use 'cron' (you can add cron cartridge to your gear) to periodically trigger your python script, that does the work and "dies".
However, keep in mind that Openshfit is not really intended to be your worker thread (unless you are on the bronze plan or higher). Unless you receive an external request to your gear within 24 hour period, your gear will be "idled" and your process will no longer run.
The way to get around this, "officially", is probably to get the bronze plan (you will not be charged unless you require the 4th gear instance),
"Unofficially", you can create a gear with python that will give you a default website. Then you create a new python script that does your job and trigger it using cron. To keep the gear from idling, use something like uptimerobot to ping your "website" every day.
How do I get SSL for my domains?
If you are still getting by on OpenShift Online's generous Free plan,
you'll see a warning message at the top of your application's SSL
configuration area. You can always take advantage of our *.rhcloud.com
wildcard certificate in order to securely connect to any application
via it's original, OpenShift-provided hostname URL.
Tornado is very simple, my first steps in telegram bot dev I did using this server on openshift platform.
I have searched the forums for my question but im either searching for a thing naming it wrongly or the question is hard which i really doubt.
I am developing a web-app which would have an web-interface written in one of the MVC frameworks like django or even flask and allow user to login, will identify users session and allow to make some settings and also my app needs to run some python process(script which basically is a separate file) on the server on a per-session per-settings made by user basis. This process is quite long - can take even days to perform and shouldn't affect the execution and performance of MVC part of an app. Another issue is that this process should be run per user so the basic usage model of such app would be:
1. the user enters the site.
2. the user makes some settings which are mirrored to database.
3. the user pushes the launch button which executes some python script just for this user with the settings he has made.
4. the user is able to monitor some parameters of the script running based on some messages that the script itself generates.
I do understand that my question is related to the architecture of the app itself and i'm quite new to python and haven't had any experience of developing such complex application but I'm also quite eager to learn about it. I do understand the bricks from which my app should be built (like django or flask and the server-side script itself) but i know very little about how this elements should be glued together to create seamless environment. Please direct me to some articles related to this topic or recommend some similar threads or just give a clear high level explanation how such separate python processes could be triggered,run and monitored further on a per-user basis from controller part of MVC.
Celery is a great solution, but it can be overpowered for many setups. If you just need tasks to run periodically (once an hour, once a day, etc) then consider just using cron.
There's a lot less setup and it can get you quite far.
Celery is the perfect solution for you purposes.
Celery can easily run long tasks, but you have to write monitoring part. It's simple - you can use django-orm from a celery task.
Do not use django-celery or flask-celery applicattion - they are deprecated.
This seems like a simple question, but I am having trouble finding the answer.
I am making a web app which would require the constant running of a task.
I'll use sites like Pingdom or Twitterfeed as an analogy. As you may know, Pingdom checks uptime, so is constantly checking websites to see if they are up and Twitterfeed checks RSS feeds to see if they;ve changed and then tweet that. I too need to run a simple script to cycle through URLs in a database and perform an action on them.
My question is: how should I implement this? I am familiar with cron, currently using it to do my server backups. Would this be the way to go?
I know how to make a Python script which runs indefinitely, starting back at the beginning with the next URL in the database when I'm done. Should I just run that on the server? How will I know it is always running and doesn't crash or something?
I hope this question makes sense and I hope I am not repeating someone else or anything.
Thank you,
Sam
Edit: To be clear, I need the task to run constantly. As in, check URL 1 in the database, check URl 2 in the database, check URL 3 and, when it reaches the last one, go right back to the beginning. Thanks!
If you need a repeatable running of the task which can be run from command line - that's what the cron is ideal for.
I don't see any demerits of this approach.
Update:
Okay, I saw the issue somewhat different. Now I see several solutions:
run the cron task at set intervals, let it process the data once per run, next time it will process the data on another run; use PIDs/Database/semaphores to avoid parallel processes;
update the processes that insert/update data in the database; let the information be processed when it is inserted/updated; c)
write a demon process which will reside in memory and check the data in real time.
cron would definitely be a way to go with this, as well as any other task scheduler you may prefer.
The main point is found in the title to your question:
Run a repeating task for a web app
The background task and the web application should be kept separate. They can share code, they can share access to a database, but they should be separate and discrete application contexts. (Consider them as separate UIs accessing the same back-end logic.)
The main reason for this is because web applications and background processes are architecturally very different and aren't meant to be mixed. Consider the structure of a web application being held within a web server (Apache, IIS, etc.). When is the application "running"? When it is "on"? It's not really a running task. It's a service waiting for input (requests) to handle and generate output (responses) and then go back to waiting.
Web applications are for responding to requests. Scheduled tasks or daemon jobs are for running repeated processes in the background. Keeping the two separate will make your management of the two a lot easier.
I have a website that right now, runs by creating static html pages from a cron job that runs nightly.
I'd like to add some search and filtering features using a CGI type script, but my script will have enough of a startup time (maybe a few seconds?) that I'd like it to stay resident and serve multiple requests.
This is a side-project I'm doing for fun, and it's not going to be super complex. I don't mind using something like Pylons, but I don't feel like I need or want an ORM layer.
What would be a reasonable approach here?
EDIT: I wanted to point out that for the load I'm expecting and processing I need to do on a request, I'm confident that a single python script in a single process could handle all requests without any slowdowns, especially since my dataset would be memory-resident.
That's exactly what WSGI is for ;)
I don't know off hand what the simplest way to turn a CGI script into a WSGI application is, though (I've always had that managed by a framework). It shouldn't be too tricky, though.
That said, An Introduction to the Python Web Server Gateway Interface (WSGI) seems to be a reasonable introduction, and you'll also want to take a look at mod_wsgi (assuming you're using Apacheā¦)
maybe you should direct your search towards inter process commmunication and make a search process that returns the results to the web server. This search process will be running all the time assuming you have your own server.