I have a python flask application where a person is gonna request data by sending a query from A form. This data goes through a certain python script that does some api-requests and covert the data into a geographic standard.
The thing is because this data can take some time based on how many datapoints there are, this will happen in the background (we are researching this for Azure). Also there is another problem, and that is cueing up. Because if one request is running, another one cannot be started up. And the last command cannot be saved:
#app.route('/handle_data', methods=['POST'])
def handle_data():
sper_year = int(request.form["Speryear"])
email = request.form["inputEmail"]
url = request.form["api-url-input"]
random_string=get_random_string(5)
# app.route('/request-completed')
Requested_data = Program_Converter.main(url, sper_year,random_string)
Requested_data = Program_Converter.main(url, sper_year,random_string) is the function that needs to be qued.
How do I do this?
I believe that the most recommended way is to run this task asynchronously. Take a look at Celery and pick up a backend (I recommend Redis), with this setup you can provide Celery a task that will run your GIPOD_Converter process in the background and store the result somewhere you chose, then it can be sent back to the user.
Note that Celery will provide you a task id and is up to your client (web interface or mobile app, I'm not sure what you're working with) to poll and endpoint and wait for the celery task to come to an end.
There are a couple of examples on who to implement this all over the web, but take a look at the Flask official documentation and check out this article from the mighty Miguel Grinberg, I believe those are the best starting points for you.
Related
I have some Flask application. It works with some database, I'm using SQLAlchemy for this. So I have one question:
Flask handle requests one-by-one. So, for example, I have two users, which are modifying the same record in the table of database, for example A and B (they are concurrent).
How can I say to user B that user A has changed this record? It must be some message to user B.
In the development server version, when you do app.run(), you get a single synchronous process, which means at most 1 requests being processed at a time. So you cannot accept multiple users at the same time.
However, gunicorn is a solid, easy-to-use WSGI server that will let you spawn multiple workers (separate processes), and even comes with asynchronous workers when you need to deploy your application.
However, to answer your question, since, they run on separate threads, the data that exists in the database at the specific time when the query is run in that thread will be used/returned.
I hope this answers your query.
I am writing a GAE application that when it starts needs to initialise a connection to a third party service, and then run a continuous check in the background (essentially pulling data from third party and pushing it to a GAE task queue)
I know that backends get a call to /_ah/start which initialises them and lets GAE know the backend has started. Is it safe to start the pull process from StartHandler, i.e.
f = urllib2.urlop
for l in f:
deferred.defer(doMyStuff,l)
I think the answer is to have a StartHandler along the lines of:
class StartHandler(webapp2.RequestHandler):
def get(self):
logging.info("Handler started")
key = self.request.get('key')
taskqueue.add('/backend/startdata', params={'key':key}, target='1.backend0')
and then have the handler for /backend/startdata run the loop.
Advice and comments welcome.
Answer to this question. Google App Engine will not let this work. I gave it up and used a different cloud provider, because life's too short, and python should be python, anywhere.
I'm pretty new to web development, so I'm just trying to see if I have the big picture right for what I am trying to do. Forgive me if any terminology is wrong.
My Django app needs to do the following:
User uploads a file through his browser
File is processed by the server (can take up to an hour)
User sees the results in his browser
I'm having trouble on how to accomplish step 2...here is what I am thinking:
1.User uploads a file (pretty straightforward)
2.File is processed - a view function would go something like this:
def process(request):
a. (get file from the request)
b. (return a page which says "the server is running your job, results will be available in {ETA}")
c. (start processing the data)
3.User sees the results in his browser - Browser queries the server at regular intervals to see if the job is done. When the job ready, the browser gets the results.
My question is, in step 2 parts b and c, how can I return a response to the browser without waiting to the process to finish? Or, how can I ensure the process keeps running after I return the results to the browser? The process should ideally have access to the Django environment variables, as it will work with a database through Django's interface.
You need to off load the processing. You could use django-celery.
I have a Celery task registered in my tasks.py file. When someone POST to /run/pk I run the task with the given parameters. This task also executes other tasks (normal Python functions), and I'd like to update my page (the HttpResponse returned at /run/pk) whenever a subtask finishes its work.
Here is my task:
from celery.decorators import task
#task
def run(project, branch=None):
if branch is None:
branch = project.branch
print 'Creating the virtualenv'
create_virtualenv(project, branch)
print 'Virtualenv created' ##### Here I want to send a signal or something to update my page
runner = runner(project, branch)
print 'Using {0}'.format(runner)
try:
result, output = runner.run()
except Exception as e:
print 'Error: {0}'.format(e)
return False
print 'Finished'
run = Run(project=project, branch=branch,
output=output, **result._asdict())
run.save()
return True
Sending push notifications to the client's browser using Django isn't easy, unfortunately. The simplest implementation is to have the client continuously poll the server for updates, but that increases the amount of work your server has to do by a lot. Here's a better explanation of your different options:
Django Push HTTP Response to users
If you weren't using Django, you'd use websockets for these notifications. However Django isn't built for using websockets. Here is a good explanation of why this is, and some suggestions for how to go about using websockets:
Making moves w/ websockets and python / django ( / twisted? )
With many years past since this question was asked, Channels is a way you could now achieve this using Django.
Then Channels website describes itself as a "project to make Django able to handle more than just plain HTTP requests, including WebSockets and HTTP2, as well as the ability to run code after a response has been sent for things like thumbnailing or background calculation."
There is a service called Pusher that will take care of all the messy parts of Push Notifications in HTML5. They supply a client-side and server-library to handle all the messaging and notifications, while taking care of all the HTML5 Websocket nuances.
I want to schedule an email to be sent to a user upon a specific action.
However, if the user takes another action I want to cancel that email and have it not send.
How would I do that in django or python?
Beanstalkd
If you can install beanstalkd and run python script from command line I would use that to schedule emails. With beanstalkc client you can easily accomplish this. On ubuntu you might first need to install:
sudo apt-get install python-yaml python-setuptools
consumer.py:
import beanstalkc
def main():
beanstalk = beanstalkc.Connection(host='localhost', port=11300)
while True:
job = beanstalk.reserve()
print job.body
job.delete()
if __name__ == '__main__':
main()
Will print job 5 seconds after it get's inserted by producer.py. Offcourse this should be set longer to when you want to schedule your emails, but for demonstration purposes it will do. You don't want to wait half an hour to schedule message when testing ;).
producer.py:
import beanstalkc
def main():
beanstalk = beanstalkc.Connection(host='localhost', port=11300)
jid = beanstalk.put('foo', delay=5)
if __name__ == '__main__':
main()
GAE Task Queue
You could also use Google App engine task queue to accomplish this. You can specify an eta for your Task. Google App engine has a generous free quota. In the task queue webhook make Asynchronous Requests to fetch URL on your server which does the sending of emails.
I would set up a cron job which could handle everything you want to do...
If you didn't have access to cron, you could easily do this:
Write a model that stores the email, the time to send, and a BooleanField indicating if the email has been sent.
Write a view which selects all emails that haven't been sent yet but should have by now, and sends them.
Use something like OpenACS Uptime, Pingdom or any other service capable of sending HTTP GET requests periodically to call that view, and trigger the email sending. (Both are free, the former should request once every 15 minutes, and the latter can be configured to request up to every minute, and will do so from several locations.)
Sure, it's inelegant, but it's a method that works on basically any web host. I used to do something like this when I was writing PHP apps to run on a host that killed all processes after something like 15 seconds.
Are your using celery ? If true, see http://ask.github.com/celery/userguide/executing.html#eta-and-countdown
You said that you want to do it through Python or Django, but it seems as though something else will need to be involved. Considering you are on a shared host, there is a chance installing other packages could also be a problem.
Another possible solution could be something like this:
Use a javascript framework which can setup timed events, start/cancel them etc. I have done timed events using a framework called ExtJS. Although ExtJS is rather large im sure other frameworks such as jQuery or even raw javascript you could do a similar thing.
Set up a task on a user action, that will execute in 5 minutes. The action could be an ajax call to a python script which sends the email... If a user does something where the task needs to be stopped, just cancel the event.
It kind of seems complicated and convoluted, but it really isn't. If this seems like a path you would like to try out, let me know and I'll edit with some code