I have made an app using Google App Engine in python of weekly Project and assessment report submitting.
I want to check that on Friday who have submitted the report and who don't just send the scheduled notification mail that he haven't submitted the report in last week.
but i don't want to send the notification mail on Monday who have submitted the report in last week, just to those who haven't submitted the report
so please suggest me some idea for that.
Hard to fathom what you want (your English is very hard to parse), but anyway, besides Task Queues which are much more flexible and powerful (and may be harder to use for simple jobs that cron functionality covers perfectly), you can use cron to schedule App Engine tasks in Python by following the instructions here.
Not sure what you want, but anything you can do with cron can be done via TaskQueues in GAE, so read this http://code.google.com/appengine/docs/python/taskqueue/
App Engine applications can perform background processing by inserting tasks (modeled as web hooks) into a queue. App Engine will detect the presence of new, ready-to-execute tasks and automatically dispatch them for execution, subject to scheduling criteria.
Related
So, I am currently working on a django project hosted at pythonanywhere, which includes a feature for notifications, while also receiving data externally from sensors through AWS. I have been thinking of the best practice in order to implement this.
I currently have a simple implementation which is a view that checks all notifications and does the actions as needed if required, with an always-on task (which simply means a script that is running independently) sending a REST request to the server every minute.
Server side:
views.py:
def checkNotifications(request):
notificationsObject = notifications.objects.order_by('thing').values_list('thing').distinct()
thingsList = list(notificationsObject)
for thing in thingsList:
valuesDic = returnAllField(thing)
thingNotifications = notifications.objects.filter(thing=thing)
#Do stuff for each notification
urls:
path('notifications/',views.checkNotifications,name="checkNotification")
and the client just sents a GET request to my URL/notifications/. which works.
Now, while researching I saw some other options such as the ones discussed here with django background tasks and/or celery:
How to initialize repeating tasks using Django Background Tasks?
Celery task best practices in Django/Python
as well as some other options.
My question is: Is there a benefit to moving from my first implementation to this one? The only benefit I can see directly is avoid abuse from another service trying to hit my URl to check notifications too often, but I can/have a required authentication to avoid that. And, is there a certain "best practice" with regards to this, considering that I am checking with this repeating task quite so often, it almost feels like there should be a more proper/cleaner solution. For one, I am not sure if running a repeating task is the best option with pythonanywhere.
(https://help.pythonanywhere.com/pages/AsyncInWebApps/ suggests using always-on tasks, but it also mentions django background tasks)
Thank you
To use Django background tasks on PythonAnywhere you need to run it using an always-on task, so it is not an alternative, but just the other use of always-on tasks.
You can also access your Django code in your always-on task directly with some kind of long-running management command, so you do not need to hit your web app with a special request.
I'm trying to learn Google App Engine (and general web app programming) by building a simple app that periodically polls a radio station RSS feed (~1 request/min), writes the result to a database, and updates a Spotify playlist with the current song. I am using Python with the Flask framework for the web app. I have a simple front-end site which is able to implement the Spotify authentication protocol, however, I am now struggling with the best way to poll information from the RSS feed in the background. I have looked into using the deferred task workflow with Google App Engine Task Queues, but it seems like cron might be a better option for something this simple. The Google App Engine cron docs say to implement a URL call, which is then handled in my app. Is this handled by my Flask URL handlers (ie routes), or by the app engine handlers? My initial thought was that it would look something like this:
In the cron.yaml file:
cron:
- description: "Poll Song RSS"
url: /playlistupdate
schedule: every 1 minute
And then in my routes.py I would have a route to do the work:
#app.route('/playlistupdate')
def playlistupdate ()
<Send HTTP request to RSS site, store results in SQLite db, add song to spotify playlist via Spotify API>
Is this the right idea? Or am I missing something about how the cron flow should work? What happens if a user tries to go to http://[MY_HOSTNAME]/playlistupdate?
Any help on what my options are for a simple background polling flow like this, and how it would work with the Flask framework would be much appreciated. Thanks in advance.
Yes, use cron.
Yes, you could implement the handler exactly as you describe.
Yes, you should secure the endpoint, otherwise, anyone would be able to access it.
See here for a way to secure the endpoint.
I'm trying to build an app in Python with Google App Engine that fetches followers of specific accounts and then their tweets. I'm basing it on this template and changing it to adapt it to what I need.
The issue at the moment is that when I try to fetch followers, I get an DeadlineExceededError due to the Twitter API waiting time.
I have found this post on how to fix the same problem and I think that in my case the best solution would be to use backends, but I noticed that they are deprecated.
Does someone know how I can achieve the same result without the deprecated module?
You have a couple options that you can use for long-running tasks:
Use GAE Task Queues: GAE provides push and pull queues which allow you to do work asynchronously outside of the individual request.
Use Cloud Pub/Sub: A type of pull queue, this would allow your App Engine app to publish a message every time you wanted fetch followers or fetch tweets. The subscriber would then take the message from the queue, perform a long-running task, and then put the result into some datastore.
Use GAE Services: This would allow you to create a background service and manually scale it to run as long as you need.
Backends (modules) have been deprecated in favor of Services:
https://cloud.google.com/appengine/docs/flexible/python/an-overview-of-app-engine
For the Service you want to be able to handle requests longer than 60 seconds, set it to Manual Scaling. Then, a request can run for up to 24 hours (or until you shut it down). See:
https://cloud.google.com/appengine/docs/standard/python/how-instances-are-managed#instance_scaling
Of course, your costs may go up with long running instances and request.
I am thinking about implementing resource throttling in my application in google app engine.
My idea is checking whether I am running out of resources (for example, bandwidth) and disabling part of the website, using the final part of the available daily traffic to inform the user that the site is running in a "resources saving" mode.
I read the GAE documentation, but I just found that if I run out of traffic, it directly returns HTTP 403.
Is there a way to make my python application aware of the used resources and to try not to be so rude with my users?
Unfortunately this is not possible, there is no API that you can use for this.
Looking at the App Engine roadmap there is no such feature coming along any time soon.
The only thing i can recommend is you sign up for billing and recieve the 50$ free quota, it's here till 31 october. You can enable billing and disable it and keep the free 50$!
Hope this helped.
I am designing a python web app, where people can have an email sent to them on a particular day. So a user puts in his emai and date in a form and it gets stored in my database.
My script would then search through the database looking for all records of todays date, retrive the email, sends them out and deletes the entry from the table.
Is it possible to have a setup, where the script starts up automatically at a give time, say 1 pm everyday, sends out the email and then quits? If I have a continuously running script, i might go over the CPU limit of my shared web hosting. Or is the effect negligible?
Ali
Is it possible to have a setup, where
the script starts up automatically at
a give time, say 1 pm everyday, sends
out the email and then quits?
It's surely possible in general, but it entirely depends on what your shared web hosting provider is offering you. For these purposes, you'd use some kind of cron in any version or variant of Unix, Google App Engine, and so on. But since you tell us nothing about your provider and what services it offers you, we can't guess whether it makes such functionality available at all, or in what form.
(Incidentally: this isn't really a programming question, so, if you want to post more details and get help, you might have better luck at serverfault.com, the companion site to stackoverflow.com that deals with system administration questions).