I'm in the process of porting my local django app to heroku and am hitting a snag. Mainly with my environment variables. I can't very well create a .env file on my webserver, it would just get overwritten when I push from github again. So I've set environment variables using heroku config:set VAR='' --app <app>. These seem to work, however, I'm working with an API that requires I refresh my token every 60 min. Locally, I developed a method to update my .env every time the task that refreshed this token was executed, but this solution clearly isn't sufficient for my web server...I've attempted to update server-level variables in Python, but I don't think that's possible. Has anyone had to deal with an issue like this? Am I approaching this all wrong?
What is the best way for me to update an environment variable on a web server (ie heroku config:set VAR='' --app <app> but in my python code)? I need this variable to update every 60 minutes (I already have the celery task code built for this). Should I modify the task to simply write it to a text file and use that text file as my "web server .env file"? I'm really lost here, so any help would be much appreciated. Thanks!
EDIT:
As requested here's more information:
I'm building some middlware for two systems. The first system posts a record to my Django API. This event kicks off a task that subsequently updates a separate financial system. This separate financial system's API requires two things, an auth_code and an access_token. The access_token must be updated every 60 minutes.
I have a refresh_token that I use to get a new access_token. This refresh_token expires every 365 days. As a result, I can simply reuse this refresh_token every time I request a new access_token.
My app is in the really early stages and doesn't require anything but a simple api post from the first system to kick off this process. This will eventually be built out to require my own sort of auth_token to access my django api.
first system --> Django App --> Finance System
https://developer.blackbaud.com/skyapi/docs/authorization/auth-code-flow/tutorial
Process:
I currently have a celery task that runs in the background every 55 minutes. This task gathers the new access_token and recreates my .env file with the new access_token.
I have a separate celery task that runs an ETL pipeline and requires the access_token to post to the financial systems api.
Related
Problem: Habitica is a habit-tracking app, but its personal data logs are not as detailed as I want. I want to create a local log of when I mark off habits/todo's in the app. Habitica offers certain webhooks that trigger when habits/todo's are checked off, which seems perfect for what I want, but how do I turn these triggers into a local log? I would like to use Python for this.
Ideas: It seems to me that I would need to set up some kind of personal cloud server to receive this data, turn it into a log, and then store it for download. I have previously deployed a Flask app using Heroku, so if this could be done similarly, that would be ideal. However, I don't know much about this, so I would welcome any ideas or advice.
Creating the Habitica webhook as Flask application is a good approach.
Heroku supports Python/Flask very nicely however the file system is ephemeral, hence it gets wiped out at every application restart.
In order to persist data you can look at various options:
save the file to AWS S3
save the data into a DB (Heroku has a free plan for PostgreSQL)
I'm trying to build an app in Python with Google App Engine that fetches followers of specific accounts and then their tweets. I'm basing it on this template and changing it to adapt it to what I need.
The issue at the moment is that when I try to fetch followers, I get an DeadlineExceededError due to the Twitter API waiting time.
I have found this post on how to fix the same problem and I think that in my case the best solution would be to use backends, but I noticed that they are deprecated.
Does someone know how I can achieve the same result without the deprecated module?
You have a couple options that you can use for long-running tasks:
Use GAE Task Queues: GAE provides push and pull queues which allow you to do work asynchronously outside of the individual request.
Use Cloud Pub/Sub: A type of pull queue, this would allow your App Engine app to publish a message every time you wanted fetch followers or fetch tweets. The subscriber would then take the message from the queue, perform a long-running task, and then put the result into some datastore.
Use GAE Services: This would allow you to create a background service and manually scale it to run as long as you need.
Backends (modules) have been deprecated in favor of Services:
https://cloud.google.com/appengine/docs/flexible/python/an-overview-of-app-engine
For the Service you want to be able to handle requests longer than 60 seconds, set it to Manual Scaling. Then, a request can run for up to 24 hours (or until you shut it down). See:
https://cloud.google.com/appengine/docs/standard/python/how-instances-are-managed#instance_scaling
Of course, your costs may go up with long running instances and request.
Just start using the firebase + react to build a website. One of the designed features of my website is to crawl and show users the data parsed from another website (e.g., the stock price changes). I already have a python crawler responsible to parse the data, but I have no idea how to execute this python crawler (in the background) of my server in firebase (or it is not even possible)?
Here is the example usage of my system
user login and subscribe the website/data they are interesting
my crawler will parse that website every 1 hour and update the data to database
user can see the summary of change of website from database
One option I have in mind is running the crawler in my local machine and use the REST api to update the parsed data to firebase database. However, it seems a very inefficient/naive approach because it is kind of losing the meaning of deploying my server with cloud service, like firebase.
Firebase does not have any service/feature that allows you to periodically run Python or any other code. The closest thing to that is Cloud Functions, which can be triggered through an external service like cron-job.org.
For more in this, see:
Firebase Hosting Flask Application (on running Python on Firebase Hosting)
Using google cloud function to spawn a python script (for an elaborate way where you might apparently run Python on Cloud Functions, but I never have, nor am likely to ever try this myself)
Cloud Functions for Firebase trigger on time? (for running Cloud Functions periodically either through AppEngine, or cron-job.org).
I have a django project and i need the ability for executing a function at a specified time on django server.
for example you can consider that in my django project if a client request for friendship to another person if after (say) 7 days that person doesn't answer that request the request will be automatically removed.
so i want the ability of calling a function on django server at a specified time
that is stored in mysql table.
Create a custom command and create a cron job to run it, also you can check some django apps for manage cron jobs/repetitive tasks. I know you can use it on linux (in windows should be alternatives, I my head sounds now schedule task, but must there be other)
I'm trying to build a django app that can monitor and interact with a remote database (to interact with the database in a basic way - just performing a look-up and only sometimes making a little change in the remote data), it also has to sometimes store the remote data to its own database.
The website which sits on the remote database is a community website, and anyone without an account is allowed to post on the website's forums. I want the app to be able to check the database every now and then to see for any updates in the discussions. The site gets at least 100 posts an hour and since anyone is allowed to post on the forums without an account, it occasionally gets spammed, but unfortunately the CMS that is being used does not have a good anti-spam system set up.
Only way that I can think of at the moment is to make a python file, and in that file I can import MySQLdb. I can connect to the remote database (mysql) server and select all the posts that have been made in the last X minutes. Using a function that calculates the probability of a post being a spam or not, I can again talk to the remote database and flag the candidates to be not displayed on the website. I can have this file run "every now and then" using cron.
The problem here is a lack of control. I want to have a user interface that can show all the spam candidates on a single webpage and have an "unflag" button to make accidentally flagged posts to be shown on that website again. This means that I'll probably be better off writing a django web app than to write a single python script that simply just flags spam candidates.
How would I have a django app or perhaps a function within that app (which can perform all actions that the stand-alone python script as described above can perform) to run automatically every now then (say every minute)?
Maybe you should try django-celery?