I am looking to run commands I typically would run through Heroku CLI via a python script, namely:
heroku pg:backups:capture
Historically, for running Heroku commands in Python I have used Heroku3.py, which works for me for things like restarting dynos. I am having difficulty finding a way to execute commands for addons, such as the one listed above.
Is there a way to call CLI commands through python?
Related
I have a web app that I deployed to a machine that has ubuntu 20 installed
to be able to run the app I should open ssh to the ubuntu machine and then run this command
cd mywebapp
python3 app.py
it works successfully, but once I close the ssh console or reboot the machine or anything happens, it stopped and I have to repeat these commands
I tried to add it as a corn job to be run after machine reboot but it does not work
I post a question in the following link : run python app after server restart does not work using crontab
nothing work with me, and I have to make sure that this web app will always be running because it should be working to send push notification to mobile devices
can anyone please advice, I have been searching and trying for so many time
I'm not expert in it, but two solutions come in my mind:
1- Using systemd:
systemd can be responsible to keep services up.
You can write a custom unit for your app, and config it as a way to be up always.
This tutorial may be useful: writing unit
2- Using Docker:
When you have containerized app, you config it as to come up, on failure or anything like that.
Read about it here
What if you have the calling piece of Python script within a bash script and run that as a daemon:
Your bash script could like below (test.sh):
#!/bin/sh
cd desired/directory
python3 app.py
and you can run the bashscript like this by using nohup:
nohup ./test.sh 0<&- &>/dev/null &
You could refer this, if you want to store the outputs of nohup.
I am using Airflow locally to execute some ETL tasks (everything is done locally using Airflow, Python and Docker), I have a task which failed.
If I could get to use Pycharm debugger it would be great. I am looking for a way so that Pycharm listen to what is happening on Airflow (localhost/airflow) so that once I run a task on Airflow I only need to jump to Pycharm to start debugging and see the logs.
I have read about remote debbug server but what I see on tutorials is that they all run their programm using Pycharm with a main function inside the file.
What I want is to launch my task through Airflow and then jump to Pycharm to see the logs and start debugging.
So I started something but I am not sure if this is the good way, when I try to add a remote interpreter to my project (preference/interpreter/add/docker compose here is what I get.
enter image description here
How to schedule and run a specific command in django using Windows Task Scheduler. My django project is not currently local server deployed but using the manual set up just like activating the virtual environment and then typing the python manage.py runserver on terminal rather deploying through xampp or laragon. But i am bit confused on how to achieve to schedule and run a command like python manage py get_source through the use of Windows Task Scheduler.
Don't know if you're still looking for this but I found a working solution here - configure .bat file to run commands in Django Virtual Env
Very simple; point to the location of your project directory...
cd C:\webapps\my-project-dir-with-manage.py-inside
copy and paste the contents of the system generated activate.bat,found inside your venv/Scripts folder...
add your command line...
.\manage.py <your command here>
save as myfile.bat, and schedule via the Windows Task Scheduler.
Super simple.
I had the same problem.
The main issue is, that you missed the full path to python.exe as an execution application. "Python" will not work.
And then your application as an argument.
Additionally, to that, you can add w to py. This will make it runnable on windows without a .bat file.
Application to execute:
C:\user\python.exe Argument: manage.pyw runserver
How do I run a python script that is in my main directory with Heroku scheduler?
Normally I run this through the command line with Heroku run python "script.py", but this syntax is clearly not correct for the Heroku Scheduler. Where it says "rake do_something" what should the correct syntax be to run a python script here? I've tried "python script.py" and this does not work either.
Thanks!
The Heroku Scheduler will try to run any command you give it. For Python, if you added a mytask.py to
your app repo, you could have the Scheduler run:
python mytask.py
Instead of waiting for the Scheduler to run to see if the command works as expected, you can also test run it like this:
heroku run python mytask.py # or heroku run:detached ...
heroku logs --tail
Another way to use the Scheduler would be to extend your app with a cli tool or a script runner that shares the app context. A popular one for Flask is Flask-Script.
Note: the "rake" references in the Heroku Scheduler docs example is for running tasks with ruby.
I have an installed free account on Heroku to get started.
A python code that accepts web requests is starting a shell subprocess, does starting this subprocess count and need a dyno?
I am seeing this specific code not do its intended purpose, but when run through the shell on Heroku it does what it is supposed too.