I have a web app that I deployed to a machine that has ubuntu 20 installed
to be able to run the app I should open ssh to the ubuntu machine and then run this command
cd mywebapp
python3 app.py
it works successfully, but once I close the ssh console or reboot the machine or anything happens, it stopped and I have to repeat these commands
I tried to add it as a corn job to be run after machine reboot but it does not work
I post a question in the following link : run python app after server restart does not work using crontab
nothing work with me, and I have to make sure that this web app will always be running because it should be working to send push notification to mobile devices
can anyone please advice, I have been searching and trying for so many time
I'm not expert in it, but two solutions come in my mind:
1- Using systemd:
systemd can be responsible to keep services up.
You can write a custom unit for your app, and config it as a way to be up always.
This tutorial may be useful: writing unit
2- Using Docker:
When you have containerized app, you config it as to come up, on failure or anything like that.
Read about it here
What if you have the calling piece of Python script within a bash script and run that as a daemon:
Your bash script could like below (test.sh):
#!/bin/sh
cd desired/directory
python3 app.py
and you can run the bashscript like this by using nohup:
nohup ./test.sh 0<&- &>/dev/null &
You could refer this, if you want to store the outputs of nohup.
Related
I am executing my Robot Framework Selenium tests in a remote machine (it's a Docker container, but I need it to be working using Podman, too... so I guess using docker commands wouldn't help me) and in this remote machine, there is an automatic process running on the background, which is producing terminal logs.
I can read this cmd output when I execute docker logs <container_id> in my terminal, but I need to get them using python and extract some info from these logs to show them in the Robot Framework test log file.
Any ideas how to do it?
I have found several ways how to execute a command in the remote machine and get the output, but here I am not executing any command, I just need to read what's being produced automatically.
Thank you for your advice
I need to setup a Jetson Nano device so that a Python script is launched everytime an Internet connection is available.
So, referring to this question, I did the following:
I created the 'run_when_connection_available' script:
#!/bin/sh
# create a dummy folder to check script execution
mkdir /home /user_name/dummy_folder_00
# kill previous instances of the system
pkill python3
# move to folder with python script and launch it
cd /home/user_name/projects/folder
/usr/bin/python3 launcher.py --arg01 --arg02 ...
# create another dummy folder to check script execution
mkdir /home /user_name/dummy_folder_01
I made this script executable and I copied it to /etc/network/if-up.d
Now, everytime I plug the ethernet cable out and in again, I can see the dummy folders are created in /home/user_name, but the python script isn't launched (at least, it doesn't appear in the system monitor). I tried running the command in the script from the terminal, and everything works fine, the python program starts as expected. Am I doing something wrong?
I'm trying to figure out something similar to you, but not quite the same..
This solution got my script python script running upon internet connection, I can check the logs and everything is working fine:
Raspbian - Running A Script After An Internet Connection Is Established
However, my script uses notify-send to send notifications to my window manager which I can't seem to get working with systemd - the script works when run inside of the user space so I assume it's something to do with systemd and Xorg. Hopefully that shouldn't be a problem for you, I hope this solves your issue.
You shouldn't need a bash script in the middle, I got systemd service to run my python script with chmod u+x <file>.py and putting #!/usr/bin/env python3 at the top of the python file so that it's executable directly under the .service file like so:
ExecStart=/path/to/file/file.py
Ok, I guess it was a matter of permissions, I solved it by running everything as user_name, so I modified the script as
sudo -user user_name /usr/bin/python3 launcher.py --arg01 --arg02 ...
I am looking for help deploying my flash app. I've already written the app and it works well. I'm currently using the following command in the directory of my flask code:
sudo uwsgi --socket 0.0.0.0:70 --protocol=http -w AppName:app --buffer-size=32768
This is on my Amazon Lightsail instance. I have the instance linked to a static public IP, and if I navigate to the website, it works great. However, to get the command to continuously run in the background even after logging out of the Lightsail, I first start a screen command, execute the above line of code, and then detach the screen using ctrl-a-d.
The problem is, if the app crashes (which is understandable since it is very large and under development), or if the command is left running for too long, the process is killed, and it is no longer being served.
I am looking for a better method of deploying a flask app on Amazon Lightsail so that it will redeploy the app in the event of a crash without any interaction from myself.
Generally you would write your own unit file for systemd to keep your application running, auto restart when it crashes and start when you boot your instances.
There are many tutorials out there showing how to write such a unit file. Some examples:
Systemd: Service File Examples
Creating a Linux service with systemd
How to write startup script for Systemd?
You can use pm2
Starting an application with PM2 is straightforward. It will auto
discover the interpreter to run your application depending on the
script extension. This can be configurable via the Ecosystem config
file, as I will show you later on this article.
All you need to install pm2 and then
pm2 start appy.py
Great, this application will now run forever, meaning that if the process exit or throw an exception it will get automatically restarted. If you exit the console and connect again you will still be able to check the application state.
To list application managed by PM2 run:
pm2 ls
You can also check logs
pm2 logs
Keeping Processes Alive at Server Reboot
If you want to keep your application online across unexpected (or expected) server restart, you will want to setup init script to tell your system to boot PM2 and your applications.
It’s really simple with PM2, just run this command (without sudo):
pm2 startup
Pm2 Manage-Python-Processes
I have created the basic "Hello World" web app and deployed it. So I have the URL with my project name included in it. I wanted to create a couple of changes to the source code, but when I try refreshing it shows no changes. How do I make changes?
If you deployed the app
gcloud app update
If it's testing in Cloud Shell
I missed where you stated the app was deployed and wrote the answer below. I'll leave it for future readers or in case it might help.
To stop the local server from the command line, you press the
following:
Mac OS X or Linux: Control-C Windows: Control-Break
Then restart again with
dev_appserver.py app.yaml
This assumes you're completing the App Engine hello world tutorial.
Still not working? Maybe you have multiple processes running or something else is wrong. Let's try killing all of the web server processes and restarting.
ps aux | grep dev_appserver.py
Look at the process number.
kill PROCESS_NUM_HERE
If you see multiple processes after running ps aux... then you should kill all of them.
Then restart
dev_appserver.py app.yaml
I'm running Ubuntu server 16.04 and still getting to grips with it. I have a python script that runs in an endless loop, performing a task related to fetching data from an external source.
What I'm trying to do, is make this python script start after (or during) boot and then run in the background.
I've tried editing rc.local but the boot sequence just hangs since the script keeps running.
Any advice would be greatly appreciated.
As one of the comments mentions, you can use cronjobs to start scripts at certain times such as at startup(as you would like to do). It also would not halt execution like you mentioned with rc.local
The line that you need to add to the chronjob list is -
#reboot python /home/MyPythonScript.py
Here is are a couple of useful tutorials that show you how to do this: http://kvz.io/blog/2007/07/29/schedule-tasks-on-linux-using-crontab/
https://help.ubuntu.com/community/CronHowto
If you would like to do it with python itself there is this handy python library - https://pypi.python.org/pypi/python-crontab/
tmux is a great utility for background desktops. You can use it for this:
sudo apt get install tmux
Then add it to your rc.local:
/usr/bin/tmux new-session -d 'python /path/to/your/script'
After boot you can use it as follow:
tmux attach
And your console will be attached to the last desktop working at background.