supervisord python import error - python

I'm trying to daemonize my bash script which starts running python script inside.
Here is my program section of supervisord.conf
[program:source]
directory=/home/vagrant/
command=/usr/local/bin/python /home/vagrant/start.py
process_name=%(program_name)s
user=vagrant
autostart=true
When I start supervisord it doesn't work. From the log i receive:
No module named monitor.tasks
When I run the program directly it works. Seems it has working directory issue but I don't know how to solve. Any suggestion?

Found where my mistake was. I just had to use -m after python command as follows:
command=/usr/local/bin/python -m vagrant/start.py

I had a similar problem, but mine was related with the PYTHONPATH. All I had to do was adding a single line on my program configuration:
[program:myProgram]
environment=PYTHONPATH=/home/nectu/.local/lib/python3.6/site-packages
(...)
Running on: Lubuntu 18.04 / Python 3.6

Related

Ubuntu EC2 - Run Python script on reboot

I have an Ubuntu 16.04 EC2 Instance and I need to run a python script every time the instance is Started.
I tried everything suggested on every question in the forum and haven't had any luck yet.
Specifically I've tested:
Adding #reboot python3 /home/project/script.py to crontab
Adding #reboot /bin/startup.sh and having the bash file configured to run the /home/project/script.py
Using etc/rc.local, etc/init/mystartup.conf, etc/systemd/mystartup.conf
Passing User Data
Probably missing a few others and literally nothing worked even though running the script manually works wonders.
Thanks a lot in advance for the help!
Put the script into: /var/lib/cloud/scripts/per-boot/
Cloud-Init, which runs User Data, will also check this directory.
From Modules — cloud-init documentation:
Any scripts in the scripts/per-boot directory on the datasource will be run every time the system boots. Scripts will be run in alphabetical order.

Ubuntu run python script on system startup that is using Firefox

I wrote python script that uses subprocess.pOpen() module to run and manipulate with 2 GUI programs: Firefox and VLC player. I am using Ubuntu 14.04 LTS operating system in Desktop mode.
My problem is when I try to run that python script when system starts, script is running but Firefox or VLC don't start.
So far, I tried to make shell script to run my python script and then with crontab with #reboot /home/user/startup.sh to execute my python script. I set all permissions for every script that is using. I gave my user root permisions so everything is OK with that.
I also tried to run my script putting command "sudo python /path/to/my/script.py" in /etc/rc.local file but that also is not helping.
I googled and found out people using .desktop files that they put in ~/.config/autostart/ directory but that also failed. Example of what I wrote:
[Desktop Entry]
Type=Application
Exec="sudo python /home/user/path_to_my_script/my_script.py"
X-GNOME-Autostart-enabled=true
Name=screensplayer
Comment=screensplayer
And I saved this as program.desktop in ~/.config/autostart/ directory but it does not work. I am sure there is a way to fix this but don't know how. Any help will be appreciated!
Found solution to my problem. When you are running commands with pOpen in python like this:
FNULL = open(os.devnull, 'w')
_FIREFOX_START = ["sudo", "firefox", "test.com/index.html"]
subprocess.Popen(self._FIREFOX_START, stdout=self.FNULL, stderr=subprocess.STDOUT)
it won't run apps because of "sudo" word, when I removed it, it worked.
Also run gnome-session-properties in terminal and add new startup application, be aware that you have to execute python script without sudo, like this:
python /home/user/path_to_script/script.py
Also, I granted my user root privileges so kepp that in mind.

Run a python script from bamboo

I'm trying to run a python script from bamboo. I created a script task and wrote inline "python myFile.py". Should I be listing the full path for python?
I changed the working directory to the location of myFile.py so that is not a problem. Is there anything else I need to do within the configuration plan to properly run this script? It isn't running but I know it should be running because the script works fine from terminal on my local machine. Thanks
I run a lot of python tasks from bamboo, so it is possible. Using the Script task is generally painless...
You should be able to use your script task to run the commands directly and have stdout written to the logs. Since this is true, you can run:
'which python' -- Output the path of which python that is being ran.
'pip list' -- Output a list of which modules are installed with pip.
You should verify that the output from the above commands matches the output when ran from the server. I'm guessing they won't match up and once that is addressed, everything will work fine.
If not, comment back and we can look at a few other things.
For the future, there are a handful of different ways you can package things with python which could assist with this problem (e.g. automatically installing missing modules, etc).
You can also use the Script Task directly with an inline Python script to run your myFile.py:
/usr/bin/python <<EOF
print "Hello, World!"
EOF
Check this page for a more complex example:
https://www.langhornweb.com/display/BAT/Run+Python+script+as+a+Bamboo+task?desktop=true&macroName=seo-metadata

Running a python script in virtual environment with node.js pm2

I would like to reference this question because I am certain that someone will flag this as a duplicate.
I am not looking for another reference to supervisord. I'm sure that it is great and all, but the node PM2 has the functionality that I require and is more straightforward to implement and test.
Manual Start
During prototyping, I created a virtual environment called 'p3env'. At the top of each script, I place a bash directive:
#!./py3env/bin/python
This allows me to execute each script in the directory using this particular python environment without having to activate it. It is very convenient and useful and the python script works well when I start it by hand.
I should be clear about what I mean when I say 'start it by hand'. My script is called 'strain_to_db.py'. When I start it by hand, I am on the shell via ssh:
./strain_to_db.py
This gets everything working that I need to have working.
PM2 Start
Moving from relative to absolute paths
To get pm2 working, I started with:
pm2 start ./strain_to_db.py
Specifying the Interpreter
Apparently pm2 ignores the directive at the top of the python script and attempts to execute using the global 'python'. No problem, I can specify the interpreter:
pm2 start ./strain_to_db.py --interpreter /home/ubuntu/db_if/p3env/bin/python
No dice. Again, maybe try more absolute paths:
pm2 start /home/ubuntu/db_if/strain_to_db.py --interpreter /home/ubuntu/db_if/p3env/bin/python
Running script as command-line option
Now I'm getting frustrated. I try another tactic. I attempt to run the python executable in the command line using:
/home/ubuntu/db_if/p3env/bin/python /home/ubuntu/db_if/strain_to_db.py
This works fine when pm2 isn't involved. When I try to pass this to pm2 using the 'command line argument' style:
pm2 start /home/ubuntu/db_if/p3env/bin/python -- /home/ubuntu/db_if/strain_to_db.py
Frustration
Same error. The error is always 'can't import pymysql', which is only installed on the virtual environment.
I am not sure where else to go with this. I have several scripts that I want to add to the pm2 execution monitor, but I can't seem to get one of them to start and run correctly.
After looking around a bit more, the question that I referenced at the top of the email had a clue in one of the answers, but not the answer.
When files end in '.py', pm2 calls 'python'... no matter what. I believe that there is a configuration file in pm2 that you could modify to change this behavior. I simply removed the '.py' from my script and specified the interpreter:
pm2 start ./strain_to_db --interpreter ./py3env/bin/python
Works perfectly. When I use pm2 to create a startup script, I will use absolute paths. Thanks for anyone who was looking, and I hope this helps someone in the future.
This Medium article solved this problem for me.
https://medium.com/#gokhang1327/deploying-flask-app-with-pm2-on-ubuntu-server-18-04-992dfd808079
Command for running a python script in virtual enviroment:
pm2 start app.py --name flask-app --interpreter=python3
--name it´s optional, that´s the name of process displayed in pm2 status
Result:
"new" is the name of my virtualenv environment.
A bit late to this question but for anyone coming here with a fresh pair of eyes, I've found that if you activate the virtual environment eg. source venv/Scripts/activate then start your script via pm2 eg. pm2 start main.py --name migration, it will automatically use the environment you've activated.

Executing a simple Python script not working with Sidekiq

class SeedWorker
include Sidekiq::Worker
def perform
`python lib/assets/python_scripts/seed.py`
end
end
I am trying to execute this from terminal like so:
bundle exec SeedWorker.perform_async()
I have a Redis server and Sidekiq running as well as a Rails server. The script works fine by itself, but I am wondering if this is even possible. Also Sinatra is running too. Any help would be greatly appreciated.
SeedWorker.perform_async() is not an executable---that will not work. Also, Sidekiq is already running, but it may not have loaded your worker file. Last, Sidekiq just requires Redis. Rails and Sinatra are not related to your problem.
That statement will work in an environment like irb. Alternatively you can use the sidekiq executable:
bundle exec sidekiq -r <path to your worker file>
How about following up the good documentation provided with Sidekiq?

Categories