I need to execute mysqldump from within a django function. I can do so easily enough from the terminal command line, but when I try to run it from within the python script, I get an error:
sh: mysqldump: command not found
when running the following.
filestamp = date.today()
dumpcmd = "mysqldump -u root appdb > appdb%s.out" % (filestamp)
os.system(dumpcmd)
I think the problem has something to do with the Path in either the django application or Eclipse, but I can't figure out why mysqldump can't be found from within the django app but it can be from the command line / virtualenv
make sure mysqldump is in your path
$ whereis mysqldump; echo $PATH
mysqldump: /usr/bin/mysqldump /usr/share/man/man1/mysqldump.1.gz
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
and/or
using mysqldump and mysql inside python
import subprocess
subprocess.Popen('mysqldump -h localhost -P 3306 -u -root appdb > appdb.sql', shell=True)
Or use the full path in the Python statement. e.g. /usr/bin/mysqldump
Through Fabric, I am trying to start a celerycam process using the below nohup command. Unfortunately, nothing is happening. Manually using the same command, I could start the process but not through Fabric. Any advice on how can I solve this?
def start_celerycam():
'''Start celerycam daemon'''
with cd(env.project_dir):
virtualenv('nohup bash -c "python manage.py celerycam --logfile=%scelerycam.log --pidfile=%scelerycam.pid &> %scelerycam.nohup &> %scelerycam.err" &' % (env.celery_log_dir,env.celery_log_dir,env.celery_log_dir,env.celery_log_dir))
I'm using Erich Heine's suggestion to use 'dtach' and it's working pretty well for me:
def runbg(cmd, sockname="dtach"):
return run('dtach -n `mktemp -u /tmp/%s.XXXX` %s' % (sockname, cmd))
This was found here.
As I have experimented, the solution is a combination of two factors:
run process as a daemon: nohup ./command &> /dev/null &
use pty=False for fabric run
So, your function should look like this:
def background_run(command):
command = 'nohup %s &> /dev/null &' % command
run(command, pty=False)
And you can launch it with:
execute(background_run, your_command)
This is an instance of this issue. Background processes will be killed when the command ends. Unfortunately on CentOS 6 doesn't support pty-less sudo commands.
The final entry in the issue mentions using sudo('set -m; service servicename start'). This turns on Job Control and therefore background processes are put in their own process group. As a result they are not terminated when the command ends.
For even more information see this link.
you just need to run
run("(nohup yourcommand >& /dev/null < /dev/null &) && sleep 1")
DTACH is the way to go. It's a software you need to install like a lite version of screen.
This is a better version of the "dtach"-method found above, it will install dtach if necessary. It's to be found here where you can also learn how to get the output of the process which is running in the background:
from fabric.api import run
from fabric.api import sudo
from fabric.contrib.files import exists
def run_bg(cmd, before=None, sockname="dtach", use_sudo=False):
"""Run a command in the background using dtach
:param cmd: The command to run
:param output_file: The file to send all of the output to.
:param before: The command to run before the dtach. E.g. exporting
environment variable
:param sockname: The socket name to use for the temp file
:param use_sudo: Whether or not to use sudo
"""
if not exists("/usr/bin/dtach"):
sudo("apt-get install dtach")
if before:
cmd = "{}; dtach -n `mktemp -u /tmp/{}.XXXX` {}".format(
before, sockname, cmd)
else:
cmd = "dtach -n `mktemp -u /tmp/{}.XXXX` {}".format(sockname, cmd)
if use_sudo:
return sudo(cmd)
else:
return run(cmd)
May this help you, like it helped me to run omxplayer via fabric on a remote rasberry pi!
You can use :
run('nohup /home/ubuntu/spider/bin/python3 /home/ubuntu/spider/Desktop/baidu_index/baidu_index.py > /home/ubuntu/spider/Desktop/baidu_index/baidu_index.py.log 2>&1 &', pty=False)
nohup did not work for me and I did not have tmux or dtach installed on all the boxes I wanted to use this on so I ended up using screen like so:
run("screen -d -m bash -c '{}'".format(command), pty=False)
This tells screen to start a bash shell in a detached terminal that runs your command
You could be running into this issue
Try adding 'pty=False' to the sudo command (I assume virtualenv is calling sudo or run somewhere?)
This worked for me:
sudo('python %s/manage.py celerycam --detach --pidfile=celerycam.pid' % siteDir)
Edit: I had to make sure the pid file was removed first so this was the full code:
# Create new celerycam
sudo('rm celerycam.pid', warn_only=True)
sudo('python %s/manage.py celerycam --detach --pidfile=celerycam.pid' % siteDir)
I was able to circumvent this issue by running nohup ... & over ssh in a separate local shell script. In fabfile.py:
#task
def startup():
local('./do-stuff-in-background.sh {0}'.format(env.host))
and in do-stuff-in-background.sh:
#!/bin/sh
set -e
set -o nounset
HOST=$1
ssh $HOST -T << HERE
nohup df -h 1>>~/df.log 2>>~/df.err &
HERE
Of course, you could also pass in the command and standard output / error log files as arguments to make this script more generally useful.
(In my case, I didn't have admin rights to install dtach, and neither screen -d -m nor pty=False / sleep 1 worked properly for me. YMMV, especially as I have no idea why this works...)
I am struggling to launch my fabfile within my Python script. I have looked at similar posts on Stack Overflow regarding this but they don't solve my problem... Or maybe they do but I am not understanding them...Not sure.
My script writes to the fab file depending on what the user wants to run on the remote host. Here is an example of the fabfile:
[root#ip-50-50-50-50 bakery]# cat fabfile.py
from fabric.api import run
def deploy():
run('wget -P /tmp https://s3.amazonaws.com/MyBucket/httpd-2.2.26-1.1.amzn1.x86_64.rpm')
run('sudo yum localinstall /tmp/httpd-2.2.26-1.1.amzn1.x86_64.rpm')
I then need to run the fabfile from my script. If I run the following manually form the Command Line, then it works fine:
fab -f fabfile.py -u ec2-user -i id_rsa -H 10.10.15.150 deploy
1) How do I run that from inside my script with all of the options?
2) The IP address is a variable called "bakery_internalip". How do I call that variable as part of the fab line?
Try with subprocess :
import subprocess
subprocess.call(['fab', '-f', 'fabfile.py', '-u ec2-user', '-i', 'id_rsa', '-H', bakery_internalip, 'deploy'])
should do the trick
You can call fabric enabled commands directly from code. Usually you will have to set the env dictionary first to specify keys and host but is is very straightforward:
# the settings for the env dict
from fabric.api import env, execute
env.hosts = ["10.10.15.150", ]
env.user = "ec2-user"
env.key = "id_rsa"
# and call the function itself
from fabfile import deplot
execute(deploy, hosts=env.hosts)
You can find more inspiration in the fabric documentation:
http://docs.fabfile.org/en/1.11/usage/execution.html#using-execute-with-dynamically-set-host-lists
I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash
I have some freebsd servers and don't have sudo. But I want to run some command automatically with root just like the following command:
def autodeploy(url):
with cd('/tmp'):
if not exists('releasetar.sh'):
put('/tmp/releasetar.sh', 'releasetar.sh', mode=0644)
run("wget '{}'".format(url))
run('su - -m -c "cd /tmp && bash /tmp/releasetar.sh"')
the su with -c option worked to linux but didn't worked on freebsd. How can I solved this problem ? I'm wish your solution can both worked on linux and freebsd. Thank you for your answer~~
If you're using fabric you can just provide the -u argument from the command line to specify which user you want to run the task as
fab -u root <task name>
For more options from the command line check out http://docs.fabfile.org/en/1.7/usage/fab.html#command-line-options
You can also set your username programmatically
from fabric.api import run, settings
with settings(user="root"):
run("some-command")
Install sudo from ports (/usr/ports/security/sudo).