I have a custom django management command in one django project and I would like to call it from another django project. Both are using virtualenv. I also need to check the output of the command.
The command I want to call looks something like this:
/home/path/to/bin/python /home/path/to/manage.py my-command --input-csv=/tmp/w2qGM1 --input-type=csv
Where /tmp/w2qGM1 is a temp csv file I've already created.
I've tried
subprocess.call
subprocess.call(CMD, shell=True, env={
'PYTHONPATH': os.pathsep.join(sys.path)})
subprocess.check_output
output = subprocess.check_output(
shlex.split(CMD),
shell=True,
stderr=subprocess.STDOUT,
env={'PYTHONPATH': os.pathsep.join(sys.path)}
)
subprocess.Popen
args = shlex.split(CMD)
output = subprocess.Popen(args).wait()
returns
Unknown command: 'my-command'
Type 'manage.py help' for usage.
I've tried adding cd /path/to/project/with/management/command/ && source bin/activate to pickup that projects django settings.
I should also mention I have all the correct paths in my PYTHONPATH and the command works when run from the command line and programmatically.
Both projects have to use djagno 1.7.
Related
I created a Cassandra database in DataStax Astra. I'm able to connect to it in Python (using cassandra-driver module, and the secure_connect_bundle). I wrote a few api in my Python application to query the database.
I read that I can upload csv to it using dsbulk. I am able to run the following command in Terminal and it works.
dsbulk load -url data.csv -k foo_keyspace -t foo_table \
-b "secure-connect-afterpay.zip" -u username -p password -header true
Then I try to run this same line in Python using subprocess:
ret = subprocess.run(
['dsbulk', 'load', '-url', 'data.csv', '-k', 'foo_keyspace', '-t', 'foo_table',
'-b', 'secure-connect-afterpay.zip', '-u', 'username', '-p', 'password',
'-header', 'true'],
capture_output=True
)
But I got FileNotFoundError: [Errno 2] No such file or directory: 'dsbulk': 'dsbulk'. Why is dsbulk not recognized if I run it from Python?
A related question, it's probably not best practice to rely on subprocess. Are there better ways to upload batch data to Cassandra?
I think it has to do with the way path is handled by subprocess. Try specifying the command as an absolute path, or relative like "./dsbulk" or "bin/dsbulk".
Alternatively, if you add the bin directory from the DS Bulk package to your PATH environment variable, it will work as you have it.
I'm sure this is something simple, but I'm trying several settings and I just can't seem to get this to work.
I have the following code:
import subprocess
p = subprocess.Popen('mkdir -p /backups/my_folder', stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
This is running in a flask application on nginx and python 3
When this executes I'm getting the following error:
/bin/sh: 1: mkdir: not found
I've tried with shell=False, I've tried with Popen(['mkdir', ...]), and I've tried subprocess.run like this question/answer
If I run with shell=False, I get the following error:
Error: [Errno 2] No such file or directory: 'mkdir -p
/backups/my_folder': 'mkdir -p /backups/my_folder'
When I do /bin/mkdir, it works. But, there are other commands which call sub commands that fail (tar calling gz for instance)
What am I missing to get this to work?
Running:
Debian 9.8, Nginx 1.14.0, Python 3.6.8
EDIT
I need this to work for other commands as well. I know I can use os.makedirs, but I have several different commands I will be executing (rsync, ssh, tar, and more)
For these simple commands, try to use python instead of invoking the shell - it makes you more independent of the environment:
os.makedirs('/backups/my_folder', exist_ok=True)
I found the problem.
I realized that my /etc/systemd/system/site.service uWSGI settings had a hard coded path:
Environment = /usr/local/bin
Once, I changed this to include /bin, all my subprocess commands executed just fine.
import subprocess
p = subprocess.Popen('mkdir -p my_folder', stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
(result, error) = p.communicate()
print(result)
this is for only windows 10.
OS X 10.13.6 Python 3.6
I am trying to run the following command from a jupyter notebook:
vpn_cmd = '''
sudo openvpn
--config ~/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn
--ca ~/Downloads/configs/ca.ipvanish.com.crt'''
proc = Popen(vpn_cmd.split(), stdout=PIPE, stderr=STDOUT)
stdout, stderr = proc.communicate()
print(stdout.decode())
But get the error:
sudo: openvpn: command not found
What I've tried:
added export PATH="/usr/local/sbin:$PATH" to my ~/.bash_profile and can run the the sudo openvpn command from my terminal
edited my sudoers file so sudo no longer prompts for a password
called sudo which openvpn and tried adding /usr/local/sbin/openvpn to my sys.path within python
not splitting vpn_cmd and setting shell=True
tried packaging it in a test.py script and executing from the terminal, but it just hangs at the proc.communicate() line
specified the full path for the --config and --ca flags
So far, nothing has fixed this. I can run openvpn from my terminal just fine. It seems like a simple path issue but I can't figure out what I need to add to my python path. Is there something particular with the jupyter notebook kernel?
Jupyter probably isn't picking up your personal .bashrc settings, depending also on how you are running it. Just hardcode the path or augment the PATH in your Python script instead.
With shell=False you don't get the tildes expanded; so you should change those to os.environ["HOME"], or make sure you know in which directory you run this, and use relative paths.
You should not be using Popen() if run can do what you require.
home = os.environ["HOME"]
r = subprocess.run(
['sudo', '/usr/local/sbin/openvpn',
'--config', home + '/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn',
'--ca', home + '/Downloads/configs/ca.ipvanish.com.crt'],
stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(r.stdout)
I have the following code that works great to run the ls command. I have a bash alias that I use alias ll='ls -alFGh' is it possible to get python to run the bash command without python loading my bash_alias file, parsing, and then actually running the full command?
import subprocess
command = "ls" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
#Launch the shell command:
output = process.communicate()
print (output[0])
Trying with command = "ll" the output I get is:
/bin/sh: ll: command not found
b''
You cannot. When you run a python process it has no knowledge of a shell alias. There are simple ways of passing text from parent to child process (other than IPC), the command-line and through environment (i.e. exported) variables. Bash does not support exporting aliases.
From the man bash pages: For almost every purpose, aliases are superseded by shell functions.
Bash does support exporting functions, so I suggest you make your alias a simple function instead. That way it is exported from shell to python to shell. For example:
In the shell:
ll() { ls -l; }
export -f ll
In python:
import subprocess
command = "ll" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
output = process.communicate()
print(output[0].decode()) # Required if using Python 3
Since you are using the print() function I have assumed you are using python 3. In which case you need the .decode(), since a bytes object is returned.
With a bit of hackery it is possible to create and export shell functions from python as well.
I have some code that uses subprocess to look at the logs from a git directory. My code seems to work fine when executed in a local django dev environment. Once deployed however (with Apache / mode_wsgi) the output from stdout read() comes back empty. My development and production machine are the same right now, and I also tried making sure every file was readable.
Does anybody have an idea why Popen is not returning any output once deployed here? Thanks.
def getGitLogs(projectName, searchTerm=None, since):
os.chdir(os.path.join(settings.SCM_GIT, projectName))
cmd = "git log --since {0} -p".format(since)
p = subprocess.Popen(cmd, shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=True)
output = p.stdout.read()
### here output comes back as expected in dev environment, but empty in once deployed
return filterCommits(parseCommits(output), searchTerm)
Chain your chdir as part of your command (ie, cd /foo/bar/zoo)
Pass the full path to git
So your command would end up cd /foo/bar/zoo && /usr/bin/git log --since