I need to invoke make (build a makefile) in a directory different from the one I'm in, from inside a Python script. If I simply do:
build_ret = subprocess.Popen("../dir1/dir2/dir3/make",
shell = True, stdout = subprocess.PIPE)
I get the following:
/bin/sh: ../dir1/dir2/dir3/make: No such file or directory
I've tried:
build_ret = subprocess.Popen("(cd ../dir1/dir2/dir3/; make)",
shell = True, stdout = subprocess.PIPE)
but the make command is ignored. I don't even get the "Nothing to build for" message.
I've also tried using "communicate" but without success. This is running on Red Hat Linux.
I'd go with #Philipp's solution of using cwd, but as a side note you could also use the -C option to make:
make -C ../dir1/dir2/dir3/make
-C dir, --directory=dir
Change to directory dir before reading the makefiles or doing anything else. If multiple -C options are specified, each is interpreted relative to the previous one: -C / -C etc is equivalent to -C /etc. This is typically used with recursive invocations of make.
Use the cwd argument, and use the list form of Popen:
subprocess.Popen(["make"], stdout=subprocess.PIPE, cwd="../dir1/dir2/dir3")
Invoking the shell is almost never required and is likely to cause problems because of the additional complexity involved.
Try to use the full path, you can get the location of the python script by calling
sys.argv[0]
to just get the path:
os.path.dirname(sys.argv[0])
You'll find quite some path manipulations in the os.path module
Related
From within a Python-based installation script running on Fish shell and supporting Python 2.7 and 3.4+, I want to add a directory to the PATH environment variable if not already present. When the script exits, PATH should contain the new directory, both for the current Fish shell session as well as future logins.
To achieve this, I can think of two potential approaches:
Create a localpath.fish file in ~/.config/fish/conf.d/ containing: set PATH ~/.local/bin $PATH
Use Python’s subprocess to run: set -U fish_user_paths ~/.local/bin $fish_user_paths
I would prefer the latter approach, but I have tried invoking Fish via subprocess several different ways, all to no avail. For example:
>>> subprocess.check_output(["fish", "-c", "echo", "Hello", "World!"], shell=True)
On Fish 3.0.2 and Python 3.7.4, the above yields:
<W> fish: Current terminal parameters have rows and/or columns set to zero.
<W> fish: The stty command can be used to correct this (e.g., stty rows 80 columns 24).
… and the Python REPL prompt does not re-appear. Tapping CTRL-C does not exit properly, leaving the Python REPL in a mostly unusable state until quit and relaunched. (This appears to be a known issue.)
Similarly, using subprocess.run() with Python 3.7’s capture_output parameter fails to yield the expected output:
>>> subprocess.run(["fish", "-c", "echo", "Hello", "World!"], capture_output=True)
CompletedProcess(args=['fish', '-c', 'echo', 'Hello', 'World!'], returncode=0, stdout=b'\n', stderr=b'')
My questions:
Is there a better strategy than either of the two methods above?
If using the first approach, how would I adjust PATH for the current Fish shell session from within a Python script?
If using the set -U fish_user_paths […] approach, what would be the appropriate way to invoke that from within a Python script?
You're on the right track, but the whole command has to be as a single argument. Also you don't want shell expansion beforehand.
So you could write:
subprocess.check_output(["fish", "-c", "echo hello world"])
and it will do what you expect. For the PATH case:
subprocess.check_output(["fish", "-c", "set -U fish_user_paths ~/.local/bin $fish_user_paths"])
and this will modify $PATH in the parent process.
I have been trying to use the output of a system command to use it as a part of the command in the next portion. However, I cannot seem to join it up properly and hence not being able to run the second command properly. The OS used is KALI LINUX and python 2.7
#IMPORTS
import commands, os, subprocess
os.system('mkdir -p ~/Desktop/TOOLS')
checkdir = commands.getoutput('ls ~/Desktop')
if 'TOOLS' in checkdir:
currentwd = subprocess.check_output('pwd', shell=True)
cmd = 'cp -R {}/RAW ~/Desktop/TOOLS/'.format(currentwd)
os.system(cmd)
os.system('cd ~/Desktop/TOOLS')
os.system('pwd')
The errors are:
cp: missing destination file operand after ‘/media/root/ARSENAL’
Try 'cp --help' for more information.
sh: 2: /RAW: not found
/media/root/ARSENAL
It seems that the reading of the first command is alright but it can't join with the RAW portion. I have read many other solutions, but they seem to be for shell scripting instead.
Assuming you haven't called os.chdir() anywhere prior to the cp -R, then you can use a relative path. Changing the code to...
if 'TOOLS' in checkdir:
cmd = 'cp -R RAW ~/Desktop/TOOLS'
os.system(cmd)
...should do the trick.
Note that the line...
os.system('cd ~/Desktop/TOOLS')
...will not do what you expect. os.system() spawns a subshell, so it will just change the working directory for that process and then exit. The calling process's working directory will remain unchanged.
If you want to change the working directory for the calling process, use...
os.chdir(os.path.expanduser('~/Desktop/TOOLS'))
However, Python has all this functionality built-in, so you can do it without spawning any subshells...
import os, shutil
# Specify your path constants once only, so it's easier to change
# them later
SOURCE_PATH = 'RAW'
DEST_PATH = os.path.expanduser('~/Desktop/TOOLS/RAW')
# Recursively copy the files, creating the destination path if necessary.
shutil.copytree(SOURCE_PATH, DEST_PATH)
# Change to the new directory
os.chdir(DEST_PATH)
# Print the current working directory
print os.getcwd()
In Python I'm using subprocess to call gsutil copy and move commands, but am currently unable to select multiple extensions.
The same gsutil command works at the terminal, but not in python:
cmd_gsutil = "sudo gsutil -m mv gs://xyz-ms-media-upload/*.{mp4,jpg} gs://xyz-ms-media-upload/temp/"
p = subprocess.Popen(cmd_gsutil, shell=True, stderr=subprocess.PIPE)
output, err = p.communicate()
If say there are four filetypes to move but the bucket is empty, the returning gsutil error from terminal is:
4 files/objects could not be transferred.
Whereas the error returned when run through subprocess is:
1 files/objects could not be transferred.
So clearly subprocess is mucking up the command somehow...
I could always inefficiently repeat the command for each of the filetypes, but would prefer to get to the bottom of this!
It seems, /bin/sh (the default shell) doesn't support {mp4,jpg} syntax.
Pass executable='/bin/bash', to run it as a bash command instead.
You could also run the command without the shell e.g., using glob or fnmatch modules to get the filenames to construct the gsutil command. Note: you should pass the command as a list in this case instead.
I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.
In a python script I call a bash script as follows:
subprocess.Popen(["./scan.sh", dir])
Inside that script, there is,
find $1 -name "*013*.txt" > found_files.txt
For some reason, the dir argument from python is translated into a version with quotes inside the bash script. Printing 'dir' in python yields the exact path as the user typed it:
~/Desktop/Files
however, find fails with
find: '~/Desktop/Files' no such directory
Running scan.sh manually with ~/Desktop/Files as the argument works fine. How come quotes are being put around it...?
There aren't. What's happening is that the ~ is not being interpreted, as it's the shell's job to do so. Use os.path.expanduser() to expand the path before passing it to subprocess.
use $HOME. the '~' is not expanding in quotes nor in double quotes.
python -c "import subprocess;subprocess.Popen(['./scan.sh', '~'])"
python -c "import subprocess;subprocess.Popen(['./scan.sh', '$HOME'])"
my scan.sh contain:
#!/bin/sh
echo =$1=
The first one print =~=, the second =/Users/jomo=.