Using wget with subprocess - python

I'm trying to use wget with subprocess.
my attempts worked until I tried to download the page to a specified directory with this code:
url = 'google.com'
location = '/home/patrick/downloads'
args = ['wget', 'r', 'l 1' 'p' 'P %s' % location, url]
output = Popen(args, stdout=PIPE)
if I run this code in /home/patrick I get index.html in /home/patrick and not in /home/patrick/downloads.
Can you help me?
Thanks ;)

You need to have hyphens and location should be just another argument:
args = ['wget', '-r', '-l', '1', '-p', '-P', location, url]

Edit: popen from os intends to replace os.popen module. Hence, using os.popen is not recommended
Initially I thought it was popen from os.
If you are using popen from os
#wget 'http://google.com/' -r -l 1 -p -P /Users/abhinay/Downloads
from os import popen
url = 'google.com'
location = '/Users/abhinay/Downloads'
args = ['wget %s', '-r', '-l 1', '-p', '-P %s' % location, url]
output = popen(' '.join(args))
and using Popen from subprocess
#wget 'http://google.com/' -r -l 1 -p -P Downloads/google
from subprocess import Popen
url = 'google.com'
location = '/Users/abhinay/Downloads'
#as suggested by #SilentGhost the `location` and `url` should be separate argument
args = ['wget', '-r', '-l', '1', '-p', '-P', location, url]
output = Popen(args, stdout=PIPE)
Let me know if I'm missing something.
Thx!

Related

List of commands with Popen

Hi guys,
I have three commands for copy/paste the folder with a similar path. I using this code:
from subprocess import Popen, PIPE
cmd_list = [
'cp -r /opt/some_folder_1/ /home/user_name/',
'cp -r /var/some_folder_2/ /home/user_name/',
'cp -r /etc/some_folder_3/ /home/user_name/',
]
copy_paste = Popen(
cmd_list,
shell = True,
stdin = PIPE,
stdout = PIPE,
stderr = PIPE
)
stdout, stderr = make_copy.communicate()
But for the copy/paste three folders, I should three times run code.
Could you help me with these guys?
Thank you!
Do:
import subprocess
cmd_list = [
['cp', '-r', '/opt/some_folder_1/', '/home/user_name/'],
['cp', '-r', '/var/some_folder_2/', '/home/user_name/'],
['cp', '-r', '/etc/some_folder_3/', '/home/user_name/'],
]
for cmd in cmd_list:
res = subprocess.check_output(cmd)
# stdout and stderr are available at res.stdout and res.stderr
# An error is raised for non-zero return codes
When passing a list into Popen or any of the other subprocess functions, you're not able to pass multiple commands in the cmd_list. It is expected that the first item in the list is the command you are running, and everything else are parameters for that one command. This restriction helps keep your code safer, especially when using user supplied input.
Another options is you can join everything together into a single command with a double ampersand. When doing so if one command fails the remaining commands won't run.
copy_paste = Popen(
' && '.join(cmd_list),
shell = True,
stdin = PIPE,
stdout = PIPE,
stderr = PIPE
)

Properly automate a docker script in Python

Based on this tutorial to build a TF image classifier, I have a bash shell in which I run a Docker image with the following command:
docker run --name fooldocker -it -v $HOME/tf_files:/tf_files/ gcr.io/tensorflow/tensorflow:latest-devel
And then in this docker image I run my Python script:
python /tf_files/label_image.py /tf_files/myimages
exit
It works.
But now, I need to automate these commands in a Python script. I tried :
p = Popen(['docker', 'run', '--rm', '--name', 'fooldocker','-it', '-v', '$HOME/tf_files:/tf_files/', 'gcr.io/tensorflow/tensorflow:latest-devel'], stdout=PIPE)
p = Popen(['docker', 'exec', 'fooldocker', 'python', '/tf_files/label_NES.py', '/tf_files/NES/WIP'])
p = Popen(['docker', 'kill', 'fooldocker'], shell=True, stdout=PIPE, stderr=PIPE)
p = Popen(['docker', 'rm', 'fooldocker'], shell=True, stdout=PIPE, stderr=PIPE)
Leading to this error after Popen #2 is run :
docker: Error response from daemon: create $HOME/tf_files: "$HOME/tf_files" includes invalid characters for a local volume name, only "[a-zA-Z0-9][a-zA-Z0-9_.-]" are allowed.
The problem is that $HOME cannot be evaluated in this single quotes string. Either try doublequotes, or evaluate the variable beforehand and put it into the command string.
Also: If you set shell=True, you don't split your command into a list:
p = Popen('docker kill fooldocker', shell=True, stdout=PIPE, stderr=PIPE)
it is because Popen didn't interpret $HOME to your home path.
and it is the string $HOME and pass to docker command which not allow $ in a volume name.
maybe you can use subprocess module for convenience, for example:
import subprocess
subprocess.call("echo $HOME", shell=True)
it interpreted $HOME if shell=True specified.

python Popen chmod error starting Postgresql

I can run:
sudo service postgresql start
from the command line with no issues. However when I try running the following:
import os
from subprocess import Popen,PIPE
pwd = getsudopwd()
cmd = ['sudo','service',process,'state']
p = Popen(cmd,stdout=PIPE,stdin=PIPE,stderr=PIPE,universal_newlines=True)
out,err = p.communicate(pwd+'\n')
if err: raise RuntimeError(err)
I get the following error
chmod: changing permissions of '/var/run/postgresql': Operation not permitted. So, why is there is an error accessing the pid directory for postgresql when this is run from Python?
You can simply use -S with sudo:
from subprocess import Popen, PIPE
import getpass
pwd = getpass.getpass()
proc = Popen(['sudo', '-S', 'service',process,'state'],
stdout=PIPE, stdin=PIPE, stderr=PIPE,universal_newlines=True)
out,err= proc.communicate(input="{}\n".format(pwd))
i suggest you use the sh library
its very simple and easy to use
from sh import sudo
print sudo('service postgresql start')
Running sudo command with the -S option and piping your password to the stdin of the sudo command should solve your problem.
import os
from subprocess import Popen, PIPE
echo = Popen(('echo', 'mypasswd'), stdout = PIPE)
p = Popen(['sudo', '-S', 'service', 'postgresql', 'restart'], stdin = echo.stdout, stdout = PIPE, stderr = PIPE, universal_newlines = True)
out, err = p.communicate()
print out

How to use subprocess Popen?

I'm trying to execute a command using Popen.
The command uses some PostGIS/Postgresql utility programs to upload a raster file to a database and works when executed from the command line. It uses unix style pipes to chain 2 commands and looks like this:
"C:\\Program Files\\PostgreSQL\\9.2\\bin\\raster2pgsql.exe" -d -I -C -e -Y -F -t 128x128 "C:\\temp\\SampleDTM\\SampleDTM.tif" test | "C:\\Program Files\\PostgreSQL\\9.2\\bin\\psql.exe" -h localhost -p 5432 -d adr_hazard -U postgres
When using within Python, I make it a string with the ' codes:
command = '"C:\\Program Files\\PostgreSQL\\9.2\\bin\\raster2pgsql.exe" -d -I -C -e -Y -F -t 128x128 "C:\\temp\\SampleDTM\\SampleDTM.tif" test | "C:\\Program Files\\PostgreSQL\\9.2\\bin\\psql.exe" -h localhost -p 5432 -d adr_hazard -U postgres'
attempting to execute it results in an error:
p = subprocess.Popen(command)
ERROR: Unable to read raster file: test
The error seems like the command was not parsed correctly (it is interpreting the wrong argument as the raster file)
Am I using Popen wrong?
Your command uses pipe |. It requires a shell:
p = subprocess.Popen(command, shell=True)
The command itself as far as I can tell looks ok.
It's not necessary to use shell=True to achieve this with pipes. This can be done programmatically with pipes even where concern about insecure input is an issue. Here, conn_params is a dictionary with PASSWORD, NAME (database name), USER, and HOST keys.
raster2pgsql_ps = subprocess.Popen([
'raster2pgsql', '-d', '-I', '-C', '-e', '-Y', '-F', '-t', '128x128',
'C:\\temp\\SampleDTM\\SampleDTM.tif',
'test'
], stdout=subprocess.PIPE)
# Connection made using conninfo parameters
# http://www.postgresql.org/docs/9.0/static/libpq-connect.html
psql_ps = subprocess.check_output([
'psql',
'password={PASSWORD} dbname={NAME} user={USER} host={HOST}'.format(**conn_params),
], stdin=raster2pgsql_ps.stdout)
The following worked for me on Windows, while avoiding shell=True
One can make use of Python's fstring formatting to make sure the commands will work in windows.
Please note that I used shp2pgsql but it should be a very similar process for raster2pgsql.
Parameters for the shp2pgsql: srid is the coordinate system of the shape file, filename is the path to the shape file to be imported, tablename is the name you'd like to give your table.
import os
import subprocess
shp2pgsql_binary = os.path.join(pgsql_dir, "bin", "shp2pgsql")
psql_binary = os.path.join(pgsql_dir, "bin", "psql")
command0 = f'\"{shp2pgsql_binary}\" -s {srid} \"{filename}\" {tablename}'
command1 = f'\"{psql_binary}\" \"dbname={databasename} user={username} password={password} host={hostname}\"'
try:
shp2pgsql_ps = subprocess.Popen(command0, stdout=subprocess.PIPE)
psql_ps = subprocess.check_output(command1, stdin=shp2pgsql_ps.stdout)
except:
sys.stderr.write("An error occurred while importing data into the database, you might want to \
check the SQL command below:")
sys.stderr.write(command)
raise
To adpat to raster2pgsql, you just need to modify the string in command0, e.g. -s {srid} becomes -d -I -C -e -Y -F -t 128x128. The string for command1 can remain the same.
PIPE = subprocess.PIPE
pd = subprocess.Popen(['"C:\\Program Files\\PostgreSQL\\9.2\\bin\\raster2pgsql.exe", '-d', '-I', '-C', '-e', '-Y', '-F', '-t', '128x128', "C:\\temp\\SampleDTM\\SampleDTM.tif", 'test'],
stdout=PIPE, stderr=PIPE)
stdout, stderr = pd.communicate()
It will be better to use subprocess.Popen in this way:
proc = subprocess.Popen(['"C:\\Program Files\\PostgreSQL\\9.2\\bin\\raster2pgsql.exe"', '-d', '-I', '-C', '-e', '-Y', '-F', '-t', '128x128', '"C:\\temp\\SampleDTM\\SampleDTM.tif"', 'test', '|', '"C:\\Program Files\\PostgreSQL\\9.2\\bin\\psql.exe"', '-h', 'localhost', '-p', '5432', '-d', 'adr_hazard', '-U', 'postgres'], shell = True, stdout = subprocess.pipe, stderr = subprocess.STDOUT)
proc.wait()
result = proc.stdout.readlines()#if you want to process the result of your command
proc.kill()
B.T.W, it's good to format the path first, use:
path = os.path.normalpath("C:\\Program Files\\PostgreSQL\\9.2\\bin\\raster2pgsql.exe")
this will avoid some path problems for different OS platform.
The shell = True is important if you want to execute your command just like executing it in local shell.
Hope will help you.

How to execute shell command get the output and pwd after the command in Python

How can I execute a shell command, can be complicated like normal command in bash command line, get the output of that command and pwd after execution?
I used function like this:
import subprocess as sub
def execv(command, path):
p = sub.Popen(['/bin/bash', '-c', command],
stdout=sub.PIPE, stderr=sub.STDOUT, cwd=path)
return p.stdout.read()[:-1]
And I check if user use cd command but that will not work when user use symlink to cd or other wierd way to change directory.
and I need a dictionary that hold {'cwd': '<NEW PATH>', 'result': '<COMMAND OUTPUT>'}
If you use subprocess.Popen, you should get a pipe object that you can communicate() for the command output and use .pid() to get the process id. I'd be really surprised if you can't find a method to get the current working directory of a process by pid...
e.g.: http://www.cyberciti.biz/tips/linux-report-current-working-directory-of-process.html
I redirect stdout to stderr of pwd command. if stdout is empty and stderr is not a path then stderr is error of the command
import subprocess as sub
def execv(command, path):
command = 'cd %s && %s && pwd 1>&2' % (path, command)
proc = sub.Popen(['/bin/bash', '-c', command],
stdout=sub.PIPE, stderr=sub.PIPE)
stderr = proc.stderr.read()[:-1]
stdout = proc.stdout.read()[:-1]
if stdout == '' and not os.path.exists(stderr):
raise Exception(stderr)
return {
"cwd": stderr,
"stdout": stdout
}
UPDATE: here is better implemention (using last line for pwd and don't use stderr)
def execv(command, path):
command = 'cd %s && %s 2>&1;pwd' % (path, command)
proc = sub.Popen(['/bin/bash', '-c', command],
env={'TERM':'linux'},
stdout=sub.PIPE)
stdout = proc.stdout.read()
if len(stdout) > 1 and stdout[-1] == '\n':
stdout = stdout[:-1]
lines = stdout.split('\n')
cwd = lines[-1]
stdout = '\n'.join(lines[:-1])
return {
"cwd": cwd,
"stdout": man_to_ansi(stdout)
}
To get output of an arbitrary shell command with its final cwd (assuming there is no newline in the cwd):
from subprocess import check_output
def command_output_and_cwd(command, path):
lines = check_output(command + "; pwd", shell=True, cwd=path).splitlines()
return dict(cwd=lines[-1], stdout=b"\n".join(lines[:-1]))

Categories