I am facing difficulties calling a command line from my script.I run the script but I don't get any result. Through this command line in my script I want to run a tool which produces a folder that has the output files for each line.The inputpath is already defined. Can you please help me?
for line in inputFile:
cmd = 'python3 CRISPRcasIdentifier.py -f %s/%s.fasta -o %s/%s.csv -st dna -co %s/'%(inputpath,line.strip(),outputfolder,line.strip(),outputfolder)
os.system(cmd)
You really want to use the Python standard library module subprocess. Using functions from that module, you can construct you command line as a list of strings, and each would be processed as one file name, option or value. This bypasses the shell's escaping, and eliminates the need to massage you script arguments before calling.
Besides, your code would not work, because the body block of the for statement is not indented. Python would simply not accept this code (could be you pasted into the questiong without the proper indentations).
as mentioned before, executing command vias: os.system(command) is not recomended. please use subprocess (read in python docs about this modulesubprocess_module_docs). see the code here:
for command in input_file:
p = subprocess.Popen(command, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
# use this if you want to communicate with child process
# p = subprocess.Popen(command, stdout=subprocess.PIPE, stdin=subprocess.PIPE, stderr=subprocess.PIPE)
p.communicate()
# --- do the rest
I usually do like this for static command
from subprocess import check_output
def sh(command):
return check_output(command, shell=True, universal_newlines=True)
output = sh('echo hello world | sed s/h/H/')
BUT THIS IS NOT SAFE!!! It's vunerable to shell injection you should do
from subprocess import check_output
from shlex import split
def sh(command):
return check_output(split(command), universal_newlines=True)
output = sh('echo hello world')
The difference is subtle but important. shell=True will create a new shell, so pipes, etc will work. I use this when I have a big command line with pipes and that is static, I mean, it do not depend on user input. This is because this variant is vunerable to shell injection, a user can input something; rm -rf / and it will run.
The second variant only accepts one command, it will not spawn a shell, instead it will run the command directly. So no pipes and such shell things will work, and is safer.
universal_newlines=True is for getting output as string instead of bytes. Use it for text output, if you need binary output just ommit it. The default is false.
So here is the full example
from subprocess import check_output
from shlex import split
def sh(command):
return check_output(split(command), universal_newlines=True)
for line in inputFile:
cmd = 'python3 CRISPRcasIdentifier.py -f %s/%s.fasta -o %s/%s.csv -st dna -co %s/'%(inputpath,line.strip(),outputfolder,line.strip(),outputfolder)
sh(cmd)
Ps: I didn't test this
Related
I'm trying to use subprocess.Popen() to run a command in my script. The code is:
output = Popen(["hrun DAR_MeasLogDump " + log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
When I print the output, it's printing out the created shell output and not the actual command that's in the list. I tried getting rid of executable='/bin/csh', but then Popen wouldn't even run.
I also tried using subprocess.communicate(), but it didn't work either. I would also get the shell output and not the actual command run.
I want to completely avoid using shell=True because of security issues.
EDIT: In many different attempts, "hrun" is not being recoognized. "hrun" is a Pearl script that is being called, DAR_MeasLogDump is the action and log_file_name is the file that the script will call its action on. Is there any sort of set up or configuration that needs to be done in order for "hrun" to be recognized?
I think the problem is that Popen requires a list of every part of the command (command + options), the documentation for Popen inside subprocess has an example for that. So for that line in your script to work, you would need to write it like this:
output = Popen(["/bin/csh", "hrun", "DAR_MeasLogDump", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE)
I've removed the executable argument, but I guess it could work that way as well.
Try:
output = Popen(["-c", "hrun DAR_MeasLogDump " +log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
csh is expecting -c "full command here". Without -c I think it just tries to open it as a file.
Specifying an odd shell and an explicit cwd seems completely out of place here (assuming cwdir is defined to the current directory).
If the first argument to subprocess is a list, no shell is involved.
result = subprocess.run(["hrun", "DAR_MeasLogDump", log_file_name],
stdout=subprocess.PIPE, stderr = subprocess.PIPE,
universal_newlines=True, check=True)
output = result.stdout
If you need this to be run under a legacy version of Python, maybe use check_output instead of run.
You generally want to avoid Popen unless you need to do something which the higher-level wrapper functions cannot do.
You are creating an instance of subprocess.Popen but not executing it.
You should try:
p = Popen(["hrun", "DAR_MeasLogDump ", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
Args should be passed as a sequence if you do not use shell=True, and then using executable should not be required.
Note that if you are not using advanced features from Popen, the doc recommends using subprocess.run:
from subprocess import run
p = run(["hrun", "DAR_MeasLogDump ", log_file_name], capture_output=True, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
This works with cat example:
import subprocess
log_file_name='-123.txt'
output = subprocess.Popen(['cat', 'DAR_MeasLogDump' + log_file_name],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout, stderr = output.communicate()
print (stdout)
print (stderr)
I think you need only change to your 'hrun' command
It seems the same problem that I had at the beginning of a project: you have tried with windows "environment variables". It turns out that when entering the CMD or powershell it does not recognize perl, java, etc. unless you go to the folder where the .exe .py .java, etc. is located and enter the cmd, where the java.exe, python.py, etc. is.
In my ADB project, once I added in my environment variables, I no longer needed to go to the folder where the .exe .py or adb code was located.
Now you can open a CMD and it will execute any command even from your perl , so the interpreter that uses powershell will find and recognize the command.
INPUTS is the variable I gave for the absolute path of a directory of possible input files. I want to check their status before going through my pipeline. So I tried:
import subprocess
import argparse
INPUTS = '/home/username/WinterResearch/Inputs'
status = subprocess.Popen(['ls', '-lh', INPUTS], shell=True, stdout=subprocess.PIPE)
stdout = status.communicate()
status.stdout.close()
I have also tried the often used
from shlx import split
import subprocess
import argparse
cmd = 'ls -lh INPUTS'
status = subprocess.Popen(cmd.split(), shell=True, stdout=subprocess.PIPE)
and
cmd = "ls -lh 'INPUTS'"
I do not receive an error code. The process simply does not output anything to the terminal window. I am not sure why the python script simply skips over this instead of stating there is an error. I do receive an error when I include close_fds=True that states int cannot use communicate(). So how can I receive an output from some ls -lh INPUTS equivalent using subprocess.Popen()?
You don't see any output because you're not printing to console stdout — it's saved into a variable (named "stdout"). Popen is overkill for this task anyway since you aren't piping the command to another. check_output should work fine with subprocess for this purpose:
import subprocess
subprocess.check_output("ls -lh {0}".format(INPUTS), shell=True)
subprocess.check_output(args, *, stdin=None, stderr=None, shell=False,
universal_newlines=False)
Run command with arguments and return its
output as a byte string.
METHOD WITH LESSER SECURITY RISK: (see warnings plastered throughout this page)
EDIT: Using communicate() can avoid the potential
shell=True security risk:
output = subprocess.Popen(["ls", "-lh", INPUTS]).communicate()[0]
print(output)
From your first snippet:
stdout = status.communicate()
status.stdout.close()
Nothing is being printed here. You may need to change it to the following (or your preferred form/format)
stdout = status.communicate()
print stdout
status.stdout.close()
I tried to write a code that can execute python codes easily.
but when I used subprocess library such:
import subprocess
print(subprocess.Popen("py setup.py install", shell = True, stdout = subprocess.PIPE).stdout.read())
print(subprocess.Popen("py setup.py py2exe", shell = True, stdout = subprocess.PIPE).stdout.read())
I saw just this result
b''
please help me please
Most likely the commands you are trying to run are producing a stderr, which your code does not display. It is possible to send the stderr messages to stdout if you don't want to handle it separately.
I'll use a different command in the subprocess that is relatively safe. And I will break it up a little instead of having one long line.
import subprocess
p = subprocess.Popen("python filedoesntexist",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
print(p.stdout.read())
See that I added the parameter stderr=subprocess.STDOUT, this sends all the error messages to stdout. The subprocess tries to run "python filedoesntexist" and since filedoesntexist is a file that doesn't exists, it will print this message:
b"python: can't open file 'filedoesntexist': [Errno 2] No such file or directory\n"
But you might just want to get the string instead of bytes, and you can add the parameter universal_newlines=True like this:
p = subprocess.Popen("python filedoesntexist",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
universal_newlines=True)
print(p.stdout.read())
Now it prints just the string like this:
python: can't open file 'filedoesntexist': [Errno 2] No such file or directory
For additional information, visit the python documentation
Edit
The documentation recommends using run(), which can be done like this (updated after comments from J.F. Sebastian) :
subprocess.run(["python", "filedoesntexist"])
If you need to handle stdout in some way, add parameters described earlier in the Popen examples.
I was trying to run and process the stdout of some java program and found that my Python script was eternally waiting. Then I've wrote a new test script to test subprocess and found that, again, I see no output when running this:
$ cat test.py
#!/usr/bin/env python
import subprocess
c = ['/usr/bin/tail', '-f', '/var/log/dmesg']
proc = subprocess.Popen(c,
bufsize=1,
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
for line in proc.stdout:
print line
Why is subprocess ignoring my bufsize argument? Is there some intermediate buffering I'm missing to take into account? I expect to read the first 10 lines of tail and then eternally wait until new lines are appened to the dmesg file. My user do have permissions, running the command on bash gives output.
Changing tail to yes seems to fill some buffers and I can see lots of output.
You can use iter(proc.stdout.readline,''):
for line in iter(proc.stdout.readline,''):
print line
for line in proc.stdout reads all the input before iterating over the content.
I am trying to mirror the following shell command using subprocess.Popen():
echo "SELECT employeeid FROM Users WHERE samaccountname=${1};" | bsqldb -S mdw2k8sqlp02.dow.com -D PhoneBookClient -U PortManUser -P plum45\\torts -q
It currently looks like:
stdout = subprocess.Popen(["echo", "\"SELECT", "employeeid", "FROM", "Users", "WHERE", "samaccountname=${1};\"", "|", "bsqldb", "arg1etc"], stdout=subprocess.PIPE)
for line in stdout.stdout.readlines():
print line
It seems that this is wrong, it returns the following standard out:
"SELECT employeeid FROM Users WHERE samaccountname=${1};" | bsqldb arg1etc
Does anyone know where my syntax for subprocess.Popen() has gone wrong?
The problem is that you're trying to run a shell command without the shell. What happens is that you're passing all of those strings—including "|" and everything after is—as arguments to the echo command.
Just add shell=True to your call to fix that.
However, you almost definitely want to pass the command line as a string, instead of trying to guess at the list that will be joined back up into the string to pass to the shell.
Or, even better, don't use the shell, and instead pipe within Python. The docs have a nice section about Replacing shell pipeline (and all kinds of other things) with subprocess code.
But in your case, the thing you're trying to pipe is just echo, which is quite silly, since you already have exactly what echo would return, and can just feed it as the input to the second program.
Also, I'm not sure what you expect that ${1} to get filled in with. Presumably you're porting a shell script that took some arguments on the command line; your Python script may have the same thing in sys.argv[1], but without knowing more about what you're doing, that's little more than a guess.
The analog of echo some string | command arg1 arg2 shell pipeline in Python is:
from subprocess import Popen, PIPE
p = Popen(["command", "arg1", "arg2"], stdin=PIPE)
p.communicate("some string")
In your case, you could write it as:
import shlex
import sys
from subprocess import Popen, PIPE
cmd = shlex.split("bsqldb -S mdw2k8sqlp02.dow.com -D PhoneBookClient "
"-U PortManUser -P plum45\\torts -q")
sql = """SELECT employeeid FROM Users
WHERE samaccountname={name};""".format(name=sql_escape(sys.argv[1]))
p = Popen(cmd, stdin=PIPE)
p.communicate(input=sql)
sys.exit(p.returncode)