write to file with tee command in python - python

I have a python script with variable names e.g V and It. I make a filename based on the parameter as following:
file_out=io.open("file_Iter" + str(It) + "_V_" +str(V)+".txt", 'w')
Then I'd like to redirect all my terminal output to a this file, so I use this command:
os.system("echo - START RUN $(LANG=en_US date +%b_%d_%Y_%k_%M)- | tee -a $file_out")
The file_out is created and also the echo command is shown correctly on the terminal but it is not written to file_out. If in tee command, I put e.g tee testfile.txt, then this file is created and also echo command writes in it.
Q: How should I change tee in order to write to the file that is created by variable names?

I'm not sure I understood it correctly, but I guess that what you want to do is the following:
fileName = "file_Iter" + str(It) + "_V_" +str(V)+".txt"
file_out=io.open(fileName, 'w')
os.system("echo - START RUN $(LANG=en_US date +%b_%d_%Y_%k_%M)- | tee -a " + fileName)
Pay attention to the end of the command, where fileName is concatenated.

$file_out refers to a shell variable -- a variable in the context of the shell that you're executing this command. However, file_out is a Python variable in the context of your Python script.
So, we need to pass this from Python into the shell. Luckily, Python provides an easy way to do this with string formatting commands. This will do what you want, I think:
os.system("echo - START RUN $(LANG=en_US date +%b_%d_%Y_%k_%M)- | " +
"tee -a {}".format(file_out))
However, there are a couple concerns with this. First, you should be using subprocess.Popen instead of os.system -- it's a newer module that's intended to replace os.system.
Also, you should be aware that, depending on the content of V and It, it may be possible for this shell to run unexpected commands. Imagine that V has the value '.txt ; sudo rm -rf / #'. Now your file name is 'file_IterIt_V_.txt ; sudo rm -rf / #.txt', and your shell command is 'echo - START RUN $(LANG=en_US date +%b_%d_%Y_%k_%M)- | tee -a file_IterIt_V_.txt ; sudo rm -rf / #.txt', which will happily remove many files from your computer. If you're going to do this, you should make sure to escape the filename in your shell command.

Related

How to embed Python code in a fish script?

I'm trying to convert over a Bash script that includes the following commands:
PYCODE=$(cat << EOF
#INSERT_PYTHON_CODE_HERE
EOF
)
RESPONSE=$(COLUMNS=999 /usr/bin/env python3 -c "$PYCODE" $#)
The idea being that a sed find/replace is then used to inject an arbitrary Python script where #INSERT_PYTHON_CODE_HERE is, creating the script that is then ran.
The corresponding Fish command would seem to be something like this
set PYCODE "
#INSERT_PYTHON_CODE_HERE
"
set RESPONSE (COLUMNS=999 /usr/bin/env python3 -c "$PYCODE" $argv)
but this falls apart when you have a Python script that can include both ' and " (and any other valid) characters.
What is the correct way to handle translate this use of EOF?
As a side note, I would prefer not to modify the sed command that is injecting the python code, but for reference here it is:
set FISH_SCRIPT (sed -e "/#INSERT_PYTHON_CODE_HERE/r $BASE_DIR/test_sh.py" $BASE_DIR/../src/mfa.fish)

can a script provide input when prompted by shell? [duplicate]

This question already has answers here:
Have bash script answer interactive prompts [duplicate]
(6 answers)
Closed 1 year ago.
suppose i wanted to make a bunch of files full of gibberish.
if i wanted to one file of gibberish, then encrypt it using ccrypt, i can do this:
$ echo "12 ddsd23" > randomfile.txt,
now using ccrypt:
$ ccrypt -e randomfile.txt
Enter encryption key:
Enter encryption key: (repeat)
as you can see i am prompted for input for the key.
i want to automate this and create a bunch of gibberish files.
script in python to produce random gibberish:
import random as rd
import string as st
alphs = st.ascii_letters
digits = st.digits
word = ""
while len(word) < 1000:
word += str(rd.choices(alphs))
word += str(rd.choices(digits))
print(word)
now running this from bash script, saving gibberish to file:
#!/bin/bash
count=1
while [ $count -le 100 ]
do
python3 /path/r.py > "file$count.txt"
ccrypt -e "file$count.txt"
((count=count+1))
done
problem, as you can see:
$ bash random.sh
Enter encryption key:
ccrypt does not have an option to provide passphrase as an argument.
Question: is there a way for the bash script to provide the passphrase when shell prompts for it?
i am aware this can be solved just by doing the encryption in python but just curious if something like this can be done with bash.
if it matters: there is an option for ccrypt to ask for just one prompt.
[Edited]
My original answer suggested to do:
printf "$PASSPHRASE\n$PASSPHRASE\n" | ccrypt -e "file$count.txt"
which is the generic solution that should work with many tools that expect some input passed to their STDIN; but it doesn't seem to work with ccrypt for whatever reason.
However, ccrypt also has options for providing the passphrase in different (non-interactive) ways:
$ ccrypt --help
...
-K, --key key give keyword on command line (unsafe)
-k, --keyfile file read keyword(s) as first line(s) from file
...
Here's an example using -K. Note that it is "unsafe" because if you execute this command in your interactive shell, or run your script with -x (to print each executed command), the passphrase may end up in ~/.bash_history or in some logs, respectively, so dump the passphrase to a file and use -k in case that's important.
#!/bin/bash
# read the passphrase, do not display it to screen
read -p "Please provide a passphrase:" -s PASSPHRASE
count=1
while [ $count -le 100 ]
do
python script.py > "file$count.txt"
ccrypt -e "file$count.txt" -K "$PASSPHRASE"
((count=count+1))
done
You need to use the yes command in your bash code. Basically this command will provide the inputs for a script (ie. ccrypt) whenever it needs it. Check here for more info.

Send parameters to python from bash

I have a bash script that calls a python script with parameters.
In the bash script, I'm reading a file that contains one row of parameters separated by ", and then calls the python script with the line I read.
My problem is that the python gets the parameters separated by the space.
The line looks like this: "param_a" "Param B" "Param C"
Code Example:
Bash Script:
LINE=`cat $tmp_file`
id=`python /full_path/script.py $LINE`
Python Script:
print sys.argv[1]
print sys.argv[2]
print sys.argv[3]
Received output:
"param_a"
"Param
B"
Wanted output:
param_a
Param B
Param C
How can I send the parameters to the Python script the way I need?
Thanks!
What about
id=`python /full_path/script.py $tmp_file`
and
import sys
for line in open(sys.argv[1]):
print(line)
?
The issue is in how bash passes the arguments. Python has nothing do to with it.
So, you have to solve all these stuff before sending it to Python, I decided to use awk and xargs for this. (but xargs is the actual MVP here.)
LINE=$(cat $tmp_file)
awk -v ORS="\0" -v FPAT='"[^"]+"' '{for (i=1;i<=NF;i++){print substr($i,2,length($i)-2)}}' <<<$LINE |
xargs -0 python ./script.py
First $(..) is preferred over backticks, because it is more readable. You are making a variable after all.
awk only reads from stdin or a file, but you can force it to read from a variable with the <<<, also called "here string".
With awk I loop over all fields (as defined by the regex in the FPAT variable), and print them without the "".
The output record separator I choose is the NULL character (-v ORF='\0'), xargs will split on this character.
xargs will now parse the piped input by separating the arguments on NULL characters (set with -0) and execute the command given with the parsed arguments.
Note, while awk is found on most UNIX systems, I make use of FPAT which is a GNU awk extension and you might not be having GNU awk as default (for example Ubuntu), but gnu awk is usually just a install gawk away.
Also, the next command would be a quick and easy solution, but generally considered as unsafe, since eval will execute everything it receives.
eval "python ./script "$LINE
This can be done using bash arrays:
tmp_file='gash.txt'
# Set IFS to " which splits on double quotes and removes them
# Using read is preferable to using the external program cat
# read -a reads into the array called "line"
# UPPERCASE variable names are discouraged because of collisions with bash variables
IFS=\" read -ra line < "$tmp_file"
# That leaves blank and space elements in "line",
# we create a new array called "params" without those elements
declare -a params
for((i=0; i < ${#line[#]}; i++))
do
p="${line[i]}"
if [[ -n "$p" && "$p" != " " ]]
then
params+=("$p")
fi
done
# `backticks` are frowned upon because of poor readability
# I've called the python script "gash.py"
id=$(python ./gash.py "${params[#]}")
echo "$id"
gash.py:
import sys
print "1",sys.argv[1]
print "2",sys.argv[2]
print "3",sys.argv[3]
Gives:
1 param_a
2 Param B
3 Param C

Chain of UNIX commands within Python

I'd like to execute the following UNIX command in Python:
cd 2017-02-10; pwd; echo missing > 123.txt
The date directory DATE = 2017-02-10 and OUT = 123.txt are already variables in Python so I have tried variations of
call("cd", DATE, "; pwd; echo missing > ", OUT)
using the subprocess.call function, but I’m struggling to find documentation for multiple UNIX commands at once, which are normally separated by ; or piping with >
Doing the commands on separate lines in Python doesn’t work either because it “forgets” what was executed on the previous line and essentiality resets.
You can pass a shell script as a single argument, with strings to be substituted as out-of-band arguments, as follows:
date='2017-02-10'
out='123.txt'
subprocess.call(
['cd "$1"; pwd; echo missing >"$2"', # shell script to run
'_', # $0 for that script
date, # $1 for that script
out, # $2 for that script
], shell=True)
This is much more secure than substituting your date and out values into a string which is evaluated by the shell as code, because these values are treated as literals: A date of $(rm -rf ~) will not in fact try to delete your home directory. :)
Doing the commands on separate lines in Python doesn’t work either
because it “forgets” what was executed on the previous line and
essentiality resets.
This is because if you have separate calls to subprocess.call it will run each command in its own shell, and the cd call has no effect on the later shells.
One way around that would be to change the directory in the Python script itself before doing the rest. Whether or not this is a good idea depends on what the rest of the script does. Do you really need to change directory? Why not just write "missing" to 2017-02-10/123.txt from Python directly? Why do you need the pwd call?
Assuming you're looping through a list of directories and want to output the full path of each and also create files with "missing" in them, you could perhaps do this instead:
import os
base = "/path/to/parent"
for DATE, OUT in [["2017-02-10", "123.txt"], ["2017-02-11", "456.txt"]]:
date_dir = os.path.join(base, DATE)
print(date_dir)
out_path = os.path.join(date_dir, OUT)
out = open(out_path, "w")
out.write("missing\n")
out.flush()
out.close()
The above could use some error handling in case you don't have permission to write to the file or the directory doesn't exist, but your shell commands don't have any error handling either.
>>> date = "2017-02-10"
>>> command = "cd " + date + "; pwd; echo missing > 123.txt"
>>> import os
>>> os.system(command)

Inserting python code in a bash script

I've got the following bash script:
#!/bin/bash
while read line
do
ORD=`echo $line | cut -c 7-21`
if [[ -r ../FASTA_SEC/${ORD}.fa ]]
then
WCR=`fgrep -o N ../FASTA_SEC/$ORD.fa | wc -l`
WCT=`wc -m < ../FASTA_SEC/$ORD.fa`
PER1=`echo print $WCR/$WCT.*100 | python`
WCTRIN=`fgrep -o N ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" | wc -l`
WCTRI=`wc -m < ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa"`
PER2=`echo print $WCTRIN/$WCTRI.*100 | python`
PER3=`echo print $PER1-$PER2 | python`
echo $ORD $PER1 $PER2 $PER3 >> Log.txt
if [ $PER2 -ge 30 -a $PER3 -lt 10 ]
then
mv ../FASTA_SEC/$ORD.fa ./TRASH/$ORD.fa
mv ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" ./TRASH/$ORD"_Trimmed.fa"
fi
fi
done < ../READ/Data.txt
$PER variables are floating numbers as u might have noticed so I cannot use them normaly in the nested if conditional. I'd like to do this conditional iteration in python but I have no clue how do it whithin a bash script also I dont know how to import the value of the variables $PER2 and $PER3 into python. Could I write directly python code in the same bash script invvoking python somehow?
Thank you for your help, first time facing this.
You can use python -c CMD to execute a piece of python code from the command line. If you want bash to interpolate your environment variables, you should use double quotes around CMD.
You can return a value by calling sys.exit, but keep in mind that true and false in Python have the reverse meaning in bash.
So your code would be:
if python -c "import sys; sys.exit(not($PER2 > 30 and $PER3 < 10 ))"
It is possible to feed Python code to the standard input of python executable with the help of here document syntax:
variable=$(date)
python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
In order to avoid parameter substitution (interpretation within the block), quote the first limit string: <<'SCRIPT'.
If you want to assign the output to a variable, use command substitution:
output=$(python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
)
Note, it is not recommended to use back quotes for command substitution, as it is impossible to nest them, and the form $(...) is more readable.
maybe this helps?
$ X=4; Y=7; Z=$(python -c "print($X * $Y)")
$ echo $Z
28
python -c "str" takes "str" as input and runs it.
but then why not rewrite all in python? bash commands can nicely be executed with subprocess which is included in python or (need to install that) sh.

Categories