Appending 100 random numbers to a file in Bash - python

I am trying to make a Python program that is based on the 20w14infinte Minecraft snapshot. The 'world gen.' was going to be made in Python 3 using os.system() but the lines were very long so I made an SH script to make the worlds for me. It should append a random number between 0 and 32767, the 16-bit limit, to the end of a file.
Here's my code:
Python:
# imports
import random
import os
# variables
game_name = "testing-world"
# functions
def mk_world():
os.system(f"./mk_world.sh {game_name}")
mk_world()
Bash (mk_world.sh):
#!/bin/bash
game_name=$1
cd ./games/$game_name/worlds/
seed=$RANDOM
mkdir $seed
cd $seed
touch world.dimension
echo $RANDOM
ls
for i in {1..100} ;do
echo $RANDOM > world.dimension
done
cat world.dimension

for i in {1..100} ;do
echo $RANDOM > world.dimension
done
This part will execute "echo $RANDOM > world dimension" 100 times, and ">" redirection means that world.dimension will be overwritten, so you should ">>" to append to the file
Possibly you want to do just the following:
echo $RANDOM >> world.dimension

A possible Python solution would be something like:
import random
# Open world.dimension in append mode
with open("world.dimension", "a") as world_dimension:
# 100 times
for i in range(100):
# Write a random integer between 0 and 32767 to the file and
# append a trailing newline character to split the lines
world_dimension.write("{}\n".format(random.randint(0, 32767)))

Related

Send parameters to python from bash

I have a bash script that calls a python script with parameters.
In the bash script, I'm reading a file that contains one row of parameters separated by ", and then calls the python script with the line I read.
My problem is that the python gets the parameters separated by the space.
The line looks like this: "param_a" "Param B" "Param C"
Code Example:
Bash Script:
LINE=`cat $tmp_file`
id=`python /full_path/script.py $LINE`
Python Script:
print sys.argv[1]
print sys.argv[2]
print sys.argv[3]
Received output:
"param_a"
"Param
B"
Wanted output:
param_a
Param B
Param C
How can I send the parameters to the Python script the way I need?
Thanks!
What about
id=`python /full_path/script.py $tmp_file`
and
import sys
for line in open(sys.argv[1]):
print(line)
?
The issue is in how bash passes the arguments. Python has nothing do to with it.
So, you have to solve all these stuff before sending it to Python, I decided to use awk and xargs for this. (but xargs is the actual MVP here.)
LINE=$(cat $tmp_file)
awk -v ORS="\0" -v FPAT='"[^"]+"' '{for (i=1;i<=NF;i++){print substr($i,2,length($i)-2)}}' <<<$LINE |
xargs -0 python ./script.py
First $(..) is preferred over backticks, because it is more readable. You are making a variable after all.
awk only reads from stdin or a file, but you can force it to read from a variable with the <<<, also called "here string".
With awk I loop over all fields (as defined by the regex in the FPAT variable), and print them without the "".
The output record separator I choose is the NULL character (-v ORF='\0'), xargs will split on this character.
xargs will now parse the piped input by separating the arguments on NULL characters (set with -0) and execute the command given with the parsed arguments.
Note, while awk is found on most UNIX systems, I make use of FPAT which is a GNU awk extension and you might not be having GNU awk as default (for example Ubuntu), but gnu awk is usually just a install gawk away.
Also, the next command would be a quick and easy solution, but generally considered as unsafe, since eval will execute everything it receives.
eval "python ./script "$LINE
This can be done using bash arrays:
tmp_file='gash.txt'
# Set IFS to " which splits on double quotes and removes them
# Using read is preferable to using the external program cat
# read -a reads into the array called "line"
# UPPERCASE variable names are discouraged because of collisions with bash variables
IFS=\" read -ra line < "$tmp_file"
# That leaves blank and space elements in "line",
# we create a new array called "params" without those elements
declare -a params
for((i=0; i < ${#line[#]}; i++))
do
p="${line[i]}"
if [[ -n "$p" && "$p" != " " ]]
then
params+=("$p")
fi
done
# `backticks` are frowned upon because of poor readability
# I've called the python script "gash.py"
id=$(python ./gash.py "${params[#]}")
echo "$id"
gash.py:
import sys
print "1",sys.argv[1]
print "2",sys.argv[2]
print "3",sys.argv[3]
Gives:
1 param_a
2 Param B
3 Param C

How to pass parameters to a python script that will be run using a bash script?

I have a python script which actually does some test and write the results to a file. This python file must be provided with list of arguments:
myPc#myPc:~/Desktop/tests$ ./testScript.py --final fromTo 201983:310029 -o res.txt
Every time I need new result for different ranges I have to do them manually so I decided to write a bash script as follows:
#!/bin/bash
a=`wc -l < inFile.txt`
declare -a periods
declare -i index=0
echo "Number of Periods is: $a"
while IFS='' read -r line || [[ -n "$line" ]]; do
periods[$index]=$line
((index += 1))
done < "inFile.txt"
for ((i=0;i<1300;i++)); do
f1=${period[$i]}
f2=${period[$i+1]}
python testScript.py "--final fromTo $f1:$f2 -o res$i.txt"
done
#END
The ranges are already in a text file so I am reading them and store them into an array.
I wish to use the bash to re-run every time the python script with different ranges until all ranges are covered and all results for all ranges are wrote in a separate file. But I am facing an issue that the parameters I try to pass to python are not even passed. I don't really understand what is the issue in here.
"--final fromTo $f1:$f2 -o res$i.txt"
this is the parameter I want to pass and once finished I will pass new params and run it again but it seems it is not even looking at them
The quotes around arguments:
python testScript.py "--final fromTo $f1:$f2 -o res$i.txt"
are causing the complete string to be passed as a single argument with spaces to Python script.
By removing quotes (or at least putting them around words):
python testScript.py --final fromTo "$f1:$f2" -o "res$i.txt"
the sys.argv will be populated correctly, and your argument parser will work (note that argparse uses sys.argv by default).
As a little proof, test.py script:
#!/usr/bin/env python
import sys
print(sys.argv)
Now try:
$ python test.py 1 2 3
['test.py', '1', '2', '3']
$ python test.py "1 2 3"
['test.py', '1 2 3']
Unrelated to this, but for completeness' sake, your bash script can be simplified by using the mapfile built-in (aka readarray, available since bash 4.0):
#!/bin/bash
mapfile -t periods <inFile.txt
cnt="${#periods[#]}"
echo "Number of Periods is: $cnt"
for ((i=0; i<cnt; i++)); do
python testScript.py --final fromTo "${period[$i]}:${period[$i+1]}" -o "res$i.txt"
done
What you type on the commandline is six strings: "./testScript.py" "--final" "fromTo" "201983:310029" "-o" "res.txt"
$ cat args.py
#!/usr/bin/env python
import sys
print sys.argv
$ ./args.py --final fromTo 201983:310029 -o res.txt
['./args.py', '--final', 'fromTo', '201983:310029', '-o', 'res.txt']
When you put double-quotes around the last four strings, you force your shell to treat them as a single string. Not Python's fault.
./args.py "--final fromTo 201983:310029 -o res.txt"
['./args.py', '--final fromTo 201983:310029 -o res.txt']
Just put the quotes around the items that must be handled as single strings:
$ f1=201983; f2=310029; i=1
$ ./args.py --final fromTo "$f1:$f2" -o "res$i.txt"
['./args.py', '--final', 'fromTo', '201983:310029', '-o', 'res1.txt']

Inserting python code in a bash script

I've got the following bash script:
#!/bin/bash
while read line
do
ORD=`echo $line | cut -c 7-21`
if [[ -r ../FASTA_SEC/${ORD}.fa ]]
then
WCR=`fgrep -o N ../FASTA_SEC/$ORD.fa | wc -l`
WCT=`wc -m < ../FASTA_SEC/$ORD.fa`
PER1=`echo print $WCR/$WCT.*100 | python`
WCTRIN=`fgrep -o N ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" | wc -l`
WCTRI=`wc -m < ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa"`
PER2=`echo print $WCTRIN/$WCTRI.*100 | python`
PER3=`echo print $PER1-$PER2 | python`
echo $ORD $PER1 $PER2 $PER3 >> Log.txt
if [ $PER2 -ge 30 -a $PER3 -lt 10 ]
then
mv ../FASTA_SEC/$ORD.fa ./TRASH/$ORD.fa
mv ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" ./TRASH/$ORD"_Trimmed.fa"
fi
fi
done < ../READ/Data.txt
$PER variables are floating numbers as u might have noticed so I cannot use them normaly in the nested if conditional. I'd like to do this conditional iteration in python but I have no clue how do it whithin a bash script also I dont know how to import the value of the variables $PER2 and $PER3 into python. Could I write directly python code in the same bash script invvoking python somehow?
Thank you for your help, first time facing this.
You can use python -c CMD to execute a piece of python code from the command line. If you want bash to interpolate your environment variables, you should use double quotes around CMD.
You can return a value by calling sys.exit, but keep in mind that true and false in Python have the reverse meaning in bash.
So your code would be:
if python -c "import sys; sys.exit(not($PER2 > 30 and $PER3 < 10 ))"
It is possible to feed Python code to the standard input of python executable with the help of here document syntax:
variable=$(date)
python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
In order to avoid parameter substitution (interpretation within the block), quote the first limit string: <<'SCRIPT'.
If you want to assign the output to a variable, use command substitution:
output=$(python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
)
Note, it is not recommended to use back quotes for command substitution, as it is impossible to nest them, and the form $(...) is more readable.
maybe this helps?
$ X=4; Y=7; Z=$(python -c "print($X * $Y)")
$ echo $Z
28
python -c "str" takes "str" as input and runs it.
but then why not rewrite all in python? bash commands can nicely be executed with subprocess which is included in python or (need to install that) sh.

Need for Performance in bash script

I have 50000 files and each one has 10000 lines. Each line is in the form:
value_1 (TAB) value_2 (TAB) ... value_n
I wanted to remove specific values from every line in every file (i used cut to remove values 14-17) and write the results to a new file.
For doing that in one file, i wrote this code:
file=nameOfFile
newfile=$file".new"
i=0
while read line
do
let i=i+1
echo line: $i
a=$i"p"
lineFirstPart=$(sed -n -e $a $file | cut -f 1-13)
#echo lineFirstPart: $lineFirstPart
lineSecondPart=$(sed -n -e $a $file | cut -f 18-)
#echo lineSecondPart: $lineSecondPart
newline=$lineFirstPart$lineSecondPart
echo $newline >> $newfile
done < $file
This takes ~45 secs for one file, which means for all it will take about: 45x50000 = 625h ~= 26 days!
Well, i need something faster, e.g. a solution that cats the whole file, applies the two cut commands simultaneusly or something like that i guess.
Also solutions in python are accepted + appreciated but bash scripting is preferable!
The entire while loop can be replaced with one line:
cut -f1-13,18- $file > $newfile

Zip function for shell scripts

I'm trying to write a shell script that will make several targets into several different paths. I'll pass in a space-separated list of paths and a space-separated list of targets, and the script will make DESTDIR=$path $target for each pair of paths and targets. In Python, my script would look something like this:
for path, target in zip(paths, targets):
exec_shell_command('make DESTDIR=' + path + ' ' + target)
However, this is my current shell script:
#! /bin/bash
packages=$1
targets=$2
target=
set_target_number () {
number=$1
counter=0
for temp_target in $targets; do
if [[ $counter -eq $number ]]; then
target=$temp_target
fi
counter=`expr $counter + 1`
done
}
package_num=0
for package in $packages; do
package_fs="debian/tmp/$package"
set_target_number $package_num
echo "mkdir -p $package_fs"
echo "make DESTDIR=$package_fs $target"
package_num=`expr $package_num + 1`
done
Is there a Unix tool equivalent to Python's zip function or an easier way to retrieve an element from a space-separated list by its index? Thanks.
Use an array:
#!/bin/bash
packages=($1)
targets=($2)
if (("${#packages[#]}" != "${#targets[#]}"))
then
echo 'Number of packages and number of targets differ' >&2
exit 1
fi
for index in "${!packages[#]}"
do
package="${packages[$index]}"
target="${targets[$index]}"
package_fs="debian/tmp/$package"
mkdir -p "$package_fs"
make "DESTDIR=$package_fs" "$target"
done
Here is the solution
paste -d ' ' paths targets | sed 's/^/make DESTDIR=/' | sh
paste is equivalent of zip in shell. sed is used to prepend the make command (using regex) and result is passed to sh to execute
There's no way to do that in bash. You'll need to create two arrays from the input and then iterate through a counter using the values from each.

Categories