Calling nc/netcat in python with variable - python

I am attempting to call netcat/nc from inside a loop, passing the lcv as the last octet in the command. I am seem to be running into an issue due to the fact that the variable is in the middle of the command. What I would like to do is:
for i in 1 to 50
print i
nc -nvv 192.168.1.i 21 -w 15 >> local_net.txt
So far I have tried:
os.system("nc -nvv 192.168.1.",i,"21 -w 15 >> local_net.txt")
and
subprocess.call(["nv","-nvv 192.168.1.",i,"21 -w 15 >> local_net.txt")
Is there an easier way to reference an LVC from within a command executing a system command?

The reason this doesn't work:
os.system("nc -nvv 192.168.1.",i,"21 -w 15 >> local_net.txt"
… is that os.system takes a single string, as a shell command. You're giving it three separate arguments. (Plus, even if this worked as you hoped, it would have to either put a space before the i or not put one after the i.) You could fix it by any of the usual ways of combining strings:
os.system("nc -nvv 192.168.1.{} 21 -w 15 >> local_net.txt".format(i))
os.system("nc -nvv 192.168.1." + str(i) + " 21 -w 15 >> local_net.txt".format(i))
However, your intuition to try subprocess.call instead was a good one. The reason this doesn't work:
subprocess.call(["nv","-nvv 192.168.1.",i,"21 -w 15 >> local_net.txt")
… is two-fold.
First, you're joining up multiple arguments. This is equivalent to the shell command:
nc '-nvv 192.168.1.' '1' '21 -w 15 >> local_net.txt'
When what you want is the equivalent to these:
nc -nvv 192.168.1.1 21 -w 15 >> local_net.txt
nc '-nvv' '192.168.1.1' '21' '-w' '15' >> 'local_net.txt'
You need to split up the arguments properly to get that.
Second, >> local_net.txt isn't part of the command itself at all; it's a shell redirection directive. You can't do that without the shell.
If you really want to use the shell, you can do it the same way as with system, but putting together a single single:
subprocess.call("nc -nvv 192.168.1.{} 21 -w 15 >> local_net.txt".format(i), shell=True)
But a better way to do this is to break the arguments up properly, and do the redirection Python-style instead of shell-style:
with open('local_net.txt', 'a+b') as f:
subprocess.call(["nc", "-nvv", "192.168.1.{}".format(i), "21", "-w", "15"],
stdout=f)
In the file mode, the a means to append to the end of the file; the + means to update the file if it exists, create it otherwise; the b means to open it in binary mode (so you don't translate newlines, if this is Windows—I'm assuming Python 2.x here).
Meanwhile, the Python version of for i in 1 to 50 is for i in range(1, 51):. (Note that 51 at the end, because Python ranges are half-open.) So, putting it all together:
for i in range(50):
with open('local_net.txt', 'a+b') as f:
subprocess.call(["nc", "-nvv", "192.168.1.{}".format(i), "21", "-w", "15"],
stdout=f)
It's worth noting that you can do this in most sh-like shells, including bash, without needing Python at all. And for something this simple, it may be easier:
for i in {1..50}; do
nc -nvv 192.168.1.${i} 21 -w 15 >> local_net.txt
done

Related

Manually reading multiple lines of input in terminal

I am writing a python code for a class which needs multiple lines of inputs.
For example I need the input to be in the format:
3 14
12 10
12 5
10 5
When entering this manually on the terminal, I do not know how to signal an end of input.
I have been working around it by entering the inputs in txt files and reading these.
On Linux, use Ctrl+D to type "end of file". On Windows, use Ctrl+Z.
Besides the control-d, in bash, you can also:
pythonscript <<EOF
3 14
12 10
12 5
10 5
EOF
On linux and unix you can find what the EOF char is using
stty -a
it will show something like
...
cchars: discard = ^O; dsusp = ^Y; eof = ^D; eol = <under>;
...
indicating the eof is ^D, which you can also change using stty.
Then, you can type ^D to signal EOF to a process that's reading its input from the terminal.

subprocess.getstatus() return limited-length string

I use subprocess.getstatus() to get output of bash command running inside python script, in my case top linux command:
output = subprocess.getoutput("top -bc -n 1 | grep some_process_name")
Unfortunately, output string of the function is limited to 80 chars. If the string is longer, I just get the first 80 chars.
Any other alternate way to get long outputs in shell commands, in full?
You could pipe the output to a text file, and then display the text file.

Send parameters to python from bash

I have a bash script that calls a python script with parameters.
In the bash script, I'm reading a file that contains one row of parameters separated by ", and then calls the python script with the line I read.
My problem is that the python gets the parameters separated by the space.
The line looks like this: "param_a" "Param B" "Param C"
Code Example:
Bash Script:
LINE=`cat $tmp_file`
id=`python /full_path/script.py $LINE`
Python Script:
print sys.argv[1]
print sys.argv[2]
print sys.argv[3]
Received output:
"param_a"
"Param
B"
Wanted output:
param_a
Param B
Param C
How can I send the parameters to the Python script the way I need?
Thanks!
What about
id=`python /full_path/script.py $tmp_file`
and
import sys
for line in open(sys.argv[1]):
print(line)
?
The issue is in how bash passes the arguments. Python has nothing do to with it.
So, you have to solve all these stuff before sending it to Python, I decided to use awk and xargs for this. (but xargs is the actual MVP here.)
LINE=$(cat $tmp_file)
awk -v ORS="\0" -v FPAT='"[^"]+"' '{for (i=1;i<=NF;i++){print substr($i,2,length($i)-2)}}' <<<$LINE |
xargs -0 python ./script.py
First $(..) is preferred over backticks, because it is more readable. You are making a variable after all.
awk only reads from stdin or a file, but you can force it to read from a variable with the <<<, also called "here string".
With awk I loop over all fields (as defined by the regex in the FPAT variable), and print them without the "".
The output record separator I choose is the NULL character (-v ORF='\0'), xargs will split on this character.
xargs will now parse the piped input by separating the arguments on NULL characters (set with -0) and execute the command given with the parsed arguments.
Note, while awk is found on most UNIX systems, I make use of FPAT which is a GNU awk extension and you might not be having GNU awk as default (for example Ubuntu), but gnu awk is usually just a install gawk away.
Also, the next command would be a quick and easy solution, but generally considered as unsafe, since eval will execute everything it receives.
eval "python ./script "$LINE
This can be done using bash arrays:
tmp_file='gash.txt'
# Set IFS to " which splits on double quotes and removes them
# Using read is preferable to using the external program cat
# read -a reads into the array called "line"
# UPPERCASE variable names are discouraged because of collisions with bash variables
IFS=\" read -ra line < "$tmp_file"
# That leaves blank and space elements in "line",
# we create a new array called "params" without those elements
declare -a params
for((i=0; i < ${#line[#]}; i++))
do
p="${line[i]}"
if [[ -n "$p" && "$p" != " " ]]
then
params+=("$p")
fi
done
# `backticks` are frowned upon because of poor readability
# I've called the python script "gash.py"
id=$(python ./gash.py "${params[#]}")
echo "$id"
gash.py:
import sys
print "1",sys.argv[1]
print "2",sys.argv[2]
print "3",sys.argv[3]
Gives:
1 param_a
2 Param B
3 Param C

Python plumbum: Passing $ in an cmd argument

I am trying to execute the following command in python using plumbum:
sort -u -f -t$'\t' -k1,1 file1 > file2
However, I am having issues passing the -t$'\t' argument. Here is my code:
from plumbum.cmd import sort
separator = r"-t$'\t'"
print separator
cmd = (sort["-u", "-f", separator, "-k1,1", "file1"]) > "file2"
print cmd
print cmd()
I can see problems right away after print separator and print cmd() executes:
-t$'\t'
/usr/bin/sort -u -f "-t\$'\\t'" -k1,1 file1 > file2
The argument is wrapped in double quotes.
An extra \ before $ and \t is inserted.
How should I pass this argument to plumbum?
You may have stumbled into limitations of the command line escaping.
I could make it work using subprocess module, passing a real tabulation char litteraly:
import subprocess
p=subprocess.Popen(["sort","-u","-f","-t\t","-k1,1","file1",">","file2"],shell=True)
p.wait()
Also, full python short solution that does what you want:
with open("file1") as fr, open("file2","w") as fw:
fw.writelines(sorted(set(fr),key=lambda x : x.split("\t")[0]))
The full python solution doesn't work exactly the same way sort does when dealing with unicity. If 2 lines have the same first field but not the same second field, sort keeps one of them, whereas the set will keep both.
EDIT: unchecked but you just confirmed that it works: just tweak your plumbum code with:
separator = "-t\t"
could just work, although out of the 3 ones, I'd recommend the full python solution since it doesn't involve an external process and therefore is more pythonic and portable.

Inserting python code in a bash script

I've got the following bash script:
#!/bin/bash
while read line
do
ORD=`echo $line | cut -c 7-21`
if [[ -r ../FASTA_SEC/${ORD}.fa ]]
then
WCR=`fgrep -o N ../FASTA_SEC/$ORD.fa | wc -l`
WCT=`wc -m < ../FASTA_SEC/$ORD.fa`
PER1=`echo print $WCR/$WCT.*100 | python`
WCTRIN=`fgrep -o N ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" | wc -l`
WCTRI=`wc -m < ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa"`
PER2=`echo print $WCTRIN/$WCTRI.*100 | python`
PER3=`echo print $PER1-$PER2 | python`
echo $ORD $PER1 $PER2 $PER3 >> Log.txt
if [ $PER2 -ge 30 -a $PER3 -lt 10 ]
then
mv ../FASTA_SEC/$ORD.fa ./TRASH/$ORD.fa
mv ../FASTA_SEC_EDITED/$ORD"_Trimmed.fa" ./TRASH/$ORD"_Trimmed.fa"
fi
fi
done < ../READ/Data.txt
$PER variables are floating numbers as u might have noticed so I cannot use them normaly in the nested if conditional. I'd like to do this conditional iteration in python but I have no clue how do it whithin a bash script also I dont know how to import the value of the variables $PER2 and $PER3 into python. Could I write directly python code in the same bash script invvoking python somehow?
Thank you for your help, first time facing this.
You can use python -c CMD to execute a piece of python code from the command line. If you want bash to interpolate your environment variables, you should use double quotes around CMD.
You can return a value by calling sys.exit, but keep in mind that true and false in Python have the reverse meaning in bash.
So your code would be:
if python -c "import sys; sys.exit(not($PER2 > 30 and $PER3 < 10 ))"
It is possible to feed Python code to the standard input of python executable with the help of here document syntax:
variable=$(date)
python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
In order to avoid parameter substitution (interpretation within the block), quote the first limit string: <<'SCRIPT'.
If you want to assign the output to a variable, use command substitution:
output=$(python2.7 <<SCRIPT
print "The current date: %s" % "${variable}"
SCRIPT
)
Note, it is not recommended to use back quotes for command substitution, as it is impossible to nest them, and the form $(...) is more readable.
maybe this helps?
$ X=4; Y=7; Z=$(python -c "print($X * $Y)")
$ echo $Z
28
python -c "str" takes "str" as input and runs it.
but then why not rewrite all in python? bash commands can nicely be executed with subprocess which is included in python or (need to install that) sh.

Categories