Python: Read data from STDIN, unless its not provided - python

I have a python program which reads from STDIN:
#!/usr/bin/env python
def my_func(data):
print (data)
if __name__ == '__main__':
import sys
data = sys.stdin.read()
my_func(data)
I see the expected results when I execute this with:
cat file.txt | ./app.py
I want to add some other functionality to the program:
if __name__ == '__main__':
import sys
data = sys.stdin.read()
if data:
my_func(data)
else:
print ('I am some other functionality')
However when I execute this with:
./app.py
... the program just hangs, as if it is waiting for STDIN input.
What's the correct way to write this, so it will handle both methods of executing.

As #Klaus D. suggested, using different command line arguments should be the most straightforward option.
sys.argv[i] returns the argument at index i used when launching the script. Index 0 is always the script name while any other index represents subsequent arguments passed to the script. With that in mind:
if __name__ == '__main__':
import sys
objective = sys.argv[1]
if objective == 'read':
data = sys.stdin.read()
if data:
my_func(data)
else:
print ('I am some other functionality')

Many Unix-style programs look for a dash - to indicate input from stdin. So,
import sys
if sys.argv[1] == '-':
read_from_stdin()
else:
other_action()
$ ./app.py - < /path/to/input

Related

How to print the first N lines of a file in python with N as argument

How would I go about getting the first N lines of a text file in python? With N have to give as argument
usage:
python file.py datafile -N 10
My code
import sys
from itertools import islice
args = sys.argv
print (args)
if args[1] == '-h':
print ("-N for printing the number of lines: python file.py datafile -N 10")
if args[-2] == '-N':
datafile = args[1]
number = int(args[-1])
with open(datafile) as myfile:
head = list(islice(myfile, number))
head = [item.strip() for item in head]
print (head)
print ('\n'.join(head))
I wrote the program, can let me know better than this code
Assuming that the print_head logic you've implemented need not be altered, here's the script I think you're looking for:
import sys
from itertools import islice
def print_head(file, n):
if not file or not n:
return
with open(file) as myfile:
head = [item.strip() for item in islice(myfile, n)]
print(head)
def parse_args():
result = {'script': sys.argv[0]}
args = iter(sys.argv)
for arg in args:
if arg == '-F':
result['filename'] = next(args)
if arg == '-N':
result['num_lines'] = int(next(args))
return result
if __name__ == '__main__':
script_args = parse_args()
print_head(script_args.get('filename', ''), script_args.get('num_lines', 0))
Running the script
python file.py -F datafile -N 10
Note: The best way to implement it would be to use argparse library
You can access argument passed to the script through sys
sys.argv
The list of command line arguments passed to a Python script. argv[0] is the script name (it is operating system dependent whether this is a full pathname or not). If the command was executed using the -c command line option to the interpreter, argv[0] is set to the string '-c'. If no script name was passed to the Python interpreter, argv[0] is the empty string.
So in code it would look like this:
import sys
print("All of argv")
print(sys.argv)
print("Last element every time")
print(sys.argv[-1])
Reading the documentation you'll see that the first values stored in the sys.argv vary according to how the user calls the script. If you print the code I pasted with different types of calls you can see for yourself the kind of values stored.
For a basic first approach: access n through sys.argv[-1] which returns the last element every time, assuming. You still have to do a try and beg for forgiveness to make sure the argument passed is a number. For that you would have:
import sys
try:
n = int(sys.argv[-1])
except ValueError as v_e:
print(f"Please pass a valid number as argument, not ${sys.argv[-1]}")
That's pretty much it. Obviously, it's quite basic, you can improve this even more by having the users pass values with flags, like --skip-lines 10 and that would be your n, and it could be in any place when executing the script. I'd create a function in charge of translating sys.argv into a key,value dictionary for easy access within the script.
Arguments are available via the sys package.
Example 1: ./file.py datafile 10
#!/usr/bin/env python3
import sys
myfile = sys.argv[1]
N = int(sys.argv[2])
with open("datafile") as myfile:
head = myfile.readlines()[0:args.N]
print(head)
Example 2: ./file.py datafile --N 10
If you want to pass multiple optional arguments you should have a look at the argparse package.
#!/usr/bin/env python3
import argparse
parser = argparse.ArgumentParser(description='Read head of file.')
parser.add_argument('file', help='Textfile to read')
parser.add_argument('--N', type=int, default=10, help='Number of lines to read')
args = parser.parse_args()
with open(args.file) as myfile:
head = myfile.readlines()[0:args.N]
print(head)

Python's sh module - is it at all possible for a script to request input?

Using Python's sh, I am running 3rd party shell script that requests my input (not that it matters much, but to be precise, I'm running an Ansible2 playbook with the --step option)
As an oversimplification of what is happening, I built a simple bash script that requests an input. I believe that if make this simple example work I can make the original case work too.
So please consider this bash script hello.sh:
#!/bin/bash
echo "Please input your name and press Enter:"
read name
echo "Hello $name"
I can run it from python using sh module, but it fails to receive my input...
import errno
import sh
cmd = sh.Command('./hello.sh')
for line in cmd(_iter=True, _iter_noblock=True):
if line == errno.EWOULDBLOCK:
pass
else:
print(line)
How could I make this work?
After following this tutorial, this works for my use case:
#!/usr/bin/env python3
import errno
import sh
import sys
def sh_interact(char, stdin):
global aggregated
sys.stdout.write(char)
sys.stdout.flush()
aggregated += char
if aggregated.endswith(":"):
val = input()
stdin.put(val + "\n")
cmd = sh.Command('./hello.sh')
aggregated = ""
cmd(_out=sh_interact, _out_bufsize=0)
For example, the output is:
$ ./testinput.py
Please input your name and press Enter:arod
Hello arod
There are two ways to solve this:
Using _in:
using _in, we can pass a list which can be taken as input in the python script
cmd = sh.Command('./read.sh')
stdin = ['hello']
for line in cmd(_iter=True, _iter_noblock=True, _in=stdin):
if line == errno.EWOULDBLOCK:
pass
else:
print(line)
Using command line args if you are willing to modify the script.

Python "template" module for both command line scripts and imported module

I would like to write a python "template" module in order to give all my scripts the same behavior.
The behavior is the following:
if the script runs in command line, it accepts arguments treated with argparse. These arguments are basically:
take in input a json from stdin, from file, or from a string argument;
give in output a json in stdout or in a file.
if the script is imported as a module, it has classes/functions managing the following situations:
take in input an object from who called it;
give in output an object so that who called it can use it.
What I have done:
The "template" part template.py
It behaves from command line exactly as I want thanks to these suggestions:
Python argparse mutually exclusive with stdin being one of the options
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import json,sys,argparse,os
def main():
parser = argparse.ArgumentParser(description='Template for python script managing JSON as input/output format. \
A JSON file can be [], {}, "string", 123, true, false, null.')
infile=['-i','--input-file']
kwinfile={'type':argparse.FileType('r'), 'help':'Input file name containing a valid JSON. Default and priority: standard input.'}
jstring=['-j','--json']
kwjstring={'type':str, 'nargs':'?', 'help':'Input file name containing a valid JSON. Default and priority: standard input.'}
outfile=['-o','--output-file']
kwoutfile={'type':argparse.FileType('w'), 'help':'Output file name. Default: standard output.', 'default':sys.stdout}
pretty=['-p','--pretty']
kwpretty={'action':'store_true', 'help':'If set, JSON output will be formatted in pretty print.'}
group = parser.add_mutually_exclusive_group()
group.add_argument(*infile, **kwinfile)
group.add_argument(*jstring, **kwjstring)
parser.add_argument(*outfile, **kwoutfile)
parser.add_argument(*pretty, **kwpretty)
args = parser.parse_args()
return(args)
def input(*data):
args=main()
# if data :
# datain=data[0]
# else :
if not sys.stdin.isatty(): # pipe
data=sys.stdin.read()
else: # no pipe
if not len(sys.argv) > 1 or (args.input_file == None and args.json == None) : # no arguments or no input
data='null'
else :
data = args.json or args.input_file.read()
try:
datain = json.loads(data)
except:
output({'script_name':(sys.argv[0]),
'error': 'Input is not a valid JSON.',
'data': data})
sys.exit(0)
return(datain)
def output(*datain) :
args=main()
if datain :
datain=datain[0]
indent = 2 if args.pretty else None
dataout = json.dumps(datain, indent=indent, ensure_ascii=False)
args.output_file.write(dataout+os.linesep)
return(dataout)
if __name__ == "__main__":
main()
I hope this is the best way to realize it.
The example "calculate_area"
Now, if I import it in a script using
import template as t
def main():
inp=t.input() # {"x":8, "y":2}
out={'area' : inp['x'] * inp['y'] }
return(t.output(out))
if __name__ == "__main__":
main()
the script act in command line as I want:
$ echo '{"x":8, "y":2}' | ./calculate_area.py -p
{
"area": 16
}
The "calculate_sqrt" script to test it as a module
Now I want a third script to import it as a module.
import template as t
import calculate_area as i
import numpy as np
import json
def main():
inp=json.loads(i.main())
out={'sqrt of area' : np.sqrt(inp['area']) }
return(t.output(out))
if __name__ == "__main__":
main()
Here the problems start:
$ echo '{"x":8, "y":2}' | ./calculate_sqrt.py -p
{
"area": 16
}
{
"sqrt of area": 4.0
}
Why do I obtain the two input instead of only the last one?
Moreover:
How to avoid to input it in json? Saying instead: "if the module is called via import, then the input/output will be via objects, else it will be via json in command line"?
I saved my code here:
https://github.com/orsa-unige/python-templates/tree/simplified-example
Here's the outline of a good,in my opinion, base script:
import json,sys,argparse,os
def parser(argv=None):
# if argv is None, uses the sys.argv[1:]
parser = argparse.ArgumentParser(....)
...
args = parser.parse_args(argv)
return(args)
def input(args, *data):
# if data :
# datain=data[0]
if args.input_file is not None:
# input_file might be sys.stdin (if '-')
data = args.input_file.read()
# stdin should work for < redirection
# I don't know if works for pipe
...
return(datain)
def output(args, *datain) :
if datain :
datain=datain[0]
# output_file might be stdout
....
return(dataout)
def main(args):
datain = input(args, [])
dataout = output(args, datain)
return dataout
if __name__ == "__main__":
args = parser()
main(args)
This runs the parser only once if called as a script. If imported it is up to the importer script to run this parser.
A parser may be run multiple times, but usually isn't needed - at least not if the Namespace can be passed around. But each call the parser opens the input/output files. Since one file is opened in write mode, the could result in overlapping opens.
The parser might be test with:
args = parser(['-i', 'inputfile.py', ....]
Another script could do
from template import parser, input, output
def main(args):
... input
# do its own thing
... output
# etc

Passing a string as an argument to a python script

I want to pass a string of ZPL codes from one python script to another python script. The string becomes malformed when used in the second script. How can I pass a string literal as an argument to another python script without it being malformed?
Original String
^XA^FO20,20^BQ,2,3^FDQA,001D4B02107A;1001000;49681207^FS^FO50,50^ADN,36,20^FDMAC: 001D4B02107A^FS^FO50,150^ADN,36,20^FDSN: 1001000^FS^FO50,250^ADN,36,20^FDCode: 49681207^FS^XZ
Malformed string
XAFO20,20BQ,2,3FDQA,001D4B02107A;1001000;49681207FSFO50,50ADN,36,20FDMAC:
Code where I call the second script
def printLabel():
label = "^XA"+"^FO20,20^BQ,2,3^FDQA,"+"001D4B02107A;1001000;49681207"+"^FS"+"^FO50,50"+"^ADN,36,20"+"^FD"+"MAC: "+"001D4B02107A"+"^FS"+"^FO50,150"+"^ADN,36,20"+"^FD"+"SN: "+"1001000"+"^FS"+"^FO50,250"+"^ADN,36,20"+"^FD" + "Code: "+"49681207"+"^FS"+"^XZ"
command = "zt320print.py "+label
print command
sys.stdout.flush()
exitCode = os.system(str(command))
Code that receives the argument
if __name__ == "__main__":
zplString = str(sys.argv[1])
print zplString
printZPL(zplString)
If your code needs to be written just as it is (including the rather odd way of stringing together the ZPL code, and calling a separate script via a shell intermediary, and the avoidance of subprocess, for that matter), you can resolve your issue with a few small adjustments:
First, wrap your code string in double-quotes.
label= '"^XA'+"^FO20,20^BQ,2,3^FDQA,"+"001D4B02107A;1001000;49681207"+"^FS"+"^FO50,50"+"^ADN,36,20"+"^FD"+"MAC: "+"001D4B02107A"+"^FS"+"^FO50,150"+"^ADN,36,20"+"^FD"+"SN: "+"1001000"+"^FS"+"^FO50,250"+"^ADN,36,20"+"^FD" + "Code: "+"49681207"+"^FS"+'^XZ"'
Second, make sure you're actually calling python from the shell:
command = "python script2.py "+label
Finally, if you're concerned about special characters not being read in correctly from the command line, use unicode_escape from codecs.decode to ensure correct transmission.
See this answer for more on unicode_escape.
# contents of second script
if __name__ == "__main__":
from codecs import decode
import sys
zplString = decode(sys.argv[1], 'unicode_escape')
print(zplString)
Now the call from your first script will transmit the code correctly:
import sys
import os
sys.stdout.flush()
exitCode = os.system(str(command))
Output:
^XA^FO20,20^BQ,2,3^FDQA,001D4B02107A;1001000;49681207^FS^FO50,50^ADN,36,20^FDMAC: 001D4B02107A^FS^FO50,150^ADN,36,20^FDSN: 1001000^FS^FO50,250^ADN,36,20^FDCode: 49681207^FS^XZ
Some demo code:
import sys
if __name__ == "__main__":
for i, arg in enumerate(sys.argv):
print("{}: '{}'".format(i, arg))
when called like
python test.py ^this^is^a^test
it gives
0: 'test.py'
1: 'thisisatest'
when called like
python test.py "^this^is^a^test"
it gives
0: 'test.py'
1: '^this^is^a^test'
Solution: enclose your parameter string in double-quotes, ie
label = '"' + label + '"'
You can put your string inside a double-quotes, or just import the other python script:
a.py
import sys, os
text = "a b c d"
# or '{} {} "{}"'.format("python", "b.py", text)
command = "python b.py \"" + text + "\""
os.system(str(command))
b.py
import sys
if __name__ == "__main__":
first_argument = str(sys.argv[1])
print(first_argument)
Output
a b c d

How can I execute an os/shell command from python [duplicate]

This question already has answers here:
How do I execute a program or call a system command?
(65 answers)
Closed 6 years ago.
For say in terminal I did cd Desktop you should know it moves you to that directory, but how do I do that in python but with use Desktop with raw_input("") to pick my command?
The following code reads your command using raw_input, and execute it using os.system()
import os
if __name__ == '__main__':
while True:
exec_cmd = raw_input("enter your command:")
os.system(exec_cmd)
Best Regards,
Yaron
To go with your specific example, you'd do the following:
import os
if __name__ == "__main__":
directory = raw_input("Please enter absolute path: ")
old_dir = os.getcwd() #in case you need your old directory
os.chdir(directory)
I've used this technique before in some directory maintenance functions I've written and it works. If you want to run shell commands more generally you'd something like:
import subprocess
if __name__ == "__main__":
command_list = raw_input("").split(" ")
ret = subprocess(command_list)
#from here you can check ret if you need to
But beware with this method. The system here has no knowledge about whether it's passing a valid command, so it's likely to fail and miss exceptions. A better version might look like:
import subprocess
if __name__ == "__main__":
command_kb = {
"cd": True,
"ls": True
#etc etc
}
command_list = raw_input("").split(" ")
command = command_list[0]
if command in command_kb:
#do some stuff here to the input depending on the
#function being called
pass
else:
print "Command not supported"
return -1
ret = subprocess(command_list)
#from here you can check ret if you need to
This method represents a list of supported commands. You can then manipulate the list of args as needed to verify it's a valid command. For instance, you can check if the directory you're about to cd exists and return an error to the user if not. Or you can check if the path name is valid, but only when joined by an absolute path.
maybe you can do this:
>>> import subprocess
>>> input = raw_input("")
>>> suprocess.call(input.split()) # for detail usage, search subprocess
for details, you can search subprocess module

Categories