I have a python file (example.py) to run, which contains 3 variables.
I have three .txt files (var1.txt, var2.txt, var3.txt) which contains the values of each of these 3 variables.
I want to write a batch file (bfile.bat) that
retrieves the three values from the text files,
pass these values to the variables in the python file,
and then run.
Any help would be appreciated!
-added-
Thank you.
the thing is, there is no particular reason for the three files.
So a coworker of mine wrote this program in python (which I am new too),
I'll say this program's name is "program.py",
and she also made a file to show how it works,
by setting the values of variables used in the program.py;
which is "example.py".
this example.py containes the values of the variables which are used in the program.py.
But I want to use many values for these variables,
so I wanted to make a batch file for this - which apparently is not a good idea.
So the example.py looks something like this:
SOUND = 'sound.wav'
TIMESTEP = 50
END = 500
program(SOUND, TIMESTEP, END)
and I want to change the values of SOUND, TIMESTEP, END.
please help me!
Thank you!
You should consider using input parameters in your python file, making use of sys.argv. Then use bash output pipelining to pipe your information to your python script. However, I do not recommend modifying the python file from bash, by writing to it.
[Also, why is it not possible to read the files with python?]
//EDit: Regarding the new information.
So when you already got a python script, this is the best thing for you to happen, as you now have multiple ways of dealing with this.
First of all, a python script can import another python script, or just parts of it. So what you can do is import <function> from program or import program and then use it. Now you can write your own python script, using her function!
You can simple create a list of your parameters and values. For example a list of tuples:
import program
# of course you can also generate this list, depeding on which couples of parameters
# you want to run :)
# this was hardcoding it for simplicity!
myparams = [('sound1.wav', 30, 50), ('sound2.wav', 20, 100), ...]
for (p1, p2, p3) in myparams:
program.function(p1, p2, p3)
This will use the function() out of your program.py.
You could also import the program and write your own script using it with sys.argv so you could run it from bash using command line parameters python myprogram.py p1 p2 p3.
However, looking at the question, you seem to be working with Windows and writing a simple python script is probably better/easier/more convenient. But that depends on your experience on that matter :)
Cheers!
It sounds like passing arguments would be an easier way to accomplish what you want. So instead of using files to hold the variables, try:
import sys
numArgs = len(sys.argv)
# You want 3 variables, and the 0th index will be '[file].py'.
if (numArgs >= 4):
SOUND = sys.argv[1]
TIMESTEP = sys.argv[2]
END = sys.argv[3]
else:
print 'Not enough arguments.'
Then you can simply run:
python [file].py arg1 arg2 arg3
Related
I was trying to automate the launching of a python code for an application. The code has multiple output variables and I only want to grab a single variable's content of the code to a powershell variable. Can this be done? I am very new to powershell.
I was trying to use System.Diagnostics.Process, but seem to have hit a roadblock. Any clue as to how this can be done?
text=r'Text'
image=r'Path\to\image'
lang = r'lang'
read_Text = r'Text_from_image'
s= SequenceMatcher(None, read_Text,text)
r = round(s.ratio(),3)
print (r)
So, as we can see there are multiple variables storing values. Only the last line is outputting to the stream and is grabbed when I use
$stdout = $Process.StandardOutput.ReadToEnd().
PS $HOME> $HOME\Time.ps1
65.2252356
I have a python script that successfully sends SOAP to insert a record into a system. The values are static in the test. I need to make the value dynamic/argument that is passed through the command line or other stored value.
execute: python myscript.py
<d4p1:Address>MainStreet</d4p1:Address> ....this works to add hard coded "MainStreet"
execute: python myscript.py MainStreet
...this is now trying to pass the argument MainStreet
<d4p1:Address>sys.argv[1]</d4p1:Address> ....this does not work
It saves the literal text address as "sys.argv[1]" ... I have imported sys ..I have tried %, {}, etc from web searches, what syntax am I missing??
You need to read a little about how to create strings in Python, below is how it could look like in your code. Sorry it's hard to say more without seeing your actual code. And you actually shouldn't create XMLs like that, you should use for instance xml module from standard library.
test = "<d4p1:Address>" + sys.argv[1] + "</d4p1:Address>"
I have a text file /etc/default/foo which contains one line:
FOO="/path/to/foo"
In my python script, I need to reference the variable FOO.
What is the simplest way to "source" the file /etc/default/foo into my python script, same as I would do in bash?
. /etc/default/foo
Same answer as #jil however, that answer is specific to some historical version of Python.
In modern Python (3.x):
exec(open('filename').read())
replaces execfile('filename') from 2.x
You could use execfile:
execfile("/etc/default/foo")
But please be aware that this will evaluate the contents of the file as is into your program source. It is potential security hazard unless you can fully trust the source.
It also means that the file needs to be valid python syntax (your given example file is).
Keep in mind that if you have a "text" file with this content that has a .py as the file extension, you can always do:
import mytextfile
print(mytestfile.FOO)
Of course, this assumes that the text file is syntactically correct as far as Python is concerned. On a project I worked on we did something similar to this. Turned some text files into Python files. Wacky but maybe worth consideration.
Just to give a different approach, note that if your original file is setup as
export FOO=/path/to/foo
You can do source /etc/default/foo; python myprogram.py (or . /etc/default/foo; python myprogram.py) and within myprogram.py all the values that were exported in the sourced' file are visible in os.environ, e.g
import os
os.environ["FOO"]
If you know for certain that it only contains VAR="QUOTED STRING" style variables, like this:
FOO="some value"
Then you can just do this:
>>> with open('foo.sysconfig') as fd:
... exec(fd.read())
Which gets you:
>>> FOO
'some value'
(This is effectively the same thing as the execfile() solution
suggested in the other answer.)
This method has substantial security implications; if instead of FOO="some value" your file contained:
os.system("rm -rf /")
Then you would be In Trouble.
Alternatively, you can do this:
>>> with open('foo.sysconfig') as fd:
... settings = {var: shlex.split(value) for var, value in [line.split('=', 1) for line in fd]}
Which gets you a dictionary settings that has:
>>> settings
{'FOO': ['some value']}
That settings = {...} line is using a dictionary comprehension. You could accomplish the same thing in a few more lines with a for loop and so forth.
And of course if the file contains shell-style variable expansion like ${somevar:-value_if_not_set} then this isn't going to work (unless you write your very own shell style variable parser).
There are a couple ways to do this sort of thing.
You can indeed import the file as a module, as long as the data it contains corresponds to python's syntax. But either the file in question is a .py in the same directory as your script, either you're to use imp (or importlib, depending on your version) like here.
Another solution (that has my preference) can be to use a data format that any python library can parse (JSON comes to my mind as an example).
/etc/default/foo :
{"FOO":"path/to/foo"}
And in your python code :
import json
with open('/etc/default/foo') as file:
data = json.load(file)
FOO = data["FOO"]
## ...
file.close()
This way, you don't risk to execute some uncertain code...
You have the choice, depending on what you prefer. If your data file is auto-generated by some script, it might be easier to keep a simple syntax like FOO="path/to/foo" and use imp.
Hope that it helps !
The Solution
Here is my approach: parse the bash file myself and process only variable assignment lines such as:
FOO="/path/to/foo"
Here is the code:
import shlex
def parse_shell_var(line):
"""
Parse such lines as:
FOO="My variable foo"
:return: a tuple of var name and var value, such as
('FOO', 'My variable foo')
"""
return shlex.split(line, posix=True)[0].split('=', 1)
if __name__ == '__main__':
with open('shell_vars.sh') as f:
shell_vars = dict(parse_shell_var(line) for line in f if '=' in line)
print(shell_vars)
How It Works
Take a look at this snippet:
shell_vars = dict(parse_shell_var(line) for line in f if '=' in line)
This line iterates through the lines in the shell script, only process those lines that has the equal sign (not a fool-proof way to detect variable assignment, but the simplest). Next, run those lines into the function parse_shell_var which uses shlex.split to correctly handle the quotes (or the lack thereof). Finally, the pieces are assembled into a dictionary. The output of this script is:
{'MOO': '/dont/have/a/cow', 'FOO': 'my variable foo', 'BAR': 'My variable bar'}
Here is the contents of shell_vars.sh:
FOO='my variable foo'
BAR="My variable bar"
MOO=/dont/have/a/cow
echo $FOO
Discussion
This approach has a couple of advantages:
It does not execute the shell (either in bash or in Python), which avoids any side-effect
Consequently, it is safe to use, even if the origin of the shell script is unknown
It correctly handles values with or without quotes
This approach is not perfect, it has a few limitations:
The method of detecting variable assignment (by looking for the presence of the equal sign) is primitive and not accurate. There are ways to better detect these lines but that is the topic for another day
It does not correctly parse values which are built upon other variables or commands. That means, it will fail for lines such as:
FOO=$BAR
FOO=$(pwd)
Based off the answer with exec(.read()), value = eval(.read()), it will only return the value. E.g.
1 + 1: 2
"Hello Word": "Hello World"
float(2) + 1: 3.0
I want to pass an input fasta file stored in a variable say inp_a from python to bowtie and write the output into another out_a. I want to use
os.system ('bowtie [options] inp_a out_a')
Can you help me out
Your question asks for two things, as far as I can tell: writing data to disk, and calling an external program from within Python. Without more detailed requirements, here's what I would write:
import subprocess
data_for_bowtie = "some genome data, lol"
with open("input.fasta", "wb") as input_file:
input_file.write(data_for_bowtie)
subprocess.call(["bowtie", "input.fasta", "output.something"])
There are some fine details here which I have assumed. I'm assuming that you mean bowtie, the read aligner. I'm assuming that your file is a binary, non-human-readable one (which is why there's that b in the second argument to open) and I'm making baseless assumptions about how to call bowtie on the command line because I'm not motivated enough to spend the time learning it.
Hopefully, that provides a starting point. Good luck!
I have written piece of code which runs sextractor from python, however I only know how to do this for one file, and i need to loop it over 62 files. Im not sure how i would go about doing this. I have attached my code bellow:
#!/usr/bin/env python
# build a catalog using sextractor on des image here
sys.path.append('/home/fitsfiles') #not sure if this does anything/is correct
def sex(image, output, sexdir='/home/sextractor-2.5.0', check_img=None,config=None, l=None) :
'''Construct a sextractor command and run it.'''
#creates a sextractor line e.g sex img.fits -catalog_name -checkimage_name
q="/home/fitsfiles/"+ "01" +".fits"
com = [ "sex ", q, " -CATALOG_NAME " + output]
s0=''
com = s0.join(com)
res = os.system(com)
return res
img_name=sys.argv[0]
output=img_name[0:1]+'_star_catalog.fits'
t=sex(img_name,output)
print '----done !---'
so this code produces a command in my main terminal of, sex /home/fitsfiles/01.fits -CATALOG_NAME g_star_catalog.fits
which successfully produces a star catalogue as I want.
However I want my code to to this for 62 fits files and change the name of star_catalog.fits depending upon which fitsfile is being used. any help would be appreciated.
There are many ways you could approach this. Let's assume you want to run your script as something like
python extract_stars.py /home/fitsfiles/*.fits
Then, you could try something like this:
for arg in len(sys.argv):
filename = arg.split('/')[-1].strip('.fits')
t = sex(arg, filename +'_star_catalog.fits')
# Whatever else
This assumes that you remove the line in sex that reformats the input filename. Also, you do not need to append the fits directory to your path.
The alternative approach is, if you do not plan to do anything else in python, you could write a bash script which would really simplify the task.
And, as a side note, you if you had asked this question more generally (ie, I wish to apply a function I wrote to a number of input files) and without reference to a rather uncommonly used application, you would have likely received an answer much more quickly.
The community has now developed some python wrappers which allow you to run sextractor as if it was a python command. These are: pysex, sewpy and astromatic_wrapper.
The good thing about sextractor wrappers is that allow you to write much cleaner code without the need of defining extra functions, invoking os commands or having the configuration files and the outputfiles on your machine. Moreover, the output can be an astropy table, a pandas dataframe or a numpy array.
For your specific case, you could use pysex and do:
import pysex
import glob
filelist = glob.glob('/directory/*.fits')
for fitsfile in filelist:
cat = pysex.run(fitsfile, params=['X_IMAGE', 'Y_IMAGE', 'FLUX_APER'],
conf_args={'PHOT_APERTURES':5})
print cat['FLUX_APER']