Passing to SOAP arguments from the command line - python

I have a python script that successfully sends SOAP to insert a record into a system. The values are static in the test. I need to make the value dynamic/argument that is passed through the command line or other stored value.
execute: python myscript.py
<d4p1:Address>MainStreet</d4p1:Address> ....this works to add hard coded "MainStreet"
execute: python myscript.py MainStreet
...this is now trying to pass the argument MainStreet
<d4p1:Address>sys.argv[1]</d4p1:Address> ....this does not work
It saves the literal text address as "sys.argv[1]" ... I have imported sys ..I have tried %, {}, etc from web searches, what syntax am I missing??

You need to read a little about how to create strings in Python, below is how it could look like in your code. Sorry it's hard to say more without seeing your actual code. And you actually shouldn't create XMLs like that, you should use for instance xml module from standard library.
test = "<d4p1:Address>" + sys.argv[1] + "</d4p1:Address>"

Related

Can't get working command line on prompt to work on subprocess

I need to extract text from a PDF. I tried the PyPDF2, but the textExtract method returned an encrypted text, even though the pdf is not encrypted acoording to the isEncrypted method.
So I moved on to trying accessing a program that does the job from the command prompt, so I could call it from python with the subprocess module. I found this program called textExtract, which did the job I wanted with the following command line on cmd:
"textextract.exe" "download.pdf" /to "download.txt"
However, when I tried running it with subprocess I couldn't get a 0 return code.
Here is the code I tried:
textextract = shlex.split(r'"textextract.exe" "download.pdf" /to "download.txt"')
subprocess.run(textextract)
I already tried it with shell=True, but it didn't work.
Can anyone help me?
I was able to get the following script to work from the command line after installing the PDF2Text Pilot application you're trying to use:
import shlex
import subprocess
args = shlex.split(r'"textextract.exe" "download.pdf" /to "download.txt"')
print('args:', args)
subprocess.run(args)
Sample screen output of running it from a command line session:
> C:\Python3\python run-textextract.py
args: ['textextract.exe', 'download.pdf', '/to', 'download.txt']
Progress:
Text from "download.pdf" has been successfully extracted...
Text extraction has been completed!
The above output was generated using Python 3.7.0.
I don't know if your use of spyder on anaconda affects things or not since I'm not familiar with it/them. If you continue to have problems with this, then, if it's possible, I suggest you see if you can get things working directly—i.e. running the the Python interpreter on the script manually from the command line similar to what's shown above. If that works, but using spyder doesn't, then you'll at least know the cause of the problem.
There's no need to build a string of quoted strings and then parse that back out to a list of strings. Just create a list and pass that:
command=["textextract.exe", "download.pdf", "/to", "download.txt"]
subprocess.run(command)
All that shlex.split is doing is creating a list by removing all of the quotes you had to add when creating the string in the first place. That's an extra step that provides no value over just creating the list yourself.

How could I get the command line that I entered in the terminal to run a python file?

Suppose I have a python file main.py, and it has some optional parameters, --learning-rate, --batch-size, and etc.
If I want to run this file, I can input the following into the terminal(Ubuntu Linux for example).
python3 main.py --learning-rate 0.1 --batch-size 100
And now, I want to write some code in main.py, in order that after I enter the command above, I can get this command in a string by executing those code. The following is the string I want to get:
"python3 main.py --learning-rate 0.1 --batch-size 100"
The reason I want to do this is that I want to write this string into my recording file so that I can know better what command I have run.
Could anyone tell me what package should I import and what code should I write to get that command information during running the python file?
Thanks!
You cannot always get precisely what you typed, because the shell will have first done substitutions and expanded filenames before starting your script. For example, if you type python "foo.py" *.txt, your script won't see *.txt, it will see the list of files, and it won't see the quotes around foo.py.
With that caveat out of the way, the sys module has a variable named argv that contains all of the arguments. argv[0] is the name of the script.
To get the name of the python executable you can use sys.executable.
To tie it all together, you can do something like this:
print(sys.executable + " " + " ".join(sys.argv))
Why not just remake the command using the arguments you parsed? It won't be exactly what you typed, but that might be nice as it will be in a common format.
Ex (assuming the learning rate and batch size are stored in similarly named variables):
command = "python3 main.py --learning-rate {} --batch-size {}".format(learning_rate, batch_size)
Of course it will be a little more complicated if some commands are optional, but I assume in that case there would be a default value for these parameters, since your network would need those parameters every time.

same string gives different result in Python

So I'm using approach in this post
to extract a double quoted string from a string. If the input string comes from terminal argument, it works fine. But if the input string comes from a txt file like the following, it gives nontype error. I tried to get the hash code for two strings(one from file and one from terminal) with identical txt content, and turns out they are different. I'm curious if anyone knows how to solve this?(in Python 3.x)
That said, I have set the default encoding to "utf-8" in my code.
python filename.py < input.txt
If you are using command python, the command recognize it to python 2.x.
If you want python 3.x, just change the command to python3
like this
python3 filename.py < input.txt
Two things, if you want to ingest a txt file into a python script, you need to specify it. Add these two lines
import sys
text = str(sys.argv[1])
this mean text would be your 'input.txt'.
Second, if your script has only a function, it would not know what you want to do with the function, you have to either, tell the script explicity to execute the function through the entry main
import re
import sys
def doit(text):
matches=re.findall(r'\"(.+?)\"',text)
# matches is now ['String 1', 'String 2', 'String3']
return ",".join(matches)
if __name__ == '__main__':
text_file = str(sys.argv[1])
text = open(text_file).read()
print(doit(text))
Alternately, you can just execute line by line without wrapping the re in a function, since it is only one line.
I just figure it out, the bug doesn't come from my code. I had the "smart quotes" enabled on my Mac so whenever it reads a quote, it's identified as a special character. Disable this under keyboard setting would do the trick.
LOL what a "bug".

how to "source" file into python script

I have a text file /etc/default/foo which contains one line:
FOO="/path/to/foo"
In my python script, I need to reference the variable FOO.
What is the simplest way to "source" the file /etc/default/foo into my python script, same as I would do in bash?
. /etc/default/foo
Same answer as #jil however, that answer is specific to some historical version of Python.
In modern Python (3.x):
exec(open('filename').read())
replaces execfile('filename') from 2.x
You could use execfile:
execfile("/etc/default/foo")
But please be aware that this will evaluate the contents of the file as is into your program source. It is potential security hazard unless you can fully trust the source.
It also means that the file needs to be valid python syntax (your given example file is).
Keep in mind that if you have a "text" file with this content that has a .py as the file extension, you can always do:
import mytextfile
print(mytestfile.FOO)
Of course, this assumes that the text file is syntactically correct as far as Python is concerned. On a project I worked on we did something similar to this. Turned some text files into Python files. Wacky but maybe worth consideration.
Just to give a different approach, note that if your original file is setup as
export FOO=/path/to/foo
You can do source /etc/default/foo; python myprogram.py (or . /etc/default/foo; python myprogram.py) and within myprogram.py all the values that were exported in the sourced' file are visible in os.environ, e.g
import os
os.environ["FOO"]
If you know for certain that it only contains VAR="QUOTED STRING" style variables, like this:
FOO="some value"
Then you can just do this:
>>> with open('foo.sysconfig') as fd:
... exec(fd.read())
Which gets you:
>>> FOO
'some value'
(This is effectively the same thing as the execfile() solution
suggested in the other answer.)
This method has substantial security implications; if instead of FOO="some value" your file contained:
os.system("rm -rf /")
Then you would be In Trouble.
Alternatively, you can do this:
>>> with open('foo.sysconfig') as fd:
... settings = {var: shlex.split(value) for var, value in [line.split('=', 1) for line in fd]}
Which gets you a dictionary settings that has:
>>> settings
{'FOO': ['some value']}
That settings = {...} line is using a dictionary comprehension. You could accomplish the same thing in a few more lines with a for loop and so forth.
And of course if the file contains shell-style variable expansion like ${somevar:-value_if_not_set} then this isn't going to work (unless you write your very own shell style variable parser).
There are a couple ways to do this sort of thing.
You can indeed import the file as a module, as long as the data it contains corresponds to python's syntax. But either the file in question is a .py in the same directory as your script, either you're to use imp (or importlib, depending on your version) like here.
Another solution (that has my preference) can be to use a data format that any python library can parse (JSON comes to my mind as an example).
/etc/default/foo :
{"FOO":"path/to/foo"}
And in your python code :
import json
with open('/etc/default/foo') as file:
data = json.load(file)
FOO = data["FOO"]
## ...
file.close()
This way, you don't risk to execute some uncertain code...
You have the choice, depending on what you prefer. If your data file is auto-generated by some script, it might be easier to keep a simple syntax like FOO="path/to/foo" and use imp.
Hope that it helps !
The Solution
Here is my approach: parse the bash file myself and process only variable assignment lines such as:
FOO="/path/to/foo"
Here is the code:
import shlex
def parse_shell_var(line):
"""
Parse such lines as:
FOO="My variable foo"
:return: a tuple of var name and var value, such as
('FOO', 'My variable foo')
"""
return shlex.split(line, posix=True)[0].split('=', 1)
if __name__ == '__main__':
with open('shell_vars.sh') as f:
shell_vars = dict(parse_shell_var(line) for line in f if '=' in line)
print(shell_vars)
How It Works
Take a look at this snippet:
shell_vars = dict(parse_shell_var(line) for line in f if '=' in line)
This line iterates through the lines in the shell script, only process those lines that has the equal sign (not a fool-proof way to detect variable assignment, but the simplest). Next, run those lines into the function parse_shell_var which uses shlex.split to correctly handle the quotes (or the lack thereof). Finally, the pieces are assembled into a dictionary. The output of this script is:
{'MOO': '/dont/have/a/cow', 'FOO': 'my variable foo', 'BAR': 'My variable bar'}
Here is the contents of shell_vars.sh:
FOO='my variable foo'
BAR="My variable bar"
MOO=/dont/have/a/cow
echo $FOO
Discussion
This approach has a couple of advantages:
It does not execute the shell (either in bash or in Python), which avoids any side-effect
Consequently, it is safe to use, even if the origin of the shell script is unknown
It correctly handles values with or without quotes
This approach is not perfect, it has a few limitations:
The method of detecting variable assignment (by looking for the presence of the equal sign) is primitive and not accurate. There are ways to better detect these lines but that is the topic for another day
It does not correctly parse values which are built upon other variables or commands. That means, it will fail for lines such as:
FOO=$BAR
FOO=$(pwd)
Based off the answer with exec(.read()), value = eval(.read()), it will only return the value. E.g.
1 + 1: 2
"Hello Word": "Hello World"
float(2) + 1: 3.0

Python and Plone help

Im using the plone cms and am having trouble with a python script. I get a name error "the global name 'open' is not defined". When i put the code in a seperate python script it works fine and the information is being passed to the python script becuase i can print the query. Code is below:
#Import a standard function, and get the HTML request and response objects.
from Products.PythonScripts.standard import html_quote
request = container.REQUEST
RESPONSE = request.RESPONSE
# Insert data that was passed from the form
query=request.query
#print query
f = open("blast_query.txt","w")
for i in query:
f.write(i)
return printed
I also have a second question, can i tell python to open a file in in a certain directory for example, If the script is in a certain loaction i.e. home folder, but i want the script to open a file at home/some_directory/some_directory can it be done?
Python Scripts in Plone are restricted and have no access to the filesystem. The open call is thus not available. You'll have to use an External Method or full python module to have full access to the filesystem.

Categories