From python I want to call a powershell script and pass a parameter to it
The Powershell function header is listed here:
Function Invoke-Neo4j
{
[cmdletBinding(SupportsShouldProcess=$false,ConfirmImpact='Low')]
param (
[Parameter(Mandatory=$false,ValueFromPipeline=$false,Position=0)]
[string]$Command = ''
)
Python - How do pass the parameter $Command = 'start' to the function from python? I can't seem to get it to work.
import subprocess
cmd = ['powershell.exe', '-ExecutionPolicy', 'RemoteSigned', '-File',
'C:\\PathName\\Invoke-Neo4j.ps1']
returncode = subprocess.call(cmd)
print(returncode)
I couldn't make the Powershell command work from Python. I actually found a .bat file with a command that works. I just inserted the 'start' parameter I call the bat file from python. It works like a charm. It's not optimal but it does the job for me.
SETLOCAL
Powershell -NoProfile -NonInteractive -NoLogo -ExecutionPolicy Bypass -Command "try { Unblock-File -Path '%~dp0Neo4j-Management\*.*' -ErrorAction 'SilentlyContinue' } catch {};Import-Module '%~dp0Neo4j-Management.psd1'; Exit (Invoke-Neo4j 'start')"
EXIT /B %ERRORLEVEL%
Running a script, in either PowerShell or Python, that only defines a function but does not call it has no other effect than defining function. The script needs to include a specific call to that function. In your case, you might as well put the logic at the top level of the script instead of inside a function then calling the function. This would look like:
[cmdletBinding(SupportsShouldProcess=$false,ConfirmImpact='Low')]
param (
[Parameter(Mandatory=$false,ValueFromPipeline=$false,Position=0)]
[string]$Command = ''
)
# rest of code not surrounded by { }
exit 0
Invoke the revised script from Python as you were initially doing and it should work properly.
Related
I am trying to execute a command from lua script. The command is to simply run a python script named "sha_compare.py" of which receives 3 arguments where two of them are variables from the lua script - dady_data and sha:
local method = ngx.var.request_method
local headers = ngx.req.get_headers()
if method == "POST" then
ngx.req.read_body()
local body_data = ngx.req.get_body_data()
local sha = headers['X-Hub-Signature-256']
ngx.print(os.execute("python3 sha_compare.py"..sha..body_data))
else
The script fails because of the way I call the arguments. The actual command if I would have ran it from cmd would have been something like:
python3 python3 sha_compare.py sha256=ffs8df aaaaa
Please tell me how should I change my code to call the python script with 3 vars properly.
If it is not possible or hard to implement, please let me know how can I call a .sh script which will receive those 3 params.
You're not providing spaces between the arguments: you're trying to execute
python3 sha_compare.pysha256=ffs8dfaaaaa
Do this:
os.execute("python3 sha_compare.py "..sha.." "..body_data)
It's often easier to build the command up as a table, and the concat it for execution:
local cmd = { 'python3', 'sha_compare.py', sha, body_data }
os.execute(table.concat(cmd, " "))
I am trying to achive following:
I am having a python script which is calling a shell script and passing a parameter as well. Shell script creates a tar.gz file using that parameter passed in some location.
Now shell script should pass the name and location of the tar.gz so created. Python script uses that to form a JSON and pass to some other code.
Along with this I want to add some check to make sure if tar.gz is generated then only value is returned to python otherwise not.
Here is the code:
Python script:
#!/usr/bin/python
import json
import subprocess
json_data='{"name": "StackOverflow", "uid": "8fa36334-ce51"}'
data = json.loads(json_data)
for keys,values in data.items():
print(keys)
print(values)
UID = data.get('uid')
rc = subprocess.check_output(["/home/cyc/Cyc-
Repo/cyc_core/cyc_platform/src/package/cyc_bsc/scripts/test.sh",
UID])
print rc
if rc != 0:
print "failed for passed in uid"
data_op = {}
data_op['pathTOCompressfile'] = 'value_should_be_return_from_shell'
data_op['status'] = 'OK'
json_data_op = json.dumps(data_op)
print json_data_op
shell script:
#!/bin/bash
if [ "$1" != "" ]; then
uid=$1
echo "Positional parameter 1 contains something $1"
else
echo "Positional parameter 1 is empty"
fi
LOG_TMP="tmp_log_file_location"
log_location="log_file_location"
filename="${log_location}/$uid.tar.gz"
echo $filename
tar -zcvf $filename -C $LOG_TMP/dc .
This is what i am not able to understand:
How to pass back the value of variable "filename" back to python script if tar -zcvf command is successful.
In python script how can i verify take value of filename and create JSON using that
In case value cannot be generated STATUS becomes fail in JSON ( within python ) so capture that as well.
Your shell script writes the name of the generated file to standard output, so your question boils down to how to catch stdandard output of a subprocess started from Python. This has been ansered here. BTW, when asking questions about Python, it would be a good idea to always specify the Python version you are using.
In your case however, I would redesign your shell script a bit:
Your script outputs not only the generated filename, but also messages about the "positional parameter", and this means that you would have to fiddle them apart in your script, whether it is an message or a valid output. You could send the messages to standard error, to keep them apart.
BTW, if there is no positional parameter, the generated file name is just .tar.gz. Is this really what you want to have? Wouldn't it better to do a exit 1, if there is no parameter?
I have this script below where I start a python program.
The python program outputs to stdout/terminal. But I want the program to be started via rc script silently.
I can start the and stop the program perfectly. And it also creates the log file, but dosent fill anything to it. I tried a lot of different ways. Even with using daemon as starter.
Where is my problem?
#!/bin/sh
# REQUIRE: DAEMON
# KEYWORD: shutdown
. /etc/rc.subr
location="/rpiVent"
name="rpiVentService"
rcvar=`set_rcvar`
command="$location/$name"
#command_args="> $location/$name.log" // Removed
command_interpreter="/usr/bin/python"
load_rc_config $name
run_rc_command "$1"
Piping with > is a feature of the shell and not an actual part of the command line. When commands are programmatically involved, the arguments given them cannot contain shell directives (unless the parent process has special support for shell, like with Python subprocess.Popen(shell=True) (doc).
What in this case you can do is that you can wrap your command (/rpiVent/rpiVentService) to a shell script then invoke this shell script in FreeBSD rc script::
Creat /rpiVent/run.sh:
#!/bin/sh
/rpiVent/rpiVentservice > /rpiVent/rpiVentService.log
and then use this is a command (no args needed).
The correct way to do this is probably by "overriding" the start command using start_cmd variable, like this:
#!/bin/sh
# REQUIRE: DAEMON
# KEYWORD: shutdown
. /etc/rc.subr
location="/rpiVent"
name="rpiVentService"
rcvar=`set_rcvar`
load_rc_config $name
command="$location/$name"
command_interpreter="/usr/bin/python"
start_cmd=rpivent_cmd
rpivent_cmd()
{
$command_interpreter $command >$location/$name.log
}
run_rc_command "$1"
I would like to know if it's possible in a bash script to include a python script in order to write (in the bash script) the return value of a funnction I wrote in my python program ?
For example:
my file "file.py" has a function which returns a variable value "my_value" (which represents the name of a file but anyway)
I want to create a bash script which has to be able to execute a commande line like "ingest my_value"
So do you know how to include a python file in a bash script (import ...?) and how is it possible to call a value from a python file inside a bash script ?
Thank you in advance.
Update
Actually, my python file looks like that:
class formEvents():
def __init__(self):
...
def myFunc1(self): # function which returns the name of a file that the user choose in his computeur
...
return name_file
def myFunc2(self): # function which calls an existing bash script (bash_file.sh) in writing the name_file inside it (in the middle of a line)
subprocess.call(['./bash_file.sh'])
if__name__="__main__":
FE=formEvents()
I don't know if it's clear enough but here is my problem: it's to be able to write name_file inside the bash_file.sh
Jordane
The easiest way of doing this is via the standard UNIX Pipeline and your Shell.
Here's an example:
foo.sh:
#!/bin/bash
my_value=$(python file.py)
echo $my_value
file.py:
#!/usr/bin/env python
def my_function():
return "my_value"
if __name__ == "__main__":
print(my_function())
The way this works is simple:
You launch foo.sh
Bash spawns a subprocess and runs python file.py
Python (and the interpretation of file.py) run the function my_function and print it's return value to "Standard Output"
Bash captures the "Standard Output" of the Python process in my_value
Bash then simply echoes the value stored in my_value also to "Standard Output" and you should see "my_value" printed to the Shell/Terminal.
If the python script outputs the return value to the console, you should be able to just do this
my_value=$(command)
Edit: Damn, beat me to it
Alternatively, you can make the bash script process arguments.
#!/usr/bin/bash
if [[ -n "$1" ]]; then
name_file="$1"
else
echo "No filename specified" >&2
exit 1
fi
# And use $name_file in your script
In Python, your subprocess call should be changed accordingly:
subprocess.call(['./bash_file.sh', name_file])
I am really fond of python's capability to do things like this:
if __name__ == '__main__':
#setup testing code here
#or setup a call a function with parameters and human format the output
#etc...
This is nice because I can treat a Python script file as something that can be called from the command line but it remains available for me to import its functions and classes into a separate python script file easily without triggering the default "run from the command line behavior".
Does Powershell have a similar facility that I could exploit? And if it doesn't how should I be organizing my library of function files so that i can easily execute some of them while I am developing them?
$MyInvocation.Invocation has information about how the script was started.
If ($MyInvocation.InvocationName -eq '&') {
"Called using operator: '$($MyInvocation.InvocationName)'"
} ElseIf ($MyInvocation.InvocationName -eq '.') {
"Dot sourced: '$($MyInvocation.InvocationName)'"
} ElseIf ((Resolve-Path -Path $MyInvocation.InvocationName).ProviderPath -eq $MyInvocation.MyCommand.Path) {
"Called using path: '$($MyInvocation.InvocationName)'"
}
$MyInvocation has lots of information about the current context, and those of callers. Maybe this could be used to detect if a script is being dot-sourced (i.e. imported) or executed as a script.
A script can act like a function: use param as first non-common/whitespace in the file to defined parameters. It is not clear (one would need to try different combinations) what happens if you dot-source a script that starts param...
Modules can directly execute code as well as export functions, variables, ... and can take parameters. Maybe $MyInvocation in a module would allow the two cases to be detected.
EDIT: Additional:
$MyInvocation.Line contains the command line used to execute the current script or function. Its Line property has the scrip text used for the execution, when dot-sourcing this will start with "." but not if run as a script (obviously a case to use a regex match to allow for variable whitespace around the period).
In a script run as a function
As of now I see 2 options that work
if ($MyInvocation.InvocationName -ne '.') {#do main stuff}
and
if ($MyInvocation.CommandOrigin -eq 'Runspace') {#do main stuff}
Disclaimer: This is only tested on Powershell Core on Linux. It may not work the same for Windows. If anyone tries it on Windows I would appreciate if you could verify in the comments.
function IsMain() {
(Get-Variable MyInvocation -Scope Local).Value.PSCommandPath -Eq (Get-Variable MyInvocation -Scope Global).Value.InvocationName
}
Demonstrated with a gist