I am executing exec_command in Python. The command is : find -path . -mmin -$time -print. And I am assigning time=100. But while executing the command, exec_command does not picks up the variable value.
or even I do exec_command('echo $time'). It does not picks up the value.
Python and the shell have separate variables. Setting time in your Python script does not magically create a variable called $time that you can then use in shell commands. It especially does not magically create a shell variable on some other computer which is what it appears you're trying to do.
Instead, put the Python value into the string you're passing. For example:
command = 'find -path . -mmin -%d -print' % time
print(command) # shows the exact command that will be executed
exec_command(command)
Be careful with this. It's very easy to create security holes if you don't know what you're doing, especially when you take strings from user input or other data sources you don't control. In this example I've used %d to substitute the value into the command, and this will keep anything but a number from being used, which should be safe.
Related
I have a python script a.py that returns a tuple of two values.
I am running this script from a Jenkins bash shell and I need to be able to retrieve the return values and use them in the further steps of the job.
As of now, the call to the script looks like:
ret_tuple=($($ENV_PATH/bin/python a.py))
Then I am trying to access the return value and assign it to variables that later I would inject into the Jenkins job
echo "${ret_tuple[0]}"
echo "${ret_tuple[1]}"
echo SRC_BUCKET=${ret_tuple[0]} > variables.properties
echo DST_BUCKET=${ret_tuple[1]} > variables.properties
Later, I forward these variables into another job that this job triggers and I can see that the parameters that are being sent from the variables are incorrect.
One of them holds $SRC_BUCKET and the second one Disabled! (which doesn't make a lot of sense to me).
I pass the variables like that:
data_path=${SRC_BUCKET}
destination_path=${DST_BUCKET}
Am I doing this in the right way and I should look for the problem in another place? or there's something wrong with this variable assignment above?
EDIT:
The python script returns two strings
src_bucket_path = os.path.join(export_path, file_name)
dst_bucket_path = os.path.join(destination_path, dst_bucket_suffix_path)
return src_bucket_path, dst_bucket_path
Maybe use command expansion, so the command is evaluated before outputted into the variables.properties files.
$(echo SRC_BUCKET=${ret_tuple[0]} > variables.properties)
$(echo DST_BUCKET=${ret_tuple[1]} > variables.properties)
A good article for you to read,
http://www.compciv.org/topics/bash/variables-and-substitution/
If you run your scripts in the same shell session window, you could create an environment variable from one script, and then when you run the next script you can access the environment variable.
$env:PYTHONSTARTUP = "E:\ftlcheckout\ps1XstoreCmds\WindowsPowerShell\oneCallMultipleTickets.py"
$Host.UI.RawUI.WindowTitle = "One Call Multiple Tickets"
Write-Host "example: oneCallMultipleTickets(['123', '456', '789'])"
python
But when python runs it's like the PYTHONSTARTUP variable was never set, I can't run the oneCallMultipleTickets function.
However; if I exit python, and then run it again it works...
So is the %PYTHONSTARTUP% variable getting set, just too late?
I am trying to run a sort of application that utilises both Python and powershell scripts. I already wrote the Python script and powershell script, which are meant to work simultaneously but separate from each other. What I want to do is create a Python program that launches them both, is there a way? Thanks!
What I have right now, as part of a larger script, is:
import subprocess
autom = r"C:\Users\mrmostacho\Desktop\Robot\Autom.ps1","-ExecutionPolicy","Unrestricted"
powershell = r"C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe"
subprocess.Popen("%s %s" % (powershell, autom,))
I think you don't want "-ExecutionPolicy","Unrestricted" as script arguments but instead want to set powershells execution policy to allow the execution of your script. Therefore you should pass those parameters before the actual Script.
Second: It's not enough, to pass the script as argument to powershell.exe (this way the script name is interpreted as Powershell command and one has to escape the name according to powershells quoting rules). Instead the Script name should be given after the -File parameter. From online documentation:
-File []
Runs the specified script in the local scope ("dot-sourced"), so that the functions and variables that the script creates are
available in the current session. Enter the script file path and any
parameters. File must be the last parameter in the command, because
all characters typed after the File parameter name are interpreted as
the script file path followed by the script parameters.
You can include the parameters of a script, and parameter values, in
the value of the File parameter. For example: -File .\Get-Script.ps1 -Domain Central
Typically, the switch parameters of a script are either included or
omitted. For example, the following command uses the All parameter of
the Get-Script.ps1 script file: -File .\Get-Script.ps1 -All
In rare cases, you might need to provide a Boolean value for a switch
parameter. To provide a Boolean value for a switch parameter in the
value of the File parameter, enclose the parameter name and value in
curly braces, such as the following: -File .\Get-Script.ps1 {-All:$False}.
Third: As cdarke already commented, it's better to use a list instead of a string as argument to Popen. This way one doesn't need to worry about the CommandLine parsing on Windows.
Altogether, this should be the way to go. (Tested with small test script.)
import subprocess
autom = r"C:\Users\mrmostacho\Desktop\Robot\Autom.ps1"
powershell = r"C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe"
subprocess.Popen([powershell,"-ExecutionPolicy","Unrestricted","-File", autom])
If you need to pass arguments to the script, do it like this:
subprocess.Popen([powershell,"-ExecutionPolicy","Unrestricted","-File", autom, 'arg 1', 'arg 2'])
I am using Centos 7.0 and PyDEv in Eclipse. I am trying to pass the variable in Python into c shell script. But I am getting error:
This is my Python script named raw2waveconvert.py
num = 10
print(num)
import subprocess
subprocess.call(["csh", "./test1.csh"])
Output/Error when I run the Python script:
10
num: Undefined variable.
The file test1.csh contains:
#!/bin/csh
set nvar=`/home/nishant/workspace/codec_implement/src/NTTool/raw2waveconvert.py $num`
echo $nvar
Okey, so apparently it's not so easy to find a nice and clear duplicate. This is how it's usually done. You either pass the value as an argument to the script, or via an environmental variable.
The following example shows both ways in action. Of course you can drop whatever you don't like.
import subprocess
import shlex
var = "test"
env_var = "test2"
script = "./var.sh"
#prepare a command (append variable to the scriptname)
command = "{} {}".format(script, var)
#prepare environment variables
environment = {"test_var" : env_var}
#Note: shlex.split splits a textual command into a list suited for subprocess.call
subprocess.call( shlex.split(command), env = environment )
This is corresponding bash script, but from what I've read addressing command line variables is the same, so it should work for both bash and csh set as default shells.
var.sh:
#!/bin/sh
echo "I was called with a command line argument '$1'"
echo "Value of enviormental variable test_var is '$test_var'"
Test:
luk32$ python3 subproc.py
I was called with a command line argument 'test'
Value of enviormental variable test_var is 'test2'
Please note that the python interpreter needs to have appropriate access to the called script. In this case var.sh needs to be executable for the user luk32. Otherwise, you will get Permission denied error.
I also urge to read docs on subprocess. Many other materials use shell=True, I won't discuss it, but I dislike and discourage it. The presented examples should work and be safe.
subprocess.call(..., env=os.environ + {'num': num})
The only way to do what you want here is to export/pass the variable value through the shell environment. Which requires using the env={} dictionary argument.
But it is more likely that what you should do is pass arguments to your script instead of assuming pre-existing variables. Then you would stick num in the array argument to subprocess.call (probably better to use check_call unless you know the script is supposed to fail) and then use $1/etc. as normal.
To use logarithmic function, I used export to pass a variable $var1 from bash to python script. After the calculation, I used
os.environ['var1']=str(result)
to send the result back to bash script.
However, the bash still shows the unmodified value.
You can have a look at the os.putenv(key, value) function here maybe it could help you.
Although as noted on the doc :
When putenv() is supported, assignments to items in os.environ are automatically translated into corresponding calls to putenv(); however, calls to putenv() don’t update os.environ, so it is actually preferable to assign to items of os.environ.
EDIT :
moooeeeep thought about it just a minute before me, but the variable you change only applies to the current process. In such a case I can see two solutions :
- Write the data to a file and read it with your bash script
- Call your bash script directly from within python so that the bash process would inherit your modified variable.