So, I'm writing a python script that gets data from a google sheet and returns it back to an ExtendScript script that I'm writing for After Effects.
The relevant bits are :
getSpreadsheetData.py
def main():
values = getSpreadsheetRange("1M337m3YHCdCDcVyS4fITvAGJsw7rGQ2XGbZaKIdkJPc", "A1:Q41")
return processValues(values)
afterEffectsScript.jsx
var script_file = File("getSpreadsheetData.py");
var results = script_file.execute();
$.writeln(results);
alert("done!");
So, I have three questions :
How do I pass variables from the afterEffectsScript.jsx to the python script (for example the spreadsheet id and range)?
How do I get a return from the python script and return it back to the jsx file?
How do I make my afterEffectsScript to work async so that it can wait for the python script to get what it needs...
Thanks in advance for the advice!
-P
After Effects has the possibility to call system commands and get the result of stdout.
var cmd = "pwd";
var stdout = system.callSystem(cmd);
$.writeln(stdout);
Take a look into the AE Scripting Guide
You can pass variables via setting environment variables.
Small example how call external script with args from extendscript:
var script_file = File("getSpreadsheetData.py");
$.setenv("arg_1", "arg1_value");
$.setenv("arg_2", "arg2_value");
script_file.execute();
You python script should start with reading this varibles from environment: Access environment variables from Python
Related
I'm working on cloning a Virtual Machine (VM) in vCenter environment using this code. It takes command line arguments for name of the VM, template, datastore, etc. (e.g. $ clone_vm.py -s <host_name> -p < password > -nossl ....)
I have another Python file where I've been able to list the Datastore volumes in descending order of free_storage. I have stored the datastore with maximum available storage in a variable ds_max. (Let's call this ds_info.py)
I would like to use ds_max variable from ds_info.py as a command line argument for datastore command line argument in clone_vm.py.
I tried importing the os module in ds_info.py and running os.system(python clone_vm.py ....arguments...) but it did not take the ds_max variable as an argument.
I'm new to coding and am not confident to change the clone_vm.py to take in the Datastore with maximum free storage.
Thank you for taking the time to read through this.
I suspect there is something wrong in your os.system call, but you don't provide it, so I can't check.
Generally it is a good idea to use the current paradigm, and the received wisdom (TM) is that we use subprocess. See the docs, but the basic pattern is:
from subprocess import run
cmd = ["mycmd", "--arg1", "--arg2", "val_for_arg2"]
run(cmd)
Since this is just a list, you can easily drop arguments into it:
var = "hello"
cmd = ["echo", var]
run(cmd)
However, if your other command is in fact a python script it is more normal to refactor your script so that the main functionality is wrapped in a function, called main by convention:
# script 2
...
def main(arg1, arg2, arg3):
do_the_work
if __name__ == "__main__":
args = get_sys_args() # dummy fn
main(*args)
Then you can simply import script2 from script1 and run the code directly:
# script 1
from script2 import main
args = get_args() # dummy fn
main(*args)
This is 'better' as it doesn't involve spawning a whole new python process just to run python code, and it generally results in neater code. But nothing stops you calling a python script the same way you'd call anything else.
I have a python script a.py that returns a tuple of two values.
I am running this script from a Jenkins bash shell and I need to be able to retrieve the return values and use them in the further steps of the job.
As of now, the call to the script looks like:
ret_tuple=($($ENV_PATH/bin/python a.py))
Then I am trying to access the return value and assign it to variables that later I would inject into the Jenkins job
echo "${ret_tuple[0]}"
echo "${ret_tuple[1]}"
echo SRC_BUCKET=${ret_tuple[0]} > variables.properties
echo DST_BUCKET=${ret_tuple[1]} > variables.properties
Later, I forward these variables into another job that this job triggers and I can see that the parameters that are being sent from the variables are incorrect.
One of them holds $SRC_BUCKET and the second one Disabled! (which doesn't make a lot of sense to me).
I pass the variables like that:
data_path=${SRC_BUCKET}
destination_path=${DST_BUCKET}
Am I doing this in the right way and I should look for the problem in another place? or there's something wrong with this variable assignment above?
EDIT:
The python script returns two strings
src_bucket_path = os.path.join(export_path, file_name)
dst_bucket_path = os.path.join(destination_path, dst_bucket_suffix_path)
return src_bucket_path, dst_bucket_path
Maybe use command expansion, so the command is evaluated before outputted into the variables.properties files.
$(echo SRC_BUCKET=${ret_tuple[0]} > variables.properties)
$(echo DST_BUCKET=${ret_tuple[1]} > variables.properties)
A good article for you to read,
http://www.compciv.org/topics/bash/variables-and-substitution/
If you run your scripts in the same shell session window, you could create an environment variable from one script, and then when you run the next script you can access the environment variable.
I am trying to execute a command from lua script. The command is to simply run a python script named "sha_compare.py" of which receives 3 arguments where two of them are variables from the lua script - dady_data and sha:
local method = ngx.var.request_method
local headers = ngx.req.get_headers()
if method == "POST" then
ngx.req.read_body()
local body_data = ngx.req.get_body_data()
local sha = headers['X-Hub-Signature-256']
ngx.print(os.execute("python3 sha_compare.py"..sha..body_data))
else
The script fails because of the way I call the arguments. The actual command if I would have ran it from cmd would have been something like:
python3 python3 sha_compare.py sha256=ffs8df aaaaa
Please tell me how should I change my code to call the python script with 3 vars properly.
If it is not possible or hard to implement, please let me know how can I call a .sh script which will receive those 3 params.
You're not providing spaces between the arguments: you're trying to execute
python3 sha_compare.pysha256=ffs8dfaaaaa
Do this:
os.execute("python3 sha_compare.py "..sha.." "..body_data)
It's often easier to build the command up as a table, and the concat it for execution:
local cmd = { 'python3', 'sha_compare.py', sha, body_data }
os.execute(table.concat(cmd, " "))
I'm having more than a little trouble running a python script from an AIR application using the NativeProcess interface. In theory, this should be quite simple. Adobe even uses this as their example in the ActionScript 3.0 documentation for NativeProcess, as follows:
var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
var file:File = File.applicationDirectory.resolvePath("test.py");
nativeProcessStartupInfo.executable = file;
They even include the contents of what test.py might include:
#!/usr/bin/python
# ------------------------------------------------------------------------------
# Sample Python script
# ------------------------------------------------------------------------------
import sys
for word in sys.argv: #echo the command line arguments
print word
print "HI FROM PYTHON"
print "Enter user name"
line = sys.stdin.readline()
sys.stdout.write("hello," + line)
The problem is that, as far as I can see, this simply doesn't work. I get the following error when I attempt it:
Error #3219: The NativeProcess could not be started. '%1 is not a valid Win32 application.
Presumably the latest version of AIR (19.0) doesn't allow the execution of anything without an "exe" file extension. The following code does seem to do what I want:
var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
var file:File = new File("C:/Python/Python35/python.exe");
nativeProcessStartupInfo.executable = file;
nativeProcessStartupInfo.workingDirectory = File.applicationDirectory.resolvePath(".");
var processArgs:Vector.<String> = new Vector.<String>();
processArgs[0] = "test.py";
nativeProcessStartupInfo.arguments = processArgs;
The problem here is twofold. First, you need to know the absolute path to the executable, which I can't assume. Second, the code is no longer platform independent. The file extension would be something else on Linux or Mac.
I thought I might solve the first problem by requiring a %PYTHON_PATH% environment variable and then making the executable dependent on that. However, I can't figure out a way to use an environment variable within the ActionScript File object. It "helpfully" escapes all the "%" characters before ever sending something to the command line.
At this point this fairly simple problem has turned into a showstopper. Could someone help me understand a way to either:
Execute something with the "py" extension with NativeProcess
Successfully resolve a path that depends on an environment variable in the File object?
I have a python script that needs to access and query a MaxMind (.mmdb) file type. My current thought is to load the MaxMind file into HDFS's distributed cache and then pass it through Pig to my Python script. MY current Pig Script is:
SET mapred.cache.file /path/filelocation/;
SET mapred.createsymlink YES;
SET mapred.cache.file hdfs://localserver:8020/pathtofile#filename;
REGISTER 'pythonscript' USING jython AS myudf;
logfile= LOAD 'filename' USING PigStorage(',') AS (x:int);
RESULT = FOREACH logfile GENERATE myudf.pyFunc(x,"how to pass in MaxMind file");
Any thoughts as to how to access the file once its loaded in to the distribute cache inside of the python script?
Thanks
I think you can do it like this:
set mapred.cache.files hdfs:///user/cody.stevens/testdat//list.txt#filename;
SET mapred.createsymlink YES;
REGISTER 'my.py' USING jython AS myudf;
a = LOAD 'hdfs:///user/cody.stevens/pig.txt' as (x:chararray);
RESULT = FOREACH a GENERATE myudf.list_files(x,'filename');
STORE RESULT into '$OUTPUT';
and here is the corresponding my.py that I used for this example
#/usr/bin/env python
import os
#outputSchema("ls:chararray}")
def list_files(x,f):
#ls = os.listdir('.')
fin = open(f,'rb')
return [x,fin.readlines()]
if __name__ == '__main__':
print "ok"
Almost forgot.. I was calling it like this.
pig -param OUTPUT=/user/cody.stevens/pigout -f dist.pig
It should be in your local dir so python should be able to access it. In that example 'filename' is the name of the symbolic link, you will have to update accordingly. In your case you will want your 'filename' to be your maxmind file, and depending on what your values in 'a' are you may need to change that back to 'as (x:int)'.
Good luck!