I am trying execute a node.js script inside python code running in Google App Engine. Something like this below:
from Naked.toolshed.shell import muterun_js
def foo(parameter):
response_from_js = muterun_js('./views/api/generateSignedTransaction.js',
parameter)
response = response_from_js.stdout
But Naked cannot be used inside GAE because of its dependencies on C and inbuilt libraries like subprocess.call(), subprocess.check_output() and os.system() did not work as well I am guessing due to the same reasons as they execute system calls they would have C dependanices. Is there any alternative to passing parameters and executing the node.js script which would work in GAE?
In your case, depending on what generateSignedTransaction.js does, you can create a Google Cloud Function with your JS script, then, just call it from your Python.
Related
This problem is showing up only on a specific project in Google Cloud (on multiple VMs). The exact same code works perfectly on other projects and on my Mac.
I have a python module which is running streaming speech API with something like this:
responses = client.streaming_recognize(
streaming_config,
get_requests()
)
The code is based on the Google sample here
I can run this module directly and see that it is getting result callbacks. This is true even on the Google VM where later (see below) I have the issues.
Then I run the same module with multiprocessing.Process and then, the call to streaming_recognize just does nothing.
I used logging to see it is being called and the code continues to run. I get no errors.
But get_requests() is NEVER getting called.
I have tried different service accounts on each of the environment. But the failure shows only in the one project, regardless of service account used.
Any reason there should be a different behavior is a sub process?
UPDATE:
I found out that if I start the process instead with subprocess.Popen everything works correctly. And just as a reminder, this problem (difference in behavior based on how I started the process) happens only in one specific Google Cloud project, and not anywhere else.
I know i can access a Dynamics instance from a python script by using the oData API, but what about the other way around? Is it possible to somehow call a python script from within Dynamics and possible even pass arguments?
Would this require me to use custom js/c#/other code within Dynamics?
You won't be able to nativley execute a python script within Dynamics.
I would approach this by placing the Python script in a service that can be called via a web service call from Dynamics. You could make the call from form JavaScript or a Plugin using C#.
I had been trying to run python from SSIS. SO i needed to create a package in sql server. I can run small scripts in sql server but I am not sure how to run scripts.
Below works. But my python code is in test_db.py How do I run that python script in sql server?
EXEC sp_execute_external_script #language = N'Python',
#script = N'print(3+4)'
STDOUT message(s) from external script:
7
If the Python engine is installed on the server where you are trying to run this script, you can use the execute process task and call the python.exe. Pass the .py file as an argument to the task and that will run the script as well.
There are two approaches to execute python scripts from SSIS:
(1) Executing Python Script using Execute Process Task
You can use execute the python script from an Execute Process Task to a Flat File then read from the flat file to SQL Server, you can refer to the following link for more information:
Scraping data with SSIS and Python
(2) Using IronPython
IronPython is an open-source implementation of the Python programming language which is tightly integrated with the .NET Framework. IronPython can use the .NET Framework and Python libraries, and other .NET languages can use Python code just as easily.
You can use a Script Component to integrate IronPython library:
I didn't used this library before and i don't know if i can help. I have read a comment wrote by #billinkc linking to the answer below which contains an amazing guide on how to do that:
SSIS: Execute Ironpython or Ironruby scripts through SSIS
References
Predicting data with Python script in an SSIS package
How to create a SSIS package to ETL JSON from Python REST API request into MSSQL server?
We using AngularJS as a frontend for our web application and some of the functions we are using python to do the calculation and get results back.
I would like to know is there any way to calling the python script directly in AngularJS? right now we are using $HTTP service to call PHP then in PHP using EXEC command to call the python, it is all working fine.
The problem is we notified there is about 5 seconds delay every time the python script call and I guess it is because of the overhead for the python interpreter and try to start it every time, we would like to eliminate that delay.
We are run on Redhat v 6.8 / AngualarJS 1.4x and Python 3.6 Anaconda3
Does anyone try something like that? any suggestions are welcome.
Thank you!
You could write a python method that calls your calculation code and expose that method as a REST API.
You would need a library like Flask.
This tutorial explains how to do that : https://www.codementor.io/sagaragarwal94/building-a-basic-restful-api-in-python-58k02xsiq
This way you can directly call the python API.
I built a simple web app using Flask. What this does is basically take data from a form and sends a POST - which is then passed as a command line argument to the script using
os.popen("python3 script.py " + postArgument).read()
The command is stored in a variable which is then passed to an element in a new page with the results.
About the script: It runs the string in the POST through an API, gets some data, processes it, sends it to another API and finally prints the results (which are finally stored in the variable)
It works fine on a local server. But Azure fails to return anything. The string is empty.
How do I get some terminal logs?
Is there a solution?
Per my experience, it seems that the issue was caused by the Python 3 (even for Python 2) interpreter called python on Azure, not python3.
So if you had configured the Python 3 runtime environment for the Application settings on Azure portal as the figure below, please using python script.py instead of python3 script.py in your code.
Or you also can use the absolute path of Python 3 on Azure WebApp D:\Python34\python instead of python3 in your code, as below.
However, I also doubt the another possible issue for you besides the above case. You may use some python packages which be not install using pip on Azure. If so, you need to refer to the section Troubleshooting - Package Installation of the Azure offical document for Python to resolve the possible issue.
Hope it helps. Any concern & update, please feel free to let me know.