Run Python in background of NodeJS - python

I'm not sure what I'm trying to do is possible, but I'm trying to write a NodeJS application that needs to call some Python functions, the way I'm going to use Python functions is similar to How to call Python function from NodeJS by using child_process
If the python script had a few imports that had a slight delay when running the script, if you were calling the python script often surely it would cause problems for the applications running time. Is there a way to get around to constantly keep a child process python call open and then call a function whenever it's needed?
Thanks

There's nothing specific to Python nor node.js here. If you don't want the overhead of spawning a child process, make the "other" script / application / whatever a long running process and use any kind of inter-process communication.

Related

Electron python shell vs sockets

What I have is electron app that runs the frontend and python app that runs in backend. At some point I need to execute python script and get info from it.
I'm currently using python shell for node.js (communicate through standard I/O), however I was able to find another method with sockets. What are the advantages of using sockets compared to standard I/O? From what I understand, I can only pass strings either way. Which performs better?

How To Call Python script from AngularJS App

We using AngularJS as a frontend for our web application and some of the functions we are using python to do the calculation and get results back.
I would like to know is there any way to calling the python script directly in AngularJS? right now we are using $HTTP service to call PHP then in PHP using EXEC command to call the python, it is all working fine.
The problem is we notified there is about 5 seconds delay every time the python script call and I guess it is because of the overhead for the python interpreter and try to start it every time, we would like to eliminate that delay.
We are run on Redhat v 6.8 / AngualarJS 1.4x and Python 3.6 Anaconda3
Does anyone try something like that? any suggestions are welcome.
Thank you!
You could write a python method that calls your calculation code and expose that method as a REST API.
You would need a library like Flask.
This tutorial explains how to do that : https://www.codementor.io/sagaragarwal94/building-a-basic-restful-api-in-python-58k02xsiq
This way you can directly call the python API.

Running Octave tasks from Python

I have a pretty complex computation code written in Octave and a python script which receives user input, and needs to run the Octave code based on the user inputs. As I see it, I have these options:
Port the Octave code to python.
Use external libraries (i.e. oct2py) which enable you to run the Octave/Matlab engine from python.
Communicate between a python process and an octave process. One such possibility would be to use subprocess from the python code and wait for the answer.
Since I'm pretty reluctant to port my code to python and I don't want to rely on maintenance of external libraries such as oct2py, I am in favor of option 3. However, since the system should scale well, I do not want to spawn a new octave process for every request, and a tasks queue system seems more reasonable. Is there any (recommended) tasks queue system to enqueue tasks in python and have an octave worker on the other end process it?
The way it is described here, option 3 degenerates to option 2 because Octave does not have an obvious way (an API or package) for the 'Octave worker' to connect to a task queue.
The only way Octave does "networking" is by the sockets package and this means implementing the protocol for communicating with the task queue from scratch (in Octave).
The original motivation for having an 'Octave worker' is to have the main process of Octave launch once and then "direct it" to execute functions and return results, rather than launching the main process of Octave for every call to a function.
Since Octave cannot do 'a worker' (that launches, listens to a 'channel' and executes code) out of the box, the only other way to achieve this is to have the task queue framework all work in Python and only call Octave when you need its functionality, most likely via oct2py (i.e. option 2).
There are many different ways to do this ranging from Redis, to PyPubSub, Celery and RabbitMQ. All of them straightforward and very well documented. PyPubSub does not require any additional components.
(Just as a note: The solution of having an 'executable' octave script, calling it via Python and blocking until it returns is not as bad as it sounds however and for some parallel-processing frameworks it is the only way to have multiple copies of the same Octave script operate on different data segments.)
All three options are reasonable depending on your particular case.
I don't want to rely on maintenance of external libraries such as oct2py, I am in favor of option 3
oct2py is implemented using option 3. You can reinvent what it already does or use it directly. oct2py is pure Python and it has permissive license: if its development were to stop tomorrow; you could include its code alongside yours.

How to automatically rerun a python program after it finishes? Supervisord?

I have a python program that I would like to constantly be running updates and gathering new data. Essentially, I am gathering data from a bunch of domains. My processors take about a day and a half to run. Once they finish, I'd like them to automatically start over again.
I don't want to use a while loop to just restart the processes without killing everything related first because some of the packages that I am using to support these processors (mainly pyV8) have a problem of memory slowly accumulating and I'm not a good enough programmer to dive into debugging a memory leak in a big package like that. So, I need all of the related processes to successfully die and then come back to life.
I have heard that supervisord can do this type of work, but don't like messing around with .conf files and would prefer to keep everything inside of python.
Summary: Is there a package that will kill all related processes with a script/package that I could use to put into a while loop or create this kind of behavior inside of a python script?
I don't see why you couldn't use supervisord. The configuration is really simple and very flexible and it's not limited to python programs.
For example, you can create file /etc/supervisor/conf.d/myprog.conf:
[program:myprog]
command=/opt/myprog/bin/myprog --opt1 --opt2
directory=/opt/myprog
user=myuser
Then reload supervisor's config:
$ sudo supervisorctl reload
and it's on. Isn't it simple enough?
More about supervisord configuration: http://supervisord.org/subprocess.html

Is there a Python equivalent to PHP-FPM?

I'm starting a web project in Python and I'm looking for a process manager that offers reloading in the same manner as PHP-FPM.
I've built stuff with Python before and Paste seems similar to what I want, but not quite.
The need for the ability to reload the process rather than restart is to allow long-running tasks to complete uninterrupted where necessary.
How about supervisor with uwsgi?

Categories