NativeProcess communication giving error - python

I am trying to communicate to a python script through actionscript. it gives me error on line :
var stdOut:ByteArray = process.standardOutput;
from the function shown below :
public function onOutputData(event:ProgressEvent):void
{
var stdOut:ByteArray = process.standardOutput; //error
var data:String = stdOut.readUTFBytes(process.standardOutput.bytesAvailable);
trace("Got: ", data);
}
Error is:
Implicit coercion of a value with static type IDataInput to a possibly
unrelated type ByteArray.
I am following the same approach as on Adobe's page. Here is some testable code :
package
{
import flash.display.Sprite;
import flash.desktop.NativeProcessStartupInfo;
import flash.filesystem.File;
import flash.desktop.NativeProcess;
import flash.events.ProgressEvent;
import flash.utils.ByteArray;
public class InstaUtility extends Sprite
{
public var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
public var file:File = new File("C:/Python27/python.exe");
public var process:NativeProcess = new NativeProcess();
public function InstaUtility()
{
nativeProcessStartupInfo.executable = file;
nativeProcessStartupInfo.workingDirectory = File.applicationDirectory.resolvePath(".");
trace("Location " + File.applicationDirectory.resolvePath(".").nativePath);
var processArgs:Vector.<String> = new Vector.<String>();
processArgs[0] = "test.py";
nativeProcessStartupInfo.arguments = processArgs;
var process:NativeProcess = new NativeProcess();
process.addEventListener(ProgressEvent.STANDARD_OUTPUT_DATA, onOutputData);
process.start(nativeProcessStartupInfo);
}
public function onOutputData(event:ProgressEvent):void
{
var stdOut:ByteArray = process.standardOutput; //error
var data:String = stdOut.readUTFBytes(process.standardOutput.bytesAvailable);
trace("Got: ", data);
}
}
}

The NativeProcess could not be started. Not supported in current
profile.
Are you testing in Flash IDE?
Test within IDE : In your AIR Publish Settings make sure you ticked only "extended Desktop" when debugging through IDE. This way you also get traces etc.
Test after Publish : You must tick both "Desktop" and "extended Desktop" and also tick "Windows Installer (.exe)". Install your App using the generated .exe file (not the .air file).
Implicit coercion of a value with static type IDataInput to a possibly
unrelated type ByteArray.
var stdOut:ByteArray = process.standardOutput; //error is not how it's done!! Don't make any var each time the progress event fires up. Each firing holds around 32kb or 64kb of bytes only (can't remember), so if the expected result is larger it will continue to fire in multiple chunks... Use and recycle a single public byteArray to hold all the result data.
Try a setup like below :
//# Declare the public variables
public var stdOut : ByteArray = new ByteArray();
public var data_String : String = "";
Your process also needs a NativeProcessExitEvent.EXIT listener.
process.addEventListener(NativeProcessExitEvent.EXIT, on_Process_Exit );
Before you .start a process, also clear the byteArray ready for new data with stdOut.clear();.
Now your progressEvent can look like this below... (Process puts result data into stdOut bytes).
public function onOutputData (event:ProgressEvent) : void
{
//var stdOut:ByteArray = process.standardOutput; //error
//# Progress could fire many times so keep adding data to build the final result
//# "stdOut.length" will be zero at first but add more data to tail end (ie: length)
process.standardOutput.readBytes( stdOut, stdOut.length, process.standardOutput.bytesAvailable );
//# Below should be in a Process "Exit" listener but might work here too
stdOut.position = 0; //move pointer back before reading bytes
data_String = stdOut.readUTFBytes( stdOut.length );
trace("function onOutputData -- Got : " + data_String );
}
But you really need to add an "onProcessExit" listener and then only check for results when the process itself has completed. (Tracing here is much safer for a guaranteed result).
public function on_Process_Exit (event : NativeProcessExitEvent) : void
{
trace ("PYTHON Process finished : ############# " )
stdOut.position = 0; //# move pointer back before reading bytes
data_String = stdOut.readUTFBytes( stdOut.length );
trace("PYTHON Process Got : " + data_String );
}

Related

How to make a progress bar for an Electron app by using child_process on a Python Script?

The problem
I am creating an Electron-React frontend for my Python script.
Electron and Python communicate via Node.js child_process module, using the function spawn to call my Python script.
The script does the following task:
Gets path to a folder, which contains 15 pdf files.
Loads 1 pdf file at a time.
Skims through it to find the first word of the 5th page.
Saves the result in a list.
Continues until all pdf files have been read.
As you can see, this takes some time to process this script.
What I want to do: Once the process is called from Electron, I want the progress bar to appear, indicating how many pdf files are processed out of the 15 given. For example: 1/15 done, 2/15 done, and etc.
I tried googling the answer but after a week I'm at my wits end. I would very much appreciate any help I could get.
Disclaimer: I know of other good packages like Python Eel which does the job (sort of, in a hackish way). But I really want it to work with CLI.
The setup
I used Electron React Boilerplate for my purposes.
For simplicity I used a dummy function for python:
assets/python/main.py
import sys
import time
# Track the progress
progress = 0
def doStuff():
# Access the global variable
global progress
# Simulate time taken to do stuff. Set at 30 seconds per iteration.
for i in range(15):
progress += 1
time.sleep(30)
# The result
return "It is done!"
# Print is used to get the result for stdout.
print(doStuff())
src/main/main.ts
import { spawn } from 'child_process';
import path from 'path';
import { app, BrowserWindow, ipcMain, webContents } from 'electron';
...
let mainWindow: BrowserWindow | null = null;
...
app
.whenReady()
.then(() => {
ipcMain.handle('readPdfs', async (event, arg) => {
// Spawn a python instance
const python = spawn('python', [
path.join(__dirname, '..', '..', 'assets', 'python', 'main.py')
]);
// Get the result and pass it on to the frontend
python.stdout.on('data', (data) => {
const result = data.toString('utf8');
mainWindow.webContents.send('output', result);
});
});
...
src/main/preload.ts
import { contextBridge, ipcRenderer, IpcRendererEvent } from 'electron';
...
let mainWindow: BrowserWindow | null = null;
...
contextBridge.exposeInMainWorld('electron', {
...
// Just sending a command forward and getting a response.
readPdfs: async () => ipcRenderer.invoke('readPdfs'),
readPdfsOutput: (callback: any) => ipcRenderer.on('output', callback),
});
src/renderer/App.tsx
const Hello = () => {
const [text, setText] = useState('');
const handleReadPdfs = async (): Promise<void> => {
await window.electron.readPdfs();
await window.electron.output((event: any, response: string) => {
setText(response);
});
};
The question
What I have set up works. I manage to get the "It is done!" message all right. The problem is I want to get the value progress from main.py every time it increments, while the process is busy reading the pdfs.
Is there a way to get that value from the python script using child_process without interrupting a time consuming process?

How to replace get_declared_classes() from CakePHP to Python?

I have been moving website made in Cakephp into Django. In one place I found get_declared_classes(). I thinks this function returns list of previously used classes before running current class.
First time when I encounter this, I just store list of classes manually in one file and I was using that and this solution worked only for a particular web page but Now I have multiple pages calling this class and everytime list of classnames are very different so I can not store class name list.
This code is actually connecting and fetching data from here and I want to replace this class in python(We are replacing whole website). The only problem I have is how to replace get_declared_classes.
class HrbcConnect{
private $scope = '';
private $token = null;
private $token_expire = 0;
private $request_opt = array();
public $id;
public $id_arr = array();
public $write_error;
public function read($resource,$condition='',$field=array(),$order=null){
$declared_classes = get_declared_classes();
foreach(App::objects('Model') as $v){
if(in_array($v,$declared_classes)){
$instance = new $v;
if(property_exists($instance,'hrbc_cols') && array_key_exists(ucfirst($resource),$instance->hrbc_cols)){
foreach($instance->hrbc_cols[ucfirst($resource)] as $vv){
if(is_array($vv)){
$field[] = $vv[0];
}else{
$field[] = $vv;
}
}
}elseif(property_exists($instance,'hrbc_cols_arr')){
foreach($instance->hrbc_cols_arr as $kk=>$vv){
if(array_key_exists(ucfirst($resource),$vv)){
foreach($vv[ucfirst($resource)] as $vvv){
if(is_array($vvv) && !in_array($vvv[0],$field)){
$field[] = $vvv[0];
}elseif(!is_array($vvv) && !in_array($vvv,$field)){
$field[] = $vvv;
}
}
}
}
}
}
}
}
}
When I print the $v in the above code to find what are classes being used, I found list of classes that defined in my models.
If question is not clear please let me know, I can provide more information.
Is there any other library that can replace this function in Python? Or Is there any other solution I can try ?

PythonKit/PythonLibrary.swift:46: Fatal error: Python library not found. Set the PYTHON_LIBRARY environment variable with the path to a Python library

Ok I'm fairly new to swift and am trying to create an app with a button that uses python. I have seem code looking like
//
// ContentView.swift
// Shared
//
// Created by Ulto4 on 10/17/21.
//
import SwiftUI
import PythonKit
struct ContentView: View {
#State private var showDetails = false
#State var result : String = " "
var body: some View {
HStack{
Text("Hello, world!")
.padding()
Button(action : {
self.coolPerson()
}, label: {
Text("Respones")
})
Text("\(result)")
}
}
func coolPerson(){
let sys = Python.import("sys")
sys.path.append("/Users/ulto4/Documents/vsc")
let example = Python.import("ahhhhh")
let response = example.hi()
result = response.description
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
.preferredColorScheme(.dark)
}
}
The code in the python file is
def hello():
return "cool"
However, when I click the button I get this error
2021-10-17 17:53:16.943097-0700 GAAIN[27059:2277939] PythonKit/PythonLibrary.swift:46: Fatal error: Python library not found. Set the PYTHON_LIBRARY environment variable with the path to a Python library.
(lldb)
I also clicked the .xcodeproj and deleted the Apple Sandbox. But it still doesn't work. Since I'm fairly new I don't know how to do this. Any help would be appreciated.
EDIT: As per the comments, IOS doesn't support PythonKIT

How to trigger a Python or C# script to run when .Net/C# Windows service stops?

We have a c#/.net Windows service that parses big log files for us and updates a meta table when it does so. The problem is whenever you need to stop the service or (services, we have multiple of them running), one must manually delete the files that are in the process of being parsed in the local folder and also update the queue DB table where it tracks files to process.
I want to automate this. I am much more familiar with python so ideally, it would be a python script as opposed to .net. Is it possible to have a script that will trigger when the service is stopped? How would one do this?
I have tried doing this internally in the .net service but since it's multithreaded, files don't get cleaned up neatly. There's always a "can't stop service because another process is using it". It is like the service gets stuck trying to delete files when the Onstop() method is called. This was how I had tried to do it internally within the service:
protected override void OnStop()
{
ProducerConsumerQueue.Dispose();
Logger.Info($"{ProducerConsumerQueue.Count()} logs will be canceled");
CancellationTokenSource.Cancel();
FileUtil.DeleteFilesInProgress(Constants.ODFS_STAGING);
MetadataDbContext.UpdateServiceEntriesOnServiceReset();
//look into some staging directory, delete all files.
Logger.Info($"{ProducerConsumerQueue.Count()} logs canceled");
}
public static void DeleteFilesInProgress(string directory)
{
var constantsutil = new ConstantsUtil();
constantsutil.InitializeConfiguration();
try
{
System.IO.DirectoryInfo di = new DirectoryInfo(directory);
foreach (FileInfo file in di.GetFiles())
{
file.Delete();
}
}
catch(Exception ex)
{
Logger.Error(ex.Message);
string subject = Constants.GENERAL_EMAIL_SUBJECT;
string body = "The following error occured in Client.Util.ConstantsUtil:";
string error = ex.ToString(); //ex.ToString makes it more verbose so you can trace it.
var result = EmailUtil.Emailalert(subject, body, error);
}
}
public static int UpdateServiceEntriesOnServiceReset()
{
int rowsAffected = 0;
try
{
string connectionString = GetConnectionString();
using (SqlConnection connection = new SqlConnection())
{
connection.ConnectionString = connectionString;
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = $"UPDATE {Constants.SERVICE_LOG_TBL} SET STATUS = '0'";
cmd.Connection = connection;
connection.Open();
rowsAffected = cmd.ExecuteNonQuery();
}
}
catch (Exception ex)
{
Logger.Error($"{ex.Message.ToString()}");
string subject = Constants.GENERAL_EMAIL_SUBJECT;
string body = "The following error occured in Client.MetadatDbContext while Parser was processing:";
string error = ex.ToString(); //ex.ToString makes it more verbose so you can trace it.
var result = EmailUtil.Emailalert(subject, body, error);
}
return rowsAffected;
}
You can run your script from OnStop:
using System.Diagnostics;
Process.Start("python yourscript.py");
// or whatever the command for executing your python script is on your system.
And then use something like pywin32's win32service to find out the status of the service that launched the script, and then wait for it to die and release its hold on the files.
Then wipe them.

Pass a pickle buffer from Node to Python

I have a Node application that subscribes to JSON data streams. I would like to extend this to subscribe to Python pickle data streams (I am willing to drop or convert non primitive types). The node-pickle & jpickle packages have failed me. I now wish to write my own Python script to convert pickles to JSON.
I fiddled with the node-pickle source code to get part of it to work (can pass JSON from Node to Python and get back a pickle string, can also use a predefined Python dict and pass to Node as JSON). My problem is getting Python to recognize the data from Node as pickled data. I am passing the data stream buffer from Node to Python and trying desparately to get the string buffer argument into a format for me to pickle.loads it.
After much trial and error I have ended up with this:
main.js
const pickle = require('node-pickle');
const amqp = require('amqplib/callback_api');
amqp.connect(`amqp://${usr}:${pwd}#${url}`, (err, conn) => {
if (err) {
console.error(err);
}
conn.createChannel((err, ch) => {
if (err) {
console.error(err);
}
ch.assertExchange(ex, 'fanout', { durable: false });
ch.assertQueue('', {}, (err, q) => {
ch.bindQueue(q.queue, ex, '');
console.log('consuming');
ch.consume(q.queue, msg => {
console.log('Received [x]');
const p = msg.content.toString('base64');
pickle.loads(p).then(r => console.log('Res:', r));
// conn.close();
});
});
});
});
index.js (node-pickle)
const spawn = require('child_process').spawn,
Bluebird = require('bluebird');
module.exports.loads = function loads(pickle) {
return new Bluebird((resolve, reject) => {
const convert = spawn('python', [__dirname + '/convert.py', '--loads']),
stdout_buffer = [];
convert.stdout.on('data', function(data) {
stdout_buffer.push(data);
});
convert.on('exit', function(code) {
const data = stdout_buffer.join('');
// console.log('buffer toString', stdout_buffer[0] ? stdout_buffer[0].toString() : null);
if (data == -1) {
resolve(false);
} else {
let result;
try {
result = JSON.parse(data);
} catch (err) {
console.log('failed parse');
result = false;
}
resolve(result);
}
});
convert.stdin.write(pickle);
convert.stdin.end();
});
};
convert.py (node-pickle)
import sys
try:
import simplejson as json
except ImportError:
import json
try:
import cPickle as pickle
except ImportError:
import pickle
import codecs
import jsonpickle
def main(argv):
try:
if argv[0] == '--loads':
buffer = sys.stdin.buffer.read()
decoded = codecs.decode(buffer, 'base64')
d = pickle.loads(decoded, encoding='latin1')
j = jsonpickle.encode(d,False)
sys.stdout.write(j)
elif argv[0] == '--dumps':
d = json.loads(argv[1])
p = pickle.dumps(d)
sys.stdout.write(str(p))
except Exception as e:
print('Error: ' + str(e))
sys.stdout.write('-1')
if __name__ == '__main__':
main(sys.argv[1:])
The error I come up against at the moment is:
invalid load key, '\xef'
EDIT 1:
I am now sending the buffer string representation, instead of the buffer, to Python. I then use stdin to read it in as bytes. I started writing the bytes object to a file to compare to the data received from Node, to the buffer received when I subscribe to the data stream from a Python script. I have found that they seem to be identical, apart from certain \x.. sequences found when subscribing from Python, being represented as \xef\xbf\xbd when subscribing from Node. I assume this has something to do with string encoding?? Some examples of the misrepresented sequences are: \x80 (this is the first sequence after the b'; however \x80 does appear elsewhere), \xe3, and \x85.
EDIT 2:
I have now encoded the string I'm sending to Python as base64, then, in Python, decoding the stdin buffer using codecs.decode. The buffer I'm writing to the file now looks more identical to the Python only stream, with no more \xef\xbf\xbd substitutions. However, I now come up against this error:
'ascii' codec can't decode byte 0xe3 in position 1: ordinal not in range(128)
Also, I found a slight difference when trying to match the last 1000 characters of each stream. The is a section in the Python stream (\x0c,'\x023) that looks like this (\x0c,\'\x023) in the stream from Node. Not sure how much that'll affect things.
EDIT 3 (Success!):
After searching up my new error, I found the last piece of this encoding puzzle. Since I was working in Python 3, and the pickle came from Python 2.x, I needed to specify the encoding for pickle.loads as bytes or latin1(the one I needed). I was then able to make use of the wonderful jsonpickle package to do the work of JSON serializing the dict, changing datetime objects into date strings.
So I was able to get the node-pickle npm package to work. My flow of getting a buffer of pickled data from Node to Python to get back JSON is:
In Node
Encode the buffer as a base64 string
Send the string to the Python child process as a stdin input, not an argument
In Python
Read in the buffer from stdin as bytes
Use codecs to decode it from base64
If using Python 3, specify bytes or latin1 encoding for pickle.loads
Use jsonpickle to serialize python objects in JSON
In Node
Collect the buffer from stdout and JSON.parse it

Categories