I've got a following setup:
VB.NET Web-Service is running and it needs to regularly call Python script with machine learning model to predict some stuff. To do this my Web-Service generates a file with input for Python and runs Python script as a subprocess. The script makes predictions and returns them, as standard output, back to Web-Service.
The problem is, that the script requires a few seconds to import all the machine learning libraries and load saved model from drive. It's much more than doing actual prediction. During this time Web-Service is blocked by running subprocess. I have to reduce this time drastically.
What I need is a solution to either:
1. Improve libraries and model loading time.
2. Communicate Python script with VB.NET Web-Service and run Python all the time with imports and ML model already loaded.
Not sure I understood the question but here are some things I can think of.
If this is a network thing you should have python compress the code before sending it over the web.
Or if you're able to, use multithreading while reading the file from the web.
Maybe you should upload some more code so we can help you better.
I've found what I needed.
I've used web.py to convert the Python script into a Web-Service and now both VB.NET and Python Web-Services can communicate. Python is running all the time, so there is no delay for loading libraries and data each time a calculation have to be done.
Related
We’re using grpc and protobuf objects as our communication method for a reinforcement learning project, Using python as a client and an external program as a server.
We’re currently in the process of trying to get multiple environments running in parallel to speed up training times, hence the use of ray.
However I am running into issues getting our existing communication methods to serialize properly for ray.remote etc.
This is the main error message that comes up:
“TypeError: cannot pickle ‘google.protobuf.pyext._message.MessageDescriptor’ object”
Any potential solutions or ideas? My python knowledge is lacking a little some i’m not exactly sure what the best way around this is.
So far when dealing with web scraping projects, I've used GAppsScript, meaning that I can easily trigger the script to be run once a day.
Is there an equivalent service when dealing with python scripts? I have a RaspberryPi, so I guess I can keep it on 24/7 and use cronjobs to trigger the script daily. But that seems rather wasteful, since I'm talking about a few small scripts that take only a few seconds to run.
Is there any service that allows me to trigger a python script once a day? (without a need to keep a local machine on 24/7) The simpler the solution the better, wouldn't want to overengineer such a basic use case if a ready-made system already exists.
The only service I've found so far to do this with is WayScript and here's a python example running in the cloud. The free tier that should be enough for most simple/hobby-tier usecases.
I was working on a project that renders images on server and then the server send it to the client. The server is current AWS lambda function which runs a python script. This script has bpy module involved. But the problem is that importing that module is taking too much time. I want to somehow reduce that time so that my overall function costs and runtime is reduced. Any sort of help will be appreciated. Thanks
I have written a python program with "csv" file as backend. Actually the algorithm which is involved performs long and deep numerical computations.So I am not able to run it on my local computer.Last time,when I ran it,it took more than 2 hrs but didn't fully execute.Can anyone tell me Is there any way of executing the program with csv file as backend on some servers?
Or is there any other way to do it?
Also,I don't want to use spark for the purpose.
I'm running Caffe using python on AWS.
The scripts uses the GPU, uploads an existing model, and checks the output of it per image URL.
On the first few tries the script ran well. Than it stuck on a few different phases at each time.
Using 'top' I could see that ksoftirqd/0 gets about 93% of the CPU when the process is stuck.
I don't think there is a bug in my script, because originally it ran well no the server. When I reboot the server, sometimes it helps. But later we get the same problem.
Killing all python process on the server doesn't help. Only rebootting it.
Any ideas what I can do here?
It seems like you are experiencing a very high network load.
What exactly are you trying to download from URLs?
Are there any other processes running at the same time on the machine?
It is difficult to diagnose your problem without the specifics of your script.