I have trained a tiny yolo3 model with custom data on Keras and want to deploy the model onto a Raspberry Pi.
I have converted the Keras .h5 model to a int-quantized .tflite model and wanted to do inference with tflite_support.task, which is only for SSD networks it seems, so that doesn't work for yolo, raises error that it requires a metadata.
So my question now is what would be the best way to deploy a .h5 Keras model onto a Raspberry Pi, I have also tried to convert it to a frozen .pb and use opencv dnn but the conversion doesn't seem to work even following this: How to export Keras .h5 to tensorflow .pb?
Running Keras on a Raspberry Pi can't be really an option, since it would require the full tensorflow installation.
Is there a lightweight way for deployment using opencv.dnn or tflite interpreter?
Related
Working on my local computer, I've created a Tensorflow Object Detector. I have exported the model (which I've tested using the checkpoints) to a protobuf file as well as several others (TF lite, TF js, etc). I now need to transfer this trained model to another computer that doesn't have the Object Detection API or other things I needed to build the model.
Do I need all these dependencies on the new machine? Or, does the protobuf file contain everything that the machine will need? The new machine only has the basic anaconda environment packages as well as tensorflow.
Protobuf files most commonly contains both model and weights. So in theory you can load your model on any machine with TensorFlow.
The only problem that I can think of is saving custom layers/losses/optimizers and data pre/postprocessing.
Im using Azure ML Studio to create an automated ML pipeline. I've successfully gotten my model to be trained and tested in Azure, but it fails on model.to_json() and model.save_weights().
I believe these functions do not exist on my model as scikit-multilearn is a wrapper around Keras. However, I want to be able to save my model and weight so I can deploy them to a webservice service. The scikit-multilearn model I'm using is Binary Relevance.
Thanks to anyone who helps.
I have trained my Keras model and converted it into a coreML model.
I have also developed an iPhone app using Swift.
Now I want to extract features from the input audio files using librosa library and pass those features to the trained model to get predictions. The prediction results will be displayed on the iPhone.
How can I achieve this? Am I missing out on something? Kindly help on this!
I am new to the Swift and iOS development world.
I have also similar task.
I partially ported libRosa to Swift.
It's in development, but please try:
https://github.com/dhrebeniuk/RosaKit
I'm trying to use a custom keras model I trained with tensorflow-gpu on my desktop with Python on a mobile phone (Android), however I need to run it with Python on the phone as well. I looked up TensorFlow Lite, however that appears to be written for Java.
Is there any lite (Python) version of TensorFlow, some kind of barebones package that's just set up for making predictions from a TensorFlow/keras model file? I'm trying to focus on saving space, so a solution under 50mb would be desired.
Thanks
TensorFlow Serving was built for the specific purpose of serving pre-trained models. I'm not sure if it runs (or how difficult to make it run) on Android or what it's compiled footprint is, if it's less than 50MB. If you can make it work, please do report back here!
I have a model built using python and tensorflow.
The model is trained and works well. I don't understand how I can deploy it? I mean how can I call this model in order to obtain a score on actual data?
I cannot use Watson ML deploy because of TensorFlow.
DSX supports training TensorFlow (without GPUs). I hear DSX will support training TensorFlow with GPUs and then deploying into Watch Machine Learning (WML) in early 2018.
For other models that you've built in DSX using SparkML, ScikitLearn, XGBoost and SPSS, go here for details on how deploy using WML:
Scala Jupyter Notebook end-to-end tutorial-Train and deploy a SparkML model
Python Jupyter Notebook end-to-end tutorial-Train and deploy a SparkML model
Python Jupyter Notebook: Recognition of hand-written digits--Train and deploy Scikit Learn model