How to convert Dlib weights into tflite format? - tensorflow

I want to convert Dlib weights for Face Detection, Face landmarks and Face recognition that is in .dat format into .tflite format. Tensorflow lite requires input format in tensorflow_saved model/ Frozen graph (.pb) or keras model (.h5) format. Conversion of Dlib .dat to any of these will also work. Can anyone help me out that how to do it and are there converted files available?

Tensorflow lite requires input format in tensorflow_saved model/ Frozen graph (.pb) or keras model (.h5) format. Conversion of Dlib .dat to any of these will also work.
I think you're on the right track. You should try to convert Dlib to TensorFlow frozen graph, then convert the TensorFlow frozen graph to TensorFlow Lite format following the guide.
Have you tried this? Did you run into any problem when running tflite_convert? If you have further questions, please update the original question with detailed error messages.

Related

Build a Tensor Flow model from saved_model file

I trained a model using yolov5, Then exported it to TensorFlow saved_model format, the result was a yolo5s.pt file. As far as I know yolov5 uses PyTorch, I prefer TensorFlow. Now I want to build a model in TensorFlow using the saved_model file, how can I do it?
It will be preferable if the solution is in google colab, I didn't included my code because I don't have any ideas how to start.

Is it possible to quantize a Tensorflow Lite model to 8-bit weights without the original HDF5 file?

I'm trying to compile a tflite model with the edgetpu compiler to make it compatible with Google's Coral USB key, but when I run edgetpu_compiler the_model.tflite I get a Model not quantized error.
I then wanted to quantize the tflite model to an 8-bit integer format, but I don't have the model's original .h5 file.
Is it possible to quantize a tflite-converted model to an 8-bit format?
#garys unfortunately, tensorflow doesn't have an API to quantize a float tflite model. For post training quantization, the only API they have is for full tensorflow models (.pb, hdf5, h5, saved_model...) -> tflite. The quantization process happens during tflite conversion, so to my knowledge, there isn't a way to do this

Mask_RCNN: How to save trained model and convert it to tflite

I followed the tutorial at
https://machinelearningmastery.com/how-to-train-an-object-detection-model-with-keras/
After successful training I got 5 .h5 files:
mask_rcnn_kangaroo_cfg_0001.h5
mask_rcnn_kangaroo_cfg_0002.h5
mask_rcnn_kangaroo_cfg_0003.h5
mask_rcnn_kangaroo_cfg_0004.h5
mask_rcnn_kangaroo_cfg_0005.h5
I am a newbie to this, so my understanding may be wrong:
How can I convert these .h5 files to .pb files or better to .tflite files, so I can use them in an Android Object Detection app?
You don't need to convert these .h5 to .pb, you can directly convert keras .h5 files to tflite. Here is the official documentation on how to.
Make sure to have the model with layers supported by TFLite, as mentioned here.
Once you have the .tflite model you can run an interpreter on Android.

Can you convert a .tflite model file to .coreml - or back to a Tensorflow .pb file or keras h5 file?

General question: is there tooling to convert from tflite format to any other format?
I'm trying to convert a keras model to a CoreML model, but I can't because the model uses a layer type unsupported by CoreML (Gaussian Noise). Converting the keras .h5 model to a .tflite is simple, removes the offending layer (which is only used in training anyway), and performs some other optimisations. But it doesn't seem possible to convert out of the resultant tflite to any other format. Coremltools doesn't support tflite. I thought I could probably load the model from tflite into a tensorflow session, save a .pb from there, and convert that to coreml using coremltools, but I can't see a way to load the tflite model into a tensorflow session. I saw the documentation linked to in this question, but that seems to use the tflite interpreter to read the tflite model, rather than a "true" Tensorflow session.

How to retrieve original TensorFlow frozen graph from .tflite?

Basically I am trying to use google's pre trained Speaker-id model for speaker detection. But this being a TensorFlow Lite model, I can't use it on my Linux pc. For that, I am trying to find a converter back to its frozen graph model.
Any help on this converter or any direct way to use tensorflow Lite pretrained models on desktop itself, will be appreciated.
You can use the converter which generates tflite models to convert it back to a .pb file if that is what you're searching for.
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/toco/g3doc/cmdline_examples.md