What is the difference between .pb SavedModel and .tf SavedModel? - tensorflow

For.pb SavedModel : model.save("my_model") default saves to .pb
For .tf SavedModel : model.save("my_model",save_format='.tf')
I would like to know the difference between these two formats. Are they both SavedModel? Are they both the same ? Which is better? Both are TensorFlow extension?

See the documentation of tf.keras.Model.save. save_format can have one of two values:
tf (default in TensorFlow 2.x) means TensorFlow format, a SavedModel protocol buffers file.
h5 (default in TensorFlow 1.x) means the HDF5 Keras format, defined back when Keras was completely independent of TensorFlow and aimed to support multiple backends without being tied to anyone in particular.
In TensorFlow 2.x you should not ever need h5, unless you want to produce a file compatible with older versions or something like that. SavedModel is also more integrated into the TensorFlow ecosystem, for example if you want to use it with TensorFlow Serving.

Related

Build a Tensor Flow model from saved_model file

I trained a model using yolov5, Then exported it to TensorFlow saved_model format, the result was a yolo5s.pt file. As far as I know yolov5 uses PyTorch, I prefer TensorFlow. Now I want to build a model in TensorFlow using the saved_model file, how can I do it?
It will be preferable if the solution is in google colab, I didn't included my code because I don't have any ideas how to start.

modifying a tensorflow savedmodel pb file for inference with a custom op

I have a Tensorflow model trained in Python, exported to a .pb file and then used with Tensorflow Serving.
I have written a custom op that greatly speeds up the inference of some operators in this Tensorflow model, but only works for inference -- I can't use this custom op during training time.
I am wondering if it's possible for me to use this custom op with the .pb file in Tensorflow serving. I figure I will probably have to edit the .pb file such that it uses my custom op in place of the original op, and Tensorflow serving should then go about looking for the custom op implementation which I can link against its runtime.
So -- how does one go about modifying a Tensorflow .pb file and swap out operators? Are there example codes doing this that I can refer to?
Your best bet, if you for some reason can't train with the original ops, is probably proto surgery. I would look for some tools that let you convert a proto to ascii format, modify it, and convert it back to binary format. I found this gist of someone doing just that for saved model. You could then write tooling on top to try to replace with your custom op.

Keras SavedModel vs Tensorflow SavedModel

I'm reading the documentation and source code of the TensorflowJS converter and it makes a clear distinction between Keras SavedModel and Tensorflow SavedModel.
What are the differences between the formats and what is the cross-format support story?
Please refer to:
https://www.tensorflow.org/tutorials/keras/save_and_load#:~:text=Saving%20custom%20objects,-If%20you%20are&text=The%20key%20difference%20between%20HDF5,without%20requiring%20the%20orginal%20code.
It discusses the differences

How to run tensorflow 2.0 model inference in Java?

I have a Java application that use my old tensorflow models. I used to convert the .h5 weights and .json model into a frozen graph in .pb.
I used a similar code than in this github https://github.com/amir-abdi/keras_to_tensorflow.
But this code but it's not compatible with tf 2.0 model.
I couldn't find any other resources.
Is it even possible?
Thank you :)

Tensorflow Frozen Graph to SavedModel

I've been trying to use tensorflow.js, but I need the model in the SavedModel format. So far, I only have the Frozen Graph, as I used Tensorflow for Poets Codelab.
How can I convert the Frozen Graph into SavedModel?
I've been using the latest Python version and Tensorflow 1.8
The SavedModel is really just a wrapper around a frozen graph that provides assets and the serving signature. For a code implementation, see this answer.