Temporal Fusion Transformer in savedModel format - tensorflow

I am trying to save the model from here https://github.com/greatwhiz/tft_tf2/blob/master/README.md
in SavedModel format (preferably with Functional API). The source code saves the checkpoints.

Related

modifying a tensorflow savedmodel pb file for inference with a custom op

I have a Tensorflow model trained in Python, exported to a .pb file and then used with Tensorflow Serving.
I have written a custom op that greatly speeds up the inference of some operators in this Tensorflow model, but only works for inference -- I can't use this custom op during training time.
I am wondering if it's possible for me to use this custom op with the .pb file in Tensorflow serving. I figure I will probably have to edit the .pb file such that it uses my custom op in place of the original op, and Tensorflow serving should then go about looking for the custom op implementation which I can link against its runtime.
So -- how does one go about modifying a Tensorflow .pb file and swap out operators? Are there example codes doing this that I can refer to?
Your best bet, if you for some reason can't train with the original ops, is probably proto surgery. I would look for some tools that let you convert a proto to ascii format, modify it, and convert it back to binary format. I found this gist of someone doing just that for saved model. You could then write tooling on top to try to replace with your custom op.

Distinguish types of on-disk models

Tensorflow has several types of model formats:
TensorFlow SavedModel 2. Frozen Model 3. Session Bundle 4. Tensorflow Hub module
How can you distinguish between them on-disk? (to later use with tensorflowjs-converter)
And how is each model created?
Yup, there are a LOT of different model types, and they all have good reasons. I'm not going to claim that I have perfect clarity of each, but here's what I know (I think I know).
The .pb file: PB stands for protobuff or Protocol Buffer. This is the model structure, generally without the trained weights and is stored in a binary format.
The .pbtxt file: Nonbinary of the pb file for human reading.
Protobuff files that aren't frozen will need a checkpoint .ckpt file, too. The Checkpoint file is the missing set of weights that the pb needs.
The .h5 file: The model + weights from a Keras save
The .tflite file would be a TensorflowLite model
Frozen Model: A frozen model combines the pb with the weights file, so you don't have to manage two of them. Usually, this means adding the word frozen to the filename. I'm sure this can be inferred when loading the file, but on disk they are a bit more on the honor system and no ckpt file. This strips out extraneous graph info; it's basically like the "Production Ready" version of the model.
Session Bundle: Are a directory. They are nolonger used, and rare.
Tensorflow Hub Module: These are pre-existing popular models that are very likely already exported to TFJS and don't require you to manually convert them. I assume they are supported for Google's benefit, more so than ours. But it's nice to know if you use hub, you can always convert it.
A multi-exported grouping of files looks like this image. From here, you can see quite a few that you could turn into TFJS.

How to save the tensorflow model in .h5 format using tensorflow model checkpoints?

I am working on tensorflow project. I have successfully train, test and make prediction using python flask. But in order to make prediction each time i have to again load full model using checkpoints. If I save the model .h5 format, I don't need to load the dataset to predict the datasets. I am don't know how to save the tensorflow model in .h5 format using checkpoints. If anybody know how to do, please help me or forward me any link if possible.
Thanks.
You can save and restore tensorflow models using tf.train.Saver class.
Although this doesn't store the models in .h5 format.
You can refer to these sections for further clarity:
Save
Restore
Hope this helps!

Can you convert a .tflite model file to .coreml - or back to a Tensorflow .pb file or keras h5 file?

General question: is there tooling to convert from tflite format to any other format?
I'm trying to convert a keras model to a CoreML model, but I can't because the model uses a layer type unsupported by CoreML (Gaussian Noise). Converting the keras .h5 model to a .tflite is simple, removes the offending layer (which is only used in training anyway), and performs some other optimisations. But it doesn't seem possible to convert out of the resultant tflite to any other format. Coremltools doesn't support tflite. I thought I could probably load the model from tflite into a tensorflow session, save a .pb from there, and convert that to coreml using coremltools, but I can't see a way to load the tflite model into a tensorflow session. I saw the documentation linked to in this question, but that seems to use the tflite interpreter to read the tflite model, rather than a "true" Tensorflow session.

Convert frozen graph to tensorflow-js format

I have a SSD model (trained on custom dataset) using Google Object Detection API. I have frozen a checkpoint which generates couple of files (including a *.pb file).
Question : How to convert that frozen inference graph into web-convenient format which can be used by tf-js?
(PS : Official website do mentions an example on the similar lines but it expects saved models format, not frozen graph)
I found the answer. This is a two step conversion process (1) Freeze the checkpoint to frozen graph with input_type as encoded_image_string_tensor (help). (2) Now, we can use the tensorflow JS exporter.
(Note: It is possible that step2 will fail possibly because all the layers are not supported for conversion.)