The Android example that comes with Tensorflow downloads a protobuf file for InceptionV3 which contains both the graph and the values from the model. In the docs, I could only find how to serialize the graph (tf.Graph.as_graph_def) or save the variable values with a tf.train.Saver. How can you save everything to a single file, as done for that example?
I answered a similar question on this topic: Is there an example on how to generate protobuf files holding trained Tensorflow graphs?
The basic idea is to use tf.import_graph_def() to replace the variables in the original (training) graph with constants, and then write out the resulting GraphDef using tf.Graph.as_graph_def().
Related
I am looking for a method to read and further modify a tensorflow saved_model.
Current saved_model was transformed from a onnx model, containing a model.pb, a fingerprint.pb, an asset folder and a variables folder.
I am new to tensorflow and this might be a stupid question. Is there any method recommendation or guide? Thank you for helping.
I tried tf.saved_model.load() method, but the result it returns seems uneditable:object structure
Expecting to read and modify a tensorflow saved_model.
I have a Tensorflow model trained in Python, exported to a .pb file and then used with Tensorflow Serving.
I have written a custom op that greatly speeds up the inference of some operators in this Tensorflow model, but only works for inference -- I can't use this custom op during training time.
I am wondering if it's possible for me to use this custom op with the .pb file in Tensorflow serving. I figure I will probably have to edit the .pb file such that it uses my custom op in place of the original op, and Tensorflow serving should then go about looking for the custom op implementation which I can link against its runtime.
So -- how does one go about modifying a Tensorflow .pb file and swap out operators? Are there example codes doing this that I can refer to?
Your best bet, if you for some reason can't train with the original ops, is probably proto surgery. I would look for some tools that let you convert a proto to ascii format, modify it, and convert it back to binary format. I found this gist of someone doing just that for saved model. You could then write tooling on top to try to replace with your custom op.
I want to convert a Tensorflow model with the following structure to a .mlmodel file for use in an iOS app:
cub_image_experiment/
logdir/
val_summaries/
test_summaries/
finetune/
val_summaries/
cmds.txt
config_train.yaml
config_test.yaml
I'm following this tutorial: https://github.com/visipedia/tf_classification/wiki/CUB-200-Image-Classification
However, I'm having trouble understanding the structure of the project. Which files are important and how do I convert all the separate config files and everything into a single .mlmodel file so that I can use in my application?
I've looked online and all I could find was how to convert .caffemodel to .mlmodel or .pb file to .mlmodel. These are all single files, however my project has multiple files. I found a tutorial on how to convert a tf model into a single .pb file, however, that model's structure was different and it did not contain any yaml files. My project is not focused on creating a model at the moment, but merely integrating a model into an iOS app. I found this model interesting for an app idea and wanted to know if it can be integrated. If there are any tutorials out there that might help me in this sort of problem please let me know.
None of that stuff is used by the Core ML model. The yaml files etc are used only to train the TF model.
All you need to provide is a frozen graph (a .pb file) and then convert it to an mlmodel using tfcoreml.
It looks like your project doesn't have a frozen graph but checkpoints. There is a TF utility that you can use to convert the checkpoint to a frozen graph, see https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/freeze_graph.py
When I quantized the model by lite modules in tensorflow, I cann't check the weights values that had quantized.Is there any way to view these values in the .tflite files? or Is there any way to parse the .tflite files?
There are some neural network visualizers that can also provide an interface to inspect the file. I have been using Netron. You can click on the "weights" tab of the layer you are interested in to view the data. I haven't tried it yet, but there appears to be a floppy-disk save icon when you view weights/biases in the right side-bar.
The data format is a Google FlatBuffer, defined by the schema file here. You may prefer doing this if you want to do something with the data like output it in a different format. I found the output from parsing it myself using the schema.fbs file to match Netron's for the CNN's I passed in. You can check out the FlatBuffers documentation here.
here in first answer is guide, how to create json view of .tflite model. There you can see quantized values
I want to use Tensorboard with a non tensorflow app. I can see how to make the graph using GraphDef and associated classes but not sure how to write it out so that Tensorboard will read it. This means that I have the graph in a serialized form and it's not the python graph class from tensorflow.
For seeing graph in tensorboard, you need weight, tensor name and structure of the graph.
I dont understand your question totally, but if you are able to create graph.pb file then , it will be simple to see that thing in tensorboard you have to run this file here.
actually here we are creating dummy graph structure using
graph_def = graph_pb2.GraphDef()
and then giving our .pb file to set all these weight and name
graph_def.ParseFromString('pb file directory to read'.read())
import_graph_def(graph_def)
let me know more detail so that i can help you better way.