In tensorflow 1.10,I could call function tensorflow.keras.applications.InceptionV3() to create a InceptionV3 model.
However, the keras.applications module seems to be removed in Tensorflow 1.11.
According to the API r1.11 reference, keras Applications are canned architectures with pre-trained weights.
Should I download the model file and load it?
Nevertheless, where could I get it from?
Related
I need to remove postprocess part from the model trained with Tensorflow Objection Detection API (SSD Arch). How can I do it?
I've been using the keras module from tensorflow 1.12.0 for training and saving models. I recently came across a seemingly useful library for visualization of the weights/outputs, but they require the models be loaded as a Keras model. I'm running into an error trying to load my tf.keras models using keras, and was hoping someone could provide a solution. Python version 3.5.2, Keras version 2.2.4.
I've defined the custom object for the GlorotUniform since keras doesn't recognize that initializer. Afterwards, when I try to load the model, I get a TypeError.
# This works
model = tf.keras.models.load_model('./densenet_model.h5')
# This does not work
model = keras.models.load_model('./densenet_model.h5', custom_objects={"GlorotUniform": tf.keras.initializers.glorot_uniform})
# This is the error that happens
TypeError: tuple indices must be integers or slices, not list
In summary, I was wondering if there was a simple way to convert a model created with tf.keras to a keras model.
I figured out a workaround. I just load the model with tf.keras.load_model, then save_weights. Then I build the same model with Keras and just use load_weights. I verified that the weights were loaded appropriately by checking the output with my validation dataset.
Instead of from keras.models import load_model I used from tensorflow.python.keras.models import load_model. The problem is solved.
As Keras becomes an API for TensorFlow, there are lots of old versions of Keras code, such as https://github.com/keiserlab/keras-neural-graph-fingerprint/blob/master/examples.py
from keras import models
With the current version of TensorFlow, do we need to change every Keras code as?
from tensorflow.keras import models
You are mixing things up:
Keras (https://keras.io/) is a library independent from TensorFlow, which specifies a high-level API for building and training neural networks and is capable of using one of multiple backends (among which, TensorFlow) for low-level tensor computation.
tf.keras (https://www.tensorflow.org/guide/keras) implements the Keras API specification within TensorFlow. In addition, the tf.keras API is optimized to work well with other TensorFlow modules: you can pass a tf.data Dataset to the .fit() method of a tf.keras model, for instance, or convert a tf.keras model to a TensorFlow estimator with tf.keras.estimator.model_to_estimator. Currently, the tf.keras API is the high-level API to look for when building models within TensorFlow, and the integration with other TensorFlow features will continue in the future.
So to answer your question: no, you don't need to convert Keras code to tf.keras code. Keras code uses the Keras library, potentially even runs on top of a different backend than TensorFlow, and will continue to work just fine in the future. Even more, it's important to not just mix up Keras and tf.keras objects within the same script, since this might produce incompatabilities, as you can see for example in this question.
Update: Keras will be abandoned in favor of tf.keras: https://twitter.com/fchollet/status/1174019423541157888
I have trained DNNClassifier using Python (conda tensorflow installation). The trained model needs to be used for evaluation using C_API. Is there a way to load both graph and weights of the trained model using C_API?
There is a way to load h5 and any data for C_API. Maybe some googling could help. I've found this article to be helpful.
And for DNNClassifier on C_API I think you should Implement it manually using pure Tensor Array on C_API. cmiimw
I see that there are many similar functions between tensorflow and keras like argmax, boolean_mask...I wonder why people have to use keras as backend along with tensorflow instead of using tensorflow alone.
Keras is not a backend, but it is a high-level API for building and training Neural Networks. Keras is capable of running on top of Tensorflow, Theano and CNTK. Most of the people prefer Keras due to its simplicity compared to other libraries like Tensorflow. I recommend Keras for beginners in Deep Learning.
A Keras tensor is a tensor object from the underlying backend (Theano,
TensorFlow or CNTK), which we augment with certain attributes that
allow us to build a Keras model just by knowing the inputs and outputs
of the model.
Theano vs Tensorflow
Tensorflow is necessary if you wish to use coremltools. Apple has promised support for architectures created using Theano but I haven't seen it yet.
Keras will require unique syntax sugar depending on the backend in use. I like the flexibility of Tensorflow input layers and easy-access to strong Google neural networks.