How to step into Tensoflow/Keras built-in class in PyCharm debugger? - tensorflow

from keras import models
from keras import layers
model = models.Sequential()
model.add(layers.Dense(2, activation='relu', input_shape=(10000,)))
For example, I want to step into the Sequential class in PyCharm, but it seems it doesn't do that. When I tried, it went to the Model's new() method, not the Sequential class's initializer. I suppose it should go here:
https://github.com/keras-team/keras/blob/master/keras/engine/sequential.py
Is there a way to make this debug possible in PyCharm? Similarly, I found I can't inspect other TF/keras classes or methods in PyCharm.

Related

Inspecting functional keras model structure

I would like to inspect the layers and connections in a model, after having created a model using the Functional API in Keras. Essentially to start at the output and recursively enumerate the inputs of each layer instance. Is there a way to do this in the Keras or TensorFlow API?
The purpose is to create a more detailed visualisation than the ones provided by Keras (tf.keras.utils.plot_model). The model is generated procedurally based on a parameter file.
I have successfully used attributes of the KerasTensor objects to do this inspection:
output = Dense(1)(...)
print(output)
print(output.node)
print(output.node.keras_inputs)
print(output.node.keras_inputs[0].node)
This wasn't available in TF 2.6, only 2.7, and I realise it's not documented anywhere.
Is there a proper way to do this?

How to fix warning that Keras Layer Input is already depreciated in google colab

Hi guys i am currently studying machine learning in google colab and right now I experience this error WARNING:tensorflow:Please add keras.layers.InputLayer instead of keras.Input to Sequential model. keras.Input is intended to be used by Functional model. in google colab.
Here is my import classes:
# Import classes
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Input, Dense, Dropout
Here is my model Definition:
# Model definition
model = Sequential()
model.add(Input(shape=(8,)))
model.add(Dense(16,activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(8,activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(1,activation='sigmoid'))
Generally it is recommend to use the functional layer API via Input, (which creates an InputLayer) without directly using InputLayer.
When using InputLayer with Keras Sequential model, it can be skipped by moving the input_shape parameter to the first layer after the InputLayer. For more details you can refer this.

TF keras layer is no longer saveable?

After recent upgrade to Tensorflow 2.3 i cannot save TF-agents layers, i get this:
AttributeError: 'ActorDistributionNetwork' object has no attribute 'save_weights'
Since ActorDistributionNetwork is a subclass of tf.keras.layers.Layer, have individual keras layer ability to save themselves been removed? I could not find anything about this in release changes neither for tensorflow nor for tf-agents.
Using model.save_weights is not very convinient for tf-agents, since i have to use different combinations of layers for a custom agent.

How to load tf.keras models with keras

I've been using the keras module from tensorflow 1.12.0 for training and saving models. I recently came across a seemingly useful library for visualization of the weights/outputs, but they require the models be loaded as a Keras model. I'm running into an error trying to load my tf.keras models using keras, and was hoping someone could provide a solution. Python version 3.5.2, Keras version 2.2.4.
I've defined the custom object for the GlorotUniform since keras doesn't recognize that initializer. Afterwards, when I try to load the model, I get a TypeError.
# This works
model = tf.keras.models.load_model('./densenet_model.h5')
# This does not work
model = keras.models.load_model('./densenet_model.h5', custom_objects={"GlorotUniform": tf.keras.initializers.glorot_uniform})
# This is the error that happens
TypeError: tuple indices must be integers or slices, not list
In summary, I was wondering if there was a simple way to convert a model created with tf.keras to a keras model.
I figured out a workaround. I just load the model with tf.keras.load_model, then save_weights. Then I build the same model with Keras and just use load_weights. I verified that the weights were loaded appropriately by checking the output with my validation dataset.
Instead of from keras.models import load_model I used from tensorflow.python.keras.models import load_model. The problem is solved.

Cannot Reload saved Keras model using tensorflow

I am working in a single jupyter notebook. I create and train a very simple CNN with keras. It compiles, fits, and predicts fine. I save it with:
model.save("mymodel.hd5")
Model is a keras.models.Sequential.
I then read that back in with:
reload_keras_model = keras.models.load_model("mymodel.hd5")
That also works fine. However if I try to read the model in using tensorflow via:
from tensorflow.keras.models import load_model
reload_tf_mmodel = load_model("mymodel.hd5")
That fails with:
ValueError: Unknown layer:layers
Most of the threads I've read on github say "update your model" or comments about custom objects (I'm not using any). My target platform is the rpi zero and I've been able to install tf but unable to install keras, and that's why I want to load via tf. Why would keras and tf.keras handle this model differently and what do I need to update/change to read it in with tf.keras?
While keras (can) use TF as Backend, it does not guarantee that the saved model is readable in TF as well.
Note that you can use keras with both theano and tf, thus reload_keras_model = keras.models.load_model("mymodel.hd5") will work good with both backends, as the saving/loading is done in the "keras" part, and not using the backend.
You can use this tool: keras_to_tensorflow
Or something similar.