AttributeError: module 'tensorflow.keras.layers' has no attribute 'Rescaling' - tensorflow

When I try:
normalization_layer = layers.Rescaling(1./255)
error message:
AttributeError: module 'tensorflow.keras.layers' has no attribute 'Rescaling'
How to fix it?

I had the same error in v2.5.0
tf.keras.layers.experimental.preprocessing.Rescaling()
I guess this is the "old" way to use this layer

Yes, I used a wrong version of tf. Rescaling in tf v2.70, i used v2.60.
A preprocessing layer which rescales input values to a new range.
Inherits From: Layer, Module
tf.keras.layers.Rescaling(
scale, offset=0.0, **kwargs
)

Related

TypeError: 'AutoTrackable' object is not callable

I am trying to run inference on my trained model following this tutorial. I am using TF 2.1.0 and I have tried with tf-nightly 2.5.0.dev20201202.
But I get TypeError: 'AutoTrackable' object is not callable when I hit the following line detections = detect_fn(input_tensor)
I am aware that
'AutoTrackable' object is not callable in Python
exists but I am not using tensorflow hub and I don't understand how the answer could help me.
Thanks
Try using detect_fn.signatures\['default'](input_tensor)
Changing detections = detect_fn(input_tensor) to
detections = detect_fn.signatures['serving_default'](input_tensor)
fixed the issue for me.

Problem in Keras with 'merge' - TypeError: 'module' object is not callable

I tried to merge layer3, layer4 and layer5 with following line of code:
layer = merge([layer3,layer4,layer5],mode='sum')
But it throws this error:
show the TypeError: 'module' object is not callable
Why is my code not working?
I assume you're trying to run source code written for an older Keras version. 'sum' just adds your layers element wise. You could also use TensorFlow to do the same:
layer = tf.add(layer3, layer4)
layer = tf.add(layer, layer5)

Tensorflow error: An initializer for variable dense_1/kernel of dtype: 'complex64' is required

I am getting the following error when the code comes to the dense layer -> second line (y_est =...)
Tensorflow error: An initializer for variable dense_1/kernel of dtype:'complex64' is required
My variable y_in has a complex value and it seems I have to initialize my dense layer with the same variable type (complex64) but I don't know how to do it.
Any ideas?
y_in = tf.reshape(input, shape=[-1,self.n])
y_est = tf.layers.dense(y_in, 20, activation= tf.nn.tanh) # line with error
h_hat = tf.layers.dense(y_est, 2, activation= None)
Thank you very much.
You have not specified your own custom kernel_initializer, and the standard initializers in TensorFlow do not support complex weights yet. See this ticket for the details and possible solutions.

how to create a tf.layers.Dense object

I want to create a dense layer in tensorflow. I tried tf.layers.dense(input_placeholder, units) which will directly create this layer and get result, but what I want is just a "layer module", i.e. an object of the class tf.layers.Dense(units). I want to first declare these modules/layers in a class, and then to have several member functions apply1(x, y), apply2(x,y) to use these layers.
But when I did in tensorflow tf.layers.Dense(units), it returned:
layer = tf.layers.Dense(100) AttributeError: 'module' object has no
attribute 'Dense'
But if I do tf.layers.dense(x, units), there's no problem.
Any help is appreciated, thanks.
tf.layers.Dense returns a function object that you later apply to your input. It performs variable definitions.
func = tf.layers.Dense(out_dim)
out = func(inputs)
tf.layers.dense performs both variable definitions and application of the dense layer to your input to calculate your output.
out = tf.layers.dense(inputs, out_dim)
Try to avoid the usage of placeholders, you have to feed_dict into the tf.Session so its probably causing this issue.
Try to use the new estimator api to load the data and then use dense layers as is done in the tensorflow's github examoples: [https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/tutorials/layers/cnn_mnist.py]:
tf.layers.Dense was not exported in TensorFlow before version 1.4. You probably have version 1.3 or earlier installed. (You can check the version with python -c 'import tensorflow as tf; print(tf.__version__)'.)

Tensorflow's fancy optimizers don't work with variable input size after upgrade to v1.2.1

I'm currently facing issues with the fancy tensorflow optimizers. The cost function is a simple cross entropy with varying input sizes (defined by None). No optimizer works other than GradientDescentOptimizer. Below are the errors I get:
Momentum Optimizer: AttributeError: 'Tensor' object has no attribute 'is_fully_defined'
RMSPropOptimizer: ValueError: Shape of a new variable (expanding/step4/deconv/bias/RMSProp/) must be fully defined, but instead was <unknown>.
AdamOptimizer: AttributeError: 'Tensor' object has no attribute 'is_fully_defined'
GradientDescentOptimizer: Works!
I worked with AdamOptimizer (with the same code) on TF1.0, which broke after an upgrade to TF1.2.1. I then replaced it with MomentumOptimizer, which initially worked (for a few runs), and then it never worked (weird, I know!).
the problem is confusing me too, please tell me why this problem happen if you solve it ,thanks a lot
i use the adam optimizer ,and i got the error :
AttributeError: 'Tensor' object has no attribute 'is_fully_defined'