ValueError: Input 0 of layer sequential is incompatible with the layer: - tensorflow

model = keras.models.Sequential([
keras.layers.Dense(30, activation = "relu", input_shape=[8]),
keras.layers.Dense(100, activation = "relu"),
keras.layers.Dense(1)])
model.compile(loss="mse", optimizer=keras.optimizers.SGD(lr=1e-3))
checkpoint_cb = keras.callbacks.ModelCheckpoint("Model-{epoch:02d}.h5")
history = model.fit(X_train, y_train, epochs=10,
validation_data=(X_valid,y_valid),
callbacks=[checkpoint_cb])
I am trying to fit a model using the callbacks but I am getting the following error:
ValueError: Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 8 but received input with shape [None, 28, 28]
What can be the possible error?

The shape of your X_train is (None,28,28) but you are giving input of shape (None,8) to your dense layer.
Reshape your X_train
X_train = X_train.reshape(-1, 28*28)
Model should be
model = keras.models.Sequential([
keras.layers.Dense(30, activation = "relu", input_shape=(784,)),
keras.layers.Dense(100, activation = "relu"),
keras.layers.Dense(1)])

Related

ValueError: Input 0 of layer lstm_27 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 5)

I have some pixel movement data and it has 5 features and 3715489 training samples. I keep getting this error and I don't know what I should make the input_shape for the LSTM.
X_train shape is (3715489,5). Do I need to reshape this?
y_train shape is (3715489, 8)
Here is my code:
model = Sequential()
model.add(LSTM(256,return_sequences=True, input_shape=(5,)))
model.add(Dense(8, activation='sigmoid'))
model.compile(optimizer='adam',loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
model.fit(x_train, y_train, epochs=100,batch_size=320)
As mentioned in the error,
ValueError: Input 0 of layer lstm_27 is incompatible with the layer: expected ndim=3, found ndim=2
The LSTM layer expects 3D inputs with a shape of (batch_size, timesteps, input_dim). You can pass (timesteps, input_dim) for input_shape argument. But you are setting input_shape (5,). This shape does not include timesteps dimension. Adding an extra dimension to input_shape will solve the problem.
#Reshape data
x_train = tf.reshape(x_train,(-1, 1, 5))
y_train = tf.reshape(y_train,(-1, 1, 8))
model = Sequential()
model.add(tf.keras.layers.LSTM(256,return_sequences=True, input_shape=(1,5)))
model.add(tf.keras.layers.Dense(8, activation='sigmoid'))
model.compile(optimizer='adam',loss='categorical_crossentropy', metrics=['accuracy'])
print(model.summary())
model.fit(x_train, y_train, epochs=1,batch_size=320)
Please refer to this link for more information. Thank you!

How can I set 'input_shape' of keras.layers.SimpleRNN, when Data is unvariate?

I am trying to do time-series forecasting using RNN, but an error continuously occurred in 'input_shape' of keras.layers.SimpleRNN,
but I could not solve it, so I would like to ask a question.
First of all, below is the code. and This is Error Message:
ValueError: Input 0 of layer sequential is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (None, 1)
# X_train.shape = (58118,)
# y_train.shape = (58118,)
X_train, X_test, y_train, y_test = train_test_split(x,y,test_size=0.2,shuffle=False,random_state=1004)
X_train,X_val,y_train,y_val = train_test_split(X_train,y_train,test_size=0.125,shuffle=False,random_state=1004)
print(X_train.shape)
print(y_train.shape)
with tf.device('/gpu:0'):
model = keras.models.Sequential([
keras.layers.SimpleRNN(20, return_sequences=True, input_shape=[None,1]),
keras.layers.SimpleRNN(20, return_sequences=True),
keras.layers.TimeDistributed(keras.layers.Dense(10))
])
model.compile(loss="mse", optimizer="adam")
history = model.fit(X_train, y_train, epochs=20,validation_data=(X_val, y_val)) #Error
model.save('rnn.h5')
SimpleRNN expects inputs: A 3D tensor, with shape [batch, timesteps, feature]
Sample Code
inputs = np.random.random([32, 10, 8]).astype(np.float32)
simple_rnn = tf.keras.layers.SimpleRNN(4)
output = simple_rnn(inputs)
Output has shape [32, 4].

Logits and Labels must have the same shape : Tensorflow

I m trying to classify Cats vs Dogs Using a CNN Network, However despite checking twice I am not able to find the error where it is coming . According to me the loss functions and shapes are in order , still I am not able to find the source of the error
!unzip cats_and_dogs.zip
PATH = 'cats_and_dogs'
train_dir = os.path.join(PATH, 'train')
train_image_generator = ImageDataGenerator(rescale=1./255)
train_data_gen = train_image_generator.flow_from_directory(batch_size=batch_size,
directory=train_dir,
target_size=(IMG_HEIGHT, IMG_WIDTH),
class_mode='binary')
augmented_images = [train_data_gen[0][0][0] for i in range(5)]
plotImages(augmented_images)
model = Sequential()
model.add(Conv2D(25,kernel_size=3,input_shape=(IMG_HEIGHT, IMG_WIDTH, 3),activation="relu"))
model.add(MaxPooling2D())
model.add(Conv2D(25,kernel_size=3,activation="relu"))
model.add(MaxPooling2D())
model.add(Conv2D(25,kernel_size=3,activation="relu"))
model.add(MaxPooling2D())
model.add(Conv2D(25,kernel_size=3,activation="relu"))
model.add(Dense(64,activation="relu"))
model.add(Dense(1,activation="sigmoid"))
model.summary()
model.compile(optimizer="adam",metrics=['accuracy'],loss='binary_crossentropy')
history = model.fit_generator(train_data_gen)
The error that I'm struggling with is
ValueError: logits and labels must have the same shape ((None, 15, 15, 1) vs (None, 1))
I forgot to Flatten my Tensor before flowing it to Dense layers

Tensorflow neural network does not work, incompatible types

This is my code:
X_train, Y_train, X_test, Y_test = load_data(DATA_PATH)
model = keras.Sequential([
# input layer
# 1st dense layer
keras.layers.Dense(256, activation='relu', input_shape=(X_train.shape[1], X_train.shape[2], X_train.shape[3])),
# 2nd dense layer
keras.layers.Dense(128, activation='relu'),
# 3rd dense layer
keras.layers.Dense(64, activation='relu'),
# output layer
keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',loss='categorical_crossentropy',metrics=['accuracy'])
model.summary()
classifier = model.fit(X_train,
Y_train,
epochs=100,
batch_size=128)
Y_train ,X_train and Y_test ,X_test are numpy arrays. X_train contains 800 and X_test 200 .png pictures of size 128X128.
Y_train contains 800 labels (80x1, 80x2, etc.) and Y_test contains testing target (20x1, 20x2, etc.).
When I try to run this program I get the following error:
ValueError: Shapes (None, 1) and (None, 128, 128, 10) are incompatible
You need to reshape your input
Here is a running code
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
X_train = tf.random.normal(shape=(800,128,128,3))
X_test = tf.random.normal(shape=(200,128,128,3))
Y_train = tf.random.normal(shape=(800,10))
Y_test = tf.random.normal(shape=(200,10))
#reshape
X_train = tf.reshape(X_train, shape=(800, 128*128*3))
model = keras.Sequential([
# input layer
# 1st dense layer
keras.layers.Dense(256, activation='relu', input_shape=(X_train.shape[0], X_train.shape[1])),
# 2nd dense layer
keras.layers.Dense(128, activation='relu'),
# 3rd dense layer
keras.layers.Dense(64, activation='relu'),
# output layer
keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',loss='categorical_crossentropy',metrics=['accuracy'])
model.summary()
classifier = model.fit(X_train,
Y_train,
epochs=100,
batch_size=128)

Keras ImageDataGenerator() expect 3D data and 4-rank tensor at once?

I've been trying to use keras.preprocessing.image.ImageDataGenerator() on the MNIST for extra examples.
I'm using a fully-connected network in Keras on top of Keras. I begin by reshaping the 3D MNIST into 4D tensors before building and compiling a model, then I use data augmentation to help fit the model.
X_train = X_train.reshape(X_train.shape[0], 28, 28, 1)
X_test = X_test.reshape(X_test.shape[0], 28, 28, 1)
model = Sequential(name="mlp")
model.add(ll.InputLayer([28, 28]))
model.add(ll.Flatten())
model.add(ll.Dense(512, kernel_regularizer=regularizers.l2(0.04)))
model.add(ll.Activation('relu'))
model.add(ll.Dense(512, kernel_regularizer=regularizers.l2(0.04)))
model.add(ll.Activation('relu'))
model.add(ll.Dense(256, kernel_regularizer=regularizers.l2(0.04)))
model.add(ll.Activation('relu'))
model.add(ll.Dense(128, kernel_regularizer=regularizers.l2(0.04)))
model.add(ll.Activation('relu'))
model.add(ll.Dense(32, kernel_regularizer=regularizers.l2(0.04)))
model.add(ll.Activation('relu'))
model.add(ll.Dense(10, activation='softmax'))
model.compile("adam", "categorical_crossentropy", metrics=["accuracy"])
gen = ImageDataGenerator(rotation_range=8, width_shift_range=0.08, shear_range=0.3,
height_shift_range=0.08, zoom_range=0.08)
test_gen = ImageDataGenerator()
train_generator = gen.flow(X_train, y_train, batch_size=64)
test_generator = test_gen.flow(X_test, y_test, batch_size=64)
...
model.fit_generator(train_generator, steps_per_epoch=60000//64, epochs=5,
validation_data=test_generator, validation_steps=10000//64)
I get this error:
5 model.fit_generator(train_generator, steps_per_epoch=60000//64, epochs=5,
----> 6 validation_data=test_generator, validation_steps=10000//64)
ValueError: Error when checking input: expected input_6 to have 3 dimensions, but got array with shape (256, 28, 28, 1)
But then when I omit it converting to a 4d tensor, this happens:
52 test_gen = ImageDataGenerator()
---> 53 train_generator = gen.flow(X_train, y_train, batch_size=64)
54 test_generator = test_gen.flow(X_test, y_test, batch_size=64)
ValueError: ('Input data in `NumpyArrayIterator` should have rank 4. You passed an array with shape', (50000, 28, 28))
It seems that the train and test generator objects generate 4D tensors, but the model, itself, wants 3D tensors.
Well, your data is (batch, 28, 28, 1) while your model (the input layer) is expecting (batch, 28, 28).
Solving this is as simple as changing the input shape of the model:
#you don't need to add an input layer, actually, just pass the shape to the first layer:
model = Sequential(name="mlp")
model.add(ll.Flatten(input_shape=(28,28,1)))