Custom Layer not supporting serialized custom activation function - tensorflow

since the release of version 2.6 of Tensorflow I am having a issue I did not have with version 2.5.
The following code works OK:
from tensorflow.keras.utils import get_custom_objects
from tensorflow.keras.layers import Dense
def my_act(x):
return x
get_custom_objects().update({"my_act": my_act})
dense = Dense(3, activation="my_act")
However, if I try to do the same but with a custom layer instead of Tensorflow built-in layers, I have the error:
ValueError: Unknown activation function: my_act. Please ensure this object is passed to the `custom_objects` argument. See https://www.tensorflow.org/guide/keras/save_and_serialize#registering_the_custom_object for details.
Here you have the minimum code to reproduce plus I show that with version 2.5 works ok (You need to restart the runtime to run it tho).

Try to import activations like this:
from tensorflow.keras import activations
instead of from tensorflow.python.keras import activations.
In tensorflow 2.7 and later versions tensorflow.python will no longer exist, and it seems in TF 2.6 already it is not compatible with some other functions.

Related

Pylance "layers" is not a known member of "None" when calling model.layers in Tensorflow v2?

I'm getting this Pylance warning in the 2nd code line when I call model.layers and I have no idea why.
Yet, I can't find any questions regardin the same issue.
Although it is just a warning, I'd like to understant it.
Does anyone have any solution to this?
import tensorflow as tf
from tensorflow import keras
from keras.models import load_model
model = load_model('my_model_final.h5', compile=False)
for layer in model.layers[0:20]:
layer.trainable = False

Module not found using Input function from tensorflow.keras.layers

Im quite new learning machine learning and my first project is the creation of a Neural Network in order to detect key facial points on google colab. Everything has been working ok but today when I wanted to train my neural network I came accross with an error that has never appeared before when I trained my neural network.
The error is:
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-189-47fd3efd0229> in <module>()
5
6
----> 7 X_input = Input(input_shape)
8
9 # Zero-padding
4 frames
/usr/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)
ModuleNotFoundError: No module named 'keras.engine.base_layer_v1'
---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
To view examples of installing some common dependencies, click the
"Open Examples" button below.
I don't understand the line ModuleNotFoundError: No module named 'keras.engine.base_layer_v1' because the line that is not working is when I'm using Input from tensorflow.keras.layers.
I really don't know what is going on because I never got this error before. I've seen that it could be the version of TensorFlow or maybe my libraries.
I am using 2.3.0 versions in TensorFlow and Keras and these are the libraries I am importing:
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.applications import DenseNet121
from tensorflow.keras.models import Model, load_model
from tensorflow.keras.initializers import glorot_uniform
from tensorflow.keras.utils import plot_model
from tensorflow.keras.callbacks import ReduceLROnPlateau, EarlyStopping, ModelCheckpoint, LearningRateScheduler
from IPython.display import display
from tensorflow.python.keras import *
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras import layers, optimizers
from tensorflow.keras.applications.resnet50 import ResNet50
from tensorflow.keras.layers import *
from tensorflow.keras import backend as K
from keras import optimizers
I would really appreciate any help :)
Re-installing tensorflow and keras works for me

AttributeError: module 'tensorflow.compat.v2.__internal__' has no attribute 'tf2'

enter image description here
Above is the image:
The error is this:
LOCAL.GENERATED_WITH_V2 = tf.__internal__.tf2.enabled()
AttributeError: module 'tensorflow.compat.v2.__internal__' has no attribute 'tf2'
Thanks
From Tensorflow 2.x onward, keras is no longer maintained and it became a part of Tensorflow. I would recommend instead of import keras, you should try from tensorflow import keras or import tensorflow as tf and use tf.keras.
You can import Sequential module from tensorflow as shown below
import tensorflow as tf
from tf.keras import Sequential
For more information you can refer this and this

An error ocurred while starting the kernel . I think because of two python version but unable to figure out

I am trying to fit a model here but thing is that every time I fit a model my kernel dies I tried every other method but it did't worked.
I think there may be possibility of having two python versions installed but I don't know how to fix that or even verify that.
Also I am using MAC
I have tried updating reinstalling everything
#Importing libraries
import numpy as np
import pandas as pd
from sklearn.preprocessing import LabelEncoder,OneHotEncoder,StandardScaler
from sklearn.model_selection import train_test_split,cross_val_score
from keras.layers import Dense
import keras
from sklearn.metrics import confusion_matrix,accuracy_score
from keras.wrappers.scikit_learn import KerasClassifier
#Importing Datasets
dataset=pd.read_csv('Churn_Modelling.csv')
X=dataset.iloc[:,3:13].values
y=dataset.iloc[:,13].values
#Data preprocessing
le1=LabelEncoder()
X[:,1]=le1.fit_transform(X[:,1])
le2=LabelEncoder()
X[:,2]=le2.fit_transform(X[:,2])
h1=OneHotEncoder(categorical_features=[1])
X=h1.fit_transform(X).toarray()
X=X[:,1:]
#Splitting Dataset
X_train,X_test,y_train,y_test=train_test_split(X,y,test_size=0.2,random_state=0)
#Feature Scaling
sc=StandardScaler()
X_train=sc.fit_transform(X_train)
X_test=sc.transform(X_test)
#Making ANN hidden layer
classifier=keras.models.Sequential()
classifier.add(Dense(units=6,activation="relu",kernel_initializer="uniform",input_shape=(11,)))
#Adding second hidden layer
classifier.add(Dense(units=6,activation='relu',kernel_initializer='uniform'))
#Adding output layer
classifier.add(Dense(units=1,activation='sigmoid',kernel_initializer='uniform'))
#Compiling ANN
classifier.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
Till here it works like a charm with some warnings
#Making predictions and evaluating it
classifier.fit(X_train,y_train,epochs=100,batch_size=10)
But when I execute this it shows
An error ocurred while starting the kernel
b''
Any one knows how to solve this ?
Maybe this might help you: https://github.com/spyder-ide/spyder/issues/2812
If you are using Spyder, try:
conda update setuptools

Error importing keras backend - cannot import name has_arg

i attempt to import keras backend to get_session as follows, but i encounter an error:
There should be no need to import the tensorflow_backend explicitly.
Look at the first lines of an example from the Keras documentation:
# TensorFlow example
>>> from keras import backend as K
>>> tf_session = K.get_session()
[...]
As long as you are using the tensorflow backend, the get_session() function should be available.