Use another Activation Function in YOLOv7 - yolo

How do you change the activation function in Yolov7? For example, the default activation function of yolov7 (except tiny) is SiLU according to YOLOv7 paper and I want to change to ReLU or other activation functions
I found a solution but it's for yolov5.

Related

Multiple Activation Functions for multiple Layers (Neural Networks)

I have a binary classification problem for my neural network.
I already got good results using the ReLU activation function in my hidden layer and the sigmoid function in the output layer.
Now I'm trying to get even better results.
I added a second hidden layer with the ReLU activation function, and the results got even better.
I tried to use the leaky ReLU function for the second hidden layer instead of the ReLU function and got even better results, but I'm not sure if this is even allowed.
So I have something like that:
Hidden layer 1: ReLU activation function
Hidden layer 2: leaky ReLU activation function
Hidden layer 3: sigmoid activation function
I can't find many resources on it, and those I found always use the same activation function on all hidden layers.
If you mean the Leaky ReLU, I can say that, in fact, the Parametric ReLU (PReLU) is the activation function that generalizes the tradional rectified unit as well as the leaky ReLU. And yes, PReLU impoves model fitting with no significant extra computational cost and little overfitting risk.
For more details, you can check out this link Delving Deep into Rectifiers

Do you need to define derivative function for custom activation function in tensorflow 2 keras?

I see in some places that you need to define the derivative function for your custom activation. Is this true? or is all you need to do just pass a tensorflow-compatible function to the wrapper and tensorflow.keras takes care of the rest?
Ie.
def my_actv(x):
return x * x
model.add(Activation(my_actv))
The derivative needs to be defined if your function is not differentiable at every point. For example, relu is not differentiable at zero.

What is the default activation function of cudnnlstm in tensorflow

What's the default activation function of cudnnlstm in TensorFlow? How can I set an activation function such as relu? Maybe it's just linear model? I read the document, but I did not find it.
For example, the code is below:
lstmcell=tf.contrib.cudnn_rnn.CudnnLSTM(1,encoder_size,direction="bidirectional")
hq,_ =lstmcell(query)
And I read the document of TensorFlow From this link.
The function is below
__init__(
num_layers,
num_units,
input_mode=CUDNN_INPUT_LINEAR_MODE,
direction=CUDNN_RNN_UNIDIRECTION,
dropout=0.0,
seed=None,
dtype=tf.float32,
kernel_initializer=None,
bias_initializer=None,
name=None
)
And no keyword to set a parameter such as "activation = "tanh" just like tf.nn.rnn_cell.LSTMell.
So what's the default activation function of cudnnlstm in TensorFlow, and how to change it to leaky_relu.
tf.contrib.cudnn_rnn.CudnnLSTM() : Tanh
This was given in the Keras github.
https://github.com/keras-team/keras/issues/8510#issuecomment-429255318
Nvidia documentation.
https://devblogs.nvidia.com/optimizing-recurrent-neural-networks-cudnn-5/
To answer OP's 2nd question which was edited in later, there is currently no way to set a custom activation function for CudnnLSTM and CudnnGRU.

How to change the activation function in LSTM cell in Tensorflow

I am trying to change the activation function in the LSTM cell from the new 1.0 release of Tensorflow but am having difficulty.
There is tf.contrib.rnn.LSTMcell which the API states should allow for changing functions but it does not seem to be implemented yet for this cell.
Furthermore, tf.contrib.rnn.BasicLSTMCell, which also should allow for different activation functions doesn't seem to exist anymore.
Do I just need to wait or is there another solution?
When you instantiate both tf.contrib.rnn.LSTMcell and tf.contrib.rnn.BasicLSTMCell you can pass the activation function as the activation parameter. If you look at the linked documentation, you'll see, for example, that the constructor's signature for BasicLSTMCell is
__init__(num_units, forget_bias=1.0, input_size=None, state_is_tuple=True, activation=tf.tanh)

how to change activation function in DNNClassifier in tensorflow r0.9?

I couldn't find a way to change activation function in DNNClassifier. The document is not well written. I want to do something like:
classifier = learn.DNNClassifier(hidden_units=[8,16,8], n_classes=2, activation_fn=relu)
But there is no activation_fn in the fucntion, so I can hardly change it.
Can anyone help? Thanks,
So there are a bunch of different activation functions out there. The dictionary below just gives you the more common ones. You can find out about all activation function here: https://www.tensorflow.org/versions/r0.11/api_docs/python/nn.html
activation = {'relu': tf.nn.relu,
'tanh': tf.nn.tanh,
'sigmoid': tf.nn.sigmoid,
'elu': tf.nn.elu,
'softplus': tf.nn.softplus,
'softsign': tf.nn.softsign,
'relu6': tf.nn.relu6,
'dropout': tf.nn.dropout}