tf.nn.relu vs tf.keras.activations.relu [closed] - tensorflow

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
The community reviewed whether to reopen this question last month and left it closed:
Original close reason(s) were not resolved
Improve this question
I see both tf.nn.relu and tf.keras.activations.relu computes only the ReLU function (no additional fully connected layer or something, as described here), so what's the difference between them? Does one just wraps the other?

tf.nn.relu : It comes from TensorFlow library. It is located in the nn module. Hence, it is used as an operation in neural networks. If x is a tensor then,
y = tf.nn.relu( x )
It is used in creating custom layers and NN. If you use it with Keras, you may face some problems while loading or saving the models or converting the model to TF Lite.
tf.keras.activations.relu : It comes from the Keras library included in TensorFlow. It is located in the activations module which also provides another activation functions. It is mostly used in Keras Layers ( tf.keras.layers ) for the activation= argument :
model.add( keras.layers.Dense( 25 , activation=tf.keras.activations.relu ) )
But, it can also be used as the example in the above section. It is more specific to Keras ( Sequential or Model ) rather than raw TensorFlow computations.
tf.nn.relu is a TensorFlow specific whereas tf.keras.activations.relu has more uses in Keras own library. If I create a NN with only TF, I will most probably use tf.nn.relu and if I am creating a Keras Sequential model then I will use tf.keras.activations.relu.

Related

Is there a PyTorch lightning equivalent for Tensorflow? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 months ago.
Improve this question
I saw PyTorch Lightning advertised as PyTorch but for people who don't want to worry so much about the underlying methodology. This narrative is on the PyTorch lightning website but also here for example.
For hardware reasons, does something similar exist for TensorFlow? I have a code example for neural nets here written in PyTorch and PyTorch Lightning but am not sure how to rewrite it in TensorFlow.
Probably the best association would be Keras (formerly separate from but now for some time integrated in TF - you can you Keras as a high level API).
Note that you can also use tensorflow_addons (I personally enjoy working with it) package and other libraries&wrappers that come into the aid of TensorFlow, because since Keras is integrated into TF, you will be also very likely to use them on your Keras code.

Pretrained alexnet in tensorflow [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 12 months ago.
Improve this question
I want to use pretrained Alexnet for transfer learning. I dont see its available in Keras library.
Am I missing something here?
Other Alternative I see here is to create model and
load pretrained weight
train from scratch
Training from scratch using imagenet dataset is not possible for me due to resource constraint.
Loading pre-trained weight will work.
Would you provide any pointers for getting the pretrained weight for Alexnet?
Thanks,
As of right now, Keras does not (officially) seem to offer a pre-trained AlexNet model. PyTorch, on the other hand, does. If you are willing to use a different framework for the task, you can use PyTorch. You can retrieve a pre-trained version of the AlexNet like so:
import torchvision.models as models
alexnet = models.alexnet(pretrained=True)
You can find the list of available pre-trained models here, and a transfer learning tutorial for image classification here.
Hope that answers your question!

How to use Variational Autoencoder as a Feature Extractor? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I want to use my VAE trained on an image dataset as a feature extractor for another task, so that I could for example replace a ResNet for feature extraction with my VAE.
Which Layers do I use for this?
With "standard" autoencoders you just take the encoding network, but since the latent layer of the VAE consist of mean and distribution I do not know which layers I should use for feature extraction.
Does somebody know how to use a VAE as a feature extractor and what to consider with using different components?
Hidden variables z are used in VAEs as the extracted features for dimensionality reduction. Here is an example dimensionality reduction from four features in the original space ([x1,x2,x3,x4]) to two features in the reduced space ([z1,z2]) (source):
Once you have trained the model, you can pass a sample to the encoder it extracts the features. You may find a Keras implementation example on mnist data here (see the plot_label_clusters function):

Is there an AlexNet model written with tensorflow without pre-trained weights? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I have been looking for AlexNet models written on tensor-flow, and all I found was codes using some pre-trained weights already.
Do you have any idea if there exist code in which weights are built during the execution of the model ?
Thanks.
You can find a nice article here:Finetuning AlexNet with TensorFlow
It contains the address of the github code
You can find a definition of the AlexNet model in TensorFlow in the path tensorflow/contrib/slim/python/slim/nets/alexnet.py of the TensorFlow repository (among the examples of what used to be TF-Slim and now is just tf.contrib.layers).
Another alternative is here with a link to the model. But you can always train from scratch and check yourself.
Note: This only runs for 30 or so epochs(atleast at the time of writing) with less accuracy then claimed in paper. But you can always tweak learning rate and run for more epochs to get better accuracy.

Is there an implementation of convolutional lstm in tensorflow? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am currently studying tensorflow. I just made some simple codes like CNN, RNN ans LSTM and so on. And now I want to implement convolutional lstm. I read this paper and tried to implement it as an exercise. However, there were, as far as I searched, no codes available in the internet. If someone knows where the available source code is, please let me know.
Yes, this is done in the Neural GPU TensorFlow model by Łukasz Kaiser and Ilya Sutskever.
It uses GRUs rather than LSTMs, but those are very similar cell types. The model is also a little different from the typical RNN implementations. The nuance is that the model does not accept new inputs in time past the first time step: the inputs are fed to the initial cell state so that this "mental image" state evolves trough timesteps.
The paper is here.
The neural GPU model implementation in TensorFlow can be found here.