How to add hugging face lib to a Kaggle notebook [closed] - kaggle

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
How to add hugging face lib to a Kaggle notebook. I want to add this one to my notebook . the code sample below does not work in the notebook I have. is there some additional step I have missed?

All models of the huggingface model hub are available once you install the library. This one in particular is only available in PyTorch, so you should install both transformers and torch:
!pip install transformers torch
You can then use the model:
from transformers import BartForConditionalGeneration
model = BartForConditionalGeneration.from_pretrained("facebook/bart-large-cnn")
This model is a summarization model so I would recommend reading the summarization tutorial on Hugging Face's website.

Related

Pretrained alexnet in tensorflow [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 12 months ago.
Improve this question
I want to use pretrained Alexnet for transfer learning. I dont see its available in Keras library.
Am I missing something here?
Other Alternative I see here is to create model and
load pretrained weight
train from scratch
Training from scratch using imagenet dataset is not possible for me due to resource constraint.
Loading pre-trained weight will work.
Would you provide any pointers for getting the pretrained weight for Alexnet?
Thanks,
As of right now, Keras does not (officially) seem to offer a pre-trained AlexNet model. PyTorch, on the other hand, does. If you are willing to use a different framework for the task, you can use PyTorch. You can retrieve a pre-trained version of the AlexNet like so:
import torchvision.models as models
alexnet = models.alexnet(pretrained=True)
You can find the list of available pre-trained models here, and a transfer learning tutorial for image classification here.
Hope that answers your question!

need guidance on using pre-trained weights in segmentation_models API [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I want to use a pre-trained Unet model using segmentation_models API for the Cityscapes dataset, but I need the pre-trained weights for the same. Where can I find the pre-trained weights for a Unet model trained on the Cityscapes dataset?
Please guide me on this!!!
UNet is absent from the benchmark so i assume it is not adapted for this dataset (too slow and not enough performant probably). However, I advise you to start with DeepLabv3+ from Google which is not so complicated and more adapted for this dataset.
You can use this repository where it is implemented, well documented and useable with pretrained weights from cityscape dataset (and also PascalVOC dataset).

I cannot train my model using transfer learning (VGG16) on my GPU [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I am trying to use transfer learning to train my model to detect diseases in rice on images of the plant. I attempted using VGG16, but I could not get it to train with my GPU. I have an NVIDIA GeForce MX150.
Below is the code that I used to fit the model:
import tensorflow as tf
from tensorflow.python.client import device_lib
print(device_lib.list_local_devices())
with tf.device('/device:GPU:1'):
# fit the model
r = model.fit(
training_set,
validation_data=test_set,
epochs=20,
steps_per_epoch=len(training_set),
validation_steps=len(test_set)
)
Tensorflow GPU support requires a few dependencies. Please see https://www.tensorflow.org/install/gpu
Then, try tf.test.is_gpu_available() - if this is True, then your GPU is being used for training.
On a single GPU, you should not need to use with to train on GPU. For me to help more, please provide any logs or errors.

What is the advantage of using tensorflow instead of scikit-learn for doing regression? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am new to machine learning and I want to start doing basic regression analysis. I saw that scikit-learn provides a simple way to do this. But why people use tensorflow for regression instead? Thanks!
If the only thing you are doing is regression, scikit-learn is good enough and will definitely do you job. Tensorflow is more a deep learning framework for building deep neural networks.
There're people using Tensorflow to do regression maybe just out of personal interests or they think Tensorflow is more famous or "advanced".
Tensorflow is a deep learning framework and involves far more complex decisions concerning algorithm design.
In the first step, it is recommended to use sklearn, because you will get a first ml model with scikit-learn faster. Later you can use a dl model with tensorflow. :-)

Is there an AlexNet model written with tensorflow without pre-trained weights? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I have been looking for AlexNet models written on tensor-flow, and all I found was codes using some pre-trained weights already.
Do you have any idea if there exist code in which weights are built during the execution of the model ?
Thanks.
You can find a nice article here:Finetuning AlexNet with TensorFlow
It contains the address of the github code
You can find a definition of the AlexNet model in TensorFlow in the path tensorflow/contrib/slim/python/slim/nets/alexnet.py of the TensorFlow repository (among the examples of what used to be TF-Slim and now is just tf.contrib.layers).
Another alternative is here with a link to the model. But you can always train from scratch and check yourself.
Note: This only runs for 30 or so epochs(atleast at the time of writing) with less accuracy then claimed in paper. But you can always tweak learning rate and run for more epochs to get better accuracy.