Insert custom level into a pre trained model - tensorflow

I'm having trouble inserting a squeeze and excitation module into a pre-trained model.
how do you insert custom levels inside a pre-trained model (for example Inception v3)? Could you please help me? thank you very much.

Related

How to Convert tensorflow saved_model to frozen inference graph?

I train a model by tensorflow 2 to detecting vehicles, but I want to Convert tensorflow saved_model to frozen inference graph.
Can any one help?
It is not the recommended way to save your model and i would suggest you use saved model.
People around here can help if you explain why you want to use frozen graph specifically and saved model won't help.
If you still want to try freezing you can use this internal method to do so.

Is that a good idea to use transfer learning in real world projects?

SCENARIO
What if my intention is to train for a dataset of medical images and I have chosen a coco pre-trained model.
My Doubts
1 Since I have chosen medical images there is no point of train it on COCO dataset, right? if so what is a possible solution to do the same?
2 Adding more layers to a pre-trained model will screw the entire model? with classes of around 10 plus and 10000's of training datasets?
3 Without train from scratch what are the possible solutions , like fine-tuning the model?
PS - let's assume this scenario is based on deploying the model for business purposes.
Thanks-
Yes, it is a good idea to reuse the Pre-Trained Models or Transfer Learning in Real World Projects, as it saves Computation Time and as the Architectures are proven.
If your use case is to classify the Medical Images, that is, Image Classification, then
Since I have chosen medical images there is no point of train it on
COCO dataset, right? if so what is a possible solution to do the same?
Yes, COCO Dataset is not a good idea for Image Classification as it is efficient for Object Detection. You can reuse VGGNet or ResNet or Inception Net or EfficientNet. For more information, refer TF HUB Modules.
Adding more layers to a pre-trained model will screw the entire model?
with classes of around 10 plus and 10000's of training datasets?
No. We can remove the Top Layer of the Pre-Trained Model and can add our Custom Layers, without affecting the performance of the Pre-Trained Model.
Without train from scratch what are the possible solutions , like
fine-tuning the model?
In addition to using the Pre-Trained Models, you can Tune the Hyper-Parameters of the Model (Custom Layers added by you) using HParams of Tensorboard.

has ssd_mobilenet_v1_coco_2017_11_17 been trained on labelled dataset?

Has the COCO dataset been labeled on this pre-trained model?
I am trying to figure out if the object detection that happens after using this pre-trained model is actually an output of an unlabelled data, or there was some labeling which made the model more efficient.
Thanks in advance

How to train a Deeplab model from scratch in TensorFlow?

I am using deepLab to generate semantic segmentation masked images for a video in cityscapes datasets. So, I started with the pre-trained model xception65_cityscapes_trainfine provided on the modelzoo and trained it further on the dataset.
I am curious to know How I can start training it from scratch? and not end up just using the pre-trained model? could anyone suggest a direction on How I can achieve it?
Any contribution from the community will be helpful and appreciated.

tensorflow seq2seq with multiple outputs

I just started to work on tensorflow not so long. I'm working on the seq2seq model and using seq2seq example code.
I want to modify seq2seq model code to get top-k outputs (k is 5 or 10) for the Reinforcement learning model, not to get top-1 output.
First, I think I should modify decoder part of the seq2seq somehow, but I don't know which part is to change.
Is there any references or codes for the problem?
check out https://github.com/tensorflow/tensorflow/issues/654. There are some discussions on this, but no worked example yet.
tf.contrib.seq2seq.BeamSearchDecoder would do the magic for you.