How to read and load tarfile to extract feature vector - tensorflow

I'm trying to use inception_v3 feature vectors for image classification. I have downloaded the tar file from tfhub which is a tar.gz file. The link to the feature vectors is "https://tfhub.dev/google/imagenet/inception_v3/feature_vector/5". This will give me the model without the classification layer.
I have downloaded the tar file and extracted it using tarfile library. But i'm not sure how to load/read the extracted file for using the feature vectors for classification.

You can use the tarfile module to extract tar.gz files and load the model with tf.keras.models.load_model
import tarfile
# open file
file = tarfile.open('imagenet_inception_v3_feature_vector_5.tar.gz')
# extracting file
file.extractall('/content/Destination_FolderName')
file.close()
#loading the saved file
tf.keras.models.load_model("/content/Destination_FolderName")

Related

How do I add folders and other files to a Kaggle Notebook

I'm trying to use a lot of other helper files such as utils.py and data.py with my Kaggle Notebook. How do I add these files and any other folders to my Kaggle Notebook?

Using .config files to load models in tensorflow

I have a .config file, name being ssd_mobilenet_v1.config which I know is supposed to load a pretrained model from tensorflow. However, I am unable to find how to do that.
I have searched the internet and there are instructions to do so from a dict but not directly using a .config file

libnvinfer.so.5: cannot open shared object file: No such file or directory

I'm using Ubuntu 16.04 and TensorFlow 1.13.1. Now I want to integrate TensorRT to improve my model's inference time. I downloaded and extracted TensorRT7's tar, and installed whls of uff and graphsurgeon in its path. I also added its path to the system's LD_LIBRARY_PATH.
However, when I tried to import tensorflow.contrib.tensorrt, it gave me a file not found error. There isn't libnvinfer.so.5 in my TensorRT7's folder but a libnvinfer.so.7 instead.
Does this mean that TensorFlow 1.13.1 doesn't support TensorRT7? Should I use TensorRT5 instead?

How can I use darknet library?

I want to use Darknet library in Python.
I've installed Darknet repository using this command.
git clone https://github.com/AlexeyAB/darknet.git
But when I type import darknet as dn
it says No module named darknet.
How can I install darknet module??
Is it possible using pip???
Check this tutorial: https://www.youtube.com/watch?v=5pYh1rFnNZs&t=408s
and the second part of it. Use those steps and then try creating a python file in which you import darknet. Make sure that the python file from where you import darknet is in the same folder as darknet.py, otherwise you need to specify the path to the darknet.py in your import statement.(ex import f1.f2.f3.darknet)

How to upload my dataset into Google Colab?

I have my dataset on my local device. Is there any way to upload this dataset into google colab directly.
Note:
I tried this code :
from google.colab import files
uploaded = files.upload()
But it loads file by file. I want to upload the whole dataset directly
Here's the workflow I used to upload a zip file and create a local data directory:
zip the file locally. Something like: $zip -r data.zip data
upload zip file of your data directory to colab using their (Google's) instructions.
from google.colab import files
uploaded = files.upload()
Once zip file is uploaded, perform the following operations:
import zipfile
import io
zf = zipfile.ZipFile(io.BytesIO(uploaded['data.zip']), "r")
zf.extractall()
Your data directory should now be in colab's working directory under a 'data' directory.
Zip or tar the files first, and then use tarfile or zipfile to unpack them.
Another way is to store all the dataset into a numpy object and upload to drive. There you can easily retrieve it. (zipping and unzipping also fine but I faced difficulty with it)