Is there any way to load FaceNet model as a tf.keras.layers.Layer using Tensorflow 2.3? - tensorflow

I want to use FaceNet as a embedding layer (which won't be trainable).
I tried loading FaceNet like so :
tf.keras.models.load_model('./path/tf_facenet')
where directory ./path/tf_facenet contains 4 files that can be downloaded at https://drive.google.com/file/d/0B5MzpY9kBtDVZ2RpVDYwWmxoSUk/edit
but a message error shows up :
OSError: SavedModel file does not exist at: ./path/tf_facenet/{saved_model.pbtxt|saved_model.pb}
And the h5 files downloaded from https://github.com/nyoki-mtl/keras-facenet doesn't seem to work either (they use tensorflow 1.3)

I had issued like you when load model facenet-keras. Maybe you python env missing h5py modules.
So you should install that conda install h5py
Hope you success!!!

Related

Streamlit Cloud OSError: [E053] Could not read config file

I am deploying an app that specifically requires spaCy==3.3.1 to Streamlit cloud, which I added to the requirement.txt as well as the link to download and install en_core_web_sm which is
https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.2.0/en_core_web_sm-2.2.0.tar.gz#egg=en_core_web_sm.
The import is okay but when I load the model i.e. nlp = spacy.load("en_core_web_sm"), I will get the OSError below:
OSError: [E053] Could not read config file from /home/appuser/venv/lib/python3.9/site-packages/en_core_web_sm/en_core_web_sm-2.2.0/config.cfg
What am I doing wrong?
Thanks in advance for your help.
My requirements.txt:
spacy==3.3.1
https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-2.2.0/en_core_web_sm-2.2.0.tar.gz#egg=en_core_web_sm
Main code:
import en_core_web_sm
nlp = spacy.load("en_core_web_sm")
I expected no error but got the OSError: [E053] above.
Packages trained with spaCy v2.x are not compatible with spaCy v3.x
(source)
Note that the model has version 2.2.0 but spacy has version 3.3.1.
I'd suggest to use a newer version of the model. The requirements could look like:
spacy==3.3.1
en-core-web-sm # https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.3.0/en_core_web_sm-3.3.0-py3-none-any.whl

OSError: [E053] Could not read config.cfg from C:\Users

I'm trying to run spaCY's lemmatizer on a text by running the command nlp = spacy.load("en_core_web_sm", disable=["parser", "ner"]), but then I get the following error:
OSError: [E053] Could not read config.cfg from C:\Users.
I'm using spaCy version 3.2.1, and I also installed en-core-web-sm-2.2.0.
I also get this warning, but I'm not sure what it means:
UserWarning: [W094] Model 'en_core_web_sm' (2.2.0) specifies an under-constrained spaCy version requirement: >=2.2.0. This can lead to compatibility problems with older versions, or as new spaCy versions are released, because the model may say it's compatible when it's not. Consider changing the "spacy_version" in your meta.json to a version range, with a lower and upper pin. For example: >=3.2.1,<3.3.0.
Hope someone can help me.
A v2 spaCy model won't work with spaCy v3 - you need to update your model. Do this:
spacy download en_core_web_sm
The error could be easier to understand, but it's not a situation that comes up much - usually you'd have to upgrade from spaCy v2 to v3 but not upgrade your models for that to happen. Not sure how you got in that state.

How to make Tensorflow vid2depth inference works?

I tried to test Tensorflow's vid2depth inference (https://github.com/tensorflow/models/tree/master/research/vid2depth).
I followed the instruction of the README. However, I got the below error:
ValueError: Couldn't find 'checkpoint' file or checkpoints in given directory vid2depth/trained-model/model-119496
I test with tf 0.12, 1.0.0. 1.8 and 1.15.
I used the below command:
python3.6 inference.py --kitti_dir ~/vid2depth/kitti-raw-uncompressed --output_dir ~/vid2depth/inference --kitti_video 2011_09_26/2011_09_26_drive_0009_sync --model_ckpt ~/vid2depth/trained-model/model-119496/
The content of the checkpoint folder I downloaded is:
model-119496.data-00000-of-00001 model-119496.index model-119496.meta
It seems provided files are not in the correct format, but I followed the link of the README.
It seems to be an old checkpoint format (Loading older checkpoint in tensorflow).
Could anyone make vid2depth works?
Thank you.

ValueError: GRU(reset_after=False) is not compatible with GRU(reset_after=True)

How to solve this error which is raised when loading the weights from h5 file?
ValueError: GRU(reset_after=False) is not compatible with
GRU(reset_after=True)
Github link : https://github.com/emilwallner/Screenshot-to-code
Colab link : https://colab.research.google.com/drive/106_QEi_Wp6mfDDE1E2lPSPh7S9CABk6B#revisionId=0Byh7i7xj0YHlMU0xaTJCWDA3ZzZNTlA1VFFRWU5xQWdtc2tFPQ
dataset drive link:
https://drive.google.com/drive/folders/1BTeUbXO7qBvOT4VkhOrr7SOcSocSZyeb?usp=sharing
set reset_after=True in your GRU layer
Probably the version of the TensorFlow is not compatible with other packages. Try using Tensorflow==1.15.0. You can install it by writing pip install TensorFlow==1.15.0

How to convert Onnx model (.onnx) to Tensorflow (.pb) model

I am trying to convert .onxx model to .pb model. I have written the code but i am getting error:
#tf_func(tf.ceil)AttributeError: module 'tensorflow' has no attribute 'ceil'
Code:
import onnx
from tensorflow.python.tools.import_pb_to_tensorboard import import_to_tensorboard
from onnx_tf.backend import prepare
onnx_model = onnx.load("original_3dlm.onnx")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("model_var.pb")
import_to_tensorboard("model_var.pb", "tb_log")
How to resolve this issue? Is there any other way to convert Onxx to Tensorflow?
I solve this issue with this.
Tensorflow Backend for ONNX.
Let me know if you have any issue.
Change from tensorflow 2.0 to 1.14.Maybe solve the problem.
your code as far as I can tell should be fine. The problem probably lies in the onnx-tf version you currently use. pip currently installs a version that only supports TensorFlow <= 1.15.
run this in the terminal to install a more up-to-date version of onnx-tf.
pip uninstall onnx_tf
pip install git+https://github.com/onnx/onnx-tensorflow.git
refer to this issue for further details