loading tensorflow models in java - tensorflow

exportDir = "gs://testbucket/export/";
SavedModelBundle b = SavedModelBundle.load(exportDir, "serve");
Gives me the error :
org.tensorflow.TensorFlowException: SavedModel not found in export directory: gs://testbucket/export/
copying the saved_model.pb to a local directory and then providing path to the local filesystem works.
Where as
tf.saved_model.loader.load(session, [tf.saved_model.tag_constants.SERVING], export_dir)
This works with gcs bucket. Does any one know if loading models using svedmodelBundle api does not support gcs bucket ? How can i load saved_model.pb and variables from gcs bucket without copying them over to local filesystem in java

tf.saved_model.tag_constants.SERVING is the string 'serving_default'
Perhaps that's the problem?

Related

When trying to convert StyleGan2 Pickle File, getting pickle.load() error: 'No module named 'torch_utils.persistence'

I trained a StyleGan2-ADA model on a custom dataset which generated a .pkl file. I'm now trying to load the .pkl file so that I can convert it to a .pt file, but when I load the .pkl file using:
pickle.load(f)
I'm getting a ModuleNotFoundError: No module named 'torch_utils.persistence'
I've installed torch_utils and other dependencies, but for loading the file I'm not sure how to fix this issue. If anyone has had this issue in loading a .pkl file any help would be greatly appreciated!!
Same issue on Github here, but no clear solution.
Have tried installing torch_utils multiple times, but error still persists

React Native, Tensorflow js, load model from local storage

I'm trying to develop a mobile application that should be able to load a tensorflow js model and then provide predictions to the final user. Everything works as intended when the required files for the model to work (model.json, weights.bin) are bundled into the application as assets, the code I'm using to load such files looks as follows:
import { fetch, bundleResourceIO } from "#tensorflow/tfjs-react-native";
import * as tf from "#tensorflow/tfjs";
modelWeights = require("./assets/explain_model/weights.bin");
modelFile = require("./assets/explain_model/model.json");
model = await tf.loadGraphModel(
bundleResourceIO(modelFile, modelWeights)
)
console.log('model loaded...')
Later on, I aim to store both files externally and then download them i.e, both files weights.bin and model.json are uploaded to a web server and then downloaded to the application by using react-native's fetch-blob library when the application runs. Both files are getting stored in the DocumentDir path, for example /data/user/0/com.tensorflowexample/files/explain-weights.bin
So I would like to load the model from local storage, I saw that bundleResourceIO only works for assets that are bundled when compiling the application, and tensorflow js react native also has an alternative to use asyncStorageIO.
Is it possible to load the model from local storage without getting into AsyncStorage?, if not, how could I use AsyncStorage to copy both files weights.bin and model.json from local storage to AsyncStorage in order to use asyncStorageIO
Thank you so much!

Is there any way to load FaceNet model as a tf.keras.layers.Layer using Tensorflow 2.3?

I want to use FaceNet as a embedding layer (which won't be trainable).
I tried loading FaceNet like so :
tf.keras.models.load_model('./path/tf_facenet')
where directory ./path/tf_facenet contains 4 files that can be downloaded at https://drive.google.com/file/d/0B5MzpY9kBtDVZ2RpVDYwWmxoSUk/edit
but a message error shows up :
OSError: SavedModel file does not exist at: ./path/tf_facenet/{saved_model.pbtxt|saved_model.pb}
And the h5 files downloaded from https://github.com/nyoki-mtl/keras-facenet doesn't seem to work either (they use tensorflow 1.3)
I had issued like you when load model facenet-keras. Maybe you python env missing h5py modules.
So you should install that conda install h5py
Hope you success!!!

Cannot load image "./darknet/dataset/z115.jpg" STB Reason: can't fopen

I am training Darknet YOLO-V3 on a Google cloud, VM instance to detect the custom object.
The darknet directory contains the following:
dataset directory: It includes all the annotated data.
Annotated image
obj.data file: It contains
Obj data
obj.names file: It contains
obj names
train.txt file: It contains
train text file
When I am runing this command:
./darknet detector train obj.data yolov3-tiny.cfg darknet53.conv.74
The Error is generated:
Cannot load image "./darknet/dataset/z115.jpg" STB Reason: can't fopen

How to use uploaded files in colab tensorflow?

I have uploaded my train.csv and valid.csv into colab using the files.upload() snippet:
User uploaded file "valid.txt" with length 3387762 bytes
User uploaded file "train.txt" with length 9401172 bytes
Running some tensorflow code that runs ok locally and fetches files in the current directory, causes the following error in Colab:
InvalidArgumentError: assertion failed: [string_input_producer requires a non-null input tensor]
[[Node: input_producer/Assert/Assert = Assert[T=[DT_STRING], summarize=3, _device="/job:localhost/replica:0/task:0/device:CPU:0"](input_producer/Greater, input_producer/Assert/Assert/data_0)]]
I assume the code can't see the files? What's the path to the uploaded files?
Do the answers on this question help?
How to import and read a shelve or Numpy file in Google Colaboratory?
(The files.upload stores uploaded files in memory. To work with them as files on your filesystem, you'll need to save them explicitly.)