How to convert Onnx model (.onnx) to Tensorflow (.pb) model - tensorflow

I am trying to convert .onxx model to .pb model. I have written the code but i am getting error:
#tf_func(tf.ceil)AttributeError: module 'tensorflow' has no attribute 'ceil'
Code:
import onnx
from tensorflow.python.tools.import_pb_to_tensorboard import import_to_tensorboard
from onnx_tf.backend import prepare
onnx_model = onnx.load("original_3dlm.onnx")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("model_var.pb")
import_to_tensorboard("model_var.pb", "tb_log")
How to resolve this issue? Is there any other way to convert Onxx to Tensorflow?

I solve this issue with this.
Tensorflow Backend for ONNX.
Let me know if you have any issue.
Change from tensorflow 2.0 to 1.14.Maybe solve the problem.

your code as far as I can tell should be fine. The problem probably lies in the onnx-tf version you currently use. pip currently installs a version that only supports TensorFlow <= 1.15.
run this in the terminal to install a more up-to-date version of onnx-tf.
pip uninstall onnx_tf
pip install git+https://github.com/onnx/onnx-tensorflow.git
refer to this issue for further details

Related

Using Huggingface pipeline transformers on Mac M1, fresh PyTorch install errors

I am running a very basic sentiment analysis pipeline utilising the XLM-Roberta model on Huggingface. I am trying to ensure I am utilising the M1 chip as I will be looping over ~10e7 entries.
So as to be consistent I am running a fresh install of PyTorch following the yml file and step outlined in this (very useful) video, I subsequently pip install sentence-piece and protobuf (version 3.2.0) to deal with a few subsequent errors. When running a simple pipeline model I am however faced with the below:
# Imports
import pandas as pd
import datetime as dt
import itertools
from transformers import pipeline, AutoTokenizer
sentiment_model = pipeline(model="cardiffnlp/twitter-xlm-roberta-base-sentiment", return_all_scores = True)
ValueError: google.__spec__ is None
Interesting following the install methods for Tensorflow from the same channel runs fine but does not access the M1 chip and simply runs on CPU.
Has anyone faced this prior or have a method such that I can run PyTorch?
Many thanks in advance.

Error when saving model with tensorflow-agents

I am trying to save a model with tensorflow-agents. First I define the following:
collect_policy = tf_agent.collect_policy
saver = PolicySaver(collect_policy, batch_size=None)
and then save the model like this:
saver.save('my_directory/')
This works ok in google colab but I am getting the following error in my local PC.
AttributeError: module 'tensorflow.python.saved_model.nested_structure_coder' has no attribute 'StructureCoder'
These are the library versions I am using:
tensorflow 2.9.1
tf-agents 0.11.0
Tl;dr
Make sure you have the right tensorflow-probability version that is compatible for 2.9.x and tf-agents 0.11.0
pip uninstall tensorflow-probability
pip install tensorflow-probability==0.17.0
(0.19.0 for TF 2.11, 0.18.0 for TF 2.10 or look at the release notes)
Also make sure to restart your kernel from the notebook.
What the problem was
StructureCoder has been moved to tensorflow API. Therefore, other dependent libraries have made changes like this in tf-agent and like this in tensorflow-probability. Your machine is somehow picking up an older version that depends on the previous version of nested_structure_coder.
For me, I was using
tensorflow 2.9.0
tf-agents 0.13.0
tensorflow-probabilities 0.17.0
Try making an explicit import in your notebook:
import tensorflow_probability
print(tensorflow_probability.__version__) # was 0.17.0 for me

'No schema registered' in ONNX model conversion

I am using kaggle notebook. I am trying to convert my pytorch model into tensorflow model to run with tensorflowJS. I used below code to convert onnx model to tensorflow model-
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load("../input/onnx-model/model.onnx")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("output/model.pb")
I got
SchemaError: No schema registered for 'BitShift'!
I tried with onnx version 1.8.1 , 1.8.0 and then further downgrade to 1.6.0 .
Also, I tried to run onnx model directly with onnx.js but facing issue in image normalization and resizing. Hence, I decided to switch to tfjs.
I faced the same issue. Uninstall the onnx-tf and run
pip install git+https://github.com/onnx/onnx-tensorflow.git. Issue seems to be with some exception type.
I have tested it for tf-1.15.0, and downgrading onnx by pip install onnx==1.8.0
For more details use this answer

ValueError: GRU(reset_after=False) is not compatible with GRU(reset_after=True)

How to solve this error which is raised when loading the weights from h5 file?
ValueError: GRU(reset_after=False) is not compatible with
GRU(reset_after=True)
Github link : https://github.com/emilwallner/Screenshot-to-code
Colab link : https://colab.research.google.com/drive/106_QEi_Wp6mfDDE1E2lPSPh7S9CABk6B#revisionId=0Byh7i7xj0YHlMU0xaTJCWDA3ZzZNTlA1VFFRWU5xQWdtc2tFPQ
dataset drive link:
https://drive.google.com/drive/folders/1BTeUbXO7qBvOT4VkhOrr7SOcSocSZyeb?usp=sharing
set reset_after=True in your GRU layer
Probably the version of the TensorFlow is not compatible with other packages. Try using Tensorflow==1.15.0. You can install it by writing pip install TensorFlow==1.15.0

Error on Scope Variable While Using Tensorflow Hub

I am using Colab to run a text analysis code. I am want to get universal-sentence-encoder-large from tensorflow_hub.
But anytime running the block containing the code below:
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
I get this error:
RuntimeError: variable_scope module_8/ was unused but the
corresponding name_scope was already taken.
I appreciate if you have any idea how this error can be fixed?
TF Hub USE-3 Module doesn't work with Tensorflow Version 2.0.
Hence, if you change the version from 2.0 to 1.15, it works without any error.
Please find the working code mentioned below:
!pip install tensorflow==1.15
!pip install "tensorflow_hub>=0.6.0"
!pip3 install tensorflow_text==1.15
import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
import tensorflow_text
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
Please find the Github Gist of Google Colab as well.
With tensorflow 2 in google colab you should use hub.load(url) instead of hub.Module(url)