I am using Colab to run a text analysis code. I am want to get universal-sentence-encoder-large from tensorflow_hub.
But anytime running the block containing the code below:
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
I get this error:
RuntimeError: variable_scope module_8/ was unused but the
corresponding name_scope was already taken.
I appreciate if you have any idea how this error can be fixed?
TF Hub USE-3 Module doesn't work with Tensorflow Version 2.0.
Hence, if you change the version from 2.0 to 1.15, it works without any error.
Please find the working code mentioned below:
!pip install tensorflow==1.15
!pip install "tensorflow_hub>=0.6.0"
!pip3 install tensorflow_text==1.15
import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
import tensorflow_text
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
Please find the Github Gist of Google Colab as well.
With tensorflow 2 in google colab you should use hub.load(url) instead of hub.Module(url)
Related
I am trying to import the tensorflow module in my jupyter notebook, however this error appears. I have imported ternsorflow before, therefore I am not cure why it is showing the error: ModuleNotFoundError: No module named 'tensorflow.python.eager.polymorphic_function' . Could someone help me fix this?
I tried checking if tensorflow is installed, and indeed it is.
I am running a very basic sentiment analysis pipeline utilising the XLM-Roberta model on Huggingface. I am trying to ensure I am utilising the M1 chip as I will be looping over ~10e7 entries.
So as to be consistent I am running a fresh install of PyTorch following the yml file and step outlined in this (very useful) video, I subsequently pip install sentence-piece and protobuf (version 3.2.0) to deal with a few subsequent errors. When running a simple pipeline model I am however faced with the below:
# Imports
import pandas as pd
import datetime as dt
import itertools
from transformers import pipeline, AutoTokenizer
sentiment_model = pipeline(model="cardiffnlp/twitter-xlm-roberta-base-sentiment", return_all_scores = True)
ValueError: google.__spec__ is None
Interesting following the install methods for Tensorflow from the same channel runs fine but does not access the M1 chip and simply runs on CPU.
Has anyone faced this prior or have a method such that I can run PyTorch?
Many thanks in advance.
I am trying to save a model with tensorflow-agents. First I define the following:
collect_policy = tf_agent.collect_policy
saver = PolicySaver(collect_policy, batch_size=None)
and then save the model like this:
saver.save('my_directory/')
This works ok in google colab but I am getting the following error in my local PC.
AttributeError: module 'tensorflow.python.saved_model.nested_structure_coder' has no attribute 'StructureCoder'
These are the library versions I am using:
tensorflow 2.9.1
tf-agents 0.11.0
Tl;dr
Make sure you have the right tensorflow-probability version that is compatible for 2.9.x and tf-agents 0.11.0
pip uninstall tensorflow-probability
pip install tensorflow-probability==0.17.0
(0.19.0 for TF 2.11, 0.18.0 for TF 2.10 or look at the release notes)
Also make sure to restart your kernel from the notebook.
What the problem was
StructureCoder has been moved to tensorflow API. Therefore, other dependent libraries have made changes like this in tf-agent and like this in tensorflow-probability. Your machine is somehow picking up an older version that depends on the previous version of nested_structure_coder.
For me, I was using
tensorflow 2.9.0
tf-agents 0.13.0
tensorflow-probabilities 0.17.0
Try making an explicit import in your notebook:
import tensorflow_probability
print(tensorflow_probability.__version__) # was 0.17.0 for me
I am trying to convert .onxx model to .pb model. I have written the code but i am getting error:
#tf_func(tf.ceil)AttributeError: module 'tensorflow' has no attribute 'ceil'
Code:
import onnx
from tensorflow.python.tools.import_pb_to_tensorboard import import_to_tensorboard
from onnx_tf.backend import prepare
onnx_model = onnx.load("original_3dlm.onnx")
tf_rep = prepare(onnx_model)
tf_rep.export_graph("model_var.pb")
import_to_tensorboard("model_var.pb", "tb_log")
How to resolve this issue? Is there any other way to convert Onxx to Tensorflow?
I solve this issue with this.
Tensorflow Backend for ONNX.
Let me know if you have any issue.
Change from tensorflow 2.0 to 1.14.Maybe solve the problem.
your code as far as I can tell should be fine. The problem probably lies in the onnx-tf version you currently use. pip currently installs a version that only supports TensorFlow <= 1.15.
run this in the terminal to install a more up-to-date version of onnx-tf.
pip uninstall onnx_tf
pip install git+https://github.com/onnx/onnx-tensorflow.git
refer to this issue for further details
I tried to code on Pycharm, but when I use from keras import backend as K it throws an import error like "cannot import name backend". But I can do it on terminal.
How can I fix this?
error
on terminal
Are you sure your PyCharm sees the same Python environment as what you are using in Terminal?
See if this possibly works for you:
use tensorflow on pyCharm
I got the same problem:
Check here:Pycharm cannot find installed packages: keras
After adding new package to project interpreter, you need to quit Pycharm and restart it again.
Use
from tensorflow.python.keras import backend as K
instead.