I'm new to kaggle and deep learning. I want to use scikeras on Kaggle, but it seemed that it requires tf version >= 2.7.0 and kaggle is using the version of 2.6.4.
I tried as follows but it didn't work. So how can I use higher version of tf? Thanks for your help!
enter image description here
Related
I'm trying to run the following google colab:
https://colab.research.google.com/gist/zsyzzsoft/5fbb71b9bf9a3217576bebae5de46fc2/data-efficient-gans.ipynb?authuser=1#scrollTo=Re5R6VX8VNgo
colab no longer recognises gpu's with tensorflow 1.x. so is there any way to get this colab working again??
I have tried reinstalling to tensorflow 1.x and also upgrading the code to tensorflow 2 but nothing seems to work.
Google Colab removed support for Tensorflow 1, and it is not possible to use %tensorflow_version 1.x magic anymore. You have to install a specific version of tensorflow 1.x version using
pip install tensorflow==1.x
sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))
For more details please refer to this link. Thank You.
I am trying to run code that was written with tensorflow v1 and I am struggling to migrate it to tensorflow v2. I thought it might be easiest to install tensorflow v1 but I couldn't find a tutorial on how to do that. Is it even still possible to install Tensorflow 1?
Code written using Tensorflow V1 can be easily upgraded to Tensorflow v2 by following Tensorflow migrate guide.
And also convert Tensorflow v1 based code to Tensorflow V2 just by running upgrade code, to know more about this library see here.
To install Tensorflow v1 version,
pip install tensorflow==1.15
Follow the instructions mentioned to install Tensorflow.
I'm trying to run a program in my Raspberry but i can't because it needs at least TensorFlow 2.2.0, while I have TensorFlow 2.0.0 . I tried several times to install TensorFlow 2.2.0 and 2.3.0 . But after install it, it always comes that is 2.0.0 still.
Versions of TensorFlow
Somebody can tell me what happens? Thank you!!
Try to find the package in the /python3.x/site-packages and remove the tensorflow directory using rm.
Then install the needed tensorflow version following the installation instruction found here in the official documentation of tensorflow.
Also attaching the image from the comment of #pablo Gracia S.
I am a university professor trying to learn deep learning for a possible class in the future. I have been using google colab with GPU support for the past couple of months. Just recently, the GPU device is not found. But, I am doing everything that I have done in the past. I can't imagine that I have done anything wrong because I am just working through tutorials from books and the tensorflow 2.0 tutorials site.
tensorflow 2 on Colab GPU was broken recently due to an upgrade from CUDA 10.0 to CUDA 10.1. As of this afternoon, the issue should be resolved for the tensorflow builds bundled with Colab. That is, if you run the following magic command:
%tensorflow_version 2.x
then import tensorflow will import a working, GPU-compatible tensorflow 2.0 version.
Note, however, if you attempt to install a version of tensorflow using pip install tensorflow-gpu or similar, the result may not work in Colab due to system incompatibilities.
See https://colab.research.google.com/notebooks/tensorflow_version.ipynb for more information.
I have installed the Tensorflow r1.14 and want to use TF-TRT. However, the following error occurs:
"ModuleNotFoundError: No module named 'tensorflow.contrib.tensorrt'"
when running the sample code. The same error occurs with Tensorflow r1.13. So my question is do I need to install the tensorflow.contrib.tensorrt library separately? If yes, how?
Additionally, I can run the sample code of the TensorRT, e.g. sampleINT8, successfully. Click here to see my successful sample code run.
This leads me to believe that TensorRT is installed properly. However, the TF-TRT still doesn't work.
Any help would be greatly appreciated!
In TF 1.14, TF-TRT was moved to the core from contrib.
You need to import it like this: from tensorflow.python.compiler.tensorrt import > trt_convert as trt
https://github.com/tensorflow/tensorrt/blob/master/tftrt/examples/image-classification/image_classification.py#L22
This is the correct answer for Linux.
However, if you're using Windows: the TensorRT Python API (and therefore TF-TRT) is not supported for Windows at the moment, so the TensorFlow python packages aren't built with TensorRT.
In TF 1.14, TF-TRT was moved to the core from contrib.
You need to import it like this:
from tensorflow.python.compiler.tensorrt import trt_convert as trt
https://github.com/tensorflow/tensorrt/blob/master/tftrt/examples/image-classification/image_classification.py#L22
In order to be able to import tensorflow.contrib.tensorrt you need to have tensorflow-gpu version >= 1.7 installed on your system. Maybe you could try installing the tensorflow-gpu library with a:
pip install tensorflow-gpu
Check out the Windows section of the GPU documentation as well. Also, I would try updating your tensorflow version with a:
pip install --upgrade tensorflow
to ensure you're up to date there as well. Check out this section of the TensorFlow documentation for additional support.
Hopefully that helps!
2 possibilities
Have you installed tensorflow-gpu instead of tensorflow?
From your screenshot it looks like you're using Windows. I had the same problem. There seems no tensorrt module under contrib in TF windows distribution however linux has it (I tried 1.13.1).