I'm having a minor problem running Tensorflow with Colab - tensorflow

I am a beginner who just learned about TensorFlow using Google Colab.
As in the attached image file, numbers 10 to 13 are underlined in tensorflow.keras~, what is the problem?
It's probably a function that indicates a typo, but there's nothing wrong with running it.
enter image description here

This underline error was an ongoing issue earlier in Google Colab, which is resolved now. Please try again replicating the same code in Google colab what you mentioned in the screen shot and let us know if the issue still persists at your end.
Please check the code below: (I replicated the same in Google Colab with python 3.8.10 and TensorFlow 2.9.2)
from keras.preprocessing import Sequence will show the error as you are not giving proper alias to import the Sequence API which is provided correctly in next line (using tensorflow.keras.preprocessing prefix or tensorflow.keras.utils).

Related

NotFoundError: iris_training.csv

I am using Colab to repeat the exercise provided here: https://nbviewer.jupyter.org/gist/yufengg/a6dff912ab48f7a273f5704ad9ab1311
I changed the Tensorflow to version 1.3.0, However, I got the error as shown like this:
enter image description here
Any help will be gratefully appreciated.
You need to download the data too.
Try adding this
!wget https://raw.githubusercontent.com/h2oai/h2o-2/master/smalldata/iris/iris_train.csv

Google Colab - Your session crashed for an unknown reason

Your session crashed for an unknown reason
when I run the following cell in Google Colab:
from keras import backend as K
if 'tensorflow' == K.backend():
import tensorflow as tf
from keras.backend.tensorflow_backend import set_session
config = tf.ConfigProto()
config.gpu_options.allow_growth = True
config.gpu_options.visible_device_list = "0"
set_session(tf.Session(config=config))
I receive this message since I have uploaded two data sets to google drive.
Does anyone know this message and can give me some advice?
Many thanks for every hint.
Update:
I always receive the message
Update
I have removed the data sets from Google Drive, but the session is still crashing.
Google Colab is crashing because you are trying to Run Code related to GPU with Runtime as CPU.
The execution is successful if you change the Runtime as GPU. Steps for the same are mentioned below:
Runtime -> Change Runtime -> GPU (Select from dropdown).
Please find the Working code in Github Gist.
Just a side note: sometimes you may want to reinstall an litle older version of the related module (see from the error log). It works for me in a case.
This error happens when the expected device and the actual device are different.
For example, if you run the code that is written with torch_xla, which is for TPU training, on the GPU (cuda) then the Colab will return you this error.
It is really tricky since it does not give you an actual debugging message, etc, which makes you hard to find what is the actual problem.

Extract Grind-Anchors from Object-Detection API Model

I'm currently trying to get my SSDLite Network, which I trained with the Tensroflow Object-detection API working, with iOS.
So I'm using the Open Source Code of SSDMobileNet_CoreML.
The Graph allready works with some limitations. For running on iOS I had to extract the FeatureExtractor from my Graph and where unable to keep Preprocessor, Posprocessor and MutlipleGrindAnchorBox, same as they did in SSDMobileNet_CoreML.
Here you can see the Anchors they have used.
So cause my Anchors seem to be a little different I tried to undestand how they got this array.
So I found in an GitHub Issue an explenation, where the User who created the Anchors explains how he got them.
He says:
I just exported them out of the Tensorflow Graph from the import/MultipleGridAnchorGenerator/Identity tensor
I allready found the matching tensor in my Graph but I don't know how to export the Graph and retrive the correct Anchor encoding.
Can sombody explain this to me?
I allready figured it out. A little below quote was a link to a Python Notebook which explains everything in detail.

Unable to download .h5 weights file trained on Google Colab (approx 500 mb in size)

I have re-trained a VGG model on google colab.
However when I try to save the model using the below code it throws the following error:
Code:
from google.colab import files
files.download('vgg_retrained_colab_24epochs_loss_0_47_accuracy_76_88.h5')
Error report:
error report link
how can I overcome this? The same code works for smaller files
Maybe it's too late now. However, may this answer help someone else at least. I faced a similar problem with checkpoint file, you can find the solution that worked for me here
Your question can be found here: https://stackoverflow.com/a/48774782/5544055
just change "file_name.csv" to
"file_vgg_retrained_colab_24epochs_loss_0_47_accuracy_76_88.h5"

JupyterLab output doesnt show visualization

Encountered this issue with 2 different visualization libraries.
PYLDAVIS and DISPLACY (spacy).
On executing a code in jupyterlab (kernel as python3), the output expected should be Jupyter Notebook to show the graph or webcontent. But my Jupyter doesnt show any output with graph / dependency image . I only see textual output in JupyterLab.
eg.
displacy.serve(doc, style='dep')
I'm using KAGGLE docker image which has JUPYTERLAB and on top of that I have updated to latest packages.
Any pointers if this is JUPYTERLAB related or underlying packages?
I can only really comment on the spaCy part of this, but one thing I noticed is that you are using displacy.serve instead of displacy.render, which would be the correct method to call from within a Jupyter environment (see the spaCy visualizer docs for a full example and more details). The reason behind this is that displacy.serve will start a web server to show the visualization in a browser – all of which is not necessary if you're already in a Jupyter Notebook. So when you call displacy.render, it will detect your Jupyter environment, and wrap the visualization accordingly. You can also set jupyter=True to force this behaviour.
try
from spacy import displacy
displacy.render(doc, style="dep", jupyter=True, options={'distance': 140})
or
displacy.render(doc, style="ent", jupyter=True, options={'distance': 140})