How to connect to local runtime in Google Colab for this specific notebook WhisperWithVAD - google-colaboratory

I wish someone could help me to connect local runtime on this specific notebook on Google Colab at this link:
https://colab.research.google.com/github/ANonEntity/WhisperWithVAD/blob/main/WhisperWithVAD.ipynb
Basically it is a modified version of OpenAI's Whisper for Speech-to-Text for my videos to learn languages, with this model it take use of GPU accellation acceleration
I have been using the free version but it has been restricted due to the fact that it is not unlimited to use this model through Google's host. Since I am not really a dev, I have difficulties with the setup. It would be also if you can give me a clear instruction.
Corresponding with error in Colab: Unable to connect to runtime

You can follow these instructions to install Jupyter on your local machine and then connect the WhisperWithVAD notebook to your local runtime: https://research.google.com/colaboratory/local-runtimes.html
I was able to get this up running, but had to comment out some code in the Run Whisper block where the notebook was getting hung up:
#from google.colab import files (near the top of the block)
and
#files.download(out_path) (at the bottom of the block)

Related

Google Cloud Deep Learning On Linux VM throws Unknown Cuda Error

I am trying to set up a deep learning VM on Google Cloud but I keep running into the same issue over and over again.
I will follow all the steps, set up a N1-highmem-8 (8 vCPU, 52gb Memory) instance, add a single T4 GPU and select the Deep Learning Image: TensorFlow 2.4 m69 CUDA 110 image. That's it.
After that, I will ssh into the vm, run the script that installs all the NVIDIA drivers and... when I begin using it, by simply running
from tensorflow.keras.layers import Input, Dense
i = Input((100,))
x = Dense(500)(i)
I keep getting failed call to cuInit: CUDA_ERROR_UNKNOWN: unknown error. By that point I haven't installed anything and haven't done anything custom, just the vanilla image from GCP.
What is more concerning is that, even if I delete the vm and then create a new one with the same config, some times the error won't happen immediately and sometimes it's present off the bat.
Has anyone encountered this? I've googled around to see if anyone has faced this issue and while I came across suggestions, all of them are old and have not worked for me. More over, suggestions on NVIDIA support forums tell me to re-install everything and the whole point of me using a pre-built GCP image specifically for deep learning is so that I don't have to enter the hell of installing and resolving issues with NVIDIA drivers.
The issue is fixed with the M74 image, but you are using M69. So follow one of the two fixes provided in the Google Cloud public forum.
we can mitigate the issue by:
Fix #1: Use the latest DLVM image (M74 or later) in a new VM instance: They have released a fix for the newest DLVM image in M74 so you will no longer be affected by this issue.
Fix #2: Patch your existing instance running images older than M74.
Run the following via an SSH session on the affected instance:
gsutil cp gs://dl-platform-public-nvidia/b191551132/restart_patch.sh /tmp/restart_patch.sh
chmod +x /tmp/restart_patch.sh
sudo /tmp/restart_patch.sh
sudo service jupyter restart
This only needs to be done once, and does not need to be rerun each time the instance is rebooted.

How to use local Coral USB TPU with Google Colab (instead of Cloud TPU)

I have a USB TPU and would like to use it as LOCAL RUNTIME in Google Colab.
I was not able to find any resources on this topic.
You can use a local Runtime (local Jupyter) and it is explained here :
https://research.google.com/colaboratory/local-runtimes.html
Do I need to install all the TPU libraries in my local Jupyter and then connect to local Jupyter as local runtime to start using my USB TPU in Colab?
I'm not familiar with Google Colab, but looks like it allows you to expose your model on your hardware. You'll then need to locate your model in order to run inference with it. There are multiple ways that you can choose to run it which are all listed here:
https://coral.withgoogle.com/docs/edgetpu/inference/

Tensorflow 2.0 beta GPU running in jupyter notebook, but not in google colab

I am working with tensorflow 2.0 beta, and while i managed to get my GPU working on anaconda through a few youtube tutorials I am unable to get my gpu running in google colab. I know google has the option to enable a gpu from one of their servers but My GTX 1070 is much faster, and i need to run off colab and not just Jupyter exclusively.
So I read the documentation like a good boy and the only thing i think i could have done wrong is my path settings I have screenshots bellow.
I followed several different youtube tutorials faithfully until the final one here gave me a way to install it to jupyter. Which is great, but I also need it to run on google colab as well.
I've been trying this since Friday and it's now tuesday and I'm losing my mind over this. Help me stackoverflow, you're my only hope.
https://imgur.com/a/8WibGWT
If you can get it running on your own Jupyter server then you can point colab to that local server.
Full instructions here: https://research.google.com/colaboratory/local-runtimes.html but edited highlights are:
install jupyter_http_over_ws:
pip install jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
start your local server allowing colab domain:
jupyter notebook \
--NotebookApp.allow_origin='https://colab.research.google.com' \
--port=8888 \
--NotebookApp.port_retries=0
Click 'connect to local runtime' in colab

"First steps with Tensorflow", how to access the data files outside of colab?

I'm Attempting to run "First steps with Tensorflow" locally, outside of colab. Not really familiar with colab so I don't know how to access the "dataframes" such as "california_housing_dataframe", etc. Evidently colab "knows" how to access the dataframes in the example but I am attempting to run the exercise natively on my local system.
Thank You
I think you should have Pandas library locally installed. Then, I think it would run natively.

GUI is not possible on Google Colab

I understand the GUI (such as those powered by tkinter) does not work on Google Colab, any alternatives at this point?
Error message
TclError: no display name and no $DISPLAY environment variable in google's colab
To use these notebooks you need to install binary MoebInv libraries and their dependencies.
In short, you simply need to execute it in CoLab or your Ubuntu-18.04 desktop the next cell