Im using Colab Free for a long time. My runtime gets disconnected every few minutes so I decided to make a research on Stackoverflow. I find some Chrome DevConsole (How to prevent Google Colab from disconnecting?) Codes and they were working until this day. Today It started to get disconnected again.
[Q] How can I keep my runtime alive?
You should move to an ec2 in amazon.
Related
I am running my code on google colab to bring mlflow dashboard and whenever I ran !mlflow ui and it is taking forever to execute. The last text on my screen is the Booting worker with pid. This is my first time working with mlflow can anyone tell me why this is happening and what I can do to fix it?
You can use mlflow ui to see logs, it doesn't install anything. In fact, it hosts a server using gunicorn. In order to connect to the tracking server created by Colab, reading this thread could be useful. (Also this doc)
I recommend you to run mlflow ui command on your local host and then go to the listening address to see what happens. (Things tracked on the colab don't show here!)
aimlflow might be helpful. It helps to run a beautiful UI on top of mlflow logs
the code: https://github.com/aimhubio/aimlflow
Some background
My computer fan goes crazy when I am using Google Colab, it definitely uses local resources somehow. I am running very long processes (over 4 hours). Yesterday, it occurred to me that I was disconnected, I thought my session had crashed since I stoped receiving the status updates of my task's progress bar. But then after clicking on Connect to a hosted runtime I was able to reconnect to that session and just interact with it fine. Given that Google Colab uses some of my local resource, I looking for a way to put the client application on halt for a little bit.
Question
How to manually disconnect from my remote session without crashing/terminating it? Is that even possible?
Note:
There is an answer for Does Google Colab stay connected when I close my browser? that says
The current cell will continue executing once you close your browser, but the outputs will not end up in the notebook in Drive.
I would be fine if I am able to leave the session running remotely but not being able to access the outputs on the notebook, given that I save the result on google drive when the process is done. So, not been able to see the output on the notebook would not be an issue for me.
I keep trying to connect to the hosted runtime on Google Colab, but it disconnects me after a few seconds (up to 10 secs).
The JavaScript Console of Firefox (the latest version) shows this:
Error: "SocketIO error: xhr poll error: 0 - TransportError"
I experience similar disconnect with Kaggle. How can I fix that? Thanks.
Edit: This may not be the final answer, but turning off VPN seems to fix the issue with Colab. Turning VPN back on does not cause the issue, but Kaggle still does not work, it might now just be a Kaggle issue.
Edit 2: With VPN back on the issue comes up again.
I am using a Google Colab Jupyter notebook for algorithm training and have been struggling with an annoying problem. Since Colab is running in a VM environment, all my variables become undefined if my session is idle for a few hours. I come back from lunch and the training dataframe that takes a while to load becomes undefined and I have to read_csv again to load my dataframes.
Does anyone know how to rectify this?
If the notebook is idle for some time, it might get recycled: "Virtual machines are recycled when idle for a while" (see colaboratory faq)
There is also an imposed hard limit for a virtual machine to run (up to about 12 hours !?).
What could also happen is that your notebook gets disconnected from the internet / google colab. This could be an issue with your network. Read more about this here or here
There are no ways to "rectify" this, but if you have processed some data you could add a step to save it to google drive before entering the idle state.
You can use local runtime with Google Colab. Doing so, the Colab notebook will use your own machine's resources, and you won't have any limits. More on this: https://research.google.com/colaboratory/local-runtimes.html
There are various ways to save your data in the process:
you can save on the Notebook's VM filesystem, e. g. pd.to_csv("my_data.csv")
you can import sqlite3 which is the Python implementation of the popular SQLite database. Difference between SQLite and other SQL databases is that the DBMS runs inside your application, and data is saved to the file system of that application. Info: https://docs.python.org/2/library/sqlite3.html
you can save to your google drive, download to your local file system through your browser, upload to GCP... more info here: https://colab.research.google.com/notebooks/io.ipynb#scrollTo=eikfzi8ZT_rW
I ended my trial period with google cloud, and I have upgraded my account, still my virtual machine is stopped and I get this following error, which I couldn't manage to solve.
Starting VM instance "hasoffer-api" failed. Error: The default network interface [nic0] is frozen.
Can anyone give me a tip to solve this issue?
Many thanks in advance
Most of the times, that error is due to a deletion process of the project or a recently change in your billing account, what it seems to be your problem.
In this case, it is neccesary to file a case to Google Support.