How to save an image using toyplot (failed example) - google-colaboratory

In Google's colab jupyter notebook I have a canvas object which displays nicely on the scree. Now I want to save it on the disk as a figure using toyplot:
import toyplot.pdf
toyplot.pdf.render(canvas, "fig.pdf")
Upon execution the code runs quitely and returns no error.
However, no file is saved nor a dialog is shown.
Using Chrome on Ubuntu.
Am I missing something?

Related

How do you embed pdf image in Jupyter Notebook markdown cell?

I'm trying to view this pdf image in my Jupyter Notebook markdown cell using:
![](https://www.cs.ubc.ca/~tmm/vadbook/eamonn-figs/fig5.2.pdf)
but the image is not rendering.
I've tried manually saving the pdf file into my directory and using:
![](fig5.2.pdf) and ![](./fig5.2.pdf)
but that didn't work either.
I do want to point out that converting the pdf into png works:
![](fig5.2.png)
However, since these .pdf images are present throughout my Jupyter Notebooks, I don't want to have to manually convert every single .pdf into .png.
I'm thinking this is a browser issue (using Microsoft Edge Version 109.0.1518.78 (Official build) (64-bit)) since the rest of my classmates and professor using Safari get the embedded pdf images to render.
I appreciate any help.

Same code running matplotlib thru anaconda prompt and spyder give images of different clarity

I'm trying to run a deep learning code. I usually run in spyder win10. The images obtained thru matplotlib are saved using something like:
plt.savefig('./output/'+filename+'_'+str(num)+'.png',dpi=360)
Occasionally, I would run the code thru the anaconda prompt using:
python abc.py
In this case, the image would appear on-screen. I realise that if I enlarge them, I can get a much clearer image with more details, as compared to the saved image. Why is this so? I have attached some images for comparison

What is Colab uploading (repeatedly/persistently), and why?

I am running a python-tensorflow-keras jupyter notebook on Colab, training a CNN on Caltech-256 images. The data is loaded from the Caltech site directly to the Colab area with a wget, and never appears on my PC. The notebook includes Tensorboard, and some Callbacks. Obviously, as I first upload the notebook to Colab, that will use some internet bandwidth. I would expect that the rest of the time there should be very little traffic - only enough to update my screen at the end of each epoch (every 600 seconds) or as I click on it. However, there is actually quite a lot of traffic - enough to impact the other people in my house significantly. I believe that the problem is with our upload speed (ie data going from my PC to Colab). I am using Firefox web browser. When I switch to Colab Playground mode, the issue disappears.
What is being uploaded, and why?
Is there any workaround?
I've found the answer here https://github.com/tensorflow/tensorboard/issues/3196
I don't totally understand it, but the workaround listed there works for me:
While the notebook is running, in the tensorboard display, click the settings cog (top right), and uncheck the auto-update box. (You then need to click the refresh icon whenever you want to update your graphs.)
This has to be repeated every time you open the notebook, but it's a small price to pay for family harmony.

Fill form fields in Colab notebook from URL query parameters

How can I dynamically populate a form field in my Colab notebook from a URL query parameter so I can construct preconfigured links to it from another system? I tried adapting Can a Jupyter / IPython notebook take arguments in the URL? but the JS in colab runs in a sandboxed iframe with a referrer policy which removed the path and query from document.referrer.
I would also like to make it as easy as possible for someone following such a link to run the notebook on the data supplied in the form fields. Is it possible to connect and run all automatically on opening a notebook? Is there a way to display a 'Run all' button so the user does not have to hunt for it in the menu?

How can I preserve cell outputs on uploaded notebooks?

I uploaded a previously created Jupyter notebook. I could initially see all the cell outputs in Colab right after uploading it, but if I close the notebook and come back to it later -- or if I share the notebook with a coworker -- then all the cell outputs have been cleared, which is quite annoying.
This is happening even though I've verified that the following two checkboxes are UNCHECKED:
Edit > Notebook settings > Omit code cell output when saving this notebook
Tools > Preferences > New notebooks use private outputs (omit outputs when saving)
From what I can tell, it looks like the cell outputs get preserved across sessions for notebooks created and edited in Colab, but not for notebooks that were created elsewhere and then uploaded. What am I missing? How can I preserve cell outputs across sessions in uploaded notebooks?
Are you trying to open the file from Drive directly in Jupyter?
If so, you'll need to save the full file using the File -> Download ipynb menu item.
By default, Colab saves outputs using a different format to support incremental saves, so the Drive file created during auto-save will show outputs, but only in Colab itself, and you'll need to download the full ipynb to export to other notebook viewing tools.