I just set up a IPython (0.13.1) Notebook server in order to use it during my introductory course on Python. The server is protected with password and is currently up and running.
Now, I need my students (about 15 people) to access the same ipynb document at the same time, play around with it and eventualy modify the code examples, while making sure anyone overwrites the uploaded version of the ipynb file.
How can I set this up?
First, take a look on teaching with ipython notebook. Try to list what type of applications you want to run on this. On the other hand, it possible to use some cloud computing resources, for example on Heroku.
Related
Goal
I'd like to have a credentials file for things like API keys stored in a place where someone I've shared the Colab notebook with can't access the file or the information.
Situation
I'm calling several APIs in a Colab Notebook, and have multiple keys for different APIs. I'd prefer a simpler approach, if there are different levels of complexity.
Current attempts
I'm storing the keys in the main Python notebook, as I'm researching the best way to approach this. I'm pretty new at authentication, so would prefer a simpler solution. I haven't seen any articles addressing this directly.
Greatly appreciate any input on this.
You can store the credential files in your Google Drive.
Only you can access them at /content/drive/MyDrive/ after mounting it. Other people need their own credential files in their own Drive.
My friend and I are working on a project together on Google Colab for which we require a dataset but we keep running into the same problem while uploading it.
What we're doing right now is uploading onto drive and giving each other access and then mounting gdrive each time. This becomes time consuming and irritating as we need to authorize and mount each time.
Is there a better way so that the we can upload the dataset to the home directory and directly access it each time? Or is that not possible because we're assessed a different machine(?) each time?
If you create a new notebook, you can set it to mount automatically, no need to authenticate every time.
See this demo.
I was recently working in a notebook on Google Colab and my computer ran out of battery and died. All the progress I had made was not saved anywhere!
I'm very used to having jupyter notebooks, which saves my files pretty much every time I execute a cell.
Is there a way to have an equivalent feature in Google Colab?
Autosave is already implemented in Google Colab, but there is a certain delay between the moment you execute a cell and when the save occurs.
You can try this yourself by going into File>Revision History, executing a cell, and waiting for the list to refresh.
That being said, I have also experienced loss of data in the past, which I can't explain. It might be a glitch.
As a good practice, I try to save every time I remember.
Good luck.
Autosave every 60 seconds by running this "magic command" into a new code cell :
%autosave 60
Colab will confirm it when you run the cell with printing : "Autosave changes every 60 seconds"
To display the list of all magic commands you can use the command :
%lsmagic
Additionally, you can call the Quick Reference Guide, describing all the magic commands and what they do using the command :
%quickref
Enjoy!
I have some scripts running from GSheet getting data from BigQuery. However, in order to make the files run, I need to manually enable the API every time for a given sheet.
So the question is: How to enable API within the code, so that if I share the GSheet or make a copy I don't have to go to the script editor and enable the API from there?
Thanks
I am a huge fan of this particular use of the Google ecosystem, so I'm happy to help get others up and running using GSheets with BigQuery! Hopefully it is working well for you!
When sharing the sheet with others, there is no need to alter anything in the script editor at all. The scripts should run and query BigQuery without issue; this has been my experience at least. The obvious caveat to this is that the users you share it with must have access to the Google Developer Project that the BigQuery instance is associated with.
However, when copying the sheet, I do not believe it is possible to have it replicate the connection. This is because when the file is copied, it becomes associated with a new Google Developer Project. Thus, you have to go into the script editor, then go to Resources > Developers Console Project and change the project listed to the one in which you have BigQuery enabled.
Hopefully this helps! Sorry I don't have better news for you!
assume I have some thesis etc. and want to give the audience the possibility to download the coding part and test it;
Is there a platform for professionally upload it and also keep it there permanently (of course it should not be deleted within a couple of months)
Thanks for your help...
You can use tools such as Git-hub or bitbucket. These allow you to upload code and even have version control. Users can download your code directly and use it if they need to.