How do you use ColabTurtle? - google-colaboratory

Although I know I can install ColabTurtle to use Turtle on a Colab Notebook, I cannot figure out how.
Could someone please give me an example on how to run Turtle codes on Colab?

Here is an example notebook.
First, you need to install it.
!pip install ColabTurtle
Then import all functions and init it.
from ColabTurtle.Turtle import *
initializeTurtle()
Then call normal turtle commands, e.g.
forward(100)
right(90)
forward(100)
right(90)
forward(100)
You can see all supported commands in the code.

Related

How to get variables from a python script

I'm running this code for ALBERT, one of Google's machine learning models on Google Colab. At the end of the code, they do everything by running a script using ! (the shell) to run the script. However, I'd like to do some stuff to the resulting model after the code has run.
Is there any way to either access or get the script to output particular variables in a way that my code in Colab could access afterwards?
Here's another way of phrasing my question. Putting $HELLO_WORLD into the shell command accesses the HELLO_WORLD variable in my Colab code. Is there any way to get the script to set the HELLO_WORLD variable in my Colab code?
You can use os.environ like this.
import os
os.environ['HELLO_WORLD']='hello world from Python'
Then later
!echo $HELLO_WORLD
# hello world from Python

Please let me know .if you able to resolve the issue?

how to select python3 from runtime type in google colab. No such option in it.
Python 3 is now the default. You cannot choose Python 2 anymore.
So, for now, Python = Python3.

Where is dask.datasets?

I'm trying to follow some examples on Dask's website for Bags and DataFrames. The examples require me to use datasets: dask.datasets but I didn't get datasets with my installation as far as I can tell, i.e. I get an error telling me dask has no attribute datasets, and I don't see it in the dask folder.
I can't find help using Google or Dask's website. Where is datasets?
https://examples.dask.org/bag.html
https://examples.dask.org/dataframe.html
Thanks in advance.
dask.datasets was introduced in version 0.18.2, so check your version to make sure you have 0.18.2 or later. To install the latest release (as of today), run:
pip install dask==2.10.1

python vincent map does not display

I'm trying to use vincent package to visualize my data (in pandas) in jupyter notebook, but have trouble in initial attempt ,here is the code I use (copied from the http://wrobstory.github.io/2013/10/mapping-data-python.html):
import vincent
import pandas
world_topo=r'world-countries.topo.json'
geo_data = [{'name': 'countries',
'url': world_topo,
'feature': 'world-countries'}]
vis = vincent.Map(geo_data=geo_data, scale=200)
vis.to_json('vega.json')
vis.display()
After I ran the code, nothing was displayed. I checked the type of the vis:
vincent.charts.Map
I'm not sure how to proceed here, I appreciate any input on this problem.
Not sure at which point of implementation of this you are.
Assuming you just used pip to install vincent and tried the code in PY IDLE , you might be missing 2 important steps:
AFIK vincent only generates jsons to be presented using Vega via Jupyter notebook.
To render with Vega You will need to install:
1) Jupyter and dependencies
2) Vega and dependencies
I was able to do so using these instructions.
Once jupiter launched, a window opens in the browser, I had to choose 'Python3' under 'new', and put code in the prompt on that page.
Alternately you can use this online Vega renderer. Please also see Vega docs
Note that it seems that vincent is not the latest technology for that purpose, their page points to Altair
Also, I noticed that the json that is generated in 'vega.json' from the code you posted, using the original data, does not render anywhere. That's also an issue, probably happens because it uses outdated format, but I am not sure.
I have limited experience with this technology but I was able to get graphs to render, specifically this, and it is also how it looked for me.
I know that this post is old but I found your error and I thought I would answer here to help future users of vincent as it has worked beautifully for me. I am working with the anaconda version of vincent and jupyter notebook.
First, you have to initialize vincent in your notebook
import vincent
vincent.core.initialize_notebook()
and your next problem is that your URL isn't actually pointing anywhere. For the world map topography you need:
world_topo="https://raw.githubusercontent.com/wrobstory/vincent_map_data/master/world-countries.topo.json"
A decent map printed out for me with those two exceptions.

Pyspark not recognized by intellij even though visible in the configured Python SDK

The Python interpreter has been selected:
We can see that the pyspark were available (via pip) and visible to that python interpreter:
However the python interpreter does not recognize pyspark package:
pyspark is the only package that seems to suffer from this issue: pandas, numpy, sklearn etc all work. So what is different about pyspark ?
While the following is not really an answer to the original question, it is a middling - and only partial - workaround.
We need to add several environment variables to the Run configuration:
In particular the SPARK_HOME, PYTHON_SUBMIT_ARGS (=pyspark-shell), and PYTHONPATH are needed.
This is inconvenient to need to set for every pyspark run configuration .. but presents a last resort.