xlrd missing in GCP AI notebooks - pandas

Since today, every AI notebook I provision seems to be missing xlrd. Installing it with conda lands me in package hell: Trying to load an xlsx. First error: xlrd missing, install it. After installing xlrd: the current version (>=2) only supports xls, not xlsx.
One of the reasons I'm using GCP AI platform notebooks is this used to be hassle free. Does anyone have a fix or knowledge what is going on?

According to the this documentation xlrd library will no longer read anything other than .xls files. They recommend to look in Working with Excel Files in Python to check alternatives to xlsx files. As #ArthurBorshenko commented in your question, openpyxl is one option.
To install dependencies in AI Platform Notebooks follow the official documentation.

Related

Code formatter like nb_black for google colab

I know that for jupyter notebooks and jupyter lab, there are available code formatter extensions such as nb_blackor blackcellmagic. However when I installed them, it doesn't seem to work on google colab.
Do you know if there are any native option in colab or an extension that formats code (pep8 compliant)?
I don't think there's an extension directly in Colab.
What you could do, though, is to download your notebook, run
pip install -U nbqa
nbqa black notebook.ipynb
and then reupload your (now formatted) notebook to Colab
disclaimer: I'm the author of nbQA
UPDATE: as of version 21.8b0, black runs directly on notebooks, no third-party tool required
I have tried everything, none of the JupyterLab/Notebook backend hack methods seem to work as of February 2022. However, until later here is a relatively simple workaround:
[Run only once, at startup]
Connect to your drive
from google.colab import drive
drive.mount("/content/drive")
Install black for jupyter
!pip install black[jupyter]
Restart kernel
[Then]
Place your .ipynb file somewhere on your drive
Anytime you want format your code run:
!black /content/drive/MyDrive/YOUR_PATH/YOUR_NOTEBOOK.ipynb
Don't save your notebook, hit F5 to refresh the page
Voila!
Now save!

"First steps with Tensorflow", how to access the data files outside of colab?

I'm Attempting to run "First steps with Tensorflow" locally, outside of colab. Not really familiar with colab so I don't know how to access the "dataframes" such as "california_housing_dataframe", etc. Evidently colab "knows" how to access the dataframes in the example but I am attempting to run the exercise natively on my local system.
Thank You
I think you should have Pandas library locally installed. Then, I think it would run natively.

How to download Keras_contrib (and other packages that is not avaliable in conda) to Pycharm?

I am quite familiar with Pycharm except 1 thing that I can't seem to figure out how to download Keras_contrib which is not availble in conda's channel and conda-forge channel which is also often used.
I have read the following article which suggest to add additional channel to conda.
"How to Install a Package in PyCharm when project interpreter is set to conda, and the package is not provided/listed by conda? 1"
but as I mentioned Keras_contrib is not provided, and I am not sure quite sure how to download it.
I managed to install Keras_contrib sucessfully to my environment which is also used by Pycharm interpreter, but for some reason, Pycharm does not recognize it.
I follow instruction given in https://github.com/keras-team/keras-contrib
which is running setup.py install
Here are the questions
By doing this Does it get install in the site_packages automatically? because I do not see it.
if I have to do it manually, how come my environment can recognize it, but Pycharm cannot.
Is there a default location in which environment and Pycharm usually look at?
because it would make sense in this case that one may recognize it while other may not.
How can I download Keras_contrib which is not avaliable in well known channel?
Is there other way to check that Pycharm Interpreter is compatible with my anaconda environment other than looking folder it is linked to?
In my case they link to the same environment, but Pycharm just cannot recognize
I just figured it out.
so Pycharm looks at site-packages of your environment.
I solved by problem that Pycharm cannot recognize the packages while anacoda env can by copy and past the Keras_contrib to the site-packages. (I still find this to be strange if any one answer to this. Feel free to comment)

TensorFlow without jupyter notebook

Do I absolutely need to use jupyter notebook to run TensorFlow in Windows ?
I tried the detect object example with the jupyter notebook, it works but I'm not really comfortable, Im used to notepad++ and running python directly on my windows without virtual environment.
I tried to copy past all the codes but I run into many hugs.
No, it is not compulsory to use Jupyter notebook to run Tensorflow on Windows. I personally use PyCharm as my IDE and Anaconda for dependency management (this is completely optional).
I would recommend you to use a proper IDE instead of notepad++ because it's much easier to do debugging using an IDE. You'll also be cloning a lot from Git when you start developing your own model, and usually the open source models out there has a lot of classes and methods in it (take Google's Inception net for example).
Another alternative would be maybe you can start posting about the bugs you are facing, then we can all start helping you.

Install Tensorflow pip wheel without internet

I do not have internet access on my linux computer therefore I installed TF from source by following TensorFlow Get Started.
I ran into a few trouble to build trainer_example due to the lack of internet connection hopefully someone from tensorflow helped me through it by creating local repositories for re2, gemmlowp, jpegsrc v9a, libpng and six and modifying WORKSPACE accordingly.
When I try to bazel build pip_package to create the wheel then I think I run into the same problem but :
-the list of repositories is insanely long (to manually install each of them) even if they seem to be mostly part of PolymerElements
Is there an easy workaround ?
If you are happy to create a PIP package without TensorBoard, you should be able to avoid rewriting the Polymer dependencies by removing this line ("//tensorflow/tensorbaord" in the build_pip_package dependencies) from tensorflow/tools/pip_package/BUILD.