warning in xml.tree in openpyxl - openpyxl

I am using openpyxl from different Python version using following way.
sys.path.insert(0,
'/remote/Python-2.7.2-shared/linux32/lib/python2.7/site-packages/openpyxl-1.6.1-py2.7.egg')
sys.path.insert(1,
'/remote/Python-2.7.2-shared/linux32/lib/python2.7/site-packages')
I will not receive any warning when I directly use particular version.
Python-2.7.2-shared/linux32/lib/python2.7/site-packages/openpyxl-1.6.1-py2.7.egg/openpyxl/shared/compat/elementtree.py:30:
UserWarning: Unable to import 'xml.etree.cElementree'. Falling back on
'xml.etree.Elementree'
I am reading more than 100 xlsx File and did manual testing previously and need to provide quick fix.
As per my understanding, I am reading xlsx File and does not contain any xml element.
So it should not impact any reading data in xlsx File. can be confirm it or can I ignore this warning.
One small thing not related to openpyxl.
is it possible to hide this warning. I do not have root permission

You can ignore warnings. What you are doing is not recommended. The warning is just that without cElementTree your code may run slow. Python does support install packages for in user home directories but using virtual environments (virtualenv) is preferable.

Related

How to read GCS path in local environment

I usually do my Machine Learning work on Kaggle/Colab, however I'm trying to modularize my codes onto github. I face one big problem when I try to read files from GCS.
For example, I have GCS_PATH = "gs://kds-432679f77c5f716920e51fb4289eb7c6d9d6" and wish to do this:
TRAINING_FILENAMES = tf.io.gfile.glob(GCS_PATH + "/train*.tfrec")
However my vscode throws me this error "in get_matching_files_v2 compat.as_bytes(pattern)) tensorflow.python.framework.errors_impl.UnimplementedError: File system scheme 'gs' not implemented (file: 'gs://kds-432679f77c5f716920e51fb4289eb7c6d9d6/train*.tfrec')"
Everything works fine on colab but fails immediately in a local environment. I am quite new to this, please advise on how to approach this problem.
This might be due to the incompatibility of version support for gcs. You may try downgrading tensorflow_datasets from 3.2.1 to 3.1.0. See a similar issue below:
https://github.com/tensorflow/tensorflow/issues/38477#issuecomment-659279614
Good luck!

want to upload a file to s3 using apache airflow [ DAG ] file

i want to make a DAG file (apache airflow) for uploading a rar file to s3 bucket any one tried.? plz suggest ,
and i tried these things on my DAG file but there is showing some error
from airflow.operators import SimpleHttpOperator, HttpSensor, , EmailOperator, S3KeySensor
The error is
/usr/local/lib/python3.6/dist-packages/airflow/utils/helpers.py:439: DeprecationWarning: Importing 'SimpleHttpOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
DeprecationWarning)
/usr/local/lib/python3.6/dist-packages/airflow/utils/helpers.py:439: DeprecationWarning: Importing 'HttpSensor' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
DeprecationWarning)
/usr/local/lib/python3.6/dist-packages/airflow/utils/helpers.py:439: DeprecationWarning: Importing 'EmailOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
how to solve this issue.?
This is simply a warning, not an error. A DeprecationWarning usually hints that something you're doing will work now, but may break in future versions. If your task is failing, ignore these messages and look for a proper error.
Code for operators have always been located under airflow.operators.[operator_module], but it was also made available under airflow.operators directly for convenience. For example, SimpleHttpOperator is defined in https://github.com/apache/airflow/blob/1.10.9/airflow/operators/http_operator.py, so importing it from airflow.operators.http_operator will definitely work. However, importing it from airflow.operators will also work due to the code that currently exists in https://github.com/apache/airflow/blob/1.10.9/airflow/operators/init.py#L97-L99, at least for now in your current version of Airflow. Basically, you can address these warnings by updating your imports to the following:
from airflow.operators.http_operator import SimpleHttpOperator
from airflow.operators.email_operator import EmailOperator
from airflow.sensors.http_sensor import HttpSensor
from airflow.sensors.s3_key_sensor import S3KeySensor
Just heads up, currently only on the master branch, not yet in any released version, some of the third party operators and sensors have also been moved again. For example, S3KeySensor will be found under providers.amazon.aws.sensors.s3_key.py. As expected, importing from the "old" path will get you a similar deprecation message, https://github.com/apache/airflow/blob/97a429f9d0cf740c5698060ad55f11e93cb57b55/airflow/sensors/s3_key_sensor.py#L25-L28.

Collectd : Could not find plugin "rrdtool" in /opt/collectd/lib/collectd

This is first time I am working with collectd.
I have performed the following steps :
Downloaded https://collectd.org/files/collectd-5.5.2.tar.gz
Extracted the tar.
executed configure
executed make all install
changed the collectd.conf in /opt/collectd/etc/collectd.conf
uncommented the necessary plugin and made changes to file paths.
I have used the following link.
I am getting the above error when I try to run collectd.
However when I use csv plugin it works correctly.
As much as I understood rrdtool is necessary in order to visualize data.
I need rrdtool so that I can visualize my data.
Is there any other alternative to rrdtool to view data on my browser, or any other tool or plugin using which I can visualize my csv data.
This is what I have figured out after running configure:
Thank you
In the same output you have to find the missign lib.
Dependencies:
collectd(x86-64) = 5.6.0-1.sdl7
libc.so.6(GLIBC_2.14)(64bit)
libdl.so.2()(64bit)
librrd_th.so.4()(64bit)
rtld(GNU_HASH)

OMP warning when numpy 1.8.0 is packaged with py2exe

import numpy
When I packaged above one line script as a single executable window application using py2exe, I get following warnings upon launch.
OMP: Warning #178: Function GetModuleHandleEx failed:
OMP: System error #126: The specified module could not be found.
This warning happen only when I build as single executable (i.e., only when bundle_files=1). Here's my setup.py for this.
from distutils.core import setup
import py2exe
setup(
options = {'py2exe': {'bundle_files': 1}},
windows=['testnumpy.py'],
zipfile = None,
)
This problem started with numpy 1.8.0. When I revert back to 1.6.2, the warnings don't show up.
Usually a single executable packaged by py2exe will catch warnings and traceback and save them into a log file. But somehow these warnings are not captured and the app creates a console window to show warning. I want to suppress this additional console window to show up.
How can I fix this warning problem?
What I tried (nothing worked):
I tried this redirecting sys.stderr.
I searched github numpy source for openMP assuming the OMP stands for it as mentioned here. But, nothing useful came out.
I have copied libiomp5md.dll to the same folder as setup.py.
I tried filterwarnings:
I tried sys.excepthook.
As I wrote in the comment, installing numpy 1.8.1rc1 from sourceforge did fix the issue, although I don't really know the differences...
I had this issue with numpy 1.13.1+mkl and scipy 1.19.1. Reverting to numpy 1.8.1rc1 is not an acceptable solution.
I tracked this issue to the scipy.integrate subpackage. The warning message pops up when this package is imported. It seems that perhaps libraries that use MKL don't like being invoked from library.zip, which is where py2exe places packages when using bundle option 2.
The solution is to exclude scipy and numpy in the py2exe setup script and copy their entire package folders into the distribution directory and add that directory to the system path at the top of the main python script.

Library not loaded - ogr2ogr - topojson (Mike Bostock's d3.js map tutorial)

I'm trying to use ogr2ogr to filter a shapefile. I'm working through Mike Bostock's Let's Make a Map tutorial. A bit of googling - including here - hasn't led to any solutions yet. I'm also VERY new to topojson (and shapefiles in general; my background is in economics/statistical software like Stata), so I'm not sure what I'm doing and where things are going wrong. Either way - here's the error result I'm getting:
dyld: Library not loaded: /usr/local/lib/liblwgeom-2.1.1.dylib
Referenced from: /usr/local/Cellar/libspatialite/4.1.1/lib/libspatialite.5.dylib
Reason: image not found
Trace/BPT trap: 5
No idea what liblwgeom-2.1.1.dylib is, what it does, where I get it, etc. Google hasn't helped much on defining it either.
FWIW, I'm on a Mac, I brew installed npm and gdal, and then npm installed topojson.
Thanks,
a
Edited to add: I just brew reinstalled gdal, because I remembered getting a warning (Caveats). See below:
==> Caveats
For non-homebrew python (2.x), you need to amend your PYTHONPATH like so:
export PYTHONPATH=/usr/local/lib/python2.7/site-packages:$PYTHONPATH
This version of GDAL was built with Python support. In addition to providing
modules that makes GDAL functions available to Python scripts, the Python
binding provides additional command line tools.
I actually tried to run export PYTHONPATH=/usr/local/lib/python2.7/site-packages:$PYTHONPATH literally as-is, and it returned nothing. (Not sure if something happened in the background?) Basically fumbling in the dark!