I had a wx script working on winxp (at work). it was upgraded to win7_64. I installed python2 and wxpython (both 32bit). now my script doesn't want to run. it says "ImportError: NumPy not found.". so I installed numpy from numpy.org, but it didnt change anything. I can import wx, I can import numpy, but when I try to run my wx script, it says that numpy is not installed. I removed and reinstalled everything but nothing changed.
what to do?
Presumably your numpy is too "new" or your wxPython is too old.
For example the combination wxPython < 3.0 and numpy > 1.9 will not work for the plot module (2.9.5 + numpy 1.8.0 and 3.0.2 + numpy 1.9.2 do actually work).
Reason should be file <site-packages.wx>/lib/plot.py (2.9.5):
# Needs NumPy
try:
import numpy.oldnumeric as _Numeric
except:
msg= """
This module requires the NumPy module, which could not be
imported. It probably is not installed (it's not part of the
standard Python distribution). See the Numeric Python site
(http://numpy.scipy.org) for information on downloading source or
binaries."""
raise ImportError, "NumPy not found.\n" + msg
and as used in 3.0.2):
# Needs NumPy
try:
import numpy as np
except:
numpy.oldnumeric is no longer part of numpy 1.9.2, wx.lib.plot was developed for ancient array libraries and you can clearly see its age.
Related
I am using Colab to run a text analysis code. I am want to get universal-sentence-encoder-large from tensorflow_hub.
But anytime running the block containing the code below:
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
I get this error:
RuntimeError: variable_scope module_8/ was unused but the
corresponding name_scope was already taken.
I appreciate if you have any idea how this error can be fixed?
TF Hub USE-3 Module doesn't work with Tensorflow Version 2.0.
Hence, if you change the version from 2.0 to 1.15, it works without any error.
Please find the working code mentioned below:
!pip install tensorflow==1.15
!pip install "tensorflow_hub>=0.6.0"
!pip3 install tensorflow_text==1.15
import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
import tensorflow_text
module = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
Please find the Github Gist of Google Colab as well.
With tensorflow 2 in google colab you should use hub.load(url) instead of hub.Module(url)
I'm attempting to read a csv file using modin and it results in the following error. this issue seems to happen on all dataframe operations:
RayWorkerError: The worker died unexpectedly while executing this
task.
Python 3.7.3
Pandas 0.24.2
Modin 0.5.4
Ray 0.7.1
import modin.pandas as pd
import numpy as np
frame_data = np.random.randint(0, 100, size=(2**10, 2**8))
pd.DataFrame(frame_data).to_csv('frame_data.csv')
pd.read_csv('frame_data.csv').head()
OP confirmed that the reason for the failure was the presence of the typing package, and that uninstalling typing fished the issue. That was a temporary fix for the issue tracked on Ray here. That issue was closed once Modin fixed the order of imports for the typing library. The latest version of Modin (0.12.0) should not have that problem.
Please read carefully. In my Python script I have the following:
import json
import pandas
from pandas.io.json import json_normalize
and it returns the following error:
from pandas.io.json import json_normalize ModuleNotFoundError: No
module named 'pandas.io'; 'pandas' is not a package
My steps:
I have uninstalled and installed Pandas
I have upgraded pip and pandas
I have installed io (pip install -U pandas.io)
I have installed data_reader and replaced the pandas.io.json part with that: from pandas_datareader import json_normalize
I have tried every solution I saw on stackoverflow and github and nothing worked. The only one I have not tried is installing Anaconda but it should work with what I tried before. Do you think it is a Windows setting things I must change?
PS: My Python version is 3.7.4
Try:
Go to ...\Lib\site-packages\pytrends on your local disk and open file request.py
Change
from pandas.io.json._normalize import nested_to_record
to
from pandas.io.json.normalize import nested_to_record
I had the same error, but it helped me.
also change
from pandas.io.json.normalize
to
from pandas.io.json._normalize
The cause of the problem was the fact that the python file had the name pandas. The filename was pandas.py. After renaming it, the code worked normally without errors.
i had same problem and i solve it b uninstalling extra python versions install on my windows.now i have only one python installed by anaconda,and everything is working perfectly
My R version is 3.4.1, python version is 3.5.2 , and OS is Ubuntu 16.04.2
I have set RPYTHON_PYTHON_VERSION=3.5 when installing rPython, which is my default python version for rPython.
♥ python.exec('import sys')
♥ python.exec('print(sys.version)')
3.5.2 (default, Nov 17 2016, 17:05:23)
[GCC 5.4.0 20160609]
When I import numpy through rPython (there is no issue with using import numpy in python 3.5, everything works fine.), I got this:
♥ python.exec('import numpy')
Error in python.exec("import numpy") :
Importing the multiarray numpy extension module failed. Most
likely you are trying to import a failed build of numpy.
If you're working with a numpy git repo, try `git clean -xdf` (removes all
files not under version control). Otherwise reinstall numpy.
Original error was: /usr/local/lib/python3.5/dist-packages/numpy/core/multiarray.cpython-35m-x86_64-linux-gnu.so: undefined symbol: PyType_GenericNew
However, if I set RPYTHON_PYTHON_VERSION=2 and reinstall rPython, the import numpy works. How can I successfully import numpy under rPython with python 3.5?
First off, can you import any packages into python 3.5.3 from R/rPython?
I am also having this problem. The error I get is exactly the same as the posters (numpy won't load). I later found that I cannot import any packages. I can however import packages in python 2.7.13 and python 3.5.3 (just not through R/rPython). This leads me to believe that this is an 'rPython' R package error. Here are the things that I have tried to do to fix this:
1) I have tried installing/reinstalling the R package rPython to use either python 2.7.13 or python 3.5.3. I could connect R to python 2.7.13 via reinstall of the rPython package:
install.packages("rPython",lib= "home/myusername/R/x86_64-pc-linux-gnu-library/3.4", configure.vars= "RPYTHON_PYTHON_VERSION=2")
Using "RPYTHON_PYTHON_VERSION=3" during install similarly allowed me to connect R with python 3.5.3. I could call "import numpy" from R when rPython was connected to python 2.7.13, but not when connected with 3.5.3.
2) I have tracked down all numpy and scipy's which had been previously installed and uninstalled them. I had several copies of each for both python 2.7.13 and python 3.5.3. Reinstalling using pip and pip3 did not fix the problem (I restarted R beforehand to be safe).
From both accounts this seems to be a problem with the R package 'rPython'. You could try the newer 'reticulate' package from R, and see if this works better for you. However, I have not been able to get parallel threads to work when using reticulate to connect R with python, and this is unfortunately what I need to do. Threading did however work perfectly when using 'rPython', but the package I need requires python 3+. I will keep troubleshooting and update this post if I am able to solve it. In the meantime, give 'reticulate' a shot, it is a very neat package.
EDIT
I was able to load numpy from python 3.5.3 in R using the 'reticulate' package.
EDIT2 For those who find this post in the future, the only solution I could find to use python3 code with multithreading from R was to call python files with system(python3 "path_to_python_script" arg1 arg2 arg3)
Using Enthought Canopy; the following command import pandas produces this error message:
ImportError: C extension: hashtable not built. If you want to import pandas
from the source directory, you may need to run 'python setup.py build_ext --
inplace' to build the C extensions first.
Which I understand means that the package hasn't been built with it's C dependencies? I thought Canopy's environment handled module installations, I have tried removing, and updating Pandas with no luck.
Does anyone know how to correctly use Pandas in Enthought Canopy?
Forcing a reinstallation of Pandas and its dependencies with enpkg pandas --forceall run from a Canopy Terminal/Command Prompt seems to have fixed the problem.