the command I typed is "python3 -m pip install -r requirements.txt".
I think the first error is that the library mkl_rt is missing but I'm not sure how to add.
Complete log:
Defaulting to user installation because normal site-packages is not writeable
Collecting Flask==1.1.1
Using cached Flask-1.1.1-py2.py3-none-any.whl (94 kB)
Collecting imutils==0.5.3
Using cached imutils-0.5.3.tar.gz (17 kB)
Preparing metadata (setup.py) ... done
Collecting Keras==2.4.0
Using cached Keras-2.4.0-py2.py3-none-any.whl (170 kB)
Collecting opencv-python==4.4.0.46
Using cached opencv-python-4.4.0.46.tar.gz (88.9 MB)
Installing build dependencies ... error
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [4837 lines of output]
Ignoring numpy: markers 'python_version == "3.6"' don't match your environment
Ignoring numpy: markers 'python_version == "3.7"' don't match your environment
Ignoring numpy: markers 'python_version >= "3.9"' don't match your environment
Collecting setuptools
Using cached setuptools-63.4.2-py3-none-any.whl (1.2 MB)
Collecting wheel
Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting scikit-build
Using cached scikit_build-0.15.0-py2.py3-none-any.whl (77 kB)
Collecting cmake
Using cached cmake-3.24.0-py2.py3-none-macosx_10_10_universal2.macosx_10_10_x86_64.macosx_11_0_arm64.macosx_11_0_universal2.whl (77.9 MB)
Collecting pip
Using cached pip-22.2.2-py3-none-any.whl (2.0 MB)
Collecting numpy==1.17.3
Using cached numpy-1.17.3.zip (6.4 MB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting distro
Using cached distro-1.7.0-py3-none-any.whl (20 kB)
Collecting packaging
Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): finished with status 'error'
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [4428 lines of output]
Running from numpy source directory.
blas_opt_info:
blas_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib', '/usr/lib']
NOT AVAILABLE
..............
For now I'm going to try to install the requirements with out the file.
This error occurs when there is a problem between the versions of python and /or pip vs the versions of the packages you're trying to install.
If you know the version of python used to generate the requirements.txt file that you have, please make sure to use the same version. Otherwise try upgrading the version of python.
Related
When I tried running pip install pandas===1.1.2, it throws this long error:
Collecting pandas===1.1.2
Using cached pandas-1.1.2.tar.gz (5.2 MB)
Installing build dependencies ... error
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [3794 lines of output]
Ignoring numpy: markers 'python_version == "3.6" and platform_system != "AIX"' don't match your environment
Ignoring numpy: markers 'python_version == "3.7" and platform_system != "AIX"' don't match your environment
Ignoring numpy: markers 'python_version == "3.6" and platform_system == "AIX"' don't match your environment
Ignoring numpy: markers 'python_version == "3.7" and platform_system == "AIX"' don't match your environment
Ignoring numpy: markers 'python_version >= "3.8" and platform_system == "AIX"' don't match your environment
Collecting setuptools
Using cached setuptools-65.4.1-py3-none-any.whl (1.2 MB)
Collecting wheel
Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting Cython<3,>=0.29.16
Using cached Cython-0.29.32-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (1.9 MB)
Collecting numpy==1.17.3
Using cached numpy-1.17.3.zip (6.4 MB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): finished with status 'error'
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [3080 lines of output]
Running from numpy source directory.
blas_opt_info:
blas_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/usr/local/lib', '/usr/lib64', '/usr/lib', '/usr/lib/x86_64-linux-gnu']
NOT AVAILABLE
...........
note: This error originates from a subprocess, and is likely not a problem with pip.
error: legacy-install-failure
× Encountered error while trying to install package.
╰─> numpy
note: This is an issue with the package mentioned above, not pip.
hint: See above for output from the failure.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
I cut the error short because of post limitations. I tried downgrading the python version from 3.10 to 3.8 but the issue still persists. Also, I tried running pip install numpy and it was installed fine. So does anyone know how to fix this issue?
EDIT
another main issue is that the version of python is 3.8 but python3 is 3.10. So I downgraded the python3 version to 3.8 and pip install pandas===1.1.2 worked perfectly downgrade python3 link
Could you try to install an older version of pip after then try again?
pip install pip==21.3.1
I am using the steps from the Hugging Face website (https://huggingface.co/docs/transformers/installation) in order to start using hugging face in Visual Studio Code and install all the transformers.
I was on the last process, where I had to type "pip install transformers[flax]", then I got an error, so I installed rust-land, however, I still ended up getting an error;
Requirement already satisfied: transformers[flax] in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (4.22.2)
Requirement already satisfied: filelock in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (3.8.0)
Requirement already satisfied: requests in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (2.28.1)
Requirement already satisfied: tokenizers!=0.11.3,<0.13,>=0.11.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (0.12.1)
Requirement already satisfied: huggingface-hub<1.0,>=0.9.0 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (0.10.0)
Requirement already satisfied: packaging>=20.0 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (21.3)
Requirement already satisfied: tqdm>=4.27 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (4.64.1)
Requirement already satisfied: regex!=2019.12.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from
transformers[flax]) (2022.9.13)
Requirement already satisfied: numpy>=1.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (1.23.3)
Requirement already satisfied: pyyaml>=5.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (6.0)
Collecting transformers[flax]
Using cached transformers-4.22.1-py3-none-any.whl (4.9 MB)
Using cached transformers-4.22.0-py3-none-any.whl (4.9 MB)
Using cached transformers-4.21.3-py3-none-any.whl (4.7 MB)
Using cached transformers-4.21.2-py3-none-any.whl (4.7 MB)
Using cached transformers-4.21.1-py3-none-any.whl (4.7 MB)
Using cached transformers-4.21.0-py3-none-any.whl (4.7 MB)
Using cached transformers-4.20.1-py3-none-any.whl (4.4 MB)
Using cached transformers-4.20.0-py3-none-any.whl (4.4 MB)
Using cached transformers-4.19.4-py3-none-any.whl (4.2 MB)
Using cached transformers-4.19.3-py3-none-any.whl (4.2 MB)
Using cached transformers-4.19.2-py3-none-any.whl (4.2 MB)
Using cached transformers-4.19.1-py3-none-any.whl (4.2 MB)
Using cached transformers-4.19.0-py3-none-any.whl (4.2 MB)
Using cached transformers-4.18.0-py3-none-any.whl (4.0 MB)
Collecting sacremoses
Using cached sacremoses-0.0.53-py3-none-any.whl
Collecting jax!=0.3.2,>=0.2.8
Using cached jax-0.3.21.tar.gz (1.1 MB)
Preparing metadata (setup.py) ... done
Collecting flax>=0.3.5
Using cached flax-0.6.1-py3-none-any.whl (185 kB)
Collecting optax>=0.0.8
Using cached optax-0.1.3-py3-none-any.whl (145 kB)
Collecting transformers[flax]
Using cached transformers-4.17.0-py3-none-any.whl (3.8 MB)
Using cached transformers-4.16.2-py3-none-any.whl (3.5 MB)
Using cached transformers-4.16.1-py3-none-any.whl (3.5 MB)
Using cached transformers-4.16.0-py3-none-any.whl (3.5 MB)
Using cached transformers-4.15.0-py3-none-any.whl (3.4 MB)
Collecting tokenizers<0.11,>=0.10.1
Using cached tokenizers-0.10.3.tar.gz (212 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
Using cached transformers-4.14.1-py3-none-any.whl (3.4 MB)
Using cached transformers-4.13.0-py3-none-any.whl (3.3 MB)
Using cached transformers-4.12.5-py3-none-any.whl (3.1 MB)
Using cached transformers-4.12.4-py3-none-any.whl (3.1 MB)
Using cached transformers-4.12.3-py3-none-any.whl (3.1 MB)
Using cached transformers-4.12.2-py3-none-any.whl (3.1 MB)
Using cached transformers-4.12.1-py3-none-any.whl (3.1 MB)
Using cached transformers-4.12.0-py3-none-any.whl (3.1 MB)
Using cached transformers-4.11.3-py3-none-any.whl (2.9 MB)
Using cached transformers-4.11.2-py3-none-any.whl (2.9 MB)
Using cached transformers-4.11.1-py3-none-any.whl (2.9 MB)
Using cached transformers-4.11.0-py3-none-any.whl (2.9 MB)
Using cached transformers-4.10.3-py3-none-any.whl (2.8 MB)
Using cached transformers-4.10.2-py3-none-any.whl (2.8 MB)
Using cached transformers-4.10.1-py3-none-any.whl (2.8 MB)
Using cached transformers-4.10.0-py3-none-any.whl (2.8 MB)
Using cached transformers-4.9.2-py3-none-any.whl (2.6 MB)
Collecting huggingface-hub==0.0.12
Using cached huggingface_hub-0.0.12-py3-none-any.whl (37 kB)
Collecting transformers[flax]
Using cached transformers-4.9.1-py3-none-any.whl (2.6 MB)
Using cached transformers-4.9.0-py3-none-any.whl (2.6 MB)
Using cached transformers-4.8.2-py3-none-any.whl (2.5 MB)
Using cached transformers-4.8.1-py3-none-any.whl (2.5 MB)
Using cached transformers-4.8.0-py3-none-any.whl (2.5 MB)
Using cached transformers-4.7.0-py3-none-any.whl (2.5 MB)
Collecting huggingface-hub==0.0.8
Using cached huggingface_hub-0.0.8-py3-none-any.whl (34 kB)
Collecting transformers[flax]
Using cached transformers-4.6.1-py3-none-any.whl (2.2 MB)
Using cached transformers-4.6.0-py3-none-any.whl (2.3 MB)
Using cached transformers-4.5.1-py3-none-any.whl (2.1 MB)
Using cached transformers-4.5.0-py3-none-any.whl (2.1 MB)
Using cached transformers-4.4.2-py3-none-any.whl (2.0 MB)
Using cached transformers-4.4.1-py3-none-any.whl (2.1 MB)
Using cached transformers-4.4.0-py3-none-any.whl (2.1 MB)
Using cached transformers-4.3.3-py3-none-any.whl (1.9 MB)
Using cached transformers-4.3.2-py3-none-any.whl (1.8 MB)
Using cached transformers-4.3.1-py3-none-any.whl (1.8 MB)
Using cached transformers-4.3.0-py3-none-any.whl (1.8 MB)
Using cached transformers-4.2.2-py3-none-any.whl (1.8 MB)
Collecting tokenizers==0.9.4
Using cached tokenizers-0.9.4.tar.gz (184 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
Using cached transformers-4.2.1-py3-none-any.whl (1.8 MB)
Using cached transformers-4.2.0-py3-none-any.whl (1.8 MB)
Using cached transformers-4.1.1-py3-none-any.whl (1.5 MB)
Using cached transformers-4.1.0-py3-none-any.whl (1.5 MB)
Using cached transformers-4.0.1-py3-none-any.whl (1.4 MB)
Collecting flax==0.2.2
Using cached flax-0.2.2-py3-none-any.whl (148 kB)
Collecting transformers[flax]
Using cached transformers-4.0.0-py3-none-any.whl (1.4 MB)
Using cached transformers-3.5.1-py3-none-any.whl (1.3 MB)
Requirement already satisfied: protobuf in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from transformers[flax]) (3.19.6)
Collecting sentencepiece==0.1.91
Using cached sentencepiece-0.1.91.tar.gz (500 kB)
Preparing metadata (setup.py) ... done
Collecting tokenizers==0.9.3
Using cached tokenizers-0.9.3.tar.gz (172 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting transformers[flax]
Using cached transformers-3.5.0-py3-none-any.whl (1.3 MB)
Using cached transformers-3.4.0-py3-none-any.whl (1.3 MB)
Collecting tokenizers==0.9.2
Using cached tokenizers-0.9.2.tar.gz (170 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting sentencepiece!=0.1.92
Using cached sentencepiece-0.1.97-cp310-cp310-win_amd64.whl (1.1 MB)
Collecting transformers[flax]
Using cached transformers-3.3.1-py3-none-any.whl (1.1 MB)
WARNING: transformers 3.3.1 does not provide the extra 'flax'
Collecting tokenizers==0.8.1.rc2
Using cached tokenizers-0.8.1rc2.tar.gz (97 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: colorama in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from tqdm>=4.27->transformers[flax]) (0.4.5)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from packaging>=20.0->transformers[flax]) (3.0.9)
Requirement already satisfied: idna<4,>=2.5 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (3.4)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (1.26.12)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from requests->transformers[flax]) (2022.9.24)
Collecting joblib
Using cached joblib-1.2.0-py3-none-any.whl (297 kB)
Requirement already satisfied: six in c:\users\user\desktop\artificial intelligence\.env\lib\site-packages (from sacremoses->transformers[flax]) (1.16.0)
Collecting click
Using cached click-8.1.3-py3-none-any.whl (96 kB)
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [48 lines of output]
C:\Users\user\AppData\Local\Temp\pip-build-env-hhrbpvks\overlay\Lib\site-packages\setuptools\dist.py:530: UserWarning: Normalizing '0.8.1.rc2' to '0.8.1rc2'
warnings.warn(tmpl.format(**locals()))
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-cpython-310
creating build\lib.win-amd64-cpython-310\tokenizers
copying tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers
creating build\lib.win-amd64-cpython-310\tokenizers\models
copying tokenizers\models\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\models
creating build\lib.win-amd64-cpython-310\tokenizers\decoders
copying tokenizers\decoders\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\decoders
creating build\lib.win-amd64-cpython-310\tokenizers\normalizers
copying tokenizers\normalizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
creating build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
copying tokenizers\pre_tokenizers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
creating build\lib.win-amd64-cpython-310\tokenizers\processors
copying tokenizers\processors\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\processors
creating build\lib.win-amd64-cpython-310\tokenizers\trainers
copying tokenizers\trainers\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\trainers
creating build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\base_tokenizer.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\bert_wordpiece.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\byte_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\char_level_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\sentencepiece_bpe.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\implementations\__init__.py -> build\lib.win-amd64-cpython-310\tokenizers\implementations
copying tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers
copying tokenizers\models\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\models
copying tokenizers\decoders\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\decoders
copying tokenizers\normalizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\normalizers
copying tokenizers\pre_tokenizers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\pre_tokenizers
copying tokenizers\processors\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\processors
copying tokenizers\trainers\__init__.pyi -> build\lib.win-amd64-cpython-310\tokenizers\trainers
running build_ext
running build_rust
error: can't find Rust compiler
If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.
To update pip, run:
pip install --upgrade pip
and then retry package installation.
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and
ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to
download and update the Rust compiler toolchain.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
Do you know how I can successfully install this into VS Code and use Hugging Face properly?
If you did intend to build this package from source, try installing a Rust compiler from your system package manager and
ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to
download and update the Rust compiler toolchain.
[end of output]
That's the primary error that you're having. You're going to need to install the rust-lang compiler in order to finish the install.
Can anybody find me a way to install this package, because all the solutions that i retrieved by searching were suitable for the specified package in the question.
I have Win.10 and i use Pycharm for ide.
The problem seems to be in a package named cryptacular
I tried already to download and install manually but it didn't work.
'''
Collecting pyshop
Using cached pyshop-1.3.0-py3-none-any.whl
Collecting docutils
Using cached docutils-0.18.1-py2.py3-none-any.whl (570 kB)
Collecting pyramid-filterwarnings
Using cached pyramid_filterwarnings-0.4.tar.gz (3.6 kB)
Preparing metadata (setup.py) ... done
Collecting zope.sqlalchemy
Using cached zope.sqlalchemy-1.6-py2.py3-none-any.whl (22 kB)
Collecting pyramid-tm
Using cached pyramid_tm-2.5-py2.py3-none-any.whl (6.5 kB)
Collecting pyramid-jinja2
Using cached pyramid_jinja2-2.10-py3-none-any.whl (43 kB)
Collecting pyramid-rpc
Using cached pyramid_rpc-0.8-py2.py3-none-any.whl (24 kB)
Collecting requests
Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB)
Collecting cryptacular
Using cached cryptacular-1.6.2.tar.gz (75 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: setuptools in c:\users\meir israeli\appdata\local\programs\python\python310\lib\site-packages (from pyshop) (58.1.0)
Collecting SQLAlchemy
Using cached SQLAlchemy-1.4.36-cp310-cp310-win_amd64.whl (1.6 MB)
Collecting pyramid>=1.5
Using cached pyramid-2.0-py3-none-any.whl (246 kB)
Collecting plaster
Using cached plaster-1.0-py2.py3-none-any.whl (14 kB)
Collecting zope.deprecation>=3.5.0
Using cached zope.deprecation-4.4.0-py2.py3-none-any.whl (10 kB)
Collecting webob>=1.8.3
Using cached WebOb-1.8.7-py2.py3-none-any.whl (114 kB)
Collecting hupper>=1.5
Using cached hupper-1.10.3-py2.py3-none-any.whl (26 kB)
Collecting plaster-pastedeploy
Using cached plaster_pastedeploy-0.7-py2.py3-none-any.whl (7.8 kB)
Collecting zope.interface>=3.8.0
Using cached zope.interface-5.4.0.tar.gz (249 kB)
Preparing metadata (setup.py) ... done
Collecting venusian>=1.0
Using cached venusian-3.0.0-py3-none-any.whl (13 kB)
Collecting translationstring>=0.4
Using cached translationstring-1.4-py2.py3-none-any.whl (15 kB)
Collecting pbkdf2
Using cached pbkdf2-1.3-py3-none-any.whl
Collecting markupsafe
Using cached MarkupSafe-2.1.1-cp310-cp310-win_amd64.whl (17 kB)
Collecting jinja2!=2.11.0,!=2.11.1,!=2.11.2,>=2.5.0
Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting transaction>=2.0
Using cached transaction-3.0.1-py2.py3-none-any.whl (47 kB)
Collecting charset-normalizer~=2.0.0
Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting urllib3<1.27,>=1.21.1
Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting greenlet!=0.4.17
Using cached greenlet-1.1.2-cp310-cp310-win_amd64.whl (101 kB)
Collecting PasteDeploy>=2.0
Using cached PasteDeploy-2.1.1-py2.py3-none-any.whl (17 kB)
Building wheels for collected packages: cryptacular, pyramid-filterwarnings, zope.interface
Building wheel for cryptacular (pyproject.toml) ... done
WARNING: Building wheel for cryptacular failed: [Errno 2] No such file or directory: 'C:\\Users\\Meir Israeli\\AppData\\Local\\Temp\\pip-wheel-lnvsxd9w\\cryptacu
lar-1.6.2-cp310-cp310-win_amd64.whl'
Building wheel for pyramid-filterwarnings (setup.py) ... done
Created wheel for pyramid-filterwarnings: filename=pyramid_filterwarnings-0.4-py3-none-any.whl size=3751 sha256=8c8c99a9acb034a282fabc0d76d0dc0aa5eda94845fdce3d7
a19cb7e34c2e934
Stored in directory: c:\users\meir israeli\appdata\local\pip\cache\wheels\83\8c\6b\c3b2da1653b88af98992da7f4774d7907450706d4d1dff3298
Building wheel for zope.interface (setup.py) ... done
Created wheel for zope.interface: filename=zope.interface-5.4.0-cp310-cp310-win_amd64.whl size=211254 sha256=d6735f1745a8f576611ae0689f0a590e336212133aaf517d554e
e3682a605080
Stored in directory: c:\users\meir israeli\appdata\local\pip\cache\wheels\21\a9\8b\0bfc5594d8e109d5b25d6b69e0cff14d09d93e3522dcb16d2b
Successfully built pyramid-filterwarnings zope.interface
Failed to build cryptacular
ERROR: Could not build wheels for cryptacular, which is required to install pyproject.toml-based projects
'''
If you are using Anaconda, you can try conda install cryptacular before installing apex.
I can't "pip instal scipy" on my m1 mac, I get an error:
Collecting scipy
Using cached scipy-1.7.3.tar.gz (36.1 MB)
Installing build dependencies ... error
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> [3280 lines of output]
Looking in indexes: https://pypi.org/simple, https://pypi.dev.project.com/pypi/, https://pypi.dev.project.com/pypi/
Ignoring numpy: markers 'python_version == "3.7" and platform_machine == "aarch64"' don't match your environment
Ignoring numpy: markers 'python_version == "3.8" and platform_machine == "aarch64"' don't match your environment
Ignoring numpy: markers 'python_version == "3.8" and (platform_machine != "arm64" or platform_system != "Darwin") and platform_python_implementation != "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.9" and (platform_machine != "arm64" or platform_system != "Darwin") and platform_python_implementation != "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.10" and platform_python_implementation != "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.7" and platform_python_implementation == "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.8" and platform_python_implementation == "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.9" and platform_python_implementation == "PyPy"' don't match your environment
Ignoring numpy: markers 'python_version == "3.10" and platform_python_implementation == "PyPy"' don't match your environment
Collecting wheel<0.38.0
Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting setuptools<58.0.0
Using cached setuptools-57.5.0-py3-none-any.whl (819 kB)
Collecting Cython<3.0,>=0.29.18
Using cached Cython-0.29.28-py2.py3-none-any.whl (983 kB)
Collecting pybind11<2.8.0,>=2.4.3
Using cached pybind11-2.7.1-py2.py3-none-any.whl (200 kB)
Collecting pythran<0.10.0,>=0.9.12
Using cached pythran-0.9.12.post1-py3-none-any.whl (4.3 MB)
Collecting numpy==1.16.5
Using cached numpy-1.16.5.zip (5.1 MB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting ply>=3.4
Using cached ply-3.11-py2.py3-none-any.whl (49 kB)
Collecting gast~=0.5.0
Using cached gast-0.5.3-py3-none-any.whl (19 kB)
Collecting beniget~=0.4.0
Using cached beniget-0.4.1-py3-none-any.whl (9.4 kB)
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): finished with status 'error'
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [2943 lines of output]
Running from numpy source directory.
blas_opt_info:
blas_mkl_info:
customize UnixCCompiler
libraries mkl_rt not found in ['/Users/danieljohnson/Documents/code/project/venv/lib', '/usr/local/lib', '/usr/lib']
NOT AVAILABLE
blis_info:
customize UnixCCompiler
libraries blis not found in ['/Users/danieljohnson/Documents/code/project/venv/lib', '/usr/local/lib', '/usr/lib']
NOT AVAILABLE
openblas_info:
customize UnixCCompiler
customize UnixCCompiler
libraries openblas not found in ['/Users/danieljohnson/Documents/code/project/venv/lib', '/usr/local/lib', '/usr/lib']
NOT AVAILABLE
... the end of the error is:
error: Command "clang -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include -I/Users/danieljohnson/Documents/code/project/venv/lib/python3.7/site-packages/numpy/core/include -DNPY_INTERNAL_BUILD=1 -DHAVE_NPY_CONFIG_H=1 -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE=1 -D_LARGEFILE64_SOURCE=1 -Inumpy/core/include -Ibuild/src.macosx-12.3-arm64-3.7/numpy/core/include/numpy -Inumpy/core/src/common -Inumpy/core/src -Inumpy/core -Inumpy/core/src/npymath -Inumpy/core/src/multiarray -Inumpy/core/src/umath -Inumpy/core/src/npysort -I/Users/danieljohnson/Documents/code/project/venv/include -I/Users/danieljohnson/.pyenv/versions/3.7.13/include/python3.7m -Ibuild/src.macosx-12.3-arm64-3.7/numpy/core/src/common -Ibuild/src.macosx-12.3-arm64-3.7/numpy/core/src/npymath -Ibuild/src.macosx-12.3-arm64-3.7/numpy/core/src/common -Ibuild/src.macosx-12.3-arm64-3.7/numpy/core/src/npymath -c build/src.macosx-12.3-arm64-3.7/numpy/core/src/multiarray/_multiarray_tests.c -o build/temp.macosx-12.3-arm64-3.7/build/src.macosx-12.3-arm64-3.7/numpy/core/src/multiarray/_multiarray_tests.o -MMD -MF build/temp.macosx-12.3-arm64-3.7/build/src.macosx-12.3-arm64-3.7/numpy/core/src/multiarray/_multiarray_tests.o.d" failed with exit status 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: legacy-install-failure
× Encountered error while trying to install package.
╰─> numpy
note: This is an issue with the package mentioned above, not pip.
hint: See above for output from the failure.
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× pip subprocess to install build dependencies did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
I've tried pip and pip3 uninstalling numpy, using older versions, installing scypi with --no-deps.
I can import numpy in a python console.
A "pip show numpy" displays:
Name: numpy
Version: 1.21.6
Summary: NumPy is the fundamental package for array computing with Python.
Home-page: https://www.numpy.org
Author: Travis E. Oliphant et al.
Author-email:
License: BSD
Location: /Users/danieljohnson/Documents/code/project/venv/lib/python3.7/site-packages
Requires:
Required-by: pandas
I'm not sure what to try. I think it either has something to do with the m1 architecture or the install not being able to find numpy for some reason.
I would suggest you to use conda for managing your Python resources on your Mac M1 machine, unless you have a specific reason to stick with pip or pip3.
Below are the key steps that might help you to quickly install scipy package:
Install miniconda on your Mac: please follow step 1 to 6 on this page under the title Installing on MacOS. Don't worry about subpoint, Anaconda Installer for MacOS, in step 1 on the link attached above.
Create a new environment for your work purpose. It's recommended to not install packages in base environment of conda, and that's why I am suggesting for creating new environment. Please follow the sections, Managing conda and Managing environments, on this page.
Set conda-forge as conda's highest priority channel. This means conda will always check if Python packages you want to install are available via conda-forge or not. To do this, please type following lines of code in the terminal:
conda config --add channels conda-forge
conda config --set channel_priority strict
You can read about conda-forge here.
Finally, type conda install scipy in the terminal by activating your non-base conda environment. This should install scipy along with its dependencies with your permission on your machine.
I load this dataset in this way:
Add data > Search for the dataset name (clova deep text ...) > Add
After the dataset is loaded and is visible in the sidebar, I found a data.mdb and lock.mdb inside every subfolder. I need to examine the contents, view images, view labels ... What should I do to open / view contents in / modify this weird format?
Based on Luke's suggestion I tried apt install mbtools, the installation starts and I'm prompted to enter y/n and unable to because the cell doesn't let you. If I try passing -y I get a unrecognized argument thing. Then I tried the following which complains about the missing package.
pip install mbtools meza
from meza import io
records = io.read('/kaggle/input/clova-deeptext/clova_deeptext/data_lmdb_release/training/ST/data.mdb')
print(next(records))
result:
Collecting mbtools
Downloading mbtools-0.1.1-py3-none-any.whl (3.7 kB)
Collecting meza
Downloading meza-0.42.5-py2.py3-none-any.whl (55 kB)
|████████████████████████████████| 55 kB 165 kB/s eta 0:00:01
Collecting ijson<3.0.0,>=2.3
Downloading ijson-2.6.1-cp37-cp37m-manylinux1_x86_64.whl (65 kB)
|████████████████████████████████| 65 kB 352 kB/s eta 0:00:01
Collecting python-slugify<2.0.0,>=1.2.5
Downloading python-slugify-1.2.6.tar.gz (6.8 kB)
Collecting dbfread==2.0.4
Downloading dbfread-2.0.4-py2.py3-none-any.whl (19 kB)
Requirement already satisfied: beautifulsoup4<5.0.0,>=4.6.0 in /opt/conda/lib/python3.7/site-packages (from meza) (4.10.0)
Collecting pygogo<0.15.0,>=0.13.2
Downloading pygogo-0.13.2-py2.py3-none-any.whl (20 kB)
Requirement already satisfied: requests<3.0.0,>=2.18.4 in /opt/conda/lib/python3.7/site-packages (from meza) (2.25.1)
Collecting xlrd<2.0.0,>=1.1.0
Downloading xlrd-1.2.0-py2.py3-none-any.whl (103 kB)
|████████████████████████████████| 103 kB 1.0 MB/s eta 0:00:01
Requirement already satisfied: PyYAML<6.0.0,>=4.2b1 in /opt/conda/lib/python3.7/site-packages (from meza) (5.4.1)
Collecting chardet<4.0.0,>=3.0.4
Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)
|████████████████████████████████| 133 kB 987 kB/s eta 0:00:01
Requirement already satisfied: python-dateutil<3.0.0,>=2.7.2 in /opt/conda/lib/python3.7/site-packages (from meza) (2.8.0)
Requirement already satisfied: soupsieve>1.2 in /opt/conda/lib/python3.7/site-packages (from beautifulsoup4<5.0.0,>=4.6.0->meza) (2.2.1)
Requirement already satisfied: six>=1.5 in /opt/conda/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.7.2->meza) (1.15.0)
Requirement already satisfied: Unidecode>=0.04.16 in /opt/conda/lib/python3.7/site-packages (from python-slugify<2.0.0,>=1.2.5->meza) (1.2.0)
Requirement already satisfied: idna<3,>=2.5 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.18.4->meza) (2.10)
Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.18.4->meza) (2021.5.30)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /opt/conda/lib/python3.7/site-packages (from requests<3.0.0,>=2.18.4->meza) (1.26.6)
Building wheels for collected packages: python-slugify
Building wheel for python-slugify (setup.py) ... done
Created wheel for python-slugify: filename=python_slugify-1.2.6-py2.py3-none-any.whl size=4609 sha256=8c4763108a666b347806541ae6fa0fb59656f9ea38406507f7c83fd06d7621e9
Stored in directory: /root/.cache/pip/wheels/c5/02/83/9904a9436aa0205c8daa9127109e9ed50d3eab25a5ea2fcb9f
Successfully built python-slugify
Installing collected packages: chardet, xlrd, python-slugify, pygogo, ijson, dbfread, meza, mbtools
Attempting uninstall: chardet
Found existing installation: chardet 4.0.0
Uninstalling chardet-4.0.0:
Successfully uninstalled chardet-4.0.0
Attempting uninstall: python-slugify
Found existing installation: python-slugify 5.0.2
Uninstalling python-slugify-5.0.2:
Successfully uninstalled python-slugify-5.0.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
caip-notebooks-serverextension 1.0.0 requires google-cloud-bigquery-storage, which is not installed.
jupyterlab-git 0.11.0 requires nbdime<2.0.0,>=1.1.0, but you have nbdime 3.1.0 which is incompatible.
gcsfs 2021.7.0 requires fsspec==2021.07.0, but you have fsspec 2021.8.1 which is incompatible.
earthengine-api 0.1.283 requires google-api-python-client<2,>=1.12.1, but you have google-api-python-client 1.8.0 which is incompatible.
aiobotocore 1.4.1 requires botocore<1.20.107,>=1.20.106, but you have botocore 1.21.44 which is incompatible.
Successfully installed chardet-3.0.4 dbfread-2.0.4 ijson-2.6.1 mbtools-0.1.1 meza-0.42.5 pygogo-0.13.2 python-slugify-1.2.6 xlrd-1.2.0
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
You must install [mdbtools](http://sourceforge.net/projects/mdbtools/) in order to use this function
Try using the following snippet:
!sudo apt install mdbtools # Library that allows access to MDB files programatically
!pip install meza # A Python toolkit for processing tabular data
from meza import io
records = io.read('database.mdb') # Use only file path, not file objects
print(next(records))