MemeryError installing numpy on Ubuntu 16.04 on Digital Ocean - numpy

I have a python application that uses numpy running on my digital Ocean droplet. I am trying to pip install numpy into my virtual environment and each time i am getting an error like this:
Collecting numpy
Downloading numpy-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl (16.6MB)
99% |████████████████████████████████| 16.6MB 40.5MB/s eta
0:00:01Exception:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 209, in
main
status = self.run(options, args)
File "/usr/lib/python2.7/dist-packages/pip/commands/install.py", line 328,
in run
wb.build(autobuilding=True)
File "/usr/lib/python2.7/dist-packages/pip/wheel.py", line 748, in build
self.requirement_set.prepare_files(self.finder)
File "/usr/lib/python2.7/dist-packages/pip/req/req_set.py", line 360, in
prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/usr/lib/python2.7/dist-packages/pip/req/req_set.py", line 577, in
_prepare_file
session=self.session, hashes=hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 810, in
unpack_url
hashes=hashes
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 649, in
unpack_http_url
hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 871, in
_download_http_url
_download_url(resp, link, content_file, hashes)
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 595, in
_download_url
hashes.check_against_chunks(downloaded_chunks)
File "/usr/lib/python2.7/dist-packages/pip/utils/hashes.py", line 46, in
check_against_chunks
for chunk in chunks:
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 563, in
written_chunks
for chunk in chunks:
File "/usr/lib/python2.7/dist-packages/pip/utils/ui.py", line 139, in iter
for x in it:
File "/usr/lib/python2.7/dist-packages/pip/download.py", line 552, in
resp_read
decode_content=False):
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-
any.whl/urllib3/response.py", line 344, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/share/python-wheels/urllib3-1.13.1-py2.py3-none-
any.whl/urllib3/response.py", line 301, in read
data = self._fp.read(amt)
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/filewrapper.py", line 54, in read
self.__callback(self.__buf.getvalue())
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/controller.py", line 224, in cache_response
self.serializer.dumps(request, response, body=body),
File "/usr/share/python-wheels/CacheControl-0.11.5-py2.py3-none-
any.whl/cachecontrol/serialize.py", line 81, in dumps
).encode("utf8"),
MemoryError
Any who is able lto help me figure out how to solve this problem i have tried installing numpy from outside the virtual-env but it is still refusing to download and install.

Related

Failure at building wheel for numpy in a virtualenv, apparently not python version related - Windows 10

I'm trying to install numpy on a Virtual Environment using pip, there is a failure when building wheel for it, however.
The problem is only present when trying to install it in the virtualenv, I can install and update it on my system just fine.
It is running on Windows 10, Python 3.10.5, pip 22.3.1.
Running from numpy source directory.
setup.py:67: DeprecationWarning:
`numpy.distutils` is deprecated since NumPy 1.23.0, as a result
of the deprecation of `distutils` itself. It will be removed for
Python >= 3.12. For older Python versions it will remain present.
It is recommended to use `setuptools < 60.0` for those Python versions.
For more details, see:
https://numpy.org/devdocs/reference/distutils_status_migration.html
import numpy.distutils.command.sdist
Processing numpy/random\_bounded_integers.pxd.in
Processing numpy/random\bit_generator.pyx
Processing numpy/random\mtrand.pyx
Processing numpy/random\_bounded_integers.pyx.in
Processing numpy/random\_common.pyx
Processing numpy/random\_generator.pyx
Processing numpy/random\_mt19937.pyx
Processing numpy/random\_pcg64.pyx
Processing numpy/random\_philox.pyx
Processing numpy/random\_sfc64.pyx
Cythonizing sources
INFO: blas_opt_info:
INFO: blas_armpl_info:
Looking for python310.dll
Traceback (most recent call last):
File "C:\stable_diffusion\diffusers_venv\lib\python3.10\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 363, in <module>
main()
File "C:\stable_diffusion\diffusers_venv\lib\python3.10\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "C:\stable_diffusion\diffusers_venv\lib\python3.10\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 261, in build_wheel
return _build_backend().build_wheel(wheel_directory, config_settings,
File "C:\Users\Vitor\AppData\Local\Temp\pip-build-env-amr3d935\overlay\lib\python3.10\site-packages\setuptools\build_meta.py", line 230, in build_wheel
return self._build_with_temp_dir(['bdist_wheel'], '.whl',
File "C:\Users\Vitor\AppData\Local\Temp\pip-build-env-amr3d935\overlay\lib\python3.10\site-packages\setuptools\build_meta.py", line 215, in _build_with_temp_dir
self.run_setup()
File "C:\Users\Vitor\AppData\Local\Temp\pip-build-env-amr3d935\overlay\lib\python3.10\site-packages\setuptools\build_meta.py", line 267, in run_setup
super(_BuildMetaLegacyBackend,
File "C:\Users\Vitor\AppData\Local\Temp\pip-build-env-amr3d935\overlay\lib\python3.10\site-packages\setuptools\build_meta.py", line 158, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 479, in <module>
setup_package()
File "setup.py", line 471, in setup_package
setup(**metadata)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\core.py", line 135, in setup
config = configuration()
File "setup.py", line 118, in configuration
config.add_subpackage('numpy')
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 1050, in add_subpackage
config_list = self.get_subpackage(subpackage_name, subpackage_path,
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 1016, in get_subpackage
config = self._get_configuration_from_setup_py(
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 958, in _get_configuration_from_setup_py
config = setup_module.configuration(*args)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\setup.py", line 9, in configuration
config.add_subpackage('core')
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 1050, in add_subpackage
config_list = self.get_subpackage(subpackage_name, subpackage_path,
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 1016, in get_subpackage
config = self._get_configuration_from_setup_py(
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\misc_util.py", line 958, in _get_configuration_from_setup_py
config = setup_module.configuration(*args)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\core\setup.py", line 853, in configuration
blas_info = get_info('blas_opt', 0)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 585, in get_info
return cl().get_info(notfound_action)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 845, in get_info
self.calc_info()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 2077, in calc_info
if self._calc_info(blas):
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 2063, in _calc_info
return getattr(self, '_calc_info_{}'.format(name))()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 1979, in _calc_info_armpl
info = get_info('blas_armpl')
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 585, in get_info
return cl().get_info(notfound_action)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 845, in get_info
self.calc_info()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 1337, in calc_info
info = self.check_libs2(lib_dirs, armpl_libs)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 1001, in check_libs2
exts = self.library_extensions()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 960, in library_extensions
c = customized_ccompiler()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\system_info.py", line 216, in customized_ccompiler
global_compiler = _customized_ccompiler()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\__init__.py", line 62, in customized_ccompiler
c = ccompiler.new_compiler(plat=plat, compiler=compiler, verbose=verbose)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\ccompiler.py", line 780, in new_compiler
compiler = klass(None, dry_run, force)
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\mingw32ccompiler.py", line 64, in __init__
build_import_library()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\mingw32ccompiler.py", line 348, in build_import_library
return _build_import_library_amd64()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\mingw32ccompiler.py", line 397, in _build_import_library_amd64
dll_file = find_python_dll()
File "C:\Users\Vitor\AppData\Local\Temp\pip-install-1sr3obr4\numpy_fa19a5e39e9448e7b6553e0f5a275159\numpy\distutils\mingw32ccompiler.py", line 219, in find_python_dll
raise ValueError("%s not found in %s" % (dllname, lib_dirs))
ValueError: python310.dll not found in ['C:\\stable_diffusion\\diffusers_venv\\', 'C:\\stable_diffusion\\diffusers_venv\\lib', 'C:\\stable_diffusion\\diffusers_venv\\bin', 'C:\\Program Files\\Inkscape\\', 'C:\\Program Files\\Inkscape\\lib', 'C:\\Program Files\\Inkscape\\bin', 'C:\\WINDOWS\\System32']
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for numpy
Failed to build numpy
ERROR: Could not build wheels for numpy, which is required to install pyproject.toml-based projects
As I faced a similar problem before, my instinct was to update pip, change the running python version or try an older, specific, version of numpy, but nothing seems to work and the problem persist.

pandas 1.3.3 to_feather giving ArrowMemoryError

I have a dataset of size around 270MB and I use the following to write to feather file:
df.reset_index().to_feather(feather_path)
This gives me an error :
File "C:\apps\Python\lib\site-packages\pandas\util\_decorators.py", line 207, in wrapper
return func(*args, **kwargs)
File "C:\apps\Python\lib\site-packages\pandas\core\frame.py", line 2519, in to_feather
to_feather(self, path, **kwargs)
File "C:\apps\Python\lib\site-packages\pandas\io\feather_format.py", line 87, in to_feather
feather.write_feather(df, handles.handle, **kwargs)
File "C:\apps\Python\lib\site-packages\pyarrow\feather.py", line 152, in write_feather
table = Table.from_pandas(df, preserve_index=False)
File "pyarrow\table.pxi", line 1553, in pyarrow.lib.Table.from_pandas
File "C:\apps\Python\lib\site-packages\pyarrow\pandas_compat.py", line 607, in dataframe_to_arrays
arrays[i] = maybe_fut.result()
File "C:\apps\Python\lib\concurrent\futures\_base.py", line 438, in result
return self.__get_result()
File "C:\apps\Python\lib\concurrent\futures\_base.py", line 390, in __get_result
raise self._exception
File "C:\apps\Python\lib\concurrent\futures\thread.py", line 52, in run
result = self.fn(*self.args, **self.kwargs)
File "C:\apps\Python\lib\site-packages\pyarrow\pandas_compat.py", line 575, in convert_column
result = pa.array(col, type=type_, from_pandas=True, safe=safe)
File "pyarrow\array.pxi", line 302, in pyarrow.lib.array
File "pyarrow\array.pxi", line 83, in pyarrow.lib._ndarray_to_array
File "pyarrow\error.pxi", line 114, in pyarrow.lib.check_status
pyarrow.lib.ArrowMemoryError: realloc of size 3221225472 failed
Note : This works well in PyCharm. No issues writing the feather file.
But when the python program is called in a Windows batch file like:
call python "myprogram.py"
and when I schedule the batch file in a task using Task Scheduler it fails with above memory error.
PyArrow version is 5.0.0 if that helps.
Any ideas please?

Tensorflow in Raspberry Pi - memory error

I am trying to install tensorflow in raspberry pi 4 with the next command:
pip install tensorflow
The next error occurs:
Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
Collecting tensorflow
Downloading https://www.piwheels.org/simple/tensorflow/tensorflow-1.14.0-cp37-none-linux_armv7l.whl (79.6MB)
100% |████████████████████████████████| 79.6MB 8.8MB/s
Exception:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/pip/_internal/cli/base_command.py", line 143, in main
status = self.run(options, args)
File "/usr/lib/python3/dist-packages/pip/_internal/commands/install.py", line 338, in run
resolver.resolve(requirement_set)
File "/usr/lib/python3/dist-packages/pip/_internal/resolve.py", line 102, in resolve
self._resolve_one(requirement_set, req)
File "/usr/lib/python3/dist-packages/pip/_internal/resolve.py", line 256, in _resolve_one
abstract_dist = self._get_abstract_dist_for(req_to_install)
File "/usr/lib/python3/dist-packages/pip/_internal/resolve.py", line 209, in _get_abstract_dist_for
self.require_hashes
File "/usr/lib/python3/dist-packages/pip/_internal/operations/prepare.py", line 283, in prepare_linked_requirement
progress_bar=self.progress_bar
File "/usr/lib/python3/dist-packages/pip/_internal/download.py", line 836, in unpack_url
progress_bar=progress_bar
File "/usr/lib/python3/dist-packages/pip/_internal/download.py", line 677, in unpack_http_url
unpack_file(from_path, location, content_type, link)
File "/usr/lib/python3/dist-packages/pip/_internal/utils/misc.py", line 600, in unpack_file
flatten=not filename.endswith('.whl')
File "/usr/lib/python3/dist-packages/pip/_internal/utils/misc.py", line 489, in unzip_file
data = zip.read(name)
File "/usr/lib/python3.7/zipfile.py", line 1429, in read
return fp.read()
File "/usr/lib/python3.7/zipfile.py", line 885, in read
buf += self._read1(self.MAX_N)
File "/usr/lib/python3.7/zipfile.py", line 975, in _read1
data = self._decompressor.decompress(data, n)
MemoryError
I have tried to install it with the next command that I've seen in the internet fixes the error, but it didn't.
pip install --no-cache-dir tensorflow
Any clue on what could I do??
Thanks in advance.

Apache BEAM pipeline fails when writing TF Records - AttributeError: 'str' object has no attribute 'iteritems'

The issue started appearing over the weekend. For some reason, it feels to be a DataFlow issue.
Previously, I was able to execute the script and write TF records just fine. However, now, I am unable to initialize the computation graph to process the data.
The traceback is:
Traceback (most recent call last):
File "my_script.py", line 1492, in <module>
MyBeamClass()
File "my_script.py", line 402, in __init__
self.run()
File "my_script.py", line 514, in run
transform_fn_io.WriteTransformFn(path=self.JOB_DIR + '/transform/'))
File "/anaconda3/envs/ml27/lib/python2.7/site-packages/apache_beam/pipeline.py", line 426, in __exit__
self.run().wait_until_finish()
File "/anaconda3/envs/ml27/lib/python2.7/site-packages/apache_beam/runners/dataflow/dataflow_runner.py", line 1238, in wait_until_finish
(self.state, getattr(self._runner, 'last_error_msg', None)), self)
apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 531, in apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 532, in apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 533, in apache_beam.runners.worker.operations.DoOperation.start
super(DoOperation, self).start()
File "apache_beam/runners/worker/operations.py", line 202, in apache_beam.runners.worker.operations.Operation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 206, in apache_beam.runners.worker.operations.Operation.start
self.setup()
File "apache_beam/runners/worker/operations.py", line 480, in apache_beam.runners.worker.operations.DoOperation.setup
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 485, in apache_beam.runners.worker.operations.DoOperation.setup
pickler.loads(self.spec.serialized_fn))
File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 317, in loads
return load(file, ignore)
File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 305, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1232, in load_build
for k, v in state.iteritems():
AttributeError: 'str' object has no attribute 'iteritems'
I am using tensorflow==1.13.1 and tensorflow-transform==0.9.0 and apache_beam==2.7.0
with beam.Pipeline(options=self.pipe_opt) as p:
with beam_impl.Context(temp_dir=self.google_cloud_options.temp_location):
# rest of the script
_ = (
transform_fn
| 'WriteTransformFn' >>
transform_fn_io.WriteTransformFn(path=self.JOB_DIR + '/transform/'))
I was experiencing the same error.
It seems to be triggered by a mismatch in the tensorflow-transform versions of your local (or master) machine and the workers one (specified in the setup.py file).
In my case I was running tensorflow-transform==0.13 on my local machine whereas the workers were running 0.8.
Downgrading the local version to 0.8 fixed the issue.

fetch chromium - No such file or directory

I've cloned depot tools given instructions from here on ubuntu 17.04.
When I run fetch chromium I get the following error.
./fetch chromium
Running: gclient root
Traceback (most recent call last):
File "./fetch.py", line 299, in <module>
sys.exit(main())
File "./fetch.py", line 294, in main
return run(options, spec, root)
File "./fetch.py", line 280, in run
if not options.force and checkout.exists():
File "./fetch.py", line 82, in exists
gclient_root = self.run_gclient('root').strip()
File "./fetch.py", line 78, in run_gclient
return self.run(cmd_prefix + cmd, **kwargs)
File "./fetch.py", line 68, in run
return subprocess.check_output(cmd, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 212, in check_output
process = Popen(stdout=PIPE, *popenargs, **kwargs)
File "/usr/lib/python2.7/subprocess.py", line 390, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1024, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
I have come across similar errors in stackoverflow while searching for a resolution however none of them helped resolve the problem.
I'm not clear on if there is any steps required after cloning to install depot tools? The readme
Any thoughts on what is required to solve the problem?
Conteh