I want to use numpy C array api (https://numpy.org/doc/stable/reference/c-api/array.html) using ctypes python library. Similarly to how I can use python c api with ctypes. For example, I can reference PyLong_FromLong function like this:
ctypes.pythonapi.PyLong_FromLong
Is this possible to do? Is there some compiled .so file that I can load like this
np_api = ctypes.CDLL(file_path)
to be able to use numpy api?
Related
I do have a function which returns a Pyspark Pandas object (and I cannot change this to normal pandas object) and I was trying to unit test this function, but it is only throwing null object.
I see that Pytest could be used as an option, but if possible I would like to try with Unittest framework. Is there any way to use the Pyspark.Pandas with Unittest?
I have a C++ class which is interchangeable with Numpy arrays through the buffer protocol, and already I can return objects from C++ to Python which are convertible to Numpy via the numpy.asarray() call.
I would like to make my class even easier to use, so I would like to return Numpy arrays which wrap my class directly from C++.
Is it possible to construct a numpy array from the C++ side using PyBind11 and return it?
I found out how to do this, you can just call the numpy "asarray" function directly on my buffer type.
py::buffer pybuf_to_numpy(py::buffer& b)
{
py::object np = py::module::import("numpy");
py::object asarray = np.attr("asarray");
return asarray(b);
}
I am a beginner trying to learn simple pandas&numpy and I am using Pycharm. Why doesn't pycharm show all of the objects in numpy library as I am typing code? For instance I can't find int64, int32, float32 and such number types. After importing numpy as np, when I start typing np.int, Pycharm acts as if int is available but as if int32 and int64 didn't exist at all.
Here is a screenshot what the code suggestion box looks like: 1
I have the newest numpy (1.18.0) installed both through commandline to the python interpreter itself and in Pycharm to this particular project. I can also confirm that for example np.int64 is valid code - it executes in Pycharm without error:
import numpy as np
print(np.int64(123)) # Executes correctly even though Pycharm shows as if there's no int64 in numpy library
Is it possible to convert a breeze dense matrix to numpy array using spark?
I have here a breeze dense matrix I want to convert to numpy array.
Here is a way that works correctly but is slow / inefficient (creates multiple copies). i used zeppelin spark and pyspark interpreters (i guess toree should also be possible):
in spark:
%spark
import breeze.linalg._
import breeze.numerics._
z.put("matrix", DenseMatrix.eye[Double](4));
z.get("matrix")
then in python:
%pyspark
import numpy as np
def breeze2numpy(breeze_matrix):
data = list(breeze_matrix.copy().data())
return np.array(data).reshape(breeze_matrix.rows(), breeze_matrix.cols(), order='F')
breeze2numpy(z.z.get("matrix"))
this works but will be impractical for big datasets (because of the multiple copies involved via a python list). it would be nice to have a zero-copy method using python's buffer protocol like there is for C++ Eigen matrix --> numpy array.
I was trying to reproduce this example from the matplotlib website using the PyPlot package for Julia. As far as I know, the PyPlot is essentialy the matplotlib.pyplot module, so I imported the other modules of matplotlib that I needed (with the #pyimport macro):
using PyCall
#pyimport matplotlib.path as mpath
#pyimport matplotlib.patches as mpatches
Then I proceed to define the path object:
Path = mpath.Path
but then I get:
fn (generic function with 1 method) .
As if I had defined a function. Moreover, when I assign the path_data I get the following error:
ERROR: type Function has no field MOVETO
Of course, that's due to Path, which Julia tries as a function and not as a type or something like that. As you might guess the same happens when I try to define the variable patch .
So, there are incompatibilities of modules from matplotlib different to pyplot for Julia since the expected objects (types) are taken as functions. This behaviour can be expected if it were different the PyPlot.jl file wouldn't be needed.
My questions are:
-Am I doing something wrong?
-Is there a simple way to make it works?
-Do you know another package for Julia in which I can define patches and work in a similar way to matplotlib?
I have in mind to do this kind of animations.
Thanks for your ideas.
You need to get the "raw" Python object for Path. By default, PyCall converts Python type objects into functions (which call the corresponding constructor), but then you cannot access static members of the class.
Instead, do e.g. Path = mpath.pymember("Path") to get the "raw" PyObject, and then you can do Path["MOVETO"] or Path[:MOVETO] to access the MOVETO member.
(This difficulty will hopefully go away in Julia 0.4 once something like https://github.com/JuliaLang/julia/pull/8008 gets merged (so that we can make PyObjects callable directly.)