CGAL::centroid for Polyheron /3D - cgal

I use already the fonction CGAL::centroid for Polygon_2. It works well :
return CGAL::centroid(vertices_begin (), vertices_end(),CGAL::Dimension_tag<0>());
When I'm trying to use the function in Polyhedron_3, it does not works.
CGAL::centroid shows only 3 possibles signatures returning 2D returns only.
Have you an example for using 3D centroid ?
Many thanks for help.
Gilles

Use points_begin(), points_end() instead.

In the CGAL::Polygon the "vertices" are points, in the Polyhedron_3 they are Vertex_handles. You could use a boost::transform_iterator to obtain a sequence of points.
Here you can see how I did this for CGAL::Surface_mesh

Related

Numpy - how do I erase elements of an array if it is found in an other array

TLDR: I have 2 arrays indices = numpy.arange(9) and another that contains some of the numbers in indices (maybe none at all, maybe it'll contain [2,4,7]). The output I'd like for this example is [0,1,3,5,6,8]. What method can be used to achieve this?
Edit: I found a method which works somewhat: casting both arrays to a set then taking the difference of the two does give the correct result, but as a set, even if I pass this result to a numpy.array(). I'll update this if I find a solution for that.
Edit2: Casting the result of the subtraction to a list, then casting passing that to a numpy.array() resolved my issue.
I guess I posted this question a little prematurely, given that I found the solution for it myself, but maybe this'll be useful to somebody in future!
You can make use of boolean masking:-
indices[~numpy.isin(indices,[2,4,7])]
Explanation:-
we are using numpy.isin() method to find out the values exists or not in incides array and then using ~ so that this gives opposite result and finally we are passing this boolean mask to indices

SQL:How to find similar strings in a tuple

I tried to use difflib to get_close_matches in a tuple data...but it does not work...I have earlier used difflib in a JSON file but couldn't use it in an SQL...Result expectationI want to find words similar to the input given..even if there is any spelling mistake...for example...if the input is treeeee or TREEEEE or Treeea...my program should consider the nearest match...that is a tree...Similar to the Did you mean? function in GOOGLE. I also tried SELECT * FROM Dictionary WHERE Expression LIKE '%s but the problem persists. Please help me solve this. Thanks in advance.
SQL functions Soundex and DIFFERENCE look like the closest fit.

Cloudinary stuck on Displacement

I am trying to generate a mockup for t-shirts.
The result I am trying to achieve is the following: https://prnt.sc/kzhjk7
Using cloudinary, this is the closest result I have been able to produce:
https://res.cloudinary.com/worldwide-buy-llc/image/upload/c_scale,o_0,w_380/a_0,c_scale,l_TemplateSquare,r_0,w_380,x_900,y_190/c_scale,l_TemplateSquare,w_380,x_-310,y_240/c_scale,u_Mockups:Kids_White,w_3623,x_0,y_0/c_scale,l_Mockups:Kids_Whiteover,o_100,w_3623,x_0/v1538036215/TemplateSquare.png
However, it still looks different from the image that I would like to achieve.
I read that I could apply displacement. For this reason, I do have a displacement map stored at Mockups:Kids_WhiteOver
Do you know how can I apply it? Also the colors of the layers TemplateSquare appear weak in comparison to the target result ( https://prnt.sc/kzhjk7 ).
Any suggestion is very much appreciated since I am literally stuck to achieve that result. Many thanks in advance!
You can try removing the opacity from the transformation and use the multiplying effect.
How about this one: https://res.cloudinary.com/shirly/image/upload/o_0/l_TemplateSquare,w_380,y_300,x_-450/l_TemplateSquare,w_380,y_100,x_650/l_Kids_Whiteover,e_displace,x_10,y_10/u_kids_white,e_multiply/TemplateSquare.png
Let me know if that result can work for you.

Shannon Capacity Formula in AMPL

I want to use Shannon Capacity forumula as one of my constraint in the minimization problem. I am not sure how can I use it as AMPL doesnot support log2.
In simple form its here as: C = B * log2(1+ S/N)
Please guide.
As we know log2(x)=ln(x)/log(2)
We can apply in APML as AMPL supports natural logrithm.
Also you can use extended function libraries provided for AMPL at https://ampl.com/resources/extended-function-library/ by loading this package you can use log(x) as well :
gsl_sf_log(x)

Dynamic function name definition in Julia.. possible?

I have a DataFrame structured as parName|region|year, and access function as getData(parName,reg,year) ( I use access function because I implement my own query logic).
Would it be possible, based on unique(df[:parName]), to dynamically create a set of functions like par1(region,year) "pointing to" getData("par1",region,year) ?
If so, using which approach?
This is a bit the opposite of this question.. there it is explained how to dynamically call a function, while I wander if it possible to dynamically declare/define one..
EDIT:
I am using this approach in order to get the cleanest and most compact syntax possible in writing multi-dimensional equations.
I managed (thanks to the #Liso answer) to implement it as:
for par in unique(dropna(df[:parName]))
#eval ($(Symbol("$(par)_"))) = (r,d1,d2="",y=-1,op=sum) -> gd($par,r,d1,d2,y,op)
#eval ($(Symbol("$(par)!"))) = (v,r,d1,d2="",y=-1) -> sd(v,$par,r,d1,d2,y)
end
i.e., I am using the convention that par!() is a setData-type and par_() is a getData-type equation.
When I'll be able to complete the macro that transforms f(dim1,dim2) = value into f(value,dim1,dim2) I will be able to write my model using a LaTeX-like (and AMPL-like) syntax that is very clear:
#meq price!(tp in secProducts, r in fr2) = sum(price_(r,pp,"",y2)*a_(r,pp,tp,y2) for pp in priProducts) + margin_(r,tp,"",y2)
I am just beginner trying to understand Julia, so I am not sure if it is good idea or not!
See https://docs.julialang.org/en/stable/manual/metaprogramming/#Code-Generation-1 .
I was able to adapt that example to this :
julia> for i in 4:6
#eval ($(Symbol("func$i")))(a) = a^$i
end
julia> func4(2), func5(2), func6(2)
(16, 32, 64)
Maybe it could help you to play and learn :)