Delaunay_triangulation_2 in CGAL doesn't keep order of input vertices - cgal

I have a set of points and I am doing a CGAL::Delaunay_triangulation_2 with them. However, the order of the points in the resulting triangulation is not the same as in the input points. For example, is input point 0 is in (-1,-1), the output point 0 in the triangulation is not in the same position. The point in position (-1,-1) is another one, but not necesarily the 0th.
For me, it is important to keep the order, as I am taking some references (as indices) to the original set of points, so I need the vertex number i in the input set and in the output set to be the same one.
Is there any way to make the output set be ordered the same as the input set? I dont care if I need to reorder the input set, as I can easily do that before taking the references.

As documented here: "Note that this function is not guaranteed to insert the points following the order of PointInputIterator, as spatial_sort() is used to improve efficiency."
If you insert your points one by one then they will be in the insertion order (provided there is no duplicate).
See also this example that can be used to set input id as info() of vertices (then a vector can be created to have direct access from id -> vertex).

Related

Contact pressure representation in Abaqus

The main question is connected with extracting the contact pressure from .odb file.
The issue is described in three facts written below:
Imagine that we have simple 3D contact model in Abaqus/CAE
1.If we make a plot of CPRESS on a deformed shape in visualisation module, we'll get a one value of CPRESS for each node. The same (one value for one node) we will get if we request XYdata field output for all frames. And this all seems to be ok, because as far as I know Abaqus CAE use averaging for surface output (CPRESS) to make it possible to request as nodal output.
2.If we use "Probe values" instrument to examine CPRESS value in node, we'll get four values for one node. It still seems to be ok, because, i suppose, it shows the values befor averaging.
3.If we request CPRESS value from command window using this script:
odb.steps['step_name'].frames[frame_number].fieldOutputs['CPRESS'].getSubset(region='node_path').values
length of this vector of CPRESS values in a single node may be from 1 to 6 depending on a chosen node. And the quantity of CPRESS valuse got using this method have no connection with the quantity got using method 2.
So the trick is that I can't inderstand how the vector of CPRESS in node is forming.
Found very little information about this topic in Abaqus Manual.
Hope somebody may help)
Probe Values, extacts the CPRESS values for the whole element. It shows the face number and its node IDs toghether with their corresponding values.

How to obtain a set of closest points surrounding a specific point in a map?

I have a set of points on a map.
For each point (say 'a'), i want to obtain n (say 5) nearest neighbours that surround the point.
Currently, I make a list of euclidean distance from 'a' to other points .Then I sort the list and choose 'n' smallest values .
I found that this way, it sometimes happen that all closest points will form a polygon without containing a.
Can you suggest a way to choose n points that surround a.
I know the method to determine if a point lies in polygon. I am not able to decide which closest point to discard when I include n+1th point of neighbourhood.
Thanks.

Compute Maya Output Attr From Previous Frame's Outputs

Does Maya allow one to compute the output attributes at frame N using the output attributes calculated at Frame (N-1) as inputs? With the proviso that at (e.g.) Frame 0 we don't look at the previous frame but use some sort of initial condition. Negative frames would be calculated by looking forward in time.
e.g. The translate of the ball at Frame N is computed to be the translate of the ball at Frame N-1 + 1cm higher. At frame zero the ball is given an initial translate of zero.
The DataBlock has a setContext function but the docs appear to forbid using that to do 'timed evaluation'. I could hit the attribute plugs directly and get value with a different time but that would be using inputs outside of the data block.
Is the Maya dependency API essentially timeless - only allowing calculation using the state at the current time? With the only solution to use animation curves which are also essentially timeless (their input state of key frames remaining the same regardless of the time)?
A simple node connection is supposed to be updated on demand, ie for the 'current' frame. It's supposed to be ahistorical -- you should be able to jump to a given frame directly and get a complete evaluation of the scene state without history.
If you need offset values you can use a frame cache node to access a different point in the value stream. You connect the attribute you want to lag to the frameCache's 'stream' plug, and then connect either the 'future' or 'past' attribute to the plug on your node. The offset is applied by specifying the index value for the connections, ie, frameCache1.past[5] is 5 frames behind the value that was fed into the frameCache.
You can also do this in a less performant, but more flexible way by using an expression node. The expression can poll an attribute value at a particular time by calling getAttr() with the -t flag to specify the time. This is much slower to evaluate but lets you apply any arbitrary logic to the time offset you might want.

"Extension" of the first Data Unit

I'm starting to study the FITS format and I'm in the proccess of reading the Definition of FITS document.
I know that a FITS file can have one or more HDUs, the primary being the first one and the extensions being the following ones (if there is more than one HDU), I also know that for the extensions there is a mandatory keyword in the header (XTENSION) that let us know if the Data Unit is an Image, Binary Table or ASCII Table, but how can I know what is the Data Type (Image, Binary Table or ASCII Table) of the first HDU?
I don't understand why XTENSION isn't a mandatory keyword in the primary header.
The "type" of the PRIMARY HDU is essentially IMAGE in most cases. From v3.0 of the standard:
3.3.2. Primary data array
The primary data array, if present, shall consist of a single data
array with from 1 to 999 dimensions (as specified by the NAXIS
keyword defined in Sect. 4.4.1). The random groups convention
in the primary data array is a more complicated structure and
is discussed separately in Sect. 6. The entire array of data values
are represented by a continuous stream of bits starting with
the first bit of the first data block. Each data value shall consist
of a fixed number of bits that is determined by the value of
the BITPIX keyword (Sect. 4.4.1). Arrays of more than one dimension
shall consist of a sequence such that the index along
axis 1 varies most rapidly, that along axis 2 next most rapidly,
and those along subsequent axes progressively less rapidly, with that along axis m, where m is the value of NAXIS, varying least
rapidly. There is no space or any other special character between
the last value on a row or plane and the first value on the next
row or plane of a multi-dimensional array. Except for the location
of the first element, the array structure is independent of the
FITS block structure. This storage order is shown schematically
in Fig. 1 and is the same order as in multi-dimensional arrays in
the Fortran programming language (ISO 2004). The index count
along each axis shall begin with 1 and increment by 1 up to the
value of the NAXISn keyword (Sect. 4.4.1).
If the data array does not fill the final data block, the remainder
of the data block shall be filled by setting all bits to zero.
The individual data values shall be stored in big-endian byte order
such that the byte containing the most significant bits of the
value appears first in the FITS file, followed by the remaining
bytes, if any, in decreasing order of significance.
Though it isn't until later on (in section 7.1) that it makes this connection:
7.1. Image extension
The FITS image extension is nearly identical in structure to the
the primary HDU and is used to store an array of data. Multiple
image extensions can be used to store any number of arrays in a
single FITS file. The first keyword in an image extension shall
be XTENSION= ’IMAGE ’.
It isn't immediately apparent what it means by "nearly identical" here. I guess the only difference is that the PRIMARY HDU may also have the aformentioned "random groups" structure, whereas with IMAGE extension HDU's PCOUNT is always 0 and GCOUNT is always 1.
You'll only rarely see the "random groups" convention. This is sort of a precursor to the BINTABLE format. It was used traditionally in radio interferometry data, but hardly at all outside that.
The reason for all this is for backwards compatibility with older versions of FITS that predate even the existence of extension HDUs. Many FITS-based formats don't put any data in the PRIMARY HDU and use the primary header only for metadata keywords that pertain to the entire file (e.g. most HST data).

How to distinguish in master data and calculated interpolated data?

I'm getting a bunch of vectors with datapoints for a fixed set of values, in the example below you see an example of a vector with a value per time point
1D:2
2D:
7D:5
1M:6
6M:6.5
But alas not for all the timepoints is a value available. All vectors are stored in a database and with a trigger we calcuate the missing values by interpolation, or possibly a more advanced algorithm. Somehow I want to be able to tell which data points have been calculated and which have been original delivered to us. Of course I can add a flag column to the table with values indicating whether the value was a master value or is calculated, but I'm wondering whether there is a more sophisticated way. We probably don't need to determine on a regular basis, so cpu cycles are not an issue for determining or insertion.
The example above shows some nice looking numbers but in reality it would look more somethin like 3.1415966533.
The database for storage is called oracle 10.
cheers.
Could you deactivate the trigger temporarily?