Proj4js - convert WGS84 to UTM (military grid reference system) - utm

Very new to the proj4js library and most things geo-spacial related, so getting my head around all the naming conventions and different types of coordinate systems.
Essentially I'm trying to take a set of WGS84 Lat-Long coordinates from a polygon and convert them to UTM in order to get the corresponding zone in the military grid reference system.
I've tried
const corrdinates = [ [ [ -1.5321158470515384, 52.34135509678963 ],
[ 0.0777579252987236, 52.310366914514184 ],
[ 0.01125412258311688, 51.324523354307196 ],
[ -1.5638793748853044, 51.354439389788006 ],
[ -1.5321158470515384, 52.34135509678963 ] ] ]
function latLngToMgrsUtmZones(coords) {
try {
coords[0] = coords[0].map((p) =>
proj4(
"WGS84",
`utm`,
p
)
);
console.log('coords', coords);
} catch (err) {
console.log(err);
}
}
latLngToMgrsUtmZones(coordinates);
But this just returns utm. I thought one simply needed pass in the projection name to proj4js and it would convert it.
How does one go about doing this using this library?

Based on your coordinates list, the UTM zone is 31. The proj4 definition string for that UTM zone would be:-
"+proj=utm +zone=31 +datum=WGS84 +units=m +no_defs"
You can define a new 'proj4' definition as follows:-
// add a definition of a UTM zone
proj4.defs("EPSG:32631","+proj=utm +zone=31 +datum=WGS84 +units=m +no_defs");
Then, the troublesome function can be updated to:
function latLngToMgrsUtmZones(coords) {
try {
coords[0] = coords[0].map((p) =>
proj4(
"WGS84",
"EPSG:32631",
p
)
);
console.log('coords', coords);
} catch (err) {
console.log("ERR: "+err);
}
}
If you run
latLngToMgrsUtmZones(coordinates);
you will get:-
coords: [
[
[ 191321.4903300695, 5808679.516222329 ],
[ 300798.59040295583, 5799580.45321534 ],
[ 291767.8812803207, 5690155.850511101 ],
[ 182274.43537953158, 5699133.648491979 ],
[ 191321.4903300695, 5808679.516222329 ]
]
]

Related

Typesense 'OR' query mode

Description
I have a case where I'd like to search for multiple query tokens in a single collection like:
let searchRequests = {
'searches': [
{
'collection': 'products',
'q': 'shoe hat dress perfume',
}
]
}
each token contains results if I query them individually and also if I query two tokens like so: 'q': 'shoe hat',.
Is there a way to allow for more than two query items?
Expected Behavior
I expect to have results returned based on my query tokens 'shoe hat dress perfume', or in other words an OR query mode:
{
"results": [
{
"facet_counts": [],
"found": 100,
"hits": [
...
}
]
}
Actual Behavior
The actual behavior is that nothing is found:
{
"results": [
{
"facet_counts": [],
"found": 0,
"hits": [
...
}
]
}
Metadata
Typesense Version: 0.22.0
Typesense doesn't support strict ORs and currently have no plan to do so.
To solve my problem I used filter_by instead like so:
let searchRequests = {
'searches': [
{
'collection': 'products',
'q': '*',
'filter_by': "category:["shoe", "hat", "dress", "perfume"]"
}
]
}
This would return all products of category shoe/hat/dress/parfume.
More details on usage of filter_by here: https://typesense.org/docs/0.19.0/api/documents.html#index-a-document

How to get data from nested array in DataTables?

I used to download data from a json file in this format to my DataTables table:
{
"data": [
["n/a","668","01.11.2021 14:16:20", ... ],
["n/a","670","05.11.2021 23:23:54", ...]
]
}
...
"ajax": "first.json",
columns: [
{ data: 0 },
{ data: 1 }
...
And everythig was ok
But now format of my json was changed:
{
"data": {
"deals": [["n/a", "718", "30/11/2021 21:46:14"], ["", "718", "30/11/2021 21:46:14"], ... ],
"stops": [["07/10/2021 21:48:28", "BTCUSDT"], ["07/10/2021 21:48:28", "BTCUSDT"], ... ]
}
}
And I try to get data like this and get "No data available in table":
...
"ajax": "first.json",
columns: [
{ data: 'deals.0' },
{ data: 'deals.1' }
...
How can I get data from new format of json to my table?
Here is your new JSON structure, provided by the URL:
{
"data": {
"deals": [
["n/a", "718", "30/11/2021 21:46:14"],
["", "718", "30/11/2021 21:46:14"]
],
"stops": [
["07/10/2021 21:48:28", "BTCUSDT"],
["07/10/2021 21:48:28", "BTCUSDT"]
]
}
}
In this structure, the location of the deals data is data.deals. This location points to an array of arrays, which is what DataTables needs (or an array of objects).
(This means the table will only have access to the deals data, since the stops data is in a separate location entirely. But I will assume you only want the deals data to match your original example.)
You therefore need to use the DataTables dataSrc option to tell DataTables where to look in your new JSON:
<table id="example" class="display" style="width:100%"></table>
and:
$(document).ready(function() {
$('#example').DataTable( {
ajax: {
method: "GET",
url: "first.json", // or whatever URL you want to use
dataSrc: "data.deals"
},
"columns": [
{ "title": "Col 1" },
{ "title": "Col 2" },
{ "title": "Col 3" }
]
} );
} );
Because each row of data is an array, you don't need to specify the specific array indexes in your columns - DataTables will iterate over each row array for you.
The result is:

getting lvalue and rvalue of a declaration

Im parsing C++ with ANTLR4 grammar, I have a visitor function for visitDeclarationStatement. In the C++ code that Im trying to parse Person p; or a declaration of any custom type, in the tree I get two similar nodes and I cannot differentiate between the Lvalue and Rvalue!
"declarationStatement": [
{
"blockDeclaration": [
{
"simpleDeclaration": [
{
"declSpecifierSeq": [
{
"declSpecifier": [
{
"typeSpecifier": [
{
"trailingTypeSpecifier": [
{
"simpleTypeSpecifier": [
{
"theTypeName": [
{
"className": [
{
"type": 128,
"text": "Person"
}
]
}
]
}
]
}
]
}
]
}
]
},
{
"declSpecifier": [
{
"typeSpecifier": [
{
"trailingTypeSpecifier": [
{
"simpleTypeSpecifier": [
{
"theTypeName": [
{
"className": [
{
"type": 128,
"text": "p"
}
]
}
]
}
]
}
]
}
]
}
]
}
]
},
{
"type": 124,
"text": ";"
}
]
}
I want to be able to get Variable Type and Variable Name separately. What is the right way of doing that? How can I change the g4 file to get those results in a way that I can differentiate between the type and the name?
Thanks
You don't need to change the grammar. In case of Person p;, which is matched by:
declSpecifierSeq: declSpecifier+ attributeSpecifierSeq?;
the first child of declSpecifierSeq (which is declSpecifier+) will be a List of declSpecifier-contexts, of which the first is the type and the second the name.

FaunaDB get entries by date range with index binding not working

I am struggling to get an Index by Date to work with a Range.
I have this collection called orders:
CreateCollection({name: "orders"})
And I have these sample entries, with one attribute called mydate. As you see it is just a string. And I do need to create the date as a string since in my DB we already have around 12K records with dates like that so I cant just start using the Date() to create them.
Create(Collection("orders"), {data: {"mydate": "2020-07-10"}})
Create(Collection("orders"), {data: {"mydate": "2020-07-11"}})
Create(Collection("orders"), {data: {"mydate": "2020-07-12"}})
I have created this index that computes the date to and actual Date object
CreateIndex({
name: "orders_by_my_date",
source: [
{
collection: Collection("orders"),
fields: {
date: Query(Lambda("order", Date(Select(["data", "mydate"], Var("order"))))),
},
},
],
terms: [
{
binding: "date",
},
],
});
If I try to fetch a single date the index works.
// this works
Paginate(
Match(Index("orders_by_my_date"), Date("2020-07-10"))
);
// ---
{
data: [Ref(Collection("orders"), "278496072502870530")]
}
But when I try to get a Range it never finds data.
// This does NOT work :(
Paginate(
Range(Match(Index("orders_by_my_date")), Date("2020-07-09"), Date("2020-07-15"))
);
// ---
{
data: []
}
Why the index does not work with a Range?
Range operates on the values of an index, not on the terms.
See: https://docs.fauna.com/fauna/current/api/fql/functions/range?lang=javascript
You need to change your index definition to:
CreateIndex({
name: "orders_by_my_date",
source: [
{
collection: Collection("orders"),
fields: {
date: Query(Lambda("order", Date(Select(["data", "mydate"], Var("order"))))),
},
},
],
values: [
{ binding: "date" },
{ field: ["ref"] },
],
})
Then you can get the results that you expect:
> Paginate(Range(Match(Index('orders')), Date('2020-07-11'), Date('2020-07-15')))
{
data: [
[
Date("2020-07-11"),
Ref(Collection("orders"), "278586211497411072")
],
[
Date("2020-07-12"),
Ref(Collection("orders"), "278586213229658624")
],
[
Date("2020-07-13"),
Ref(Collection("orders"), "278586215000703488")
],
[
Date("2020-07-14"),
Ref(Collection("orders"), "278586216887091712")
],
[
Date("2020-07-15"),
Ref(Collection("orders"), "278586218585784832")
]
]
}
Another alternative is to use a filter with a lambda expression to validate which values you want
Filter(
Paginate(Documents(Collection('orders'))),
Lambda('order',
And(
GTE(Select(['data', 'mydate'], Var('order')), '2020-07-09'),
LTE(Select(['data', 'mydate'], Var('order')), '2020-07-15')
)
)
)
You can update the conditions as you need
I believe this will work with the strings you have already
There are some mistakes here, first of all, you have to create documents that way:
Create(Collection("orders"), {data: {"mydate": ToDate("2020-07-10")}})
The index has to be created like this:
CreateIndex(
{
name: "orders_by_my_date",
source: Collection("orders"),
values:[{field:['data','mydate']},{field:['ref']}]
}
)
and finally, you can query your index and range:
Paginate(Range(Match('orders_by_my_date'),[Date("2020-07-09")], [Date("2020-07-15")]))
{ data:
[ [ Date("2020-07-10"),
Ref(Collection("orders"), "278532030954734085") ],
[ Date("2020-07-11"),
Ref(Collection("orders"), "278532033804763655") ],
[ Date("2020-07-12"),
Ref(Collection("orders"), "278532036737630725") ] ] }
or if you want to get the full doc:
Map(Paginate(Range(Match('orders_by_my_date'),[Date("2020-07-09")], [Date("2020-07-15")])),Lambda(['date','ref'],Get(Var('ref'))))
{ data:
[ { ref: Ref(Collection("orders"), "278532030954734085"),
ts: 1601887694290000,
data: { mydate: Date("2020-07-10") } },
{ ref: Ref(Collection("orders"), "278532033804763655"),
ts: 1601887697015000,
data: { mydate: Date("2020-07-11") } },
{ ref: Ref(Collection("orders"), "278532036737630725"),
ts: 1601887699800000,
data: { mydate: Date("2020-07-12") } } ] }

GraphDB Elasticsearch Connector

is there a working example to map lat long properties from graphdb to geo_point objects on elastic search ?
{
"fieldName": "location",
"propertyChain": [
"http://example.com/coordinates"
],
"objectFields": [
{
"fieldName": "lat",
"propertyChain": [
"http://www.w3.org/2003/01/geo/wgs84_pos#lat"
]
},
{
"fieldName": "lon",
"propertyChain": [
"http://www.w3.org/2003/01/geo/wgs84_pos#long"
]
}
]
}
thanks
The only way to index data as geo_point with the current version of GraphDB and the Elasticsearch connector is to have the latitude and the longitude in a single literal, e.g. with the property http://www.w3.org/2003/01/geo/wgs84_pos#lat_long. The connector would look like this:
PREFIX : <http://www.ontotext.com/connectors/elasticsearch#>
PREFIX inst: <http://www.ontotext.com/connectors/elasticsearch/instance#>
INSERT DATA {
inst:geopoint :createConnector '''
{
"elasticsearchNode": "localhost:9300",
"types": ["http://geopoint.ontotext.com/Point"],
"fields": [
{
"fieldName": "location",
"propertyChain": [
"http://www.w3.org/2003/01/geo/wgs84_pos#lat_long"
],
"datatype": "native:geo_point"
}
],
}
''' .
}
Note that datatype: "native:geo_point" is important as it tells Elasticsearch what type of data this is.
We are currently looking into possible ways to introduce support for latitude and longitude coming from separate literals.