When i try to convert a username to a UUID using the mojang api, the response allways is in the {"username":username, "id":shortened UUID of the username} format. but my problem is, i need the UUID of the player for another API which requires a non shortened UUID, and gives a "malformed UUID" error when i try to do the request with a short UUID. the basic difference between short and long UUID's in minecraft is that the long one's contain multiple - whilst the short ones dont. Is there any way to convert that or another endpoint?
Mojang does not provide any endpoints to use dashed UUIDs instead, however there are many answers on how to insert dashes into the UUID. See Creating a UUID from a string with no dashes
You can use StringBuilder#insert(int, char) to insert dashes where they should be, it should look like this:
new StringBuilder("005056963AB75FD48BDC59C100314C40")
.insert(20, '-')
.insert(16, '-')
.insert(12, '-')
.insert(8, '-')
.toString();
Related
I am using cloud google vision API to extract text from Aadhaar and PAN. How can I get exact user details like name, father's name, and address?
Raw Data
ଭାରତ ସରକାର
Government of India
ଜିତ୍ୟାନନ୍ଦ ଖେମୁକୁ
NITYANANDA KHEMUDU
ପିତା : ସୀତାରାମ ଖେମୁକୁ
Father: Sitaram Khemudu
ଜନ୍ମ ତାରିଖ / DOB : 01.07.1999
ପୁରୁଷ / Male
ମୋ ଆଧାର, ମୋ ପରିଚୟ
I have built 5-6 OCR till date like aadhar, pan, ITR, Driving Linces etc., using google cloud vision API, I think you are looking for response like
{"pan_card_no":"ECXXXXXX123",
"name":"fshksj"
}
to get such response you need to built your own logic, here are some logic's i can share with you
Perform OCR on your document using Google_cloud_vision API and store that response into one array (Goggle gives logic line by line)
Like in above case if you want to grab DOB first you can build logic like i) if "DOB" in (list of item) then grab the numeric values
To get the name what you can do is dropping the unnecessary items from list by if using if condition like (if "India" in i) or (if i.isdigit()) then drop it likewise you can drop the unnesseary items from main list to get the Name
to grab the Address what you can do is, 95% of the time address come with pincode at last, so what you can do is treat pincode as a last index of address and look of "Address" kind of keyword then add all the elements from "Add keyword index" to "pincode index" ( this can be easily done in list) to validate whether the pincode is valid or not you can use library like Pyzipin
There are multiple conditions that you can use, above are the very basic one i mentioned, if you need any specific logic then then you can ask me
I want to extract 4 values out of one field, called msg, from a Splunk query; and the msg is in the form of:
msg: "Service call successful k1=v1 k2=v2 k3=v3 k4=v4 k5=v5 something else can be ignored"
keys are always static but values are not, for instance, v2 could be XXX or XXYYZZ; similarly possible values for v3 just have unpredictable length.
I query to get some sample results and hope to use Field Extractor to generate a regex, but the regex generated can't get all the values out and I guess it's probably because values are not having the same length?
Do I need to change my logging format by separating each key=value using a common? Or I am not using the field extractor correctly?
[Update1]: A few sample data:
msg:Service call successful k1=XXX k2=BBBB k3=Something I made up k4=YYYNNN k5=do not need to retrieve this value
msg:Service call successful k1=SSSSSS k2=AAA k3=This could contain space and comma, like this one k4=YYYNNM k5=can be ignored
I could change the logging format if it makes easier to query and extract fields. Will adding a separator like dot or pipe help?
Normally Splunk will pull key-value pairs out automatically
However, when it doesn't, go try your regular expression(s) on regex101 - the field extractor is often a good[ish] start, but rarely creates efficient (or complete) regular expressions
An inline version of this would be as follows (presuming the "value" half of the key-value pair is contiguous characters):
| rex field=_raw "k1=(?<k1>\S+)\s+k2=(?<k2>\S+)\s+k3=(?<k3>\S+)\s+k4=(?<k4>\S+)\s+k5=(?<k5>\S+)"
Normally I prefer to do sequential rex calls, in case something's out of order or missing, but if your data's consistent, this will work
Once you have it the way you want it, update your props.conf and transforms.conf as appropriate for the sourcetype
EDIT for updated sample data / comment response:
...
| rex field=_raw "k3=(?<k3>.+)\s+k4="
| rex field=_raw "k4=(?<k4>.+)\s+k5="
...
I have a PostgreSQL RPC that aims to select filtered rows of a view.
This RPC requires some parameters (name_article, catg_article, color_article, etc).
Most of these parameters are int[]/bigint[] because I want the user to be able to request "all blue articles or all red articles, etc" but I want the user to be able to post empty parameters as well, and that the request considers he doesn't care about which color or category so it will return all possibilities.
The problem is that from what I saw after many topics on Internet, the ANY () or IN () can't be empty, which I'd like to allow it otherwise my filters system would have to manage all possibilities and I really don't want to cry.
This is what I've readen on Internet to try ( param is null or in()/any() ) but it doesn't work, not returning any article (the first where is fine, also don't pay attention to the cast thing, it's just that catg_and_type is json so I have to say id_catgarticle from this json is a bigint so it works fine) :
SELECT *
FROM dev.get_all_articles
WHERE get_all_articles.lib_article ILIKE '%' || $1 || '%'
AND ($2 is null or CAST(get_all_articles.catg_et_type->>'id_catgarticle' AS BIGINT) = any ($2));
Do you have any idea how I could allow empty arrays that will be processed with IN/ANY commands ?
Thanks a lot.
Problem solved, as mentionned into my answer to #LaurenceIsla's answer to the topic.
When having to send an array parameter into a PostgREST API endpoint, the syntax is like this : /rpc/endpoint?param={1,2,3}. So in order to make the request understand an empty param in URL (endpoint?param={}), I had to say, in the WHERE clause this : OR $2 = '{}'. That's all. Kind of tricky syntax when you don't know it.
I am using the OUT parameter in the ORDS parameters to send a response which is already in JSON and this stored as CLOB in the DB. When I send this out in the response from the ORDS, ORDS is actually adding lot of back spaces to the response.
Can someone help me to understand how I can remove the escaping chars here(all the backslashes). I had tried a different approach explained by Jeff in another thread to try to use an alias for the JSON key, but it did not work for me.
Over here I have mentioned the response as OUT parameter in my code.
eg payload:
{
"response": "{\n\"Order\":{\n\"OriginalShipmentID\":1\n,\"orderNumber\":1\n,\"orgID\":1\n,\"orderShipmentNumber\":1\n,\"oracleShipSet\":\"1\"\n,\"OeHeaderId\":1\n,\"GsHeaderId\":1\n,\"CustomerPoNumber\":\"1\"\n,\"PaymentTerms\":1\n,\"FreightTerms\":\"PAID\"\n,\"CurrencyCode\":\"USD\"\n,\"BillToSiteUseID\":1\n,\"ShipToSiteUseID\":1\n,\"SalesChannel\":\"DIRECT\"\n,\"HeaderKeyCode\":\"1 1\"\n,\"OrigSysDocumentRef\":\"1\"\n,\"OrderTypeCode\":\"XYZ yXYZ\"\n,\"OrderTypeID\":1\n,\"SalesRepID\":-3\n,\"IsAtgOrder\":\"N\"\n,\"OrderSource\":\"MCP\"\n,\"OdiDocSet\":\"PO\"\n,\"CsServiceSymbol\":\"XYZ\"\n,\"CsServiceFriendlyName\":\"XYZ\"\n,\"CsShipper\":\"NDC_MX\"\n,\"GsShipToAddressID\":1\n,\"GsBillFrtAddressID\":-1\n,\"BillingMethod\":\"DoNotBillFreight\"\n,\"ReasonForNoInvoice\":\"NA\"\n,\"ShippedBy\":\"NA\"\n,\"ManifestID\":1771748\n,\"IsVoided\":\"N\"\n,\"CustomerNumber\":\"1\"\n,\"OrderLines\":[\n]\n,\"HeaderShortNotes\":{\n}\n}\n}\n"
}
Some more details:
SELECT get_json ( :ordernumber,
:warehouseid,
:shipset,
:ordershipmentno,
:gsshipmentid,
:countrycode,
:shipperid,
:fromcurrency,
:tocurrency,
:groupid,
:optionvalues,
:linehaul,
:servicename,
:sigid,
:lineid,
:customerid) "{}Order"
INTO l_response_clob
FROM DUAL;
This function returns a CLOB with a JSON format data and l_response_clob is defined as STRING output parameter on the ORDS.
Essentially, I want to stop ORDS from again converting JSON to JSON.
Been banging head over this from some time but can't seem to make it work.
Thanks for the help I can get here.
I have a small index with ~1000 documents with only two fields:
- id (string)
- content (text_general)
I noticed that when I do MLT search by id for similar content, the original document(which id is the searched id) have a score 5.241327.
There is 1:1 duplicated document and for the duplicated content it is returning score = 1.5258181. Why? Why it is not 5.241327 when it is 100% duplicate.
Another question is can I in any way to get similarity documents by content by passing some text in the query.
Example:
/mlt/?q=content:Some encoded long text&mlt.fl=content
I am trying to check if there is similar content uploaded and the check must be performed at new content upload time.
It might be worth to try some different parameters. I also use MLT on only one field, I use the following parameters:
'mlt.boost': 'true',
'mlt.fl': 'my_field_name',
'mlt.maxqt': 1000,
'mlt.mindf': '0',
'mlt.mintf': '0',
'qt': 'mlt',
'rows': '10'
See http://wiki.apache.org/solr/MoreLikeThis for an explanation of the parameters. I think with a small index mindf might be important and I see the default mintf (term frequency) is 2, so I assume an ID is only one term, so this is probably ignored!
First, how does Solr More-Like-This works?
A regular Solr query is conducted (e.g. "?q=content:Some encoded long text&.....".
For each document returned by the above query, More-Like-This conduct More like this query...
So, the first result set "response", is just like any Solr query results set.
The More-Like-This appears below and start with something like that (Json format):
"moreLikeThis":{
"57375":{"numFound":18155,"start":0,"docs":["
For an explanation about More Like This algorithm, please read that:
http://blog.brattland.no/node/18
and: http://cephas.net/blog/2008/03/30/how-morelikethis-works-in-lucene/
If you didn't solved the problem yet, please let me know and I will guide you through.