How to index couchdb in hyperledger fabric - sap

I'm trying to get the result sorted by the posting date for which i have defined an index
{
"index": {
"fields": [{
"PostingDate": "DESC"
}]
},
"ddoc": "indexPostingDate",
"name": "indexPostingDate",
"type": "json"
}
Acc to the hyperledger fabric documentation i have to place this in META-INF/statedb/couchdb/indexes
With Sap hyperledger fabric platform i tried placing this folder both in vendor folder as well as directly under src folder.But none worked for me.
This is my code for querying result
queryString := fmt.Sprintf("{\"selector\":{\"id\":\"%s\"},\"sort\": [{\"PostingDate\": \"desc\"}],\"use_index\": \"indexPostingDate\"}",
id)
logger.Infof("Getting data for %s", id)
oResultsIterator, responseMetaData, oFetchErr := stub.GetQueryResultWithPagination(queryString,
pageSize, bookmark)
On invoking this function, i get this following error
{
"error": {
"message": "GET_QUERY_RESULT failed: transaction ID: 9b7c42c09069855758ccdfbc30f5d75d38b51569a78e101584051e0fa142ebc3: error handling CouchDB request. Error:no_usable_index, Status Code:400, Reason:No index exists for this sort, try indexing by the sort fields.",
"code": "CustomError",
"status": 500
}
}
How do i resolve this?

It's a known limitation that you can't specify the sort direction in the Query with Fabric. If you need an ordered search, your index creation should specify the direction, so you can simply exclude it in your query.
Now that we support and recommend you use the official CouchDB:3.1, this may work, but I would need to test it.
You can simplify your query by the way by using backticks so you don't have to escape your quotes for ease of reading and writing: fmt.Sprintf(`{"selector":{"id":"%s"},"sort": [{"PostingDate": "desc"}],"use_index": "indexPostingDate"}`, id)

Related

Wit AI response for API requests

I'm using wit ai for a bot and I think it's amazing. However, I must provide the customer with screens in my web app to train and manage the app. And here I found a big problem (or maybe I'm just lost). The documentation of the REST API is not enough to design a client that acts like the wit console (not even close). it's like a tutorial of what endpoints you can hit and an overview of the parameters, but no clean explanation of the structure of the response.
For example, there is no endpoint to get the insights edge. Also and most importantly, no clear documentation about the response structure when hitting the message endpoints (i.e. the structure the returned entities: are they prebuilt or not, and if they are, is the value a string or an object or array, and what the object might contain [e.g. datetime]). Also the problem of the deprecated guide and the new guide (the new guide should be done and complete by now). I'm building parts of the code based on my testing. Sometimes when I test something new (like adding a range in the datetime entity instead of just a value), I get an error when I try to set the values to the user since I haven't parsed the response right, and the new info I get makes me modify the DB structure at my end sometimes.
So, the bottom line, is there a complete reference that I can implement a complete client in my web app (my web app is in Java by the way and I couldn't find a client library that handles the latest version of the API)? Again, the tool is AWESOME but the documentation is not enough, or maybe I'm missing something.
The document is not enough of course but I think its pretty straightforward. And from what I read there is response structure under "Return the meaning of a sentence".
It's response in JSON format. So you need to decode the response first.
Example Request:
$ curl -XGET 'https://api.wit.ai/message?v=20170307&q=how%20many%20people%20between%20Tuesday%20and%20Friday' \
-H 'Authorization: Bearer $TOKEN'
Example Response:
{
"msg_id": "387b8515-0c1d-42a9-aa80-e68b66b66c27",
"_text": "how many people between Tuesday and Friday",
"entities": {
"metric": [ {
"metadata": "{'code': 324}",
"value": "metric_visitor",
"confidence": 0.9231
} ],
"datetime": [ {
"value": {
"from": "2014-07-01T00:00:00.000-07:00",
"to": "2014-07-02T00:00:00.000-07:00"
},
"confidence": 1
}, {
"value": {
"from": "2014-07-04T00:00:00.000-07:00",
"to": "2014-07-05T00:00:00.000-07:00"
},
"confidence": 1
} ]
}
}
You can read more about response structure under Return the meaning of a sentence

SoftLayer API: Does VSI flavor based order support specifying image_id

We want to programatically order VSI using the flavor (for example. Balanced type), however instead of using the standard os_code, we want the VSI to be created from a public image template (ie. CentOS7-ChangeStable). From the following doc it seems to be possible.
http://softlayer-python.readthedocs.io/en/latest/_modules/SoftLayer/managers/vs.html
However I tried but got the following error:
SoftLayer.exceptions.SoftLayerAPIError: SoftLayerAPIError(SoftLayer_Exception_InvalidValue): Invalid value provided for 'blockDevices'. Block devices may not be provided when using an image template.
Using slcli is failing as well with a different error:
# slcli vs create --hostname testvsi --domain vmonic.local --flavor BL2_4X8X100 --image 1cc8be72-f230-4ab9-b4b2-329c3e747853 --datacenter tok02 --private
This action will incur charges on your account. Continue? [y/N]: y
SoftLayerAPIError(SoftLayer_Exception_Public): Order is missing the following category: Operating System.
Please advice whether using "image_id" with "flavor" is supported in SL API / python API. Thanks!
this is an issue with the API, the python client uses the http://sldn.softlayer.com/reference/services/softlayer_virtual_guest/createObject method to create the VSI using RESTFul the same request would be something like this:
POST: https://$USERNAME:#APIKEY#api.softlayer.com/rest/v3.1/SoftLayer_Virtual_Guest/createObject
Payload:
{
"parameters": [{
"datacenter": {
"name": "tok02"
},
"domain": "softlayer.local",
"hourlyBillingFlag": true,
"blockDeviceTemplateGroup": {
"globalIdentifier": "1cc8be72-f230-4ab9-b4b2-329c3e747853"
},
"hostname": "rcabflav",
"privateNetworkOnlyFlag": true,
"supplementalCreateObjectOptions": {
"flavorKeyName": "BL2_4X8X100"
}
}]
}
and you will get the same error, I reported this error in Softlayer, if you want you can submit a ticket in softlayer and report it as well.

Checking a SQL Azure Database is available from a c# code

I do an up scale with a code like this on an Azure SQL Database:
ALTER DATABASE [MYDATABASE] Modify (SERVICE_OBJECTIVE = 'S1');
How is it possible to know from a c# code when Azure has completed the job and the table is already available?
Checking for SERVICE_OBJECTIVE value is not enough, the process still continue further.
Instead of performing this task in T-SQL I would perform the task from C# using an API call over to the REST API, you can find all of the details on MSDN.
Specifically, you should look at the Get Create or Update Database Status API method which allows you to call the following URL:
GET https://management.azure.com/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Sql/servers/{server-name}/databases/{database-name}}/operationResults/{operation-id}?api-version={api-version}
The JSON body allows you to pass the following parameters:
{
"id": "{uri-of-database}",
"name": "{database-name}",
"type": "{database-type}",
"location": "{server-location}",
"tags": {
"{tag-key}": "{tag-value}",
...
},
"properties": {
"databaseId": "{database-id}",
"edition": "{database-edition}",
"status": "{database-status}",
"serviceLevelObjective": "{performance-level}",
"collation": "{collation-name}",
"maxSizeBytes": {max-database-size},
"creationDate": "{date-create}",
"currentServiceLevelObjectiveId":"{current-service-id}",
"requestedServiceObjectiveId":"{requested-service-id}",
"defaultSecondaryLocation": "{secondary-server-location}"
}
}
In the properties section, the serviceLevelObjective property is the one you can use to resize the database. To finish off you can then perform a GET on the Get Database API method where you can compare both the currentServiceLevelObjectiveId and requestedServiceObjectiveId properties to ensure your command has been successful.
Note: Don't forget to pass all of the common parameters required to make API calls in Azure.

BigQuery Command Line Tool: get error details

one of my jobs keeps failing and when I looked into why (by requesting job details) I get the following output:
status": {
"errorResult": {
"location": "gs://sf_auto/Datastore Mapper modules.models.userData/15716706166748C8426AD/output-46",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
},
"errors": [
{
"location": "gs://sf_auto/Datastore Mapper modules.models.userData/15716706166748C8426AD/output-46",
"message": "JSON table encountered too many errors, giving up. Rows: 1; errors: 1.",
"reason": "invalid"
}
],
"state": "DONE"
Problem is, it doesn't help at all, and I need more details. Is there anyway to understand which column or attribute caused the failings? Is there any way to get more information?
Edit Additional Details
We're running a map reduce job on appengine to transfer our datastore from appengine to BigQuery
The files are stored on Google Cloud Store
It's creating a brand new table instead of adding to an existing one
Update #2
I played around with the query trying lots of things as well as adjusting the scheme and i've narrowed down the problem to the uuid. For some reason this type of data messes everything up:
"uuid": "XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX"
The schema defines it as a String
OK, after loads of debugging I found the error... in the JSON Newline file we had two attributes that were similar:
uuid: "XXX..."
uuId: "XXX..."
This has been there for a while so I think some change within bigquery started to require that keys be unique regardless of capitalization. Will test so more and confirm!
A recent change made loads of JSON data case insensitive in field names similar to be consistent with how SQL queries treat field names. I have opened a work item to track the improvement of error message for this case.

Importing data to BQ error

I updated our BigQuery client to new client Google API and suddenly I started seeing this error when uploading via JSON:
"errors": [
{
"reason": "invalid",
"location": "Offset:0 / Line:1 / Column:159 / Field:q1",
"message": "Could not convert value to string"
},
Job reference:
"jobReference": {
"projectId": "dot-metrics",
"jobId": "job_8e0511a40c1845cca5717daf78b605f7"
},
This worked before we updated our client, afterwards it just stopped working so it must be some change inside BigQuery. Any help is appreciated!
This looks like a regression in a recent release that broke importing null values in json. A fix should be forthcoming.
Note if you drop the null fields (i.e. instead of "field: null" you just don't include "field" at all) it should continue to work.