add smart checklist to jira task from api - jira-rest-api

I am trying to create a jira user story from powershell using the rest api.
I want to add a smart checklist to the task when its created.
ill send following json to the endpoint
{
"fields":
{
"project":
{
"key": "dep"
},
"issuetype":
{
"name": "Story"
},
"summary": "Order: 5, Tasktype: New_table",
"description": "Auto created Jira task for a data task\n ",
"assignee":
{
"key": "da",
"name": "da",
"emailAddress": "da#mycompany.com"
},
"labels": ["DATA"],
"customfield_10001": "DWH-62",
"customfield_10006": 0
"--data": /****************************here i try to add a checklist
{
"- ToDo \n+ Checked\nx Skipped\n~ In Progress\n"
},
}'
But i won't work
Error reponse is
Invoke-RestMethod : The remote server returned an error: (400) Bad Request.
At S:\XXX\XXX1\SIJ\scripts\jira-test.ps1:49 char:1
+ Invoke-RestMethod -uri $restapiuri -Headers $headers -Method POST - ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-
RestMethod], WebException
+ FullyQualifiedErrorId :
WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeRestMethodCommand
Any suggestions on how to achieve this.

I had a similar problem. In spite of the fact that the Atlassian documentation indicates that you can create a Smart Checklist when you create an issue, I could not get it to work. What I ended up doing was creating the issue first, then making a call to add the Checklist using the checklist API, something like this:
await Client.RestClient.ExecuteRequestAsync(RestSharp.Method.PUT, $"{JiraUri}/rest/api/2/issue/{issue}/properties/com.railsware.SmartChecklist.checklist", "\"- Item 1\\n- Item 2\\n- Item 3\\n\"");
I found the documentation here: https://railsware.atlassian.net/wiki/spaces/CHK/pages/92176610/Jira+REST+API.+Read+Write+checklists
and it worked for me.

Related

Withings API Body Sample

I'm trying to get data back from this Withings endpoint: https://developer.withings.com/api-reference/#operation/measure-getmeas
But every combination of things I've tried simply returns:
status body error
503 Invalid Params
This is the most recent body that isn't working: action=getmeas&meastype=meastype&meastypes=11&category=1&startdate=1641168000&enddate=1641254399
For reference: https://developer.withings.com/api-reference/#operation/measure-getmeas
Based on what you posted, the problem is your parameter meastype=meastype. If you remove this then it should run fine.
Assuming you have followed the procedure to get an access token your call from PowerShell would look like this:
Invoke-RestMethod -Method 'Post' -Headers #{ "Authorization" = "Bearer XXXXXXXXXXXXXXXXXX" } -Body "action=getmeas&meastypes=11&category=1&startdate=1641168000&enddate=1641254399" -Uri 'https://wbsapi.withings.net/measure'
This will return a JSON structure as per the docs you link to in the question e.g.
{
"status": 0,
"body": {
"updatetime": "string",
"timezone": "string",
"measuregrps": [
{
"grpid": 12,
"attrib": 1,
"date": 1594245600,
"created": 1594246600,
"category": 1594257200,
"deviceid": "892359876fd8805ac45bab078c4828692f0276b1",
"measures": [
{
"value": 65750,
"type": 1,
"unit": -3,
"algo": 3425,
"fm": 1,
"fw": 1000
}
],
"comment": "A measurement comment"
}
],
"more": 0,
"offset": 0
}
}
If your "measuregrps" is empty (like mine is below) then it means there is no data available for the time period you selected so either your device doesn't record that parameter or the data has not been synchronised to your Withings account.
What I get when I run it (my device doesn't record HR):
status body
------ ----
0 #{updatetime=1641470158; timezone=Europe/London; measuregrps=System.Object[]}
Another option is to use Windows Subsystem for Linux to run curl commands. You essentially get the same thing:
curl --header "Authorization: Bearer XXXXXXXXXXXXXXXXXXXXXX" --data "action=getmeas&meastype=11&category=1&startdate=1609925332&enddate=1641461360" 'https://wbsapi.withings.net/measure'
gives
{
"status":0,
"body":{
"updatetime":1641470640,
"timezone":"Europe\/London",
"measuregrps":[]
}
}

How do I create an event using the SocialTables API?

I'm trying to use the /4.0/legacyvm3/teams/{team}/events endpoint to create an event. I'm running into some trouble with spaces.
I used the /4.0/legacyvm3/teams/{team}/venues endpoint to get a list of venues. I chose one to include in the spaces section and posted this:
{
"name": "Event via API Test 04",
"category": "athletic event",
"public": true,
"attendee_management": true,
"start_time": "2017-04-05T16:13:54.217Z",
"end_time": "2017-04-05T16:13:54.217Z",
"uses_metric": false,
"venue_mapper_version": 0,
"spaces": [
{
"venue_id": 128379,
"name": "Snurrrggggg"
}
]
}
The endpoint returns a 400 code and this error:
{
"code": 400,
"message": "Cannot read property 'toLowerCase' of undefined"
}
I tried including the wizard section, but each time it would return this error:
{
"message": "Access Denied to this feature"
}
After some experimentation, this body succeeded:
{
"name": "Event via API Test 03",
"category": "athletic event",
"public": true,
"attendee_management": true,
"start_time": "2017-04-05T16:13:54.217Z",
"end_time": "2017-04-05T16:13:54.217Z",
"uses_metric": false,
"venue_mapper_version": 0,
"spaces": [
{
"name": "Fake News Room"
}
]
}
But the application itself would not display the diagram, and the newly created room did not show up in my list of venues. Perhaps it did not assign permissions to it?
In any case, I don't actually want to create a new venue/space. I want to pass in an existing venue/space. How do I do that?
The short answer is to create a working diagram in 4.0 you will need to POST some data to the /4.0/diagrams endpoint.
The room you create doesn't map to the same concept as venues. When you create an event as you did, it creates a new space entity. The spaces endpoints can return information on those.

Get Camunda TaskID after creation in response

We are using Camunda for our approval process implementation in our application.
We created a BPMN process with human Task service. We are using the below URL
engine-rest/engine/default/process-definition/key/processKey/start
we pass our form parameters as input to this service
{
"variables": {
"requestId" : {"value" : "xxxxx", "type" : "String"},
"catalog" : {"value" : "yyyy", "type" : "String"},
"businessReason": {"value":"yyyyy","type":"String"},
"link": {"value":"","type":"String"}
}
}
The response of this start task is below-
{
"links": [
{
"method": "GET",
"href": "http://localhost:8080/engine-rest/engine/default/process-instance/31701",
"rel": "self"
}
],
"id": "31701",
"definitionId": "xxxxx:7:31605",
"businessKey": null,
"caseInstanceId": null,
"ended": false,
"suspended": false,
"tenantId": null
}
The id in the response is not the actual task ID which we use to get the task details etc instead its the execution ID.
Is there a way to get the task id back in the response.? Also can we add some parameteres to the above response. Like
"status" : "success"
I am having listener class created for the Human task but not sure how to add response parameters . Any help is appreciated
This is not possible unless you build a custom REST resource on top of Camunda's Java API. See https://docs.camunda.org/manual/7.6/reference/rest/overview/embeddability/ for info how you would embed the default REST resources into a custom JAX-RS application.

Error loading file stored in Google Cloud Storage to Big Query

I have been trying to create a job to load a compressed json file from Google Cloud Storage to a Google BigQuery table. I have read/write access in both Google Cloud Storage and Google BigQuery. Also, the uploaded file belongs in the same project as the BigQuery one.
The problem happens when I access to the resource behind this url https://www.googleapis.com/upload/bigquery/v2/projects/NUMERIC_ID/jobs by means of a POST request. The content of the request to the abovementioned resource can be found as follows:
{
"kind" : "bigquery#job",
"projectId" : NUMERIC_ID,
"configuration": {
"load": {
"sourceUris": ["gs://bucket_name/document.json.gz"],
"schema": {
"fields": [
{
"name": "id",
"type": "INTEGER"
},
{
"name": "date",
"type": "TIMESTAMP"
},
{
"name": "user_agent",
"type": "STRING"
},
{
"name": "queried_key",
"type": "STRING"
},
{
"name": "user_country",
"type": "STRING"
},
{
"name": "duration",
"type": "INTEGER"
},
{
"name": "target",
"type": "STRING"
}
]
},
"destinationTable": {
"datasetId": "DATASET_NAME",
"projectId": NUMERIC_ID,
"tableId": "TABLE_ID"
}
}
}
}
However, the error doesn't make any sense and can also be found below:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
],
"code": 400,
"message": "Job configuration must contain exactly one job-specific configuration object (e.g., query, load, extract, spreadsheetExtract), but there were 0: "
}
}
I know the problem doesn't lie either in the project id or in the access token placed in the authentication header, because I have successfully created an empty table before. Also I specify the content-type header to be application/json which I don't think is the issue here, because the body content should be json encoded.
Thanks in advance
Your HTTP request is malformed -- BigQuery doesn't recognize this as a load job at all.
You need to look into the POST request, and check the body you send.
You need to ensure that all the above (which seams correct) is the body of the POST call. The above Json should be on a single line, and if you manually creating the multipart message, make sure there is an extra newline between the headers and body of each MIME type.
If you are using some sort of library, make sure the body is not expected in some other form, like resource, content, or body. I've seen libraries that use these differently.
Try out the BigQuery API explorer: https://developers.google.com/bigquery/docs/reference/v2/jobs/insert and ensure your request body matches the one made by the API.

AWS xray put trace segment command return error

I am trying to send segment doc manually using the CLI with example on this page: https://docs.aws.amazon.com/xray/latest/devguide/xray-api-sendingdata.html#xray-api-segments
I created my own Trace ID and also start and end time.
The command i used are:
> DOC='{"trace_id": "'$TRACE_ID'", "id": "6226467e3f841234", "start_time": 1581596193, "end_time": 1581596198, "name": "test.com"}'
>echo $DOC
{"trace_id": "1-5e453c54-3dc3e03a3c86f97231d06c88", "id": "6226467e3f845502", "start_time": 1581596193, "end_time": 1581596198, "name": "test.com"}
> aws xray put-trace-segments --trace-segment-documents $DOC
{
"UnprocessedTraceSegments": [
{
"ErrorCode": "ParseError",
"Message": "Invalid segment. ErrorCode: ParseError"
},
{
"ErrorCode": "MissingId",
"Message": "Invalid segment. ErrorCode: MissingId"
},
{
"ErrorCode": "MissingId",
"Message": "Invalid segment. ErrorCode: MissingId"
},
.................
The put-trace-segment keep giving me error. The segment doc comply with the JSON schema too. Am i missing something else?
Thanks.
I need to enclose the JSON with "..". The command that works for me was: aws xray put-trace-segments --trace-segment-documents "$DOC"
This is probably due an error in the documentation or that the xray team was using another kind of shell.