I am working on Zoho API. Using postman I want to create/issue a bug in the system. For this, I am looking into Zoho Bug API. For creating a bug below are the request parameters list in the link.
Create a Bug
POST /portal/[PORTALID]/projects/[PROJECTID]/bugs/
Creates a bug.
Request Parameters
title* String Name of the bug.
description String Description of the bug.
assignee Long Assignee for the bug.
flag String Bug flag must be Internal or External.
classification_id Long Classification ID of the project.
milestone_id Long Milestone ID of the project.
due_date String [MM-DD-YYYY] Due date of the bug.
module_id Long Module ID of the project.
severity_id Long Severity ID of the project.
reproducible_id Long Reproducible ID of the project.
affectedmile_id Long Milestone ID of the project.
bug_followers Long Follower ID of the user.
uploaddoc File The maximum size to upload a file is 128 MB.
Custom Fields
CHAR1 - CHAR12 String Any text type of custom fields with string or picklist values.
LONG1 - LONG4 Long Numeric type of custom field.
DATE1 - DATE4 String [MM -DD-YYYY] Bug custom field in date format.
Sample Response
Status: 201 Created
Content Type: application/json;charset=utf-8
{
"bugs": [{
"id": 170876000001851001,
"key": "543",
"project": {
"id": 170876000000147021
},
"flag": "Internal",
"title": "UI issue in Status text box",
"reporter_id": "2060758",
"reported_person": "Patricia Boyle",
"created_time": "05-27-2014 08:38 AM",
"created_time_long": 1401188920000,
"assignee_name": "Not Assigned",
"classification": {
"id": 170876000000133041,
"type": "Feature(New)"
},
"severity": {
"id": 170876000000065005,
"type": "Major"
},
"status": {
"id": 170876000001077429,
"type": "known limitation"
},
"closed": false,
"reproducible": {
"id": 170876000000133005,
"type": "Always"
},
"module": {
"id": 170876000000494013,
"name": "ERP Phase I"
},
"link": {
"self": {
"url": "https://projectsapi.zoho.com/restapi/portal/2063927/projects/
170876000000147021/bugs/170876000001851001/"
},
"timesheet": {
"url": "https://projectsapi.zoho.com/restapi/portal/2063927/projects/
170876000000147021/bugs/170876000001851001/logs/"
}
}
}]
}
What I am doing
My request param
https://projectsapi.zoho.com/restapi/portal/[PORTALID]/projects/[PROJECTID]/bugs/
Key
`Authorization: myKey`
`Content Type: application/json`
In body
[{
"title": "My First Bug",
"description" :"This is my first bug",
"assignee" : "engr.usman" ,
"flag": "internal",
"classification_id": "1139168000000297069",
"milestone_id": "",
"due_date": "02-15-2018",
"module_id" : "1139168000000019372",
"severity_id" : "1139168000000007003",
"reproducible_id" : "1139168000000017069",
"status_id" :"1139168000000007045",
"resolution": "",
"affectedmile_id" : "",
"customfields": [
{
"column_name": "LONG1",
"label_name": "MSN#",
"value": "2999000190"
},
{
"column_name": "CHAR1",
"label_name": "Circle-Division-SubDivision",
"value": "Hyderabad - Latifabad - Tando Jam"
},
{
"column_name": "CHAR3",
"label_name": "LCD Indication",
"value": "S7"
},
{
"column_name": "CHAR2",
"label_name": "Reference #",
"value": "28371430034961U"
}
],
"uploaddoc" : [""]
}]
Response
{
"error": {
"code": 6831,
"message": "Input Parameter Missing"
}
}
Update 1
So just for testing it again. I have tried to send only mandatory and by default fields.
[{
"title": "My First Bug",
"flag": "internal",
"classification_id": "1139168000000297069",
"module_id" : "1139168000000019372",
"severity_id" : "1139168000000007003",
"customfields": [
{
"column_name": "CHAR2",
"label_name": "Reference #",
"value": "28371430034961U"
}
]
}]
But again I am getting same error Input Parameter Missing
I don't know why this error is generating. As there is no method of sending a request in the link.
Any help would be highly appreciated.
Related
Hi Can someone help me simulate this scenario, Example this is the response I got, I want to extract all alertId with the name parameter contains test. You response is highly appreciated. Thank you so much.
Response:
[
{
"duplicateCount": 0,
"fqdn": "qa-ubuntu14-4",
"appName": "TEST_APD_UB14",
"stateString": "OPEN",
"category": "FILESCAN",
"alkey": {
"agentId": "8470ea64-a710-3e46-ba6b-ccd37ebc4074",
"role": "AD SERVER",
"alertId": "0258a7ca-bc72-3a53-aa98-3098c87411ba",
"id": "6695a7fa-ab9f-43fa-871b-620cd1eeb75054af7770-604b-11e9-b486-8d59ab9344597cea0ea2-d897-3696-852d-5f3cb36f270e8470ea64-a710-3e46-ba6b-ccd37ebc4074/var/log/test321.txttest321.txtA",
"applicationContextId": "7cea0ea2-d897-3696-852d-5f3cb36f270e"
},
"properties": {
"name": "test321.txt",
"acl": ""
}
},
{
"duplicateCount": 0,
"fqdn": "qa-ubuntu14-4",
"appName": "TEST_APD_UB18",
"stateString": "OPEN",
"category": "FILESCAN",
"alkey": {
"agentId": "8470ea64-a710-3e46-ba6b-ccd37ebc4074",
"role": "AD SERVER",
"alertId": "0258a7ca-bc72-3a53-aa98-3098c8741CDA",
"id": "6695a7fa-ab9f-43fa-871b-620cd1eeb75054af7770-604b-11e9-b486-8d59ab9344597cea0ea2-d897-3696-852d-5f3cb36f270e8470ea64-a710-3e46-ba6b-ccd37ebc4074/var/log/test321.txttest321.txtA",
"applicationContextId": "7cea0ea2-d897-3696-852d-5f3cb36f270e"
},
"properties": {
"name": "test555.txt",
"acl": ""
}
}
]
Screenshot:
Expected Result:
I want to extract all alertId with the name parameter contains test
You could use the following JSON query to extract the values:
[*].[?(#.properties.name contains 'test')]alkey.agentId
I found this reference with JSON Path Syntax is really useful.
Responses from Podio API returns an JSON array of items with a fields property. Each field carries its values and its config.
For example a category field for the Gender:
{
"type": "category",
"field_id": 219922852,
"label": "Gender",
"values": [
{
"value": {
"status": "active",
"text": "Prefer not to say",
"id": 3,
"color": "F7F0C5"
}
}
],
"config": {
"settings": {
"multiple": true,
"options": [
{
"status": "active",
"text": "Male",
"id": 1,
"color": "DCEBD8"
},
{
"status": "active",
"text": "Female",
"id": 2,
"color": "F7F0C5"
},
{
"status": "active",
"text": "Prefer not to say",
"id": 3,
"color": "F7F0C5"
}
],
"display": "inline"
},
"mapping": null,
"label": "Gender"
},
"external_id": "gender"
},
How can I fetch the config without having to query a specific item?
Is there a way to get every field in the response? Because if the queried item does not have a field value set, Podio doesn't return it in the response.
I would like to get the field config for ALL the fields. If possible, with a single API request. In particular I am interested in all the possible values (in case of Category or Relationship fields) so that I could match them with local values I have.
This way I can use the field structure to programmatically map some local values to the format required by the Podio API; and then generate a fields payload that to update/create Podio items via an API calls.
You can request the Podio Get App method to get the app configuration.
Podio Doc Ref: https://developers.podio.com/doc/applications/get-app-22349
I'm trying to send a message to my broker, using Avro schema, but "im always getting error:
2020-02-01 11:24:37.189 [nioEventLoopGroup-4-1] ERROR Application -
Unhandled: POST - /api/orchestration/
org.apache.kafka.common.errors.SerializationException: Error
registering Avro schema: "string" Caused by:
io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
Schema being registered is incompatible with an earlier schema; error
code: 409
Here my docker container:
connect:
image: confluentinc/cp-kafka-connect:5.4.0
hostname: confluentinc-connect
container_name: confluentinc-connect
depends_on:
- zookeeper
- broker
- schema-registry
ports:
- "8083:8083"
environment:
CONNECT_BOOTSTRAP_SERVERS: 'broker:29092'
CONNECT_REST_ADVERTISED_HOST_NAME: connect
CONNECT_REST_PORT: 8083
CONNECT_GROUP_ID: confluentinc-connect
CONNECT_CONFIG_STORAGE_TOPIC: confluentinc-connect-configs
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
CONNECT_OFFSET_FLUSH_INTERVAL_MS: 10000
CONNECT_OFFSET_STORAGE_TOPIC: confluentinc-connect-offsets
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
CONNECT_STATUS_STORAGE_TOPIC: confluentinc-connect-status
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
CONNECT_KEY_CONVERTER: org.apache.kafka.connect.storage.StringConverter
CONNECT_KEY_CONVERTER_SCHEMAS_ENABLE: "true"
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: 'http://schema-registry:8081'
CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
CONNECT_ZOOKEEPER_CONNECT: 'zookeeper:2181'
CONNECT_LOG4J_ROOT_LOGLEVEL: "INFO"
CONNECT_LOG4J_LOGGERS: "org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR"
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: "1"
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: "1"
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: "1"
CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
CONNECT_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
CONNECT_LOG4J_LOGGERS: org.apache.zookeeper=ERROR,org.I0Itec.zkclient=ERROR,org.reflections=ERROR
CONNECT_PLUGIN_PATH: "/usr/share/java,/usr/share/extras"
My producer (written in Kolin)
val prop: HashMap<String, Any> = HashMap()
prop[BOOTSTRAP_SERVERS_CONFIG] = bootstrapServers
prop[KEY_SERIALIZER_CLASS_CONFIG] = StringSerializer::class.java.name
prop[VALUE_SERIALIZER_CLASS_CONFIG] = KafkaAvroSerializer::class.java.name
prop[SCHEMA_REGISTRY_URL] = schemaUrl
prop[ENABLE_IDEMPOTENCE_CONFIG] = idempotence
prop[ACKS_CONFIG] = acks.value
prop[RETRIES_CONFIG] = retries
prop[MAX_IN_FLIGHT_REQUESTS_PER_CONNECTION] = requestPerConnection
prop[COMPRESSION_TYPE_CONFIG] = compression.value
prop[LINGER_MS_CONFIG] = linger
prop[BATCH_SIZE_CONFIG] = batchSize.value
return KafkaProducer(prop)
My Avro Schema:
{
"type": "record",
"namespace": "com.rjdesenvolvimento",
"name": "create_client_value",
"doc": "Avro Schema for Kafka Command",
"fields": [
{
"name": "id",
"type": "string",
"logicalType": "uuid",
"doc": "UUID for indentifaction command"
},
{
"name": "status",
"type": {
"name": "status",
"type": "enum",
"symbols": [
"Open",
"Closed",
"Processing"
],
"doc": "Can be only: Open, Closed or Processing"
},
"doc": "Status of the command"
},
{
"name": "message",
"type": {
"type": "record",
"name": "message",
"doc": "Avro Schema for insert new client",
"fields": [
{
"name": "id",
"type": "string",
"logicalType": "uuid",
"doc": "UUID for indentifaction client transaction"
},
{
"name": "active",
"type": "boolean",
"doc": "Soft delete for client"
},
{
"name": "name",
"type": "string",
"doc": "Name of the client"
},
{
"name": "email",
"type": "string",
"doc": "Email of the client"
},
{
"name": "document",
"type": "string",
"doc": "CPF or CPNJ of the client"
},
{
"name": "phones",
"doc": "A list of phone numbers",
"type": {
"type": "array",
"items": {
"name": "phones",
"type": "record",
"fields": [
{
"name": "id",
"type": "string",
"logicalType": "uuid",
"doc": "UUID for indentifaction of phone transaction"
},
{
"name": "active",
"type": "boolean",
"doc": "Soft delete for phone number"
},
{
"name": "number",
"type": "string",
"doc": "The phone number with this regex +xx xx xxxx xxxx"
}
]
}
}
},
{
"name": "address",
"type": "string",
"logicalType": "uuid",
"doc": "Adrres is an UUID for a other address-microservice"
}
]
}
}
]
}
And my post:
{
"id" : "9ec818da-6ee0-4634-9ed8-c085248cae12",
"status" : "Open",
"message": {
"id" : "9ec818da-6ee0-4634-9ed8-c085248cae12",
"active" : true,
"name": "name",
"email": "email#com",
"document": "document",
"phones": [
{
"id" : "9ec818da-6ee0-4634-9ed8-c085248cae12",
"active" : true,
"number": "+xx xx xxxx xxxx"
},
{
"id" : "9ec818da-6ee0-4634-9ed8-c085248cae12",
"active" : true,
"number": "+xx xx xxxx xxxx"
}
],
"address": "9ec818da-6ee0-4634-9ed8-c085248cae12"
}
}
What am I doing wrong?
github project: https://github.com/rodrigodevelms/kafka-registry
UPDATE =====
Briefly:
I'm not generating my classes using the Gradle Avro plugin.
In this example, my POST sends an Client object. And in service, it assembles a Command-type object as follows:
id: same client id
status: open
message: the POST that was sent.
So I send this to KAFKA, and in the connect (jdbc sink postgres) I put as fields.whitelist only the attributes of the message (the client) and I don't get either the command id or the status.
on github the only classes that matter to understand the code are:
1
-https://github.com/rodrigodevelms/kafka-registry/blob/master/kafka/src/main/kotlin/com/rjdesenvolvimento/messagebroker/producer/Producer.kt
2 -
https://github.com/rodrigodevelms/kafka-registry/blob/master/kafka/src/main/kotlin/com/rjdesenvolvimento/messagebroker/commnad/Command.kt
3 -
https://github.com/rodrigodevelms/kafka-registry/blob/master/src/client/Controller.kt
4
-https://github.com/rodrigodevelms/kafka-registry/blob/master/src/client/Service.kt
5 - docker-compose.yml, insert-client-value.avsc, postgresql.json,
if i set the compatibility mode of the avro scheme to "none", i can send a message, but some unknown characters will be shown, as shown in the photo below.
I suspect that you're trying to do multiple things and you've not been cleaning up state after previous attempts. You should not get that error in a fresh installation
Schema being registered is incompatible with an earlier schema
Your data has changed in a way that the schema in the registry is not compatible with the one you're sending.
You can send an HTTP DELETE request to http://registry:8081/subjects/[name]/ to delete all versions of the schema, then you can restart your connector
I have completed a Node.js app using LINE APIs.I have the following request object. How can I define the array of different objects, here the messagesfield which contains different object structure for different message types. I hope swagger permits this very common scenario.
Request Body:
{
"to": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"messages":[
{
"type":"text",
"text":"Hello, world1"
},
{
"type": "audio",
"originalContentUrl": "https://example.com/original.m4a",
"duration": 240000
}
{
"type": "location",
"title": "my location",
"address": "〒150-0002 東京都渋谷区渋谷2丁目21−1",
"latitude": 35.65910807942215,
"longitude": 139.70372892916203
}
]
}
My swagger definition for messages array.
"Messages Object": {
"type": "array",
"items": {
"allOf": [
{
"$ref": "#/definitions/multicast Message Error Response"
},
{
"$ref": "#/definitions/multicast Message Error Response"
}
]
}
}
And this is the rendered messages array. It has only one entry. I want to include many different entries
"messages": [
{
"code": 500,
"httpCode": 400,
"name": "string",
"message": "string"
}
]
I have been given the attached RAML file to use in Mule but I am having problems working out how to clean up the errors in the file and not even sure this raml file conforms to standards. The errors I am getting are for missing {} and another is missing block entry when I remove the version. Can't figure out how to resolve them.
Below is a cut down version of the RAML:
#%RAML 0.8
---
title: Databox
version: v1
protocols: [HTTPS]
baseUri: https://databox/v1/{version}
mediaType: application/json
traits:
- http-data: !include http-data.raml
resourceTypes: !include types.raml
documentation:
- title: Home
content: |
Databox 1st draft
/stores:
type:
store:
description: Stores
dataSchema: !include stores.json
The traits (http-data.raml):
responses:
200:
description: |
Success
The resourceType (types.raml):
- store:
head:
description: Retrieve data for <<description>>.
is: [ http-data ]
get:
description: Retrieve data for <<description>>.
responses:
200:
body:
application/json:
schema: |
{
"type": "object",
"properties": {
"meta": {
"title": "Data",
"type": "object",
"properties": {
"createdOn": {
"type": "string",
"format": "date-time"
}
},
"required": [
"createdOn"
]
},
"data": {
"type": "array",
"items": <<dataSchema>>
}
},
"required": [
"data"
]
}
description: |
Success. Returns a JSON object containing all <<description>>.
The schema (stores.json):
{
"id": "http://localhost:8000/stores.json#",
"$schema": "http://json-schema.org/draft-04/schema",
"title": "Databox Store Schema",
"type": "object",
"properties": {
"storeId": {
"type": "string"
},
"storeDescription": {
"type": "string"
},
},
"required": [
"storeId"
],
"additionalProperties": false
}
Thanks
RAML is valid except for that <<dataSchema>> parameter used in the json schema, not sure if that's a valid use of parameters.
I would start by replacing that <<dataSchema>> for the json in stores.json and try again.
Let me know if that works or what errors you get.
UPDATE:
Mulesoft's anypoint portal validates your RAML with just that single change, you can see it here