Zeppelin REST API error getting paragraph results - 405 Method not allowed - impala

From various other StackOverflow posts I understand I can do a Zeppelin API call to run and get the output from a paragraph using the URL:
https://[zeppelin url]:[port]/api/notebook/run/[note ID]/[paragraph ID]
but this gives me:
HTTP ERROR 405
Problem accessing /api/notebook/run/2GG52SU6/2025492809-066545_207456631. Reason:
Method Not Allowed
Is there a way of fixing this? Other API calls work fine and the paragraph runs fine in the Zeppelin Web UI (it just does a simple Impala query). I just want to get the output via a REST API so I can call it from an Angular paragraph and manipulate the results before display.
Thanks!

The documentation of the run paragraph api states it to be a post request ; If you send the get request it will fail with 405 not allowed.
curl -X POST http://localhost:8000/zeppelin/api/notebook/run/2GUEWJDQ4/paragraph_1642773079113_366171993|jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 298 100 298 0 0 2712 0 --:--:-- --:--:-- --:--:-- 2733
{
"status": "OK",
"body": {
"code": "SUCCESS",
"msg": [
{
"type": "TEXT",
"data": "common.cmd\ncommon.sh\nfunctions.cmd\nfunctions.sh\ninstall-interpreter.sh\ninterpreter.cmd\ninterpreter.sh\nstop-interpreter.sh\nupgrade-note.sh\nzeppelin-daemon.sh\nzeppelin-systemd-service.sh\nzeppelin.cmd\nzeppelin.sh\n"
}
]
}
}

Related

cube.js API load endpoint responds with 413

When calling the load endpoint with a query > ~1700 bytes, we are receiving a 413 (request entity too large) error. We have narrowed it down to between 1706 and 1758.
Steps to reproduce the behavior:
post large query to <host>:<port>/cubejs-api/v1/load
Receive 413
removing one or two entries from Dimensions will cause the query to work as expected
standard response JSON and 200 status
Version: 0.31.14
The smallest failing query we have is:
{
"query": {
"measures": [
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.count",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalDiscAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalCopayAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalDedctAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalRejAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalExGrtaAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsGrsAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsStlAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsRbnsAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsIbnrAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsIncrAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsRskStlAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsRskRbnsAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsRskIbnrAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsRskIncrAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsReinsAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsReinsRbnsAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsReinsIbnrAmt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.totalClmsReinsIncrAmt"
],
"dimensions": [
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.clntCoCd",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.trtyClntGrpNm",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.grpNbr",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.primLfNbr",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.insLfNbr",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.rskIncptnDt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.rskRnwlDt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.trtmtStrtDt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.trtmtEndDt",
"rpt_clm_clm_990_clab9f47d0000356e2szw7p19.admsnDt"
]
},
"queryType": "multi"
}
I tried submitting an issue on the cube.js github, but they have marked it as a question and asked that I post it here. I have also searched their docs and have not been able to find any configuration that relates to this. It looks like the max payload size is hard-coded to 1M (see link), but here we are failing with 1758 bytes.

Stress Test hit an HTTP error 400 during stress test with 600 threads

i'm doing Stress Test for my API for two endpoint. First is /api/register and second is /api/verify_tac/
request body on /api/register is
{
"provider_id": "lifecare.com.my",
"user_id": ${random},
"secure_word": "Aa123456",
"id_type": "0",
"id_number": "${id_number}",
"full_name": "test",
"gender": "F",
"dob": "2009/11/11",
"phone_number": ${random},
"nationality": "MY"
}
where ${random} and ${id_number} is a list from csv data config.
while request body for verify_tac is
{
"temp_token": "${temp_token}",
"tac":"123456"
}
${temp_token} is a response extract from /api/register response body.
For the test. I have done 5 type of testing without returning all error.
100 users with 60 seconds ramp up periods. All success.
200 users with 60 seconds ramp up periods. All success.
300 users with 60 seconds ramp up periods. All success.
400 users with 60 seconds ramp up periods. All success.
500 users with 60 seconds ramp up periods. All success.
600 users with 60 seconds ramp up periods. most of the /api/register response data is empty resulting in /api/verify_tac return with an error. request data from /api/verify_tac that return an error is
{
"temp_token": "NotFound",
"tac":"123456"
}
How can test number 6 was return with an error while all other 5 does not return error. They had the same parameter.
Does this means my api is overload with request? or weather my testing parameter is wrong?
If for 600 users response body is empty - then my expectation is that your application simply gets overloaded and cannot handle 600 users.
You can add a listener like Simple Data Writer configured as below:
this way you will be able to see request and response details for failing requests. If you untick Errors box JMeter will store request and response details for all requests. This way you will be able to see response message, headers, body, etc. for previous request and determine the failure reason.
Also it would be good to:
Monitor the essential resources usage (like CPU, RAM, Disk, Network, Swap usage, etc.) on the application under test side, it can be done using i.e. JMeter PerfMon Plugin
Check your application logs for any suspicious entries
Re-run your test with profiler tool for .NET like YourKit, this way you will be able to see the most "expensive" functions and identify where the application spends most time and what is the root cause of the problems

IBM Watson Concept Insights get related concepts (corpus) using cURL timing out

I am getting this error -> {"code": 500, "message": "Forwarding error"} every time I try to get related concepts from my private account and corpus. The error seems to be a timeout error since it always dies at 2:30.
I've replaced the sample provided by IBM to point to my account and corpus. Does anybody know why this is occurring?
curl -u "{username}":"{password}" "https://gateway.watsonplatform.net/concept-insights/api/v2/corpora/accountid/corpus/related_concepts?limit=3&level=0"
cURL result
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 44 0 44 0 0 0 0 --:--:-- 0:02:30 --:--:-- 10
Corpus status
{"id":"/corpora/accountid/corpus","documents":10,"last_updated":"0001-01-01T00:00:00Z","build_status":{"ready":10,"error":0,"processing":0}}
NOTE: I do not get this error if I use the public example provided by IBM on the API. I have also masked my account id, corpus, username, and password for this public posting.
Unfortunately, since the error is corpus specific (since you mentioned you can the API to work on the public corpus), we would need to know more information about your corpus (like account id and corpus id) in order to help you out.
One way to allow you to provide this information privately is to open a ticket with the Bluemix system (there are 2 options described here)
https://developer.ibm.com/bluemix/support/#support
If you list the "Watson Concept Insights" service in the ticket, we will get your information.

BigQuery Load Job [invalid] Too many errors encountered

I'm trying to insert data into BigQuery using the BigQuery Api C# Sdk.
I created a new Job with Json Newline Delimited data.
When I use :
100 lines for inputs : OK
250 lines for inputs : OK
500 lines for inputs : KO
2500 lines : KO
The error encountered is :
"status": {
"state": "DONE",
"errorResult": {
"reason": "invalid",
"message": "Too many errors encountered. Limit is: 0."
},
"errors": [
{
"reason": "internalError",
"location": "File: 0",
"message": "Unexpected. Please try again."
},
{
"reason": "invalid",
"message": "Too many errors encountered. Limit is: 0."
}
]
}
The file works well when I use the Bq Tools with command :
bq load --source_format=NEWLINE_DELIMITED_JSON dataset.datatable pathToJsonFile
Something seems to be wrong on server side or maybe when I transmit the file but we cannot get more log than "internal server error"
Does anyone have more informations on this ?
Thanks you
"Unexpected. Please try again." could either indicate that the contents of the files you provided had unexpected characters, or it could mean that an unexpected internal server condition occurred. There are several questions which might help shed some light on this:
does this consistently happen no matter how many times you retry?
does this directly depend on the lines in the file, or can you construct a simple upload file which doesn't trigger the error condition?
One option to potentially avoid these problems is to send the load job request with configuration.load.maxBadRecords higher than zero.
Feel free to comment with more info and I can maybe update this answer.

Yodlee executeUserSearchRequest error

I try to get information from Yodlee API.
I have a test user where I've implemented adding an account and I got refresh OK from the site:
{ siteRefreshStatus: {
siteRefreshStatusId: 8
siteRefreshStatus: "REFRESH_COMPLETED_WITH_UNCERTAIN_ACCOUNT"
}
- siteRefreshMode: {
refreshModeId: 2
refreshMode: "NORMAL"
}
- updateInitTime: 0
nextUpdate: 1391603301
code: 403
noOfRetry: 0
}
}
Now when I try to perform search and get the actual transactions I get this error:
{
errorOccured: "true"
exceptionType: "com.yodlee.core.IllegalArgumentValueException"
refrenceCode: "_57c250a9-71e8-4d4b-830d-0f51a4811516"
message: "Invalid argument value: Container type cannot be null"
}
The problem is that I have container type!
Check out the parameters I send:
cobSessionToken=08062013_2%3Ad02590d4474591e507129bf6baaa58e81cd9eaacb5753e9441cd0b1ca3b8bd00a3e6b6a943956e947458307c1bb94b505e2eb4398f890040a3db8c98606c0392&userSessionToken=08062013_0%3A8e8ef9dd4f294e0f16dedf98c1794b96bf33f2e1f2686eda2f35dfe4901dd3a871eed6d08ce52c99a74deb004c025ebf4bf94c7b17baf8ba18aacb331588f5f5&transactionSearchRequest.containerType=bank&transactionSearchRequest.higherFetchLimit=1000&transactionSearchRequest.lowerFetchLimit=1&transactionSearchRequest.resultRange.endNumber=500&transactionSearchRequest.resultRange.startNumber=1&transactionSearchRequest.searchClients.clientId=1&transactionSearchRequest.searchClients.clientName=DataSearchService&transactionSearchRequest.ignoreUserInput=true&transactionSearchRequest.searchFilter.currencyCode=USD&transactionSearchRequest.searchFilter.postDateRange.fromDate=01-01-2014&transactionSearchRequest.searchFilter.postDateRange.toDate=01-31-2014&transactionSearchRequest.searchFilter+.transactionSplitType=ALL_TRANSACTION&transactionSearchRequest.searchFilter.itemAccountId+.identifier=10008425&transactionSearchRequest.searchClients=DEFAULT_SERVICE_CLIENT
There is an error occurred while adding the account, which can be interpreted by this parameter code: 403 and hence you will not be seeing that account when you call the getItemSummary API. An account is successfully linked if the code has zero as value. E.g.code:0 . 403 is an error which is show if Yodlee's data agent has encountered an unhandled use case. Hence for any such error you should file a service request using Yodlee customer care tool.
To know more about error codes please visit -
https://developer.yodlee.com/FAQs/Error_Codes
The status is show as completedsiteRefreshStatus: "REFRESH_COMPLETED_WITH_UNCERTAIN_ACCOUNT"because addition of any account is followed by a refresh in which Yodlee's data agent logs into the websites of FIs and try scraping data. Hence completion of this activity is denoted as REFRESH_COMPLETED even when there is an error occurred.
TranasctionSearch issue -
I can see two of the parameters with a "+" sign. Since transactionSlipttype and containerType are dependent on each other the error is thrown.
&transactionSearchRequest.searchFilter+.transactionSplitType=ALL_TRANSACTION
&transactionSearchRequest.searchFilter.itemAccountId+.identifier=10008425
The right parameters are -
&transactionSearchRequest.searchFilter.transactionSplitType=ALL_TRANSACTION
&transactionSearchRequest.searchFilter.itemAccountId.identifier=10008425