Starting Kibana roll up job - api

I have made a roll up job on the Kibana with the Kibana dev tool pictured below
I however have trouble starting this roll up job as I get following error below
I'm following the documention found here: https://www.elastic.co/guide/en/elasticsearch/reference/7.1/rollup-start-job.html
I've had trouble with finding anyone else with this problem, do you have any ideas?
I use kibana version 7.1.1

It should be a POST method.
eg: POST _xpack/rollup/job/<job_id>/_start
Also There is no request body for the Start Job API.
Please refer the documentation.

Related

Bigquery to Redshift data transfer using AWS SCT failing?

I am trying to migrate data from Bigquery to Redshift using this article. I followed through and successfully got till "Start the Local Data Migration Task".I had to setup AWS profile to access "Data Migration View(Other)". AWS profile was setup using access key and access secret of an admin user account in AWS.
What am I missing ?However, upon starting the task I keep getting following error:
class com.amazon.dmt.model.FileCredentials cannot be cast to class com.amazon.dmt.model.UserCredentials (com.amazon.dmt.model.FileCredentials and com.amazon.dmt.model.UserCredentials are in unnamed module of loader 'app')
I tried to check AWS documentation and looked around but this error is not listed anywhere. I cannot seem to understand that, why is type casting from FileCredentials to UserCredentials is being done ?
Anyone faced a similar issue or can point me in right direction please ?
Based on my testing, I have determined that this is an issue in the 1.0.670 version of SCT. A request has been submitted to correct the issue. In the meantime, to allow you to continue with your project, please revert to AWS-SCT version 1.0.666 using this link. https://d211wdu1froga6.cloudfront.net/builds/1.0/666/Windows/aws-schema-conversion-tool-1.0.zip
You will have to uninstall SCT and the extractor agent then reinstall and configure the previous version(s) as you did before.

WT 8-1: Getting HTTP GET on resource 'http://training4-american-api-maryumsiddique.us-e2.cloudhub.io:80/flights' failed: not found (404)

I'm doing mule 4 training and currently I'm on WT 8-1. I followed all the steps as they told and when I run the project it deployed successfully. But when I call "http://localhost:8081/american" from postman it gives me the following error.
Although the status is Started on Runtime Manager as below.
In API Manager the status is Active as well.
And I cannot figure out what's the issue. Can anyone please tell me what should I do?
Really appreciate the help 🙂
Likely your app deployed locally is not calling the right URL for the API deployed in CloudHub. Maybe the URL should be http://training4-...cloudhub.io/api/flights or something like it.

Trying to run Apache Apex's Yahoo Finance example on YARN

I've downloaded Apache Apex 3.5.0 along with Malhar 3.5.0.
I've successfully started the apex client and submitted the Yahoo Finance demo example to our YARN cluster (running CDH 5.10). The cluster is running and configured properly (many Spark and MR jobs are running on it).
I see the application I submitted as RUNNING in YARN as well as in the Apex cli. However when I try to connect to the Application Master I get a 404.
org.apache.hadoop.yarn.webapp.WebAppException: /: controller for default not found
I also tried directly to connect to the appMasterTrackingUrl reported by get-app-info command, and I get the same error.
I tried a couple of apex examples, and I always get the same error.
Any idea why?
It is somewhat expected. Add "/ws/v2/stram/info" to the URL path
When you connect to the App Master you need to provide the complete URL for a REST API to invoke. There is nothing to show/return for "/" so what you are seeing is expected. What are you trying to do connecting to the App Master?

How to define the leak period date in SonarQube using the API?

I am using SonarQube 6.0
Has anyone defined the leak period date in SonarQube using the API?
I have looked at the Web API (localhost:9000/web_api/) but have not seen a section that details this operation.
Any advise is appreciated.
Thanks
As indicated in the documentation go to Administration > General Settings > General > Differential Views. If you want to do it with a script, have a look at api/properties web services.
So I've tried this url,
curl -u d****#****.com:****** -X POST https://*****.com/api/properties?id=sonar.timemachine.period1&value=2018-06-01&resource=AdManager
and my response code is;
{"err_code":200,"err_msg":"property created"}
But, the property is not updated. Is there a way I can see why this failed? The logs don't seem to mention anything.

Unable to add apache Nifi in ambari?

I am trying to add Apache Nifi in ambari but continuously failing with error Error occured during stack advisor command invocation:
Unable to delete directory /var/run/ambari-server/stack-recommendations/1.
There is a similar thread with the same error in hortonworks community, I have tried everything mentioned in that thread but unable to fix it. My sandbox is installed in vmware workstation 12 player. I also tried to create and remove directory manually but it is failing with the error invalid argument. Created a thread for this error also on stackexchange. Please help!!!
Here is a link to Hortonworks forum thread. And it seems like sandbox is just broken:
This is due to a docker issue in this 2.5 sandbox build. It will be
fixed in next revision of the sandbox.
There are also some workarounds described (like use older HDP 2.4 or establishing own cluser based on the HDP 2.5 docker image)
Updated sandbox arrived: http://hortonworks.com/downloads
Trust me, active member of community see your posts in multiple locations. In a good, no Big Brother ways :) but cross-posting is an old as world ... Well, you got it.
Did you see a notice for this service in Ambari? Telling it's been deprecated? Same note in the github. There's a good reason for that, it's now been implemented properly by the dev team and with many more features. I.e. all the action is there now.
I think I replied a similar question, though not sure it was yours, take a look in HCC.