https://docs.sonarqube.org/display/SONAR/Upgrading
I am just going through this documentation to upgrade Sonarqube.
One of the steps is to open the URL in browser and follow instructions.
Is there any CLI command available for this step? So that I can automate this step in my upgrade automation?
Most (or even all?) UI interactions only trigger Web API calls.
In your case, api/system/migrate_db seems to serve your purpose.
From the api documentation:
Migrate the database to match the current version of SonarQube.
Sending a POST request to this URL starts the DB migration. It is
strongly advised to make a database backup before invoking this WS.
To call it from the command line use:
curl -s -u admin:admin -XPOST "localhost:9000/api/system/migrate_db"
curl is a linux command line tool for to communicate via HTTP
-s toggle "silent mode"
-u admin:admin provides authentication
-XPOST set's the HTTP method to POST (instead of default GET)
Related
I am trying to generate PDF reports and download them using a script. I followed below instructions.
https://github.com/elastic/kibana/blob/master/docs/user/reporting/automating-report-generation.asciidoc
I am able to queue the report and i also got a download url ()/api/.../download/xyzdrfd but when i am trying wget on the url, It's not working. I have no idea how to download that report using APIs so just tried with wget.
Can anyone tell me how to download the reports from API call?
The download might not be happening due to some redirects happening on the page. Use -L option with curl command to get it working. I did it specifically using the Kibana endpoint to download a PDF file. Replace the username and passsword with the basic auth credentials of yours. Use -o option to specify the downloaded file name. Below is the complete example of the command:
curl -L -u username:password -o download.pdf https://endpoint.com:9244/s/bi-/api/reporting/jobs/download/ktl8n95q001edfc210feaz0r
I need to forward some database related logs into splunk indexer using scripted inputs (Shell scripts)
My questions are :
1)Do I need to install the universal forwarder in the host side ?
2)Is there any other way rather than installing UF in host that we can extract the logs into indexer using scripted inputs?
3)In order to accomplish this what are the steps do I need to follow ?
1) To run a scripted input you need either a Universal Forwarder or a Heavy Forwarder. You'll need the HF to run a Python script.
2) See #Akah's answer.
3) See http://docs.splunk.com/Documentation/Forwarder/7.2.1/Forwarder/Abouttheuniversalforwarder
You can use the HTTP Event Collector which permits you to send data to the indexer via HTTP in JSON format.
There are examples to show you how to do via curl (and so by script) :
curl -k https://<host>:8088/services/collector -H 'Authorization: Splunk <token>' -d '{"sourcetype": "mysourcetype", "event":"Hello, World!"}'
You can follow the walkthrough too.
I am looking for a persistent key DB which can be accessed via HTTP. I need to use it for storing postman test script data. I have heard of rocksdb and leveldb, but I am not sure whether they can be accessed via HTTP.
leveldb and rocksdb don't have a network component.
I created a small python project that does expose a document datastore like API that you can query using REST. Have a look at it https://github.com/amirouche/deuspy. It rely on leveldb for persistence.
There is a python asyncio client. You can create a client on your own it's very easy.
To get started, you can simply do the following:
pip3 install deuspy
python3 -m deuspy.server
And then start querying.
Here is an example curl-based session:
$ curl -X GET http://localhost:9990
{}
$ curl -X POST --data '{"héllo": "world"}' http://localhost:9990
3252169150753703489
$ $ curl -X GET http://localhost:9990/3252169150753703489
{"h\u00e9llo": "world"}
You can also filter documents. Look at how is implemented the asyncio client.
Take a look at Webdis which provides HTTP REST API access to Redis key value store. Redis has very good performance and scalability.
I have a question about API's and cURL. I'm not sure if this is all Python, but I am trying to access JSON data using an API, but the server isn't as easy as grabbing the data with an XMLRequest... The support team gave me this line of code:
curl -k -s --data "api_id=xxxx&api_key=xxxx&time_range=today&site_id=xxxxx"
https://my.incapsula.com/api/stats/v1
And I have no idea what this even means because all the API requests I've been making was just as easy as using a link and parsing through it with some JavaScript. Can anyone break the -k -s --data for me or point me in a right tutorial?
(NOT PYTHON; Sorry guys...)
The right tutorial is the man page.
-k/--insecure
(SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers. All SSL connections are attempted to be made secure by using the CA certificate bundle installed by default. This makes all connections considered "insecure" fail unless -k/--insecure is used.
See this online resource for further details: http://curl.haxx.se/docs/sslcerts.html
-s/--silent
Silent or quiet mode. Don't show progress meter or error messages. Makes Curl mute.
As for --data, well, it specify the data you are sending to the server.
This question is (for now) not related to Python at all, but eventually to shell scripting.
I am trying to automatically deploy our Java EE application from our build server (Jenkins) to a remote Glassfish server via the command line.
At the moment I am using asadmin for this and it works fine, but this option requires me to have Glassfish installed on the build server as well - which I would like to avoid as I do not need it there. The build server is really only running the builds and the deployment so I would like to keep the server as "clean" as possible.
I can't find any download that installs only the asadmin tools, and also my attempt to manually copy over only the required files failed as there are some dependencies to certain *.jars that I don't know of so it always fails unless I copy the whole glassfish installation folder to the build server.
So my question is:
Does anybody know how to install only the asadmin tools without installing the whole Glassfish server?
Alternatively I would also be happy to use any other command line tools as long as they allow me to deploy to a remote Glassfish instance using secure communication.
After doing a bit more research I gave up on trying to install asadmin without the full Glassfish installation and instead used Glassfish's REST admin interface.
I now made it work using CURL in a simple batch file:
curl.exe ^
--user glassfish_username:glassfish_password ^
--insecure ^
-H "Accept: application/json" ^
-H "X-Requested-By: dummy" ^
-X POST ^
-F id=#yourfile.war ^
-F contextroot=yourcontextroot ^
-F force=true ^
https://yourservername:4848/management/domain/applications/application/
The REST API is fairly straight forward once you know what you need to do but just in case somebody else needs this, here a couple of important points:
--insecure is required (by CURL) to allow self-signed and untrusted SSL certificates
The header attributes for "Accept" and "X-Requested-By" must be set, otherwise Glassfish doesn't process the request and simply returns a blank document as an ansower. No idea why but setting these parameters made it work.
The content of the war
file is passed as the "id" parameter on the POST
The URL needs to be exactly as shown in the snipped above, i.e. do not replace "domain"
with your domain name or "application" with your application name. This is the actual REST interface endpoint. There is no need to specify the domain/application name anywhere.