Is there a solution to log all messages in rabbitmq but don't using rabbitmq_management? - rabbitmq

I do know rabbitmq_tracing, which is a plugin of RabbitMQ, can provides a GUI to capture traced messages and log them in text or JSON format files. But the plugin is performance costing, is there a way to log all messages without this plugin?
Or is there a eclectic way to log messages automatically without using the management plugin? Because configuring traces on the GUI is not tolerant for some customers.
Any response would be appreciated.

I can't find a good solution to log all messages without rabbitmq_management. But with this plugin turned on, add and delete rabbitmq trace via command line:
Add a new trace:
[windows:] curl -i -u guest:guest -H "content-type:application/json" -XPUT ^ http://localhost:15672/api/traces/%2f/my-trace ^ -d"{""format"":""json"",""pattern"":""#"",""max_payload_bytes"":1000}"
[linux:] curl -i -u guest:guest -H "content-type:application/json" -XPUT \ http://localhost:15672/api/traces/%2f/my-trace \ -d'{"format":"text","pattern":"#", "max_payload_bytes":1000}'
Delete a trace:
[windows:] curl -i -u guest:guest -H "content-type:application/json" -XDELETE ^ http://localhost:15672/api/traces/%2f/my-trace
[linux:] curl -i -u guest:guest -H "content-type:application/json" -XDELETE \ http://localhost:15672/api/traces/%2f/my-trace

Related

Triggering Airflow DAG via API

I have installed Airflow 2.0.1 on EC2 with PostgreSQL RDS as metadata db. I want to trigger DAG from Lambda so tried to test the code with curl but am receiving Unauthorized as response. What if anything should I be doing differently?
Steps:
Create user for lambda
airflow users create -u lambda_user -p some_pwd -f Lambda -l User -r User -e someone#nowhere.com
Define variables on shell (for lambda user, password and endpoint url)
Make the curl call
curl -H "Authorization: Basic Base64(username:password)" -H "Content-type: application/json" -H "Accept: application/json" -X GET --user "${LAMBDA_USER}:${LAMBDA_PWD}" "${ENDPOINT_URL}/api/v1/dags/sns_test/dagRuns"
Response I receive is this:
{
"detail": null,
"status": 401,
"title": "Unauthorized",
"type": "https://airflow.apache.org/docs/2.0.1/stable-rest-api-ref.html#section/Errors/Unauthenticated"
}
After revising call to
curl -H "Content-type: application/json" -H "Accept: application/json"
-X POST --user "${LAMBDA_USER}:${LAMBDA_PWD}" "${ENDPOINT_URL}/api/v1/dags/sns_test/dagRuns" -d '{"conf": {}}'
dag was triggered!
You are creating a user with the role User.
This is because you have -r User in the command.
Now Airflow requires at least Viewer permissions for the end point you are calling. You can find that information on the Apache Airflow website here.
If you change your command it will work.
Change it from
airflow users create -u lambda_user -p some_pwd -f Lambda -l User -r User -e someone#nowhere.com
to
airflow users create -u lambda_user -p some_pwd -f Lambda -l User -r Viewer -e someone#nowhere.com

Using CURL to list github repository tree (github API)

Using the github API, you can get a repository tree using (example done with CURL):
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/{owner}/{repo}/git/trees/{tree_sha}
Assuming I have a repo called dev by owner NewCo, and I want to list the repo tree called XXXX, I would:
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/NewCo/dev/git/trees/{tree_sha}
How can I find out the {tree_sha} value for tree XXXX? Any idea where can I find out this value?
You can use a commit sha from the commits endpoint for that: /repos/{owner}/{repo}/commits. For example:
#!/usr/bin/env bash
set -e
owner=zacanger
repo=fetchyeah
sha=$(curl -s -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/$owner/$repo/commits?per_page=1 \
| jq -r '.[0].sha')
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/$owner/$repo/git/trees/$sha
You could also use the pull requests API, or any other endpoint that returns commit info (meaning, most of them except for user and org APIs).

Can I get a working curl command to remove a system from RHEL subscription?

I want to automate the addition and removal of VMs from the RHEL Subscription. I want to use a curl command if possible and keep it simple.
I tried executing curl commands on the api.access.redhat.com/management/v1/subscriptions endpoints but it is giving errors like "Authentication parameters missing".
Below is an example command I am using:
curl -X GET -s -k -u username:Password "https://api.access.redhat.com/management/v1/subscriptions" -H "accept: application/json"
Expected to see the list of Subscribed systems but getting the "Authentication parameters missing" message.
In order to get all the subscriptions you have, run the following command:
curl -H "Authorization: Bearer $access_token" "https://api.access.redhat.com/management/v1/subscriptions"
You can retrieve the access_token variable by running the following command:
curl https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token -d grant_type=refresh_token -d client_id=rhsm-api -d refresh_token=$offline_token
The offline_token, instead, has to be generated from the API Tokens Page.
Check this article for further details.

Bamboo Trigger Deployment plan with variables

Im trying to deploy a plan that has artefacts from external service for this I want to download via curl those files that I will pass as variables... however I am not able to set programmaticly the variables with the deploymet call
curl -k -u user:passord -X POST -d "bamboo.myVariable=someurl" BASE_BAMBOO_URL/bamboo/rest/api/latest/queue/PROJECT-ID
Trying to do the same with the deployment API fails
curl BASE_BAMBOO_URL/bamboo/rest/api/latest/deploy/project/1321123123 -u user:passord-X POST -d "bamboo.myVariable=callMEwithDATA"
Trying to add that into the API fails as does trying to pass it thru JSON
curl -X POST BASE_BAMBOO/bamboo/rest/api/latest/deploy/project/1320058 -u user:passord -H "Accepts: application/json" -H "Content-Type: application/json" -d '{"name":"release-1", "myVariable":"ARTEFACT_URL"}'
To continue with a request the variables have to be passed as query params... a sad sad reality is that the Bamboo API is very messed up
bamboourl&executeAllStages=true&bamboo.variable.MYVAR=1234

opendaylight: How to view config database

I am using OpenDaylight Carbon release and the openflow plugin. I am writing code to install a flow. The flow gets written to MDSAL and picked up and installed by the Southbound plugin. I want to see what is in the config database for the switch. How can I do this? Thanks.
With the MDSAL Openflow plugin (and general MDSAL usage overall), the flows get written to the config datastore (which is effectively the intention of what you want) then if there is a switch connected for these flows, the flows will be written to the switch and to the operational data store (which is where the result is stored).
Lets assume you're using OVS and have set the manager and controller to Opendaylight, you can query the flows in the config and operational data stores as follows:
Get the OVS datapath ID:
(needed below in the queries)
curl -H "Content-Type: application/json" -X GET --user admin:admin http://localhost:8181/restconf/config/opendaylight-inventory:nodes/ | python -m json.tool | grep "openflow:"
"id": "openflow:156930464280132",
"id": "openflow:156930464280132:1",
"id": "openflow:156930464280132:LOCAL",
Query the flows in the configuration data store:
curl -H "Content-Type: application/json" -X GET --user admin:admin http://localhost:8181/restconf/config/opendaylight-inventory:nodes/node/openflow:156930464280132 | python -m json.tool
Query the flows in the operational data store:
curl -H "Content-Type: application/json" -X GET --user admin:admin http://localhost:8181/restconf/operational/opendaylight-inventory:nodes/node/openflow:156930464280132 | python -m json.tool
Notice, you can go into more detail with the URL to get flows in specific tables, for instance, do this to get table 4 flows:
curl -H "Content-Type: application/json" -X GET --user admin:admin http://localhost:8181/restconf/config/opendaylight-inventory:nodes/node/openflow:156930464280132/table/4 | python -m json.tool
Also notice that using "python -m json.tool" formats the output so its not all on one line. Its not mandatory to use.