Was making a bash script to backup organization's repos (private included) convert them into a tar file and then send off to s3 bucket.
curl -H "Authorization: token {PAT}" -L https://api.github.com/repos/{org}/tarball/main > main.tar.gz
When I do this command with one single repo it works, but my task was to have it grab all 100+ repos in the organization. Any thoughts of what I am missing?
I don't think that there is any API from GitHub to provide your desired HTTP call out of the box.
You can try to create a Bash/PowerShell script to:
List the organization:
curl \
-H "Accept: application/vnd.github+json" \
-H "Authorization: Bearer <YOUR-TOKEN>" \
https://api.github.com/orgs/ORG/repos
(reference: https://docs.github.com/en/rest/repos/repos#list-organization-repositories)
Take the result and convert it to an array.
For each element in the array, run your command:
curl -H "Authorization: token {PAT}" -L
https://api.github.com/repos/{org}/tarball/main > main.tar.gz
Related
Using the github API, you can get a repository tree using (example done with CURL):
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/{owner}/{repo}/git/trees/{tree_sha}
Assuming I have a repo called dev by owner NewCo, and I want to list the repo tree called XXXX, I would:
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/NewCo/dev/git/trees/{tree_sha}
How can I find out the {tree_sha} value for tree XXXX? Any idea where can I find out this value?
You can use a commit sha from the commits endpoint for that: /repos/{owner}/{repo}/commits. For example:
#!/usr/bin/env bash
set -e
owner=zacanger
repo=fetchyeah
sha=$(curl -s -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/$owner/$repo/commits?per_page=1 \
| jq -r '.[0].sha')
curl -H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/$owner/$repo/git/trees/$sha
You could also use the pull requests API, or any other endpoint that returns commit info (meaning, most of them except for user and org APIs).
-H "Authorization: Bearer [OAUTH2_TOKEN]" \
-o "[SAVE_TO_LOCATION]" \
"https://storage.googleapis.com/storage/v1/b/[BUCKET_NAME]/o/[OBJECT_NAME]?alt=media"
I don't know where we need to give the local directory(SAVE_TO_LOCATION) path.
1.List the bucket
gsutil ls gs://your-bucket
#gs://your-bucket/file
2.Assuming that you want to download the blob gs://your-bucket/file to /home/user/file
curl -X GET -H "Authorization: Bearer $(gcloud auth print-access-token)" -o "/home/user/file" "https://storage.googleapis.com/storage/v1/b/your-bucket/o/file?alt=media"
3.Check if the file was downloaded
cat /home/user/file
I want to automate the addition and removal of VMs from the RHEL Subscription. I want to use a curl command if possible and keep it simple.
I tried executing curl commands on the api.access.redhat.com/management/v1/subscriptions endpoints but it is giving errors like "Authentication parameters missing".
Below is an example command I am using:
curl -X GET -s -k -u username:Password "https://api.access.redhat.com/management/v1/subscriptions" -H "accept: application/json"
Expected to see the list of Subscribed systems but getting the "Authentication parameters missing" message.
In order to get all the subscriptions you have, run the following command:
curl -H "Authorization: Bearer $access_token" "https://api.access.redhat.com/management/v1/subscriptions"
You can retrieve the access_token variable by running the following command:
curl https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token -d grant_type=refresh_token -d client_id=rhsm-api -d refresh_token=$offline_token
The offline_token, instead, has to be generated from the API Tokens Page.
Check this article for further details.
Im trying to deploy a plan that has artefacts from external service for this I want to download via curl those files that I will pass as variables... however I am not able to set programmaticly the variables with the deploymet call
curl -k -u user:passord -X POST -d "bamboo.myVariable=someurl" BASE_BAMBOO_URL/bamboo/rest/api/latest/queue/PROJECT-ID
Trying to do the same with the deployment API fails
curl BASE_BAMBOO_URL/bamboo/rest/api/latest/deploy/project/1321123123 -u user:passord-X POST -d "bamboo.myVariable=callMEwithDATA"
Trying to add that into the API fails as does trying to pass it thru JSON
curl -X POST BASE_BAMBOO/bamboo/rest/api/latest/deploy/project/1320058 -u user:passord -H "Accepts: application/json" -H "Content-Type: application/json" -d '{"name":"release-1", "myVariable":"ARTEFACT_URL"}'
To continue with a request the variables have to be passed as query params... a sad sad reality is that the Bamboo API is very messed up
bamboourl&executeAllStages=true&bamboo.variable.MYVAR=1234
Using the endpoint /folders of the Orange Cloud API, I can only get the listing of the files in the main directory:
curl -X GET -H "X-Orange-CA-ESID: OFR-2588c...2e64f249ab" -H \
"Authorization: Bearer OFR-2588c...2e64f249ab" \
https://api.orange.com/cloud/v1/folders/Lw
How could I get photos entries only, including the ones in subdirectories?
You can get all photos this way:
curl -X GET \ -H \
-H "Authorization: Bearer OFR-948ef5..." \
"https://api.orange.com/cloud/v1/folders?filter=image&flat=true"
By the way, the session header is no more necessary
https://developer.orange.com/apis/cloud-france/getting-started#filtering-on-photos,-videos,-audio-files