Artifactory REST API to modify Build Info json - repository

I need to change the BuildUser name (Principal) in BuildInfo json while publishing package to the Artifactory.
Kindly let me know if there is any REST API(PUT/POST) available to update the user details in Buildinfo.
Thanks,

Builds are supposed to be immutable, so there is no way to modify one. If you really need to do this, the closest you can get is deleting and re-deploying the existing build info:
curl -uuser:pass -XGET 'http://localhost:8081/artifactory/api/build/foobar/10' >build.json
curl -uuser:pass -XDELETE 'http://localhost:8081/artifactory/api/build/foobar?buildNumbers=10'
curl -uuser:pass -XPUT 'http://localhost:8081/artifactory/api/build' -H 'Content-Type: application/json' -T build.json
This should re-deploy the build exactly as it already is, except that Artifactory will overwrite the principal field with the current user (so make sure you run these as the user you want the principal to be set to). By default, the DELETE will only delete the build info, and not the build artifacts.
If you're looking to deploy a build with a different principal from the user you're deploying as, I don't think that's possible.

Related

Downloading files (not from repo) from private GitLab server

I need to get a file from a private GitLab in a script (actually a Yocto recipe, if it matters).
Issuing: https://gitlab2server.com/api/v4/projects/53/packages/generic/paCKAGE/21.08.16/FILE.tar.xz on a browser works fine, but wget <same URL> fails with a "401 Unauthorized".
I can get around the problem with curl --header "PRIVATE_TOKEN: xxxx" ... but that means encoding my private token into a shell script which doesn't seem right.
To access a regular git repo I can use git clone git:... and it works because of the uploaded keys.
Using the equivalent scp gitlab2server.com:/api/v4/... . does not work because "Permission denied (publickey).".
What is the right way to do this?
Ideally I would need to have a ssh (actually scp, of course) access using pre-shared keys to access the files. I would hate to put large binaries into the git repo just to be able to access them.
The only way to authenticate with the GitLab API (including the Package API here) is using a personal access token, or the CI_JOB_TOKEN environment variable if running within GitLab CI/CD. CI_JOB_TOKEN is one of the Predefined Variables available to every CI/CD Pipeline Job and holds a non-admin token.

Using the BitBucket API, list which branches have all content in master and which dont

I have a bitbucket server which has 100's of branches. I would like to separate the branches which contain code already on the master branch, and those that dont.
The only way I've thought to get the info would be to get a list of branches:
https://<bitbucket host>/rest/api/1.0/projects/<project>/repos/<MyRepo>/branches
and get the head of each branch
and see whether the commit exists in the list of commits from master:
https://<bitbucket host>/rest/api/1.0/projects/<project>/repos/<MyRepo>/commits?branch=master&limit=1000'
This is extremely slow. Can anybody think of a better way of doing this? Basically I'm hoping to identify all branches which have been added into master and so can be deleted somewhat safely.
Im thinking that I want the equivalent of:
for each branch:
git rev-list --left-right --count origin/master...origin/FOO
Thanks in advance
What you can do first is get the list of all branches available in that particular repository using API call and make it into a python list. Since you have 100's of branches and also there is paging, try to loop the curl by iterating the page value until the obtained response is empty.
curl --url "https://api.bitbucket.org/2.0/repositories/workspace/repository_name/refs/branches?pagelen=100&page={Iterate}" --user username:password --request GET --header "Accept: application/json"
Later there is an API available to check if all the commits in the source branch is available in destination branch.
curl --url "https://api.bitbucket.org/2.0/repositories/workspace/repository_name/commits/source_branch?exclude=destination_branch" --user username:password --request GET --header "Accept: application/json"
So, you could have your source as master and iterate all other branches as destination and get the response. Based on the response if its empty or not, you can categorize the branches that has all the contents in master and that don't have the content.

Is there an API for fetching stopped containers list

I need to get all the stopped containers list via an API. but I got only commands to get the list.
If APIs are not available, suggest how we can create an API with docker commands. so whenever I hit the API, I can get the list of stopped containers.
First, if you need other pc to visit docker daemon you need to enable it in /lib/systemd/system/docker.service, like next:
ExecStart=/usr/bin/dockerd -H fd:// -H tcp://0.0.0.0:2375
Second, you could use next url to paste to browser to get all containers like exit:
http://10.192.244.188:2375/containers/json?filters={"status":["exited"]}
If use curl, then you may need to url encode some special html entity like next:
curl http://10.192.244.188:2375/containers/json?filters=%7B%22status%22%3A%5B%22exited%22%5D%7D
You could also use next to make it easy to read:
curl http://10.192.244.188:2375/containers/json?filters=%7B%22status%22%3A%5B%22exited%22%5D%7D | python -m json.tool

CLI command for Sonarqube Upgrade browser step

https://docs.sonarqube.org/display/SONAR/Upgrading
I am just going through this documentation to upgrade Sonarqube.
One of the steps is to open the URL in browser and follow instructions.
Is there any CLI command available for this step? So that I can automate this step in my upgrade automation?
Most (or even all?) UI interactions only trigger Web API calls.
In your case, api/system/migrate_db seems to serve your purpose.
From the api documentation:
Migrate the database to match the current version of SonarQube.
Sending a POST request to this URL starts the DB migration. It is
strongly advised to make a database backup before invoking this WS.
To call it from the command line use:
curl -s -u admin:admin -XPOST "localhost:9000/api/system/migrate_db"
curl is a linux command line tool for to communicate via HTTP
-s toggle "silent mode"
-u admin:admin provides authentication
-XPOST set's the HTTP method to POST (instead of default GET)

How to download all my caldav and carddav data with one wget / curl?

Until now, I used Google Calender and do my personal backup with a daily wget of the public ".ics"Link.
Now I want to switch to a new Service who has only caldavaccess.
Is there a possibility to download all my caldav and carddav data with one wget / curl?
This downloaded data should give me the possibility to backup lost data.
Thanks in advance.
edit
I created a very simple php file which works in the way hmh explained. Don't know if this way works for different providers, but for mailbox.org, it works well.
You can find it here https://gist.github.com/ahemwe/a2eaae4d56ac85969cf2.
Please be more specific, what is the new service/server you are using?
This is not specifically CalDAV, but most DAV servers still provide a way to grab all events/todos using a single GET. Usually by targeting the relevant collection with a GET, e.g. like either one of those:
curl -X GET -u login -H "Accept: text/calendar" https://myserver/joe/home/
curl -X GET -u login -H "Accept: text/calendar" https://myserver/joe/home.ics
In CalDAV/CardDAV you can grab the whole contents of a collection using a PROPFIND:
curl -X PROPFIND -u login -H "Content-Type: text/xml" -H "Depth: 1" \
--data "<propfind xmlns='DAV:'><prop><calendar-data xmlns='urn:ietf:params:xml:ns:caldav'/></prop></propfind>" \
https://myserver/joe/home/
Replace calendar-data with
<address-data xmlns="urn:ietf:params:xml:ns:carddav"/>
for CardDAV.
This will give you an XML entity which has the iCal/vCard contents embedded. To restore it, you would need to parse the XML and extract the data (not hard).
Note: Although plain standard, some servers reject that or just omit the content (lame! file bug reports ;-).
Specifically for people using Baïkal (>= 0.3.3; other Sabre/dav-based solutions will be similar), you can go directly to
https://<Baïkal location>/html/dav.php/
in a browser and get an html interface that allows you to download ics files, and so also allows you to find the right links for those for use with curl/wget.
I tried the accepted answer which did not work for me. With my CalDAV calendar provider I can, however, retrieve all calendar files using
wget -c -r -l 1 -nc --user='[myuser]' --password='[mypassword]' --accept=ics '[url]'
where [myuser] and [mypassword] are what you expect and [url] is the same URL as the one that you enter in your regular CalDAV software (as specified by your provider).
The command creates a directory containing all the ICS-files representing the calendar items. A similar command works for my addressbook.