Gitlab-CI and Artifactory without Maven? - gitlab-ci

I would like to build a rpm package from a python module and save it in the Artifactory. Do I have to use Maven or are there alternatives?

You can build the packages the way you are used to, and deploy them to Artifactory via REST API / JFrog CLI / Artifactory JAVA client / Go client.
The best way will be using the JFrog CLI to collect and publish Build Info.
You can read here about integrating with GitLab CI. Although this blog is a bit out-dated, you can see some examples and understand the advantages of collecting the Build Info.

Related

Deployit integration with Gitlab

I was using deployit (XLdeploy) with Jenkins and SVN as repo, but now I am planning to move to Gitlab. Is there any good documentation on how to integrate Gitlab with deployIT? I have searched but didn't get any reference documents.
Where should I start with this?
Is it the case that XLdeploy has come up with some integration with Gitlab?

Jfrog Artifactory what is used for

I've a question about Jfrog Artifactory.
I don't know for what it is used. Can you explain me this in practical way ?
Thanks in advance!
From the tag description of artifactory:
Artifactory is a binary repository manager for use by build tools (like Maven and Gradle), dependency management tools (like Ivy, NuGet and RubyGems) and build servers (like Jenkins, TeamCity and Bamboo). It comes as downloadable version (OSS, Pro and Enterprise with additional features) and cloud SaaS version.
Simply put it's a self-hosted/on-premise remote location in which you can store Java/Maven dependencies (with the open source version) and various other package types (NPM, rubygems etc) with the pro version

Has anyone tried to deploy artifacts to Rational Asset Manager

I am experimenting the use of Rational Asset Manager to store our binaries and/or build artifacts. I am running a mvn deploy command to deploy my build artifacts to RAM. Although it recognizes the connection its throwing me a Http status code: 500 error.
I have also checked RAM logs for more information but i don't see any specific exception. All the examples or documents out on internet says we have to configure RTC build engine to run the builds.
I just want to know if anyone have tried publishing to RAM from command line using Mvn deploy ( without using RTC client ) ? is this do-able?
If you have successfully published artifacts to RAM using maven, can you please elaborate on how you did it?
It seems to be possible, if you follow the steps described in:
"Creating and using Maven assets", and
"Deploying from Maven to assets":
Before you can use Maven assets, a repository administrator must enable the Maven model library. For more information, see Enabling the Maven library.
The mvn client can integrate with Rational Asset Manager, using Rational Asset Manager as a Maven repository.
Before you deploy from Maven to Rational Asset Manager, you must add a repository entry to the pom.xml file on the computer where you plan to run the mvn client. See Creating and using Maven assets for more information.
I was able to deploy the artifact using mvn deploy after changing the following things in RAM
Pointed to a JRE version in RAM ( originally was pointing to JDK
version) and
Changed permissions for my role for saving my artifacts
in RAM ("save Work Item") Reference

Deploy from maven released repository to application

I just figured out, how to release to CB hosted maven "release" repository. I am trying to figure out, how to deploy tagged version to CB application.
I understand, I can manually upload WAR file but is there any script. As far as I know maven plugin for CB doesn't support it.
I have one appserver is running snapshot builds from jenkins.
I have other appserver, which I want to deploy only tagged/released artifact.
There are four ways to deploy applications to the CloudBees RUN#cloud service:
Using the bees command provided by the SDK
Using the bees-maven-plugin
Using the manual upload via the web GUI
Using the CloudBees Deployer plugin for Jenkins
Which option you choose depends on where the deployment will take place from... And the from I am talking about is which machine is doing the deployment not where the file is sourced.
If running from a Jenkins job, the best bet is the Jenkins plugin.
If running from your own laptop, the web ui or the bees command is simplest.
If running as part of a maven build, the maven plugin is simplest... (Though I should warn that the maven plugin (temporarily removing my cloudbees hat and putting on my maven PMC hat) is shite and does it all arsewise ;-) )
Your best bet is to set up a Jenkins job that uses dependency:get to pull the artifact from the repo and then add a cloudbees deployer build step to push to RUN#cloud
The good news is that bashing the maven plugin into something more maven like is on our roadmap... Hopefully that will enable actions like you can achieve with the ship-maven-plugin#mojo where you can specify a specific released version for "shipping" to production.
I suppose, that what you want to do is to deploy a release artifact to your repository.
have a look at maven-release-plugin.
Briefly, what you need to do is:
$ mvn release:prepare
$ mvn release:perform
it's not so trivial, since you need to configure appropriately your pom.xml to get it working. Have a look at the maven-release-plugin examples and usage pages.
Are you creating the tag/release from a Jenkins build? If so you could probably use the Deploy to CloudBees post-build step with target/checkout/something.war.
More generally I guess you would want to write a script to use mvn dependency:get followed by the Bees SDK to obtain the latest released artifact and deploy it.

Setup a shared ivy repository

I am setting up an ant build system on a project with dependency resolution being managed by ivy. I have it up and running with the file system being used for the local and shared repository currently. My ultimate goal would be that when developers are fixing bugs or creating new functionality, they would only be able to put artifacts into their local repository. When they belive their code is ready to be used by the rest of the team, it would be promoted to the proper branch in SVN and the group in charge of doing official builds would compile and publish the new artifacts.
So I guess my questions are how can you control who can publish to a repository? Does ivy just rely on filesystem permissions?
Also, I would eventually like to make my shared repository available via http. I think I could point apache to the file system repository directories for retrieving artifacts, but how do you setup publishing to an http repository?
I would suggest that you setup a repository manager to manage your project's build artifacts.
The best choices are one of the following:
nexus
artifactory
archiva
Publishing to a Maven repository means that your artifacts can be consumed by projects using other build technologies. All modern build systems support Maven (Including ivy, see the ibiblio resolver).
You could specify three resolvers in your ivy settings file. First would be a chain resolver which include remote and local ivy repositories. Second would be a local resolver for local ivy repository. Third a resolver to remote ivy repository only.
Every developer retrieves artifacts using first chain resolver.
Usual developer publish artifacts using second local resolver.
Your special team could use third remote resolver to publish in remoter ivy repository.
To protect remote repository from usual developers place it on (S)FTP server with write protection by password.
The only problem in this case is how to set versions on artifacts so that artifacts published in remote repository in some cases override locally published in some not.
Our team used such scheme few years ago. But now we use only local ivy repositories and CI server to build and run tests from various branches. We came up to this after switching to git.
for existing ivy repo easy to setup this: rest-ivy