I have a project that implements a shared CI library and other projects include it in their .gitlab-ci.yml files.
Also, all these projects share the same service account that is added as CI/CD variables to each project that wants to include the library.
I tried to set this service account as a CI/CD variable in the library project, but I can not share it with the projects consuming this library.
All these projects are in different groups and, at least until now, there is no need to move them.
Is there another way to share service accounts/ API keys among different projects consuming a CI library?
If you are using your own runners, then possibly.
If not, you might be able to write some tooling to retrieve the variables via the API.
Related
I have an MVC 5 application we're moving from on-premise to the Azure cloud. Currently, we have several publish profiles, one per environment, which we determine using a powershell script. One of our goals is to make the building scripts and infrastructure as simple as possible, so I was wondering if I could make it so that using only my appveyor.yml file I could set the publish profile to be used, so
Is there a way to set the publish profile from the appveyor.yml file?
If not what are my choices?
You can run your PowerShell script as part of desired build step in pipeline. It is possible can run commands right from YAML file or UI or check-in your PowerShell script into repository and run .ps1 file. You might consider using secure variables to avoid checking in things like connection strings into repo in clear text.
However this custom script/profiles approach will not allow you to use built-in WAP artifacts packaging and you will be also needed to use custom script instead of automatic MSBuild mode. Which is OK, but a little bit more scripting. Also you will be needed to publish artifacts so it will be available for deployment.
Maybe easier option is to let AppVeyor do all build and WAP artifacts packaging/publishing automatically, and then use built-in Web Deployment with Web Deploy parametrization instead of multiple publishing profiles.
But if you decide to go with custom scripts, and multiple publishing profiles, you still can use use built-in Web Deployment with artifacts created by your scripts.
I would like to build a custom component and share it with my team. I don't want to online upload the new component. Is there an offline way to share components?
I can define my github account instead of Mule's when creating a project. Will then it be private?
Adding details:
I created jars which I want to use in my Mule project. I added 2 Java classes in my project that uses the jars. I also have 3 apps on different computers that needs the jars.
Instead of duplicating these jars and the Java code, I would like to wrap it in a component/ connector and share the component between apps/developers in my team. As far as I understood from reading I can use both connector and component for my needs. However, I couldn't understand how can I share offline what I built.
The best option is to Maven-deploy your common JARs to a private Maven repository. This can be a simple as an S3 bucket or as refined as Nexus server.
This way, your different Mule projects will be able to pull these common JARs in their builds by simply adding them to their pom.xml files.
What is a 'scoped' repository workspace ?
Does this mean only members in current team will be able to view the repository rather than 'public' where everyone in the prject area can view the repository.
Scoped is what I always recommend when creating a repo workspace:
It allows other member of the project area to access your repo workspace by adding it to their flow targets.
That allows them to accept changesets from your repo workspace even though you haven't delivered them on the Stream.
This is a nice change to the "reserved checkout" issue with ClearCase, when a collaborator is no longer there and the file is blocked.
Here, as long as you have checked in your changes, even if you are not there to deliver them, those changes aren't locked in your computer, but are available for the other members.
This is different from scoped flow target.
And RTC4.0 has introduced Scope read permissions on files and folders.
That being said, a "public" repo workspace has its use (see this thread):
The idea of using a public repository workspace, is to provide an up and running development environment for the team.
I do not want a developer to spend 2-3 hours with the support of somebody else to setup is workspaces, to run a web application with the J2EE artifacts. Currently we are using MAVEN to build are applications and setting up the development environment and we are struggling with it.
To much knowledge required by the developer and way to much money trying to automagically configure the RAD 7.5 workspace with MAVEN and are own scripts.
The idea is to setup pre-configured public repository workspace with all the necessary RAD 7.5 artifacts (server, EAR configuration, web configuration, link between projects and link to MAVEN repository for component that you dont want to load in your workspace.
For our team we may have around 8 public repository workspaces, some with only the front-end projects other with only the back end projects or a mix of both depending on our specific needs.
The developer come in the morning pick the proper public repository workspace for his task and is up and running in 10 minutes. He can see ongoing changes from other, accept changes from his team mate or not. Of Course, from the workspace the changes can be delivered in the stream used for continuous integration.
I think its cool.
I know Maven has a central repository that you can upload build artifacts (assemblies) and reference them in your build script so that you get the latest versions.
Is there any similar tool (other than Maven for .Net) that provides a way to centrally store artifacts and reference them in MSBuild scripts?
I'm trying to figure out how to incorporate a library solution we have that is used across all our other solutions (contains common data access, schemas, etc.)
We don't always automatically want it included in our projects as sometimes we need to stay on a particular version for one project until that project is ready to upgrade to the latest.
If I were you, I'd create a build system that allows publishing packaged 'modules' from one end, and importing from the other end.
You create a shared directory at a global place within your organization and this becomes your "central Repository" you're talking of.
Alas, I'm not aware of any public implementation of such an msbuild system.
We have a Java codebase that is currently one Web-based Netbeans project. As our organization and codebase grows it seems obvious that we should partition the various independent pieces of our system into individual jars. So one Jar library for the data access layer, one for a general lib, one for a specialized knowledge access, etc. Then we'd have a separate project for the web application, and could have one for a command line tools app, another web app eventually, etc.
What is the recommended practice for doing and managing this? Is it Maven? Can it all be effectively done with just Netbeans alone by simply creating individual projects and setting the dependecies of one project on the jar files of the others?
I'd agree with SteveG above on using Maven2 to help you modularise your code base, but I'd use Nexus as the local repository for Maven instead of Archiva. The guys at Sonatype also have an excellent (free html/pdf) book on how to use Maven, Nexus, and integrate it into IDEs.
Be careful on how you decide to partition up your projects, though. There's no sense in over-complicating your dependencies just for the sake of it.
I would definitely say check Maven(2) out. It is very good for doing this sort of thing. You can define individual models and version then very easily. Netbeans also does a decent job of integrating with.
Also I suggest you set up Archiva which will let you be dependent upon binaries of other artifacts that your company generates internally. This also acts as a proxy and will keep a local copy of any external dependencies your projects might have so its very quick to get the new versions internally.
I would create ant scripts to build the pieces and for deployment. Then you are not depending on your IDE for build/deployment.
It sounds like your code is getting to the point where you're graduating from the WAR approach and have entered into the EAR level.
An EAR is just another archive that contains all the other JARs and WARs that get combined to create an application. There are four types of modules that can reside inside it, Web, EJB, Connectors and Utilities. Most people only use Web and Utilities so they go with using the WEB-INF/lib approach.
But if you're starting to get a lot of interdependencies what you do create an EAR project and make your web project a child of it. Each Utility JAR which is just straight Java code used by other modules also becomes a child of the EAR. Finally in each of your projects there should be a META-INF/manifest.mf file that just has the name of the JARs that JAR/WAR depends on.
I'm an eclipse guy and most of this gets taken care of for you in eclipse, but I'm sure netbeans has very similar functionality.
Now the only problem is that you have to use a full Java EE server to deploy an EAR so I don't think you can use Tomcat if that's what you're currently using.