Compiling assets for multiple environments in Rails - ruby-on-rails-3

In my application i have created different environments other that development, like staging, qa_test and production. (all are like production but having different settings/configurations)
Now my question is that i want to compile the assets for staging, qa_test and production all together so each environment (Rails.env) will catch up with their assets.
P.S i am having dynamic (environment depended) code in assets files.

Related

msbuild not collecting static web assets from dependencies in restricted environment

I have a Blazor webassembly project which requires some web assets from its dependencies (most of such projects require some).
When I build the project on my machine, everything works fine.
However, when I try to run the very same build in a containerized environment, the application still builds without any errors or warnings, but static web assets are not included in the publishable output.
Withing the containerized environment I have exactly the same toolchain and deps installed (dotnet core 6.0.102). Env vars are different (restricted subset).
I have checked the usual culprits (strace, etc). I can see the relevant asset files and associated prop files being read by msbuild successfully. Yet, when I inspect "Initial items" section in the msbuild log, StaticWebAsset subsection is missing outright, even if I invoke the ResolveStaticWebAssetsConfiguration target directly.
Yet, on my normal workstation everything "just works".
Does somebody know what can cause msbuild to ignore all the static web asset property files post successful reading of those (as confirmed by strace)?

One Repository With Multiple Deployments, Environment Variables, and Secrets on Vercel?

I'm doing some early research for a project I plan to deploy to Vercel. I am wondering if the following is possible:
I want to have on GitHub repository. This repository will use environment variables for API tokens, and basic settings.
I have three versions of the project that I want to create. Instead of creating three separate repositories, I'd rather have one repository, and then have the slight differences made using environment variables. This will make updates, fixes, etc much easier.
So, my question is: Is it possible to deploy one repository three times, each with different environment variables, using Vercel?
Yes, possible in deploying multiple environments in 1 repository. This can be done by importing your project to Vercel. For evey commit you made on the git repo, there is a completely new environment created for that. See https://vercel.com/docs/v2/git-integrations
You may also opt to create different git branches for each environment, and Vercel will take care in creating new environment for them. See https://vercel.com/docs/v2/git-integrations/vercel-for-github#a-deployment-for-each-push
With regards to environment variables, here's what the doc says:
The maximum number of Environment Variables per Environment per Project is 100. For example, you can not have more than 100 Production Environment Variables.
Moreover, the total size of Environment Variables applied to a Deployment (including all the Environment Variables Names and Values) is limited to 4kb. Deployments made with Environment Variables exceeding the 4kb limit will fail during the Build Step.
- https://vercel.com/docs/v2/platform/limits?query=environment%20va#environment-variables
Environment Variables: https://vercel.com/docs/v2/build-step#environment-variables
Yes, they give you Production, Preview, and Development environments. Each has their own environment variables you can save via the UI, or you can download the .env via the cli with vercel env pull.
https://vercel.com/docs/build-step#environment-variables
Multiple Vercel projects can be created for the same GitHub repo.
In other words, there is no restriction like only a single Vercel project can be created for the single GitHub repo.
Then, different environment variables can be set for different Vercel projects.
Pushing a commit to the GitHub repo triggers build & deploy of multiple Vercel projects.
Referece: https://github.com/vercel/vercel/discussions/4879#discussioncomment-356114

TFS Build continuous integration with multiple branches

In my TFS solution, I have two branches Main and Dev. We have four different hosting environments Dev, ITST, QA and Prod, and a different build script for each environment.
Whenever there is a check-in, a build runs and deploys the solution to the Dev environment. However, it is only building our Main branch and not whatever is checked into the Dev branch.
For the Dev build script, how would I go about specifying which branch to build? I've already tried configuring the Source Settings, but when I specified the Dev branch for the Source Control Folder, I kept getting errors related to the mappings.
Update
Here is the error I'm getting: There is no working folder mapping for $/DLS/Application/P1/P1.sln
It sounds like TFS is looking for:
$/DLS/Application/P1/P1.sln
You have it mapped to:
$/DLS/Application/DEV
Try changing the 'Active' mapping to something like:
$/DLS/Application
Or
$/DLS/Application/P1

ANT build config with dependency on core files

Currently working on a PHP/HTML/JS project that is made up of a 'core' directory and multiple individual 'product' directories.
Dir structure as follows...
Core
JS
PHP
Product1
JS
PHP
build.xml
Product2
JS
PHP
build.xml
The ANT build script lies in each respective product directory. When triggering an ANT build, the core files are copied into a deploy directory, then the respective product files are merged into (and overwritten if necessary) the same directory. ANT then runs Qunit, PHPUnit tests, concatenates and minimizes the javascript etc. This new deploy directory is then copied to the CI environment.
My question is, is this the best way of doing this? Is there a way to introduce versioning of the core files - so product1 might use v1.2 of core files?
This method also causes problems when running the application locally - effectively on every change of a file, the build script needs to be run again.
Does anyone have any suggestions of how to improve this?

Maven best practice for generating artifacts for multiple environments [prod, test, dev] with CI/Hudson support?

I have a project that need to be deployed into multiple environments (prod, test, dev). The main differences mainly consist in configuration properties/files.
My idea was to use profiles and overlays to copy/configure the specialized output. But I'm stuck into if I have to generate multiple artifacts with specialized classifiers (ex: "my-app-1.0-prod.zip/jar", "my-app-1.0-dev.zip/jar") or should I create multiple projects, one project for every environment ?!
Should I use maven-assembly-plugin to generate multiple artifacts for every environment ?
Anyway, I'll need to generate all them at once so it seams that the profiles does not fit ... still puzzled :(
Any hints/examples/links will be more than welcomed.
As a side issue, I'm also wondering how to achieve this in a CI Hudson/Bamboo to generate and deploy these generated artifacts for all the environments, to their proper servers (ex: using SCP Hudson plugin) ?
I prefer to package configuration files separately from the application. This allows you to run the EXACT same application and supply the configuration at run time. It also allows you to generate configuration files after the fact for an environment you didn't know you would need at build time. e.g. CERT
I use the "assembly" tool to zip up each domain's config files into named files.
I would use the version element (like 1.0-SNAPSHOT, 1.0-UAT, 1.0-PROD) and thus tags/branches at the VCS level in combination with profiles (for environments specific things like machines names, user name passwords, etc), to build the various artifacts.
We implemented a m2 plugin to build the final .properties using the following approach:
The common, environment-unaware settings are read from common.properties.
The specific, environment-aware settings are read from dev.properties, test.properties or production.properties, thus overriding default values if necessary.
The final .properties files is written to disk with the Properties instance after reading the files in given order.
Such .properties file is what gets bundled depending on the target environment.
We use profiles to achieve that, but we only have the default profile - which we call "development" profile, and has configuration files on it, and we have a "release" profile, where we don't include the configuration files (so they can be properly configured when the application is installed).
I would use profiles to do it, and I would append the profile in the artifact name if you need to deploy it. I think it is somewhat similar to what Pascal had suggested, only that you will be using profiles and not versions.
PS: Another reason why we have dev/ release profiles only, is that whenever we send something for UAT or PROD, it has been released, so if there is a bug we can track down what the state of the code was when the application was released - it is easier to tag it in SVN than trying to find its state from the commit history.
I had this exact scenario last summer.
I ended up using profiles for each higher environment with classifiers. Default profile was "do no harm" development build. I had a DEV, INT, UAT, QA, and PROD profile.
I ended up defining multiple jobs within Hudson to generate the region specific artifacts.
The one thing I would have done differently was to architect the projects a bit differently so that the region specific build was outside of the modularized main project. That was it would simply pull in the lastest artifacts for each specific build rather than rebuild the entire project for each region.
In fact, when I setup the jobs, the QA and PROD jobs were always setup to build off of a tag. Clearly this is something that you would tailor to your specific workplace rules on deployment.
Try using https://github.com/khmarbaise/multienv-maven-plugin to create one main WAR and one configuration JAR for each environment.