How to setup alias using serverless framework - serverless-framework

I have a project I'm working on with another developer using the serverless framework in aws. I need us both be able to deploy the stack without each other stepping on the others changes. I've been looking for an alias feature where I can provide some prefix or something that will make the deployment unique, but so far I've been unsuccessful. Is there such a feature in serverless to do this? If not how do teams deploy multiple version of the same code without stepping on each other?

You can use Serverless Stages. Set your stage to your name and your teammate can set the stage to his name.
Production and Dev can also be separate stages.
https://serverless-stack.com/chapters/stages-in-serverless-framework.html

Related

Providing environment variables with vuejs and azuredevops

Right now I am building a project and using vuejs for the front end. When testing locally, creating a .env.developement and .env.production work fine when in different environments and will show variables correctly. My issue now comes when building in azure devops. I am pointing to the dist folder and this is, obviously, only providing production variables which makes sense.
Is there a way to pass in dev vs prod environment variables to vuejs to build against in a azure devops/vue project?
Seems like there is something "magical" about the way vue is injecting these files into the index.html file and I cant pinpoint how vue is deciding which env variables to use.
Seems to me a question not related with Azure DevOps Pipelines but with Vue compile process.
I don't know a thing about Vue, but if it works similarly to other javascript /typescript frameworks, you should specify the environment in your build tasks.
In my Angular projects I may create a npm task specifying which environment to choose (i.e. npm run build:prod or npm run build:pre). And then, in my Azure Pipelines run the right task depending on the environment I'm going to deploy (you may even store the output in different build artifacts depending on the environment, so you'll have all those artifacts available in your deployment pipeline).
Finally (just a recommendation) I would recommend you to review which values you store in your .env.production file, just to be sure that it's safe to store that file in a repository. If you have some sensitive information, I'd recommend you to use Pipeline Variables instead. Pipeline Variables may be keep hidden, available only for the DevOps Team.
Regards.

ASP.NET Core front-end developer workflow with VSCode and VS 2019

I haven't done any cshtml front-end development for a few years.
What's the current, generally accepted way for ASP.NET Core front-end developers to work across a range of tools on Windows?
By that, I mean a way to have the front-end JS build and the .NET project(s) also build and to work rapidly in the browser and code.
My thinking is.
We have much better command line story around dotnet today.
Some folk like VS Code.
Some folk prefer VS 2019, and some like either, depending.
We need to work on UI aspects sometimes.
But we also need to attach a debugger and debug the server logic sometimes.
The build server should have no problem, be simple, and rely mostly on build logic held in the repo.
Tooling, and kicking off the whole build and serve process should be understandable and familiar.
It should be pretty simple to get going after a team noob clones the repo.
My initial thought would be to setup NPM then use something like Gulp to kick off everything, including running dotnet run.
Then when running under the Visual Studio 2019 debugger, use the Task Runner Explorer to kick off the Gulp stuff but skip the dotnet run part.
(shame there doesn't seem to be a command line for start VS(Code or 2019) and attach debugger)
Now I'm expecting to get a "primarily opinion based" SO beating, but there are general trends and ideas that go into designing all these tools for how they can all play ball together and what the dev story looks like.
You've pretty much already described the process. However, I'll add a few things:
You don't need the dotnet run bit. Visual Studio and VS Code are both capable of debugging directly.
You can assign the gulp tasks to build tasks in Task Runner Explorer, so you really don't even then to think about running those directly. I'm not as sure on this aspect of VS Code, but I'm sure there's probably some extension to handle it, if it's not already built-in.
If you want true ease of development, the best thing you can do is use Docker. Just add a Dockerfile to each project that actually runs (i.e. not a class library) and set up the steps to build and run it there. In Visual Studio, you can right-click the project and choose Add > Docker Support, and it will actually generate a ready-made Dockerfile, though you may need to add a step or two to handle the client-side build steps. In any case, this then becomes truly click and run, with nothing to worry about. The story is even better when you use docker-compose, as then Visual Studio and VS Code can spin up your entire application stack all at once, including external dependencies such as a database, Redis instance, etc. If you haven't used Docker before, start now. It's absolutely revolutionary for development.
One note for CI/CD, as much as possible, you should add a YAML file to describe your CI/CD pipeline. Depending on the the actual provider you're using for build/release, there might be some differences, so consult the relevant documentation. (Azure DevOps, for example, doesn't currently support describing release pipelines in yaml, though you can still do your build that way.) In any case, this allows you to configure all this in code, and have it committed to source control.
You may consider the same for your infrastructure. Azure has ARM templates, AWS has CloudFormation, GCP has Deployment Manager. There's also third-party tools like Terraform or Ansible. All of these, in some form or fashion (usually JSON or YAML) allow you to define all the characteristics of the infrastructure you're going to deploy to and commit that to source control. This makes deployment and things like creating new environments as breeze.

Azure DevOps multi CI/CD

I have a following use case:
We have one solution that contains 5-10 different services (.NET Framework Web Apps of various versions) within. We have to setup CI/CD in Azure DevOps to be able to automate the deployments of each services separately (or all services at once). There will be around 5 different environments for each service.
Challenges:
We are trying to avoid having (# of services X # of environments) seperate builds and releases (~50 build/ ~50 release).
We do have to be able to deploy one service alone without others being affected.
We do have to be able to deploy ALL services all at once for mass deployments.
P.S. We are currently using trunk based development but I am thinking about moving to giflow to have branch based triggers as I feel it would be easier to manage in this case.
CI - handled by your build server (e.g. teamcity). Responsibility: Build, Test, Obfuscate, Create Packages and lastly push Packages to nuget server (.net specific). Traditionally besides the app code you also need at least 2 other packages: db migrations, infra migrations.
You build packages once and deploy the exact version everywhere else you want it to go.
https://gist.github.com/leblancmeneses/1d352bb79447cd7a486598c4dc796ef1
This script works in conjunction with https://github.com/leblancmeneses/RobustHaven.DevOps
CD - handled by something like octopus deploy. Responsibly: orchestrate deployment process across your cluster. Octopus pulls packages from nuget server and moves them to what ever environment you want and to whatever machines encompasses that environment.
https://www.robusthaven.com/presentations/DevOps
you dont really need 50 builds, you can use a single build per service (assuming builds for different environments are identical) and build from different branches. technically you can get away with a single release for 50 environments if you create your triggers\phases properly, but that would be a mess, just create a single one for each environment. I cant see how managing 50 environments on a single release is manageable.
when yaml release pipelines arrive, this becomes trivial, right now its not, unfortunatelly.

Automatic Jenkins deployment

I want to be able to automate Jenkins server installation using a script.
I want, given Jenkins release version and a list of {(plugin,version)}, to run a script that will deploy me a new jenkins server and start it using Jetty or Tomcat.
It sounds like a common thing to do (in need to replicate Jenkins master enviroment or create a clean one). Do you know what's the best practice in this case?
Searching Google only gives me examples of how to deploy products with Jenkins but I want to actually deploy Jenkins.
Thanks!
this may require some additional setup at the beginning but perhaps could save you time in the long run. You could use a product called puppet (puppetlabs.com) to automatically trigger the script when you want. I'm basically using that to trigger build outs of my development environments. As I find new things that need to be modified, I simply update my puppet modules and don't need to worry about what needs to be done to recreate the environments through testing for the next go round.

Best mix of aspnet deployment tools

I know how many questions have been asked around this, but I still can't figure out the answer to my question.
I need to deploy an ASP.NET 4.0 site, and I want to do something like this:
Get latest version og the entire solution - a website and a couple class projects that are used by the webapp (I am doing this already using CCNet- not a problem)
Build and deploy in debug config to a test site
Build and deploy in a release configuration to a staging site
If everything looks fine on the staging site, I'll run a script that deploys the release build staging site, to 7-8 similar sites used by different customers on the same server. In the future this will be on another server.
There is MSDeploy (webdeploy 2.0), aspnet_compiler, MSBuild, Powershell (my weapon of choice..) and propably more ... I am not 100% sure what to use where?
I would love to mimick the "only deploy files needed to run the site" from the Vs2010 GUI assisted deployment, and I'd love to have the possibility of not touching some existing folders in the websites that we deploy to.
I feel like I should use MSDeploy a lot ... but I find it pretty hard to GET. I'm reading away in IIS.NET and I've heard the Scott Hanselman/Jon Arild Tørresdal podcast. I am not 100% sure where to start ... and I'm not a MSBuild expert, so Powershell is looking pretty good to me. But I feel I'm missing out on the right tools by going that way ...
What tool would you use in which step??
According to me, MSDeploy is build to achieve all of your needs with much more control. MSDeploy is the product of IIS team and it's far more feature rich.
Answers to your questions
Considerations: Svn as Source control, MSDeploy for deployment
You can use SVN Command line tool to get latest version of your source code in some temp directory and build it using MSBuild command line tool. Then you can create a Deployment Package or a Publish Content Package which can be deployed to specific environment. I am currently working on a project which can basically create a deployment package independent of the environment, means the pacakge can be deployed to any stage/test/live environment. Sample for creating a deployment package is...
msbuild C:\Projects\NimBuildDeployTestApp\NimBuildDeployWebApp/NimBuildDeployWebApp.csproj /t:package /p:Configuration=Release /p:PackageLocation=C:\Temp\DeployPackage\NimTestWebApp.zip /p:EnablePackageProcessLoggingAndAssert=true
Use MSDeploy command line tool to deploy the package as below
msdeploy -verb:sync -source:package=packagelocation -dest:CONFIGURABLE_ACCORDING_TO_YOUR_NEEDS
User Step 2 to build the solution in Release mode using MSBUILD and configure the dest parameter in msdeploy.
If everything seems fine on your staging server, run the deploymentToLive.batch file which will deploy a particular PublishContent/package on your live server. Execute 7 msdeploy scripts in same batch file or else use DFS (which is far more complex solution)
MSDeploy is definitely the way to go. One thing that you may consider is having a deployment package ready which can be deployed onto any environment of yours. What I mean by that is your web.config file won't be similar for all the envrionments, and hence create a deployment package which contains web.confif file for all the environments. Refer to my solution here
Hope this helps and gives you confidence in moving forward to automate your deployment process