Providing environment variables with vuejs and azuredevops - vue.js

Right now I am building a project and using vuejs for the front end. When testing locally, creating a .env.developement and .env.production work fine when in different environments and will show variables correctly. My issue now comes when building in azure devops. I am pointing to the dist folder and this is, obviously, only providing production variables which makes sense.
Is there a way to pass in dev vs prod environment variables to vuejs to build against in a azure devops/vue project?
Seems like there is something "magical" about the way vue is injecting these files into the index.html file and I cant pinpoint how vue is deciding which env variables to use.

Seems to me a question not related with Azure DevOps Pipelines but with Vue compile process.
I don't know a thing about Vue, but if it works similarly to other javascript /typescript frameworks, you should specify the environment in your build tasks.
In my Angular projects I may create a npm task specifying which environment to choose (i.e. npm run build:prod or npm run build:pre). And then, in my Azure Pipelines run the right task depending on the environment I'm going to deploy (you may even store the output in different build artifacts depending on the environment, so you'll have all those artifacts available in your deployment pipeline).
Finally (just a recommendation) I would recommend you to review which values you store in your .env.production file, just to be sure that it's safe to store that file in a repository. If you have some sensitive information, I'd recommend you to use Pipeline Variables instead. Pipeline Variables may be keep hidden, available only for the DevOps Team.
Regards.

Related

Do I have to rebuild my frontend for production every time I edit it? I'm using Vue

Basically what I have is my frontend (Vue) and my backend (Node.js and etc.). By following a guide, I've built the frontend for production using npm run build. I got a bunch of files in a build folder I setup within a previous step. These files were then moved to a folder in the backend. It works, but it's more a demo than anything else, and the frontend and backend will have to be modified more as I continue.
I'm just wondering if and when I edit the fronted more (let's say, when I add a new page) am I supposed to go through this process again? So I'll modify the front end folder, build that, move files, etc.
Thanks.
Yes, definitely.
If we are in a development environment, we use npm run dev or yarn run which upholds the development environment running and updates the browser whenever any modifications inside the code happen. We don't use any final build in the development environment because we make code modifications so repeatedly that it would be a sore process to make a build after every modification and check the results using that build.
But, the production is distinct from the development environment. We deploy the only code which is bug-free, entirely working, and ready for users to use. Deploying to production means all changes have been made, and the final code is ready to be deployed. So, we make a final production build and deploy it to our server.
So, don't panic to deploy to production every time you make a small change in the code. First, complete your all changes, and test the changes in the development environment, if everything is working correctly then only create a final production build, and deploy it to the server.**
I hope this helps.

How make changes on vue project in hosting

I have vue project which published on Digital Ocean. The main problem is when i make some changes on FileZilla it is not affect on website. How can i solve this issue?
This is not an issue per-se. This is just the way how modern web development works. Vue.js (but also Nuxt) is using a bundler right now (Webpack, Vite are the most common), hence to go to production it needs to be bundled each time you push something to it.
If you upload something via FTP or SSH and edit some source code, a bundle step will be required in order to get any changes on the actual webapp.
Backend languages may not need that, for example you could SSH into a server and change some .php file, if you F5 the page it will be updated in real time. But this is not how frontend JS code works, it needs to be optimised.
Another thing, sending code via SSH/FTP is not really a good workflow because it is not easily trackable, no version-controlled, will not trigger any build flags in case of an error etc...
The best approach is to have a git repo + some build step included in some CI.
A common platform for it is Netlify, you connect a Github repo, you tell which command to use to build the project and each time you push some code, it may do some checks/tests/optimizations/etc... via Github Actions before being released automatically to production (updated on your webapp).
This workflow have a lot of benefits as one may tell but is also de-facto, the official/regular approach for modern Web development on the frontend.

Prevent expo doing full config load during expo install

We have extra config for our app to run inside app.config.ts and some environment variable validation in order to populate it. expo install, as I understand, reads some of the core expo config in order to make some decisions on library versions. It does not need the full configuration with the extra params. We have no way of detecting expo install vs normal build at runtime (i.e. it does not set any specific environment vars, or anything).
For our application, rather than using dotenv at runtime we simply require certain environment variables to exist.
Our local development scripts to start the server use dotenv-cli to populate some environment variables. Our CI builds rely on the environment variables set in CI. For this reason we always validate the required environment variables and don’t pre-populate anything.
We would either like to have a pre-script hook so we can make the same dotenv call before expo install happens, or a way to detect inside app.config.ts that expo install is running (some env) so we don’t need to expose full config.
Does anyone know how this could be achieved?
FYI this exact question was raised via the expo forums over a week ago, but there does not seem to be enough attention/activity there: https://forums.expo.dev/t/expo-install-does-full-config-load-including-extra-any-way-to-pre-hook-and-set-env-vars-or-detect-expo-install-at-runtime/62123

Options for local hosted client side package management in VS2019?

A common issue I keep bumping into for web projects in net core, is the need for sharing javascript in easy to use modules from project to project. Often times large quantities of code written in VS project A could be very much used in project B, sometimes in the same solution.
Restrictions:
Must be self hosted, not publicly exposed, only within local network etc etc can access the libs/modules/packages/etc
Ideally can be performed via visual studio projects and make use of build tasks, powershell, msbuild, or other such automation tools to deploy and package, minify, bundle, etc etc the javascript libraries.
The absolute ideal is if this can all be hosted from just a network folder
NPM/Yarn
I'm not super familiar with either of these, but is there a way we can drag and drop javascript code we've built into some designated folder, perhaps modify some form of manifest, json or xml file or what have you, and then anyone can just npm install those packages? I guess what I'm wondering is, is there a way to tell npm "This folder now is a source of packages you can install from"?
Bonus points: If said "trust this folder" config can be set inside of the VS project, so if someone new grabs the git repo, it will just work "out of the box" and they dont need to go through steps configuring npm or yarn so it knows how to find those packages.
Libman
Same as above, but mostly I'm trying to figure out if there is any way at all to configure libman from VS. It's the default and what is currently in use, but it just has its four default CDNs it comes with that it trusts and I am not seeing any way at all to tell Libman "Here's a now resource for files to trust, add that to the selectable drop down"
But I am seeing basically zero configuration as an option for libman, which is quite disappointing.
Nuget
This is the other option that is already popular locally, but something about using nuget to deliver js files when NPM, Yarn, and Libman already exist sets my teeth on edge, but, we have I believe a locally hosted nuget server that could be used already, so the infrastructure I believe is already setup, if not, I know how to do it. I do like the fact that nuget 100% for sure could leverage actual projects and build steps and msbuild and etc for deploying.
Conclusion
What's the popular and easy way to do this nowadays? Best case scenario is if there's a way to go, "Put a manifest.json file in the folder root that points to all the modules inside, then add it as a trusted source to your package manager, and now you can install those packages"

Why does Aurelia install so many dependencies?

I am curious to know why when I create a new Aurelia project, each project installs +600 node_modules. Understandably, the modules collectively don't take up a lot of space, but are all of these modules necessary? I was under the impression that Aurelia's aim was to help developers move away from depending on 3rd party libraries so it seems odd that each project comes with a massive dump of 3rd party libraries.
My guess is that you are starting your project from CLI - which comes preset with HTTP server, ES6/2015, SASS, live-reloading and more.
I created clean Aurelia project and looked at the package.json - there were 5 dependencies and 34 dev dependencies. Using all of above mentioned tools is somewhat standard in today's JS web development, and generating project from CLI reduces time needed for upfront setup. All of these features come with their own dependencies, and that's why node_modules/ folder grows rapidly.
The bottom line is - you could start new Aureila project with much fewer dependencies. On their home page you can find starter project with just three. But that also means that you won't have access to most of the tools used today.
Also, and correct me if I'm wrong, I haven't got the impression Aurelia ever aimed to move devs from third party libs and modules, just to be modern, fast, and unobtrusive.
All modern web frameworks have a host of tooling. The reasons in no particular order -
1. Transpiling ESNext or TypeScript - if you want to write in Future JavaScript but have it work in all browsers, you need this step. Both Babel and TypeScript tooling comes with extra stuff too. If you want to see coverage (everyone does) there's another tool.
2. Testing - Unit test and End to End testing require testing frameworks, test runners, and if you want to write like above (esnext or TypeScript) you also need transpiling.
3. Module Loading / Bundling - Require.js, JSPM/System.js, WebPack, etc... are used to allow your code to actually run in the browser. Without a module loader you could not break your code out in to separate files. Without a bundler you would be loading a lot of extra files in production.
4. Serving your application - If you want to run your app locally you need a way to serve it up and watch for changes.
5. Debugging - You want to debug? Now you need a way to debug the file that gets served to the browser back to the original source.
6. Linting - Lint your code base for style consistencies.
Each of these packages usually have their own dependencies, and they get pulled down as well.
This convention of small packages that have a single focus is arguably better than massive packages that do everything for you. This allows you to remove a package and replace it with the one that does the same thing but in a way you want it.