Bamboo Continuous integration with yarn test (JEST framework) - bamboo

I am very new to Atlassian Bamboo build CI. I want some help from you guys.
My Job is to make a continuous integration build plan for my reactjs application. So I started with Bamboo.
Now my application test cases are written in JEST framework.
In my (local machine) react application when I test the test cases locally I use the following command
"yarn test"
I installed yarn inside Bamboo by "npm install yarn"
My requirement is whenever I will merge my code in GitHub an automatic build will be triggered in Bamboo and if the test cases are passed then it will deploy the code..Now the build plan is getting triggered when ever I merge the code it GitHub (Because in step 1 of the build plan I made a job to checkout code from my GitHub repo)
But I am not understanding how to tell the build plan to run "yarn test" my test cases using JEST framework.
The question might look very easy for you guys...so please help me..

The agent (local or remote) running your builds needs:
Nodejs installed
npm installed - typically by the nodejs install
yarn installed globally (npm i -g yarn)
Then you can use as a script task to run the yarn test command.
You can build on this be seeing if there are plugins that abstract the script task into some sort of yarn task and you can look at processing the test results in Bamboo so that the builds show the test results and fail/pass the build accorindly.

Related

How to create a Pipeline with Gitlab CI, just for the affected packages with Turborepo

So, I am in the process to integrate Turborepo into our NodeJS(React, Next, Node) Monorepo which uses GitLab CI. The thing is the example in the Docs is quite not up to what I want.
For reference here is what they have in their Docs:
image: node:latest
# To use Remote Caching, uncomment the next lines and follow the steps below.
# variables:
# TURBO_TOKEN: $TURBO_TOKEN
# TURBO_TEAM: $TURBO_TEAM
stages:
- build
build:
stage: build
script:
- npm install
- npm run build
- npm run test
We have a few stages. Beyond the ones in their example:
install
build
package
What I would ideally like to have is to use Turborepo and GitLab Downstream Pipelines to run as follows:
install stage should run when root package.json has changed.
build, package stage should be run just for the affected packages. (i.e, if shared-lib is changed, then shared-lib should run as well as the 2 consumers app-a, app-b. In parallel)
I read the Docs and I can somehow make the Downstread Job run but not for the affected. Instead it does it for all. The main problem is how can I read the affected packages and its consumers, and just run those.
I read with the latest version I can use the --dry commands to read those. But let's say that works reliable which from my testing it doesn't. how can I put those packages, as Downstream Jobs in Gitlab?

How to chain vite build and tests?

It seems that vite build command doesn't run tests (unit tests written with vitest, for example) before building the application for production. This is quite unsafe.
Is there a way to configure vite build to run tests before trying to build ?
I couldn't find anything in Build Options.
What do you have in your package.json, scripts section? Don't you have some test or alike?
build is used to bundle your app and ship it to production. It's not supposed to run any tests by default.
You could achieve this by doing something like
"scripts": {
"build": "vite build && npm run test:unit && npm run test:e2e",
"test:unit": "vitest --environment jsdom",
"test:e2e": "start-server-and-test preview http://localhost:4173/ 'cypress open --e2e'",
},
If you generate a new project via the CLI, you will have most of them already written for you, then it's a matter of properly chaining them with && to assure that they are all succeeding before proceeding further.
You can also add some Git hooks with something like husky + lintstaged so that your flow is using something by default before even pushing it to a remote repo.
Otherwise, it's part of your CI. Either it be a Docker compose file, some Github actions, Gitlab pipelines or anything your devops team could have setup for your deploy environments.

AppCenter Yarn 1.19 error Incorrect integrity when fetching from the cache

I have a React Native app hosted on Microsoft App center. The builds (both iOS and android) are failing because of yarn 1.19 (error Incorrect integrity when fetching from the cache)
I tried to remove the package-lock.json but it didn't help.
I would like to downgrade yarn or execute the cache clean command but don't know where to execute it.
I have installed the appcenter cli version 2.2.1 and successfully connected to it.
Where could I execute yarn cache clean for example?
I've read I could also create a script but I have no idea where to place it and how it should look like. Should it be both in the ios and android directory? Or in the root? Thank you
I actually noticed that I had a yarn.lock back from the initial commit, even though we are using npm in the team.
In the build logs, there is this line:
/bin/sh -c if [ -f yarn.lock ]; then { yarn install && yarn list --depth=0; } else npm install; fi
So I deleted the yarn.lock file and now it builds successfully using npm!
Not sure whether you build your React app with Azure Devops pipeline. If yes, you can use Command line task to achieve the things you want to do.
If the agent you used is hosted agent during the build, since each build will use a completely new VM, you need to install the AppCenter cli once per build.
Use follow command to install the AppCenter cli:
sudo npm install -g appcenter-cli
Then logging in it:
appcenter login --token {Token}
Here, you need first generate the token with this doc described: Go and login to https://appcenter.ms/ -> Click Self Avatar -> Choose Account Settings -> Click on API Tokens -> Click New API token then select the corresponding the scope for this token.
Copy and use it in this pipeline task. Note: Recommend you store
this token with secret variable for security.
Now, you can execute the clean command: yarn cache clean.
Where to place it and how it should look like?
According to your description, you need place this command line task into the first step, then it could clean the Yarn cache firstly.
Also, because of the image configuration that the hosted agent is using, its installed node.js version is 6.7.0, this does not match the runtime environment for AppCenter cli. You need also run Node.js tool installer task to install node.js v10.0.0.
All of them should look like this:
Should it be both in the ios and android directory?
As I mentioned previously, for Hosted agent, each build will use a completely new VM. So yes, you must execute these two steps firstly in every build.
If what you used is your private agent, since you have installed the AppCenter cli locally, the agent will automatically call the local configuration when running the command line task. At this time, you just need to skip the install command in the command task:
We fixed it by adding a "yarn cache clean" in appcenter-post-clone.sh, you can add this shell script in root of project.
See this docs for details.

Gitlab CI: create dist folder inside repository?

I just recently started using gitlab CI to automate some build/deploy steps. It works perfectly to build docker images etc, but I was wondering if it's possible to create a folder in the repository during a build step? For example I'm now making an npm utility package, but I'm just importing it in my other projects via a private gitlab repo (using deploy token), but the code of the util package is written in es6 and needs to be transpiled to commonJS to be used in the other packages. Manually I can run npm run build and it will output a dist folder with the transpiled code.
I was trying (and researching) if it's possible to automate this build process using .gitlab-ci but so far I couldn't find anything.
Anyone know how I can achieve this and/or if this is possible?
Thanks in advance!
Not sure if I got your question correctly, so add more details if not.
When your CI build creates new folders or files, they are written to the task runner's file system (no surprise here, I assume).
If you want to access these files from Gitlab's web UI you can define them as artifacts in your build job (see https://docs.gitlab.com/ee/user/project/pipelines/job_artifacts.html)
Your build job would look something like that (pseudo code written by memory, not tested on Gitlab):
build:
script:
- npm run build
artifacts:
paths:
- dist/
expire_in: 1 week
UPDATE If you want to upload the build artifact to an NPM registry, you could just build and push together.
build:
script:
- npm run build
- npm publish <PARAMETERS>

Missing node_modules when deploying AngularJS2 application to Bluemix

We're trying to deploy an AngularJS2 application to bluemix but we're missing the folder "node_modules" after the application was deployed to the server. We're using npm to build the application.
I found the following post that is mentioning the problem: (https://developer.ibm.com/answers/questions/181207/npm-install-within-subdirectory-not-creating-node.html)
My question would now be: what's the recommended best practice?
I believe you are installing the node modules using npm install, you also should save those module in your package.json file which you can do that by npm install --save.
The recommended best practice would be to Setup a Build Pipeline.
There could be 3 stages or more:
Build Stage: It builds the app so doing things like npm install there so your folder node_modules gets created for you.
Test Stage: Tests the app so doing things like npm test would run all the tests in your app
Deploy Stage: Once build and deploy stage runs successfully, Deploy will actually deploy the app to the Bluemix domain.