Serverless to AWS - Gitlab CICD - gitlab-ci

I'm developing an application using Serveless Framework and I want to deploy it in AWS using GitLab CICD.
Following the best practises documentation I've setup a single repo with the following structure:
│
├── services
│ ├── customer-api
│ │ └── handler.js, serverless.yml
│ └── payment-api
│ └── handler.js, serverless.yml
│
├── serverless-common.yml
└── .gitlab-ci.yml
So inside of the services folder I've several folders (one for each api / lambda).
I've also setup the following simple GitLab CI/CD
image: node:latest
stages:
- deploy
dev:
environment: dev
stage: deploy
before_script:
- npm config set prefix /usr/local
- npm install -g serverless
script:
- serverless deploy --stage dev --verbose
The problem that I've is that I want to deploy only the lambdas that have changes.
Should I go directory by directory inside the services folder and run the serverless deploy for each api?:
script:
- cd customer-api
- serverless deploy --stage dev --verbose
- cd ..
- cd payment-api
- serverless deploy --stage dev --verbose
How are you managing serverless deployments using GitLab CI/CD?

and regarding your folder structure , if you can do something like this :
│
├── services
│ ├── customer-api
│ │ └── handler.js, customer-api.yml
│ └── payment-api
│ └── handler.js, payment-api.yml
│
├── serverless.yml
└── .gitlab-ci.yml
and give references for those function in main serverless.yml file like this:
functions:
customer-api: ${file(./services/customer-api/customer-api.yml)
payment-api: ${file(./services/payment-api/payment-api.yml)
then you don't need to go inside each folder to execute serverless.yml file:
script:
- serverless deploy --stage dev --verbose
will be info.
for more info you can check here

Related

Yarn Workspaces seemingly not recognizing directory

Wasn't sure the best verbiage for the title, but basically I have a monorepo setup using yarn workspaces. It works for every other directory I have. Here's my layout to make it easy to visualize
├── root
├── package.json
├── node_modules
├── appsync
│ ├── package.json
├── aurora
│ ├── package.json
├── lambdas
│ ├── lambda-function-1
│ ├── └── package.json
│ ├── lambda-function-2
│ ├── └── package.json
Roughly this is as simple as I can make it without getting into too much unnecessary detail. In my root package.json I have my yarn workspaces setup like so:
"workspaces": {
"packages": [
"./*",
"aurora/*",
"lambdas/**",
]
}
The issue is this:
For every other directory I have. Yarn Workspaces is working perfectly. appsync is covered in the ./* declaration. And I have packages in that package.json that get installed property to the root directory and not in the appsync directory. Same with my lambdas. I define those differently because it wont be covered from the ./* declaration. And this works great too, no issues.
Today I added a new directory, and to my surprise saw modules being installed to aurora and not to the root. So I'm very confused whats wrong with my root package.json thats causing this? Or any ideas I can try to get more information? I have done yarn both at root and in the aurora directory as well, and no matter what I try, it always installs modules to the aurora directory and also adds a yarn.lock file to that directory as well.
Any help is appreciated.

CopyWebpackPlugin not copying files when running dev server

what I am trying to achieve
I am trying to synchronise locale files from a yarn workspace to multiple Vue apps in a monorepo so they can be used with i18next in each app. They need to be:
kept in sync during development
automatically updated when the translation files get updated
have them eventually in the dist folder so that they get deployed with the rest of the app (which should happen automatically when the files are in the public folder)
In order to keep the bundle size small, I can't just bundle the whole package - the files need to be copied individually with their original file names and stored in the public folder of each app and the UI library.
the problem
I am trying to configure CopyWebpackPlugin, but I am either
just getting an initial copy from translation/locales to public/locales on starting up the dev server
or the dev server ends up in a loop when I try to enable the writeToDisk option, plus it starts flooding the dist folder with hot reload files.
my vue.config.js *
module.exports = {
devServer: {
writeToDisk: true,
},
configureWebpack: {
plugins: [
new CopyPlugin({
patterns: [
{
from: `${path.dirname(
require.resolve(`#namespace/translations/package.json`)
)}/locales`,
to: "./public/locales",
toType: "dir",
},
],
}),
],
},
*based on instructions from https://webpack.js.org/plugins/copy-webpack-plugin/, it includes a reference to yarn workspaces
running yarn serve with this config results in a loop. The correct files are copied to the ./public folder, but at the same time, it creates the ./dist folder and floods it with ...hot-update.json files.
if I run yarn build the first time, the locale files get copied to the ./public folder, but not to the ./dist folder (so it seems it copies the files at the end of the process, so the latest files aren't included in the ./dist folder
current folder structure
Monorepo
└── packages
├── applications
│ ├── app1
│ │ ├── public
│ │ └── dist
│ ├── app2
│ └── ...
└── common
├── translations
│ └── locales
│ ├── en-GB
│ │ └── common.json
│ └── de-DE
├── ui
└── ...
versions
#vue/cli 4.5.12
webpack#4.46.0
copy-webpack-plugin#6.4.1
Any help with getting this setup to work would be very much appreciated.

Npm script to run both child directory folders?

How do I write a script in my Parent Folder's package.json file so that when I run npm install it installs the node modules in each folder and npm start will go to each folder and run npm start
The Frontend and Backend folder both use npm start to start up and I want to somehow do the same in the parent folder to simultaneously start both
This is the file structure:
ParentFolder
├── package.json . <--- npm install && npm start scripts
├── FrontEnd
│ ├── /node_modules
│ ├── package.json
│ └── index.js
├── Backend
│ ├── /node_modules
│ ├── package.json
│ ├── routes.js
│ └── server.js.js
Installing in two directories is easy with find
find ./*/* -maxdepth 1 -name package.json -execdir npm install \;
This looks in each directory for a package.json and executes npm install;
npm start becomes a bit harder. At least on Windows using Cygwin, I wanted to do:
npm --prefix ./FrontEnd start ./FrontEnd & npm --prefix ./Backend start ./Backend
But it wasn't actually running in the background like I expected and FrontEnd was the only one that actually started. Depending on your start script this could work for you.
Possible solutions for this could be concurrently or npm-run-all -p.

How to organise multi package flutter project and use it as dependency

I would like to organise a flutter project into a multi-package one with following requirements:
use one repository for this project
able for developers to work on the packages in this repository locally
make the packages accessible as dependencies from other projects outside of this repository
The file setup for the repository I have now is:
.
├── app_base
│ ├── ...
│ └── pubspec.yaml
├── feature
│ ├── ...
│ └── pubspec.yaml
└── README.md
I tried using path dependencies like this in app_base/pubspec.yaml:
name: app_base
dependencies:
feature:
path: ../feature
and it works for local development but if I try to use app_base in a completely different project and not use paths but a git dependency:
name: actual_app
dependencies:
app_base:
git:
url: ssh://address.to/the_repo.git
path: app_base
ref: deadbaca
it cannot resolve the transitive feature dependency:
Running "flutter packages get" in actual_app...
Error on line 21, column 11: Invalid description: "../feature" is a relative path, but this isn't a local pubspec.
path: ../feature
^^^^^^^^^^
pub get failed (65)
Process finished with exit code 65
Is there a way to make it work for both local development and used as a git dependency from other project?
Just use Git dependencies for both scenarios (locally and for other projects).
If you think this is cumbersome during local development, use path dependencies locally and change it back to Git before committing.

Why can't CircleCi find my tests?

Please see repo structure below. I want to run the tests in root/app2/tests/. I am using py.test.
After CircleCi couldn't infer test directory automatically I added a circle.yml file to the root directory but still the tests are not found. Any help greatly appreciated.
circle.yml file content:
general:
build_dir: app2/tests
Repository structure:
root
├── circle.yml
├── app1
│ ├── xxx
│ ├── yyy
│
└── app2
├── src
├── tests
|-- test_module_1.py
|-- test_module_2.py
Ok with help from CircleCI I figured out how to use py.test with CircleCI:
In the circle.yml file add:
test:
override:
- py.test <optional path to subdir with tests>
dependencies:
pre:
- pip install pytest
If your code relies on several packages in addition to py.test, you can create a requirements.txt file:
pip freeze > requirements.txt
place the requirements.txt file in the same directory as the circle.yml file (or in the build_dir if you have specified that in your circle.yml file) and add to circle.yml:
dependencies:
pre:
- pip install -r requirements.txt
It’s generally enough to set the build_dir to just the application directory, we’ll take care of finding the tests subdirectory ourselves.
Could you please try replacing your current command in the circle.yml with this?
general:
build_dir: app2