Is it allowed to create local dbt deps repository so that "dbt deps" command should download libraries from local repository?
N.B: Our client is not interested to connect to external network.
Yes this is possible, provided that the repositories have already been locally cloned or copied.
dbt docs's page on Packages tells you exactly how to do this
Packages that you have stored locally can be installed by specifying the path to the project, like so:
packages:
- local: /opt/dbt/redshift # use a local path
Local packages should only be used for specific situations, for example, when testing local changes to a package.
Note: I think it is worth re-iterating the caveat given in the docs. You will now own downloading the cloning the correct versions of the packages along with the ongoing work of keeping the packages up-to-date.
As for how this works in practice. Consider the following example:
/Users/michelle/repos/my_dbt_project where my dbt project lives (that contains dbt_project.yml and packages.yml
/Users/michelle/repos/dbt_utils the location where I previously cloned the dbt-utils repo
In this example my packages.yml should look like
packages:
- local: /Users/michelle/repos/dbt_utils # use a local path
Please note that the external package does not live within my dbt project directory, but outside of it. While it should work to have it within the repo, this is not best practice. This external package development article goes even further in-depth.
Related
We have a terraform module developed and kept it inside a repo and people access it by putting below in their main.tf
module "standard_ingress" {
source = "git::https://xxx.xx.xx/scm/xxxx/xxxx-terraform-eks-ingress-module.git?ref=master"
When they do terraform init whole repo is being cloned to folder (~/.terraform/modules/standard_ingress)
We have some non module (non terraform) related folders as well in the same repo and same branch.
Is there a way, we can make terraform init exclude those folders being cloned.
Thanks.
The Git transfer protocols all work by transferring batches of commits associated with a particular remote ref (branch or tag), so there is no way for a Git client to fetch only a subset of the directories or files in the selected commit.
Terraform can only use the Git protocol as it's already defined, and so it cannot provide any capabilities that the underlying protocol lacks.
If your concern is the amount of time taken to clone the entire repository, you may be able to optimize by excluding anything except the most recent commit rather than by ignoring files within that commit. You can do that by setting the depth argument to 1:
source = "git::https://xxx.xx.xx/scm/xxxx/xxxx-terraform-eks-ingress-module.git?ref=master&depth=1"
If even that isn't sufficient then I think your only further option would be to add a separate build and release step for your modules where you capture the subset of files that are relevant to the Terraform modules into a .zip or .tar.gz archive, publish that archive somewhere that Terraform can fetch it over HTTP, and then use fetching archives over HTTP as the source type. In this case Terraform will download only the contents of the archive, allowing you to curate exactly what's included. (It would also be equivalent to put the archive into one of the supported cloud storage services, such as Amazon S3.)
I've been trawling documentation on Netlify's site for how to deploy a node application (specifically a Vue application) which has a private npm repository as a dependency.
I have an .npmrc file setup as follows:
//npm.pkg.github.com/:_authToken=${GITHUB_TOKEN}
#my:registry=https://npm.pkg.github.com
I also set the GITHUB_TOKEN variable in my build process's environment variables to the correct value.
However, the build fails.
There is an outline of how to achieve a successful build with a private Github package repository here: https://docs.netlify.com/configure-builds/repo-permissions-linking/#npm-packages.
However, it's still unclear as to what I need to do specifically to both my package.json file and also how to configure the build process' environment variables ...
Could anyone show me a full working example which doesn't leave me scratching my head?
Do I add:
"package-name": "git+https://<github_token>:x-oauth-basic#github.com/<user>/<repo>.git"
to the dependencies section of the package.json?
If yes, how do I then hide my <github_token> and provide the build process with this in Netlify?
How do I also get the same process working locally?
Why can't Netlify just inject the GITHUB_TOKEN into my .npmrc file so I have parity with my local development environment?
My Environment
When I develop vue project, I keep track of files in dist folder.
My laptop
I develop vue project -> npm(yarn) run build -> git add everthing in dist folder -> git commit -> push
Production server
git pull -> Everything is good to go (since I already built dist folder on my laptop).
I have URLs of production server and localhost server in my .env file
VUE_APP_PRODUCTION_API=https://myprodserver.com/
VUE_APP_LOCAL_API=http://localhost:3000/
When I use
vue-cli-service serve, it uses localhost server url.
vue-cli-service build, it uses production server url.
(so URL is allocated in build-time, not run-time)
The Problem
But I came across a problem when I added a qa server.
Now, there are three envrionments
1. My laptop
2. qa server
3. production server
I found that vue project in qa server tries to talk to PRODUCION_API (since It pulled the dist folder for production server which I built on my laptop and committed).
What I tried
I created the .env.local in qa server and overwrote the VUE_APP_PRODUCTION_API
# qa server .env.local
VUE_APP_PRODUCTION_API=http://qaserver.com/
and installed npm, yarn
and build
it worked! But... the problem is it makes git status output dirty.
when I git status
changes:
file removed: dist/file...
file removed: dist/file...
file removed: dist/file...
file removed: dist/file...
I know I can .gitignore dist folder, and build everytime when I git pull in each environment... but that would be bothersome. that would make me have to install npm, yarn, and install whole dependency in production server too. (It wasn't necessary before since I used to build on my laptop)
Question
Do I have to build the project everytime just to change the base url?
Was it right to keep track of dist folder from the fist place?
Is there any Best-Practice you could share me?
Ideally, you would want to deploy your project using a tool like Github Actions or Gitlab CI.
This would allow you to build your projects with the necessary environment variables for the given target environment.
In general, the dist files aren't included in version control, as each developer working on your project would have different versions of it and you would encounter merging conflicts really often.
Here's an example using Github Actions, deploying on Github Pages
I would like to create a private npm repo on our nexus.
So i guess i need to take the current node_modules folder on my machine and put it in Nexus, right?
I cant proxy npm, because on our jenkins machine we don't have internet access.
So how do i put all my node modules (lots of folders) onto Nexus?
and in what structure and format?
See https://github.com/DarthHater/nexus-repository-import-scripts for uploading a lot of files into NXRM3.
If that doesn't work for you, likely you can modify that script for your needs.
I'm evaluating GitLab CI/CD at the moment and trying to get the pipelines working, however am having a simple(?) problem.
I'm trying to automate the build process by automatically updating ISO images. However these ISO images are not tracked by the repository and ignored in a .gitignore file. But this leads to the issue of when I try run make that it can't find the ISO image...
I'm just using a simple .gitlab-ci.yml file:
stages:
- build
build:
stage: build
script:
- make
And when I try running this in the GitLab CI interface, it clones the repository (without the ISO images), and then fails, as there is no rule to that make target (as the ISO images are missing). I have tried moving the files into the "build" directory which GitLab creates, however that gives the error of saying it has failed to remove ...
How do I use the local repository rather than having GitLab clone the repository to a "build" location?
You can't use Gitlab CI with the files that are on your computer, at least you shouldn't. You could do it with an ssh-executor that will login to a machine that stores the iso files (and it should be a server rather than your development machine) or you could have the docker executor pull them from ftp/object store. For testing purposes you can run the builds locally on your machine.