Is there a way to change directories in DBT cloud - dbt

Is there a good/accepted way to change directories in DBT cloud? Currently DBT cloud pulls from the main branch of my repo and then uses the top level folder for where it looks for the dbt_project.yml file.
I was hoping to have that file in a subfolder like src/dbt and have DBT cloud recognize it. Was wondering if anyone else ran into this issue. I have no issues running DBT locally but was hoping to develop on cloud.

Navigate to your project settings in dbt Cloud and set DBT PROJECT SUBDIRECTORY under Overview.

Related

"dbt deps" from local repository

Is it allowed to create local dbt deps repository so that "dbt deps" command should download libraries from local repository?
N.B: Our client is not interested to connect to external network.
Yes this is possible, provided that the repositories have already been locally cloned or copied.
dbt docs's page on Packages tells you exactly how to do this
Packages that you have stored locally can be installed by specifying the path to the project, like so:
packages:
- local: /opt/dbt/redshift # use a local path
Local packages should only be used for specific situations, for example, when testing local changes to a package.
Note: I think it is worth re-iterating the caveat given in the docs. You will now own downloading the cloning the correct versions of the packages along with the ongoing work of keeping the packages up-to-date.
As for how this works in practice. Consider the following example:
/Users/michelle/repos/my_dbt_project where my dbt project lives (that contains dbt_project.yml and packages.yml
/Users/michelle/repos/dbt_utils the location where I previously cloned the dbt-utils repo
In this example my packages.yml should look like
packages:
- local: /Users/michelle/repos/dbt_utils # use a local path
Please note that the external package does not live within my dbt project directory, but outside of it. While it should work to have it within the repo, this is not best practice. This external package development article goes even further in-depth.

Serverless: How to remove/deploy deployment without .serverless directory for team collaboration

How do I remove/deploy deployment without .serverless directory for team collaboration?
For example if I run sls deploy --aws-profile profile1 with a .yml file it then creates this .serverless directory which I am not including in my git push to hide secrets. Now when someone else clones this repo on my team how can they now manage the same deployment? Is the .yml file and same aws profile sufficient?
The .serverless folder is created by serverless to alocate the cloud formation files. You should not handle them manually (and the folder and it’s content should not be included in source control).
The serverless.yml is the source of truth for the deployment, so it should do the same if running with the same environments.
The AWS account/profile can be set using the AWS cli. Given all the devs use the same account or use accounts with the same level of permissions, each one of you should be able to run deploy/remove.
If you project uses a .env file or environmental variables, each member of the team has to include them in their environment.

How to version control with IntelliJ

I'm looking for a way to control versions of my project through IntelliJ. However, I know Git can manage it the best way and I already did started experiencing Git with the help of Madara Uchiha's Git tutorial. I must say it is incredibly useful, but I rather have version control arranged on my harddrive which is constantly backed up.
I decided doing my version control manually and it's pretty slow and annoying. Is there an easier and more efficient way to clone the current project files in another folder?
For example, clone the current project files on another folder named v1.4.2 outside my project structure without relocating my project files, also having them refactored as project on its own so they be runnable whenever.
Set up a local Git repository for the project. It will start with a master branch. Then create a working branch that you make your changes in. You can merge this branch back in to master as you are ready. You can create as many branches as you need and switch between them very quickly. All using the one directory.
If you are new to git you can use something like Sourcetree - (a GUI for Git) it will allow you to manage the repository. It makes it really fast to switch between branches of your repository. It also helps with pushing changes to another location. GitHub, Bitbucket, etc.
For backup, you could always set up the project on Bitbucket. You can create public and private repositories for free. I really recommend setting this part up.
Depending on the environment that you are building on, you could build a shell script / batch script that would copy files to the duplicate location. Without knowing what type of project you are developing in/for it is hard to say what would be the best strategy.
Ideally if your project has a build output you could have the compiler/IntelliJ IDEA place the results into your result folder. You could then copy the results to your Builds/v1.4.2 folder or wherever. Whether you check in the files that are built will depend on your project. You can always exclude files/folders like your ../Builds that you don't want to track via your .gitignore file.

Adding external Jar to Pentaho Kettle

I am working on Pentaho Kettle version 5.0.1. In one of my transformation I am using javascript component where I am calling a method located in the JAR which I have copied to the lib folder of data-integration and everything is working fine in my local. But in my dev environment(I run it using kitchen) I don't have permission to copy my Jar file to the lib folder due to the restrictions on the server. Is there any other way using which I can supply the path of my custom Jar during run time so that the Kettle Job/Transformation can use it while being executed. Is there a way Kettle can pick the Jar location other than data-integration/lib?. Any help will be appreciated.
Take a look into kitchen.sh (and pan.sh). At some point the script starts adding stuff to the classpath. You can add more folders to the classpath there.
You still need permissions to edit the kitchen.sh file, though. If you can't do that, I suggest creating a copy of kitchen.sh you can write, in a separate location, and change the $BASEDIR folder to the actual PDI installation, so that kitchen can be located elsewhere.
If you have permission you can put your jar in another directory and after you specify this directory in the launcher.properties which you find in data-integration\launcher.
For exemple: if you put your jar in this directory: /export/home.
In launcher.properties: you will add this path and precisely libraries=../test:../lib:../libswt:../export/home

Teamcity 2 configurations merge and deploy

I have two teamcity configurations one becoming my common helpers and reuseable components and my other a website which uses the common project.
I use a third configuration to publish to a test environment.
When the third configuration is run i would like it to get the artifacts from the common project and merge them with the website output and deploy. Am i asking for two much?
This ought to be pretty straightforward.
On ThirdConfig add two artifact dependencies. One whose source is CommonProject, and another whose source is WebProject. When configuring an artifact dependency it will allow you to specify which artifact files are are actually pulled from CommonProject and WebProject into ThirdConfig via the 'Artifact paths'. The artifact files can then be placed into some new folder hierarchy specific to ThirdConfig by using the 'Destination path'. These two options ought to be enough to create the directory structure that is the merging of CommonProject and WebProject. That takes care of the merge part.
The deploy is a bit more tricky. To my knowledge TeamCity does not support any sort of 'copy or upload to external location' function out of the box. For this bit you'll need to create an msbuild script (or batch file, or anything that can be run from the command line). Said script can expect the file/directory structure you've created via artifact dependencies where the root of the structure is the initial working directory of the script, and need only push these files out to your specific deploy location. That 'push' of course is going to be specific to your environment. Ftp, unc share, etc.