dbt: could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS - google-bigquery

I'm trying to connect dbt to BigQuery in vscode. For that I extracted a bigquery keyfile json that I put into the root directory of my dbt project.
I then created a profiles.yml file that looks as follows:
my-bigquery-db:
target: dev
outputs:
dev:
type: bigquery
method: service-account
project: civil-parsec-350114
dataset: dbt_dataset
threads: 1
keyfile: bigquery.json
Database Error
Runtime Error
dbt encountered an error while trying to read your profiles.yml file.
Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
When I put an empty projects.yml file I get the same error, so I'm not even sure if that file is loaded at all. How can I best debug this? What could be the problem?

Related

CodePipeline ElasticBeanstalk [ERROR] An error occurred during execution of command [app-deploy] - [CheckProcfileForDotNetCoreApplication]

I've built a Code Pipeline (Source > Build > Deploy) and it's failing on the deploy step.
It's a Net Core 3.1 Api project.
I check the elastic beanstalk logs and I see:
2020/07/02 14:14:00.600060 [ERROR] An error occurred during execution of command [app-deploy] - [CheckProcfileForDotNetCoreApplication]. Stop running the command. Error: error stat /var/app/staging/MyApi/MyApi.dll: no such file or directory with file /var/app/staging/MyApi/MyApi.dll
As far as I know I have no control over /var/app/staging/ and this is built in AWS stuff?
The build step is working so I am unsure on this error.
My buildspec.yml is:
version: 0.2
phases:
build:
commands:
- dotnet publish -c release -o ./build_output ./MyApi/MyApi.csproj
artifacts:
files:
- '**/*'
base-directory: 'build_output'
This is the "zipfile/build_output" folder:
This is the zip file root folder:
These are the files in the build artifacts zip file that pipeline is using. The error says it cannot find MyAppName.dll (renamed to MyApi in the pic). It's there so I wonder why the problem.
Perhaps it doesnt like the folder structure in the zip file - see pic.
I had the same problem.
In my case, if the solution name and project name were different, I got the same error when I deployed the code from Visual Studio to Beanstalk, but when I added a project with the same name as the solution name and built it, I didn't get an error.
I suspect that there will be behavior during deployment that assumes the .dll file has the same name as the solution name.
Warning: This is a workaround, not a solution!
On the project that's failing to deploy, change the "Assembly name" in Project Properties / Application tab, to the name of the DLL it's missing (typically the solution name or the first period-separated part of the namespace).
i.e. "SLNNAME"
Then, redeploy your beanstalk app and it should work.
As Marcin correctly noticed, the indentation was incorrect for the "base-directory"
base-directory: 'build_output'
Should be
base-directory: 'build_output'
As others have noted, it is looking for only the first part of your project name .dll. In my case, my project and Assembly name were both UC.Web which yielded the error during deployment:
Error: error stat /var/app/staging/UC.dll: no such file or directory with file /var/app/staging/UC.dll
My solution that worked was I renamed my Assembly name from UC.Web to simply UC and it deployed successfully. While a solution for everyone, it is a workaround for the time being until Amazon fixes this.

drone.io: containerd: write /proc/14/oom_score_adj: permission denied

I am trying to reverse engineer the drone.io docker plugin and understand how to run the docker daemon in a pipeline step (DinD).
drone.io uses the library github.com/cncd/pipeline to compile and execute .drone.yml files.
The first thing the plugins/docker does is to start the docker daemon:
+ /usr/local/bin/dockerd -g /var/lib/docker
This works if fine in the official plugin, but I cannot get it to work with my own image where I do the same:
pipeline.yml
workspace:
base: /go
path: src/github.com/fnbk/hello
pipeline:
test:
image: fnbk/drone-daemon
fnbk/drone-daemon/run.sh
#!/bin/sh
/usr/local/bin/dockerd # <= ERROR: containerd: write /proc/17/oom_score_adj: permission denied
# ...
It will give me the following error:
containerd: write /proc/14/oom_score_adj: permission denied
The full example can be found on github: https://github.com/cncd/pipeline/pull/45
Any suggestions are highly appreciated.
You need to add your plugin to a whitelist via the DRONE_ESCALATE environment variable, which is passed to the server. This is the default value:
DRONE_ESCALATE=plugins/docker,plugins/gcr,plugins/ecr
So you would pass something like this:
-DRONE_ESCALATE=plugins/docker,plugins/gcr,plugins/ecr
+DRONE_ESCALATE=plugins/docker,plugins/gcr,plugins/ecr,fnbk/my-custom-plugin
Note that this should be the image name only. It must not include the tag.

Global variable in Jenkins Repository URL

I am trying to use a global Jenkins variable in the Repository URL field:
Repository URL: ${BUILD-PEND-SRC}
BUILD-PEND-SRC is defined in Configure System and a value of a proper URL is set. If I do a shell execution job with echo ${BUILD-PEND-SRC} it does display the correct value.
However, when I run the job, I get
ERROR: Failed to check out ${BUILD-PEND-SRC}
org.tmatesoft.svn.core.SVNException: svn: E125002: Malformed URL '${BUILD-PEND-SRC}'
Which tells me that Jenkins did not resolve ${BUILD-PEND-SRC}.
I am summarizing the SO answer that solved it for git-based Jenkins pipeline jobs but also applies to svn-based jobs: https://stackoverflow.com/a/57065165/1994888 (credits go to #rupesh).
Summary
Edit your job config
go to the Pipeline section
go to the definition Pipeline script from SCM
uncheck Lightweight checkout
The issue seems to be with the scm-api-plugin (see the bug report in the Jenkins issue tracker), hence, it is not specific to a version control system.

Moonmail/Serverless: "s-variables-<stage>-<region>" location?

I'm looking to install Moonmail. One early step of installing Moonmail is:
Add variables to s-variables-< stage >-< region >:
{
...,
"apiHost": "yourendpointhost.com"
}
I can't find the relevant file to enter this information. Where exactly do I enter this?
The files created by serverless in my Moonmail location are:
s-project.json, s-resources-cf.json, and s-templates.json
The file is inside a new folder created when you ran sls project init -c -n your-lower-case-project-name. For example, if you stage is dev and your region is us-east-1, you should see this file:
_meta/variables/s-variables-dev-useast1.json
apiHost is the URL that points to your API Gateway, so you've got to set it after deploying, at least, one endpoint

Pentaho : Error while running pan.bat file in cmd line

I am trying to run my .ktr file in cmd line. I have my data-integration setup in this path:
C:\Users\dhamodharan.a\Desktop\pdi-ce-4.4.0-stable\data-integration
and my .ktr file in this path:
C:\Users\dhamodharan.a\Desktop\test.krt
while am trying to run that in cmd line I am getting the following error
DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files (x86)\Java\jre7
DEBUG: _PENTAHO_JAVA=C:\Program Files (x86)\Java\jre7\bin\java.exe
WARN 11-08 11:47:09,728 - Unable to load Hadoop Configuration from "file:///C:/
Users/dhamodharan.a/Desktop/pdi-ce-4.4.0-stable/data-integration/plugins/pentaho
-big-data-plugin/hadoop-configurations/mapr". For more information enable debug
logging.
INFO 11-08 11:47:09,759 - Pan - Start of run.
ERROR: No repository provided, can't load transformation.
C:\Users\dhamodharan.a\Desktop\pdi-ce-4.4.0-stable\data-integration>e:C:\Users\d
hamodharan.a\Desktop.test.ktr /level:Basic
I am trying to run an input excel file and make the output as excel. Do I also need to create repository for that?
If I try to create repository option I saw only for dbms not for excel.
Make sure the environment variable PENTAHO_JAVA_HOME is set correctly and then it'll work.
For some reason the java install is not on your path. But if spoon works you must have it somewhere.
i did the environment variable for JAVA_HOME now pan.bat and kitchen.bat works fine.
here the command:
pan.bat /file:C:\Users\dhamodharan.a\Desktop\dhamu\test.ktr /level:Basic > C:\Users\dhamodharan.a\Desktop\dhamu\test.log
thanks