DVSA Error in deployment with Serverless - 'Bucket name should not contain uppercase characters' - serverless-framework

I'm trying to deploy the DVSA Serverless App via Serverless (SLS) and I'm hitting this error:
Serverless: Packaging service...
Serverless Error ---------------------------------------
FeedbackBucket - Bucket name should not contain uppercase characters. Please check provider.s3.FeedbackBucket and/or s3 events of function "FeedbackUploads".
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information ---------------------------
Operating System: linux
Node Version: 10.21.0
Framework Version: 1.73.1
Plugin Version: 3.6.13
SDK Version: 2.3.1
Components Version: 2.31.2
The repo is here https://github.com/OWASP/DVSA and it looks to be the file https://github.com/OWASP/DVSA/blob/master/backend/src/functions/processing/sls.yml which is causing the issue. With the S3 line - should this be a reference rather than a string? (New to SLS so apologies if this is an obvious question). Thanks!

It looks like the error is line 26 where the bucket name has caps in it: https://github.com/OWASP/DVSA/blob/b26c8a744293cd192383e4a61e0699563505c5a8/backend/src/functions/processing/sls.yml#L26

Related

dbt found two resources with the same name

I'm hitting a strange error in dbt
I am using the following package to enable external tables in bigquery.
packages:
- package: dbt-labs/dbt_external_tables
version: 0.8.0
$ dbt run-operation stage_external_sources
An error occurred in the dbt Server. Please contact support if this issue persists.
RPC server failed to compile project, call the "status" method for compile status: Compilation Error
dbt found two resources with the name
"dbt_user_test_ext". Since these resources have the same name, dbt will be unable to find the correct resource when source("dbt_user", "test_ext") is used. To fix this change the name of one of these resources:
source.dbt_user.test_ext (models/sources.yml)
source.dbt_user.test_ext (models/sources.yml) Code: 10011
models/sources.yml looks something like this:
sources:
- name: "{{ target.schema }}"
tables:
- name: test_ext
external:
location: "gs://test-bucket/test_ext.csv"
options:
format: csv
skip_leading_rows: 1
Seems to only happen the after the first time I run the command.
Question: what am I doing wrong, and/or how to fix?

failure on serverless deploy Template format error: Unresolved resource dependencies

all
As it said in the title, I just delete all the cloudformation stack and try to do the fresh deploy again. but when I run the sls deploy --verbose, it shows the error
The CloudFormation template is invalid: Template format error:
Unresolved resource dependencies [ServerlessDeploymentBucket] in the
Resources block of the template
For debugging logs, run again after setting the "SLS_DEBUG=*"
environment variable.
I double check my serverless.yml file, but found I don’t define a bucket with Name ServerlessDeploymentBucket.
Then I go to the AWS console, after clicking my root stack (I am using the serverless-split-stack plugin),
there is only a simple template
AWSTemplateFormatVersion: 2010-09-09
Description: The AWS CloudFormation template for this Serverless application
Resources:
ServerlessDeploymentBucket:
Type: 'AWS::S3::Bucket'
Outputs:
ServerlessDeploymentBucketName:
Value: !Ref ServerlessDeploymentBucket
while actually in my origin serverless.yaml, there are more than 1200 lines. Quite wondering what is the error related to the serverless. Appreciated for anyone could help. Thanks.
Edit: there are a bunch of plugin I am using which maybe useful for troubleshooting the error:
serverless-content-encoding
serverless-pseudo-parameters
serverless-webpack
serverless-offline
serverless-plugin-split-stacks
serverless-plugin-custom-roles
serverless-domain-manager
serverless-s3-deploy serverless-plugin-tracing
Regards.
is there any chance you are using template exported from an old stack?
By the way What happens if you just
delete the stack
and just sls deploy?
It will create the template for you (and the stack) and deploy it.

Serverless: WARNING: Inappropriate call of provider.request()

I'm using Serverless & the Serverless-Finch plugin to deploy a Serverless website to S3. Serverless Finch config is below.
custom:
client:
bucketName: my-site-${self:provider.stage}
distributionFolder: build
indexDocument: index.html
errorDocument: index.html
When I run serverless client deploy the deployment is successful and my site is online hosted on S3, however the logs in the terminal show this warning repeated about 30 times.
Serverless: WARNING: Inappropriate call of provider.request()
I've tried searching for the cause/meaning of this warning but haven't been able to find any info, any help explaining the error meaning or pointing me to the right bit of documentation is much appreciated.
I've tried changing my YML to not take the stage from the provider object for the bucketName however the warning persisted so I know that is not the source of the issue.
I think this is due to us using the old provider API. We're fixing this in the next release.
https://github.com/fernando-mc/serverless-finch/pull/42
Disclosure - I'm the maintainer for this project.

Serverless getting started demo errors on deploy

I'm following the instructions for the serverless quickstart demo https://github.com/serverless/serverless-starter
and I'm getting an error on serverless dash deploy
What am I doing wrong - it seems like such a basic walkthrough.
Serverless: Deploying the specified functions in "dev" to the following regions: us-east-1
/Users/bar/serverless-starter/node_modules/serverless-optimizer-plugin/index.js:266
if (!result || !result.code) {
^
This issue was fixed with my PR here:
https://github.com/serverless/serverless-optimizer-plugin/pull/41

ERROR: Talend S3 - AWS authentication requires a valid Date or x-amz-date header

I'm using talend open studio to push salesforce data to my redshift database. By pushing data using the following:
1. tSalesforceInput
2. tMap
3. tFileOutputDelimited
4. tRedshiftOutput
I am only getting about 2-5 rows/s which does not work at all for me.
By pushing the delimited file to tS3Put and then pushing data to redshift the transfer would go MUCH faster, about 500 rows/s. The issue I continue to face is that I get the error:
AWS authentication requires a valid Date or x-amz-date header (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: CC9C86CCC65625C0
And I have no idea how to solve. I have tried using tLibraryLoad to load joda time 2.8.2 before running and then running after but it still fails. Any advice greatly appreciated.
I also had this problem using Talend 6.1 The issue is an incompatibility of Java8, the AWS SDK and the joda-time 2.3 library that Talend bundles.
The solution I found was adapted from: TalendForge
Download joda-time 2.8.2 jar from Joda Time
Add a tLibraryLoad and point it to the new joda-time jar file you downloaded.
Go to your project's Run tab/Advanced Settings and add an additional JVM argument of:
-Xbootclasspath/p:$ROOT_PATH/../lib/joda-time-2.8.2.jar