reference a variable in serverless framework - serverless-framework

I'm trying to define my serverless framework deployment bucket.
My serverless.yml looks like this:
provider:
name: aws
runtime: nodejs14.x
region: us-east-1
stage: dev
deploymentBucket:
name: ${self:environment.DEPLOYMENT_BUCKET}
environment:
${file(../evn.${opt:stage, 'dev'}.json)}
and the evn.dev.json file looks like this:
{
"DEPLOYMENT_BUCKET": "myBucketName"
}
(both of these files have non-relevant parts removed)
I'm getting a "cannot resolve variable at "provicer.deploymentBucket.name" error when trying to deploy.
How do I reference the DEPLOYMENT_BUCKET variable in the serverless.yml file?
EDIT: Other errors:
${environment}:DEPLOYMENT_BUCKET -> Could not locate deployment bucket. Error: The specified bucket is not valid
name: ${environment:DEPLOYMENT_BUCKET}1 -> Unrecognized configuration variable sources: "environment"
name: ${self:provider.environment:DEPLOYMENT_BUCKET}
and
name: ${self:environment:DEPLOYMENT_BUCKET}
-> Cannot resolve serverless.yml: Variables resolution errored with - Cannot resolve variable at "provider.deploymentBucket.name": Value not found at "self" source
I was able to solve the problem with this:
${file(../evn.${opt:stage, 'dev'}.json):DEPLOYMENT_BUCKET}
But 'reading' that file twice -- both here and in the 'environment' area seems to somewhat defeat the purpose of the environments area.

Related

Serverless: Importing file to custom + other variables

I have a serverless.common.yml, with properties that should be shared by all the services, with that:
service: ixxxx
custom:
stage: ${opt:stage, self:provider.stage}
resourcesStages:
prod: prod
dev: dev
resourcesStage: ${self:custom.resourcesStages.${self:custom.stage}, self:custom.resourcesStages.dev}
lambdaPolicyXRay:
Effect: Allow
Action:
- xray:PutTraceSegments
- xray:PutTelemetryRecords
Resource: "*"
And, another serverless.yml inside a services folder, which uses properties on the common file:
...
custom: ${file(../../serverless.common.yml):custom}
...
environment:
stage: ${self:custom.stage}
...
In that way, I can access the custom variables (from the common file) without a problem.
Now, I want to continue to import this file to custom, but adding new variables, related to this service, to it, so I tried that:
custom:
common: ${file(../../serverless.common.yml):custom}
wsgi:
app: app.app
packRequirements: false
pythonRequirements:
dockerizePip: non-linux
And it seems it's possible to access, for example:
environment:
stage: ${self:custom.common.stage}
But now, I'm receiving the error:
Serverless Warning --------------------------------------
A valid service attribute to satisfy the declaration 'self:custom.stage' could not be found.
Serverless Warning --------------------------------------
A valid service attribute to satisfy the declaration 'self:custom.stage' could not be found.
Serverless Error ---------------------------------------
Trying to populate non string value into a string for variable ${self:custom.stage}. Please make sure the value of the property is a strin
What am I doing wrong?
In your serverless.common.yml you must reference as if it were serverless.yml. In this case ${self:custom.stage} does not exist, but ${self:custom.common.stage} does exist.
service: ixxxx
custom:
stage: ${opt:stage, self:provider.stage}
resourcesStages:
prod: prod
dev: dev
resourcesStage: ${self:custom.common.resourcesStages.${self:custom.common.stage}, self:custom.resourcesStages.dev}
lambdaPolicyXRay:
Effect: Allow
Action:
- xray:PutTraceSegments
- xray:PutTelemetryRecords
Resource: "*"

Error in Yaml file while trying to create multiple s3 buckets in Serverless Framework for AWS Lambda Function

So I'm pretty new to CloudFormation and also to Serverless framework. I've been trying to work through some exercises (such as an automatic thumbnail generator) and then create some simple projects that I can hopefully generalize for my own purposes.
Right now I'm attempting create a stack/function that creates two S3 buckets and has the Lambda Function take a CSV file form one, perform some simple transformations, and place it in the other receiving bucket.
In trying to build off the exercise I've done, I created a Yaml file with the following code:
provider:
name: aws
runtime: python3.8
region: us-east-1
profile: serverless-admin
timeout: 10
memorySize: 128
iamRoleStatements:
- Effect: "Allow"
Action:
- "s3:*"
Resource: "*"
custom:
assets:
targets:
- bucket1: csvbucket1-08-16-2020
pythonRequirements:
dockerizePip: true
- bucket2: csvbucket2-08-16-2020
pythonRequirements:
dockerizePip: true
functions:
protomodel-readcsv:
handler: handler.readindata
events:
s3:
- bucket: ${self:custom.bucket1}
event: s3:ObjectCreated:*
suffix: .csv
- bucket: ${self:custom.bucket2}
plugins:
- serverless-python-requirements
- serverless-s3-deploy
However, when i do a Serverless deploy from my command prompt, I get:
Serverless Warning --------------------------------------
A valid service attribute to satisfy the declaration 'self:custom.bucket1' could not be found.
Serverless Warning --------------------------------------
A valid service attribute to satisfy the declaration 'self:custom.bucket2' could not be found.
Serverless Error ---------------------------------------
Events for "protomodel-readcsv" must be an array, not an object
I've tried to make the events object in the protohandler-readcsv: by adding a - but I then get a bad indentation error that for some reason I cannot reconcile. But, more fundamentally, I'm not exactly sure why that item would need be an array anyway, and I wasn't clear about the warnings with the buckets either.
So sorry about a pretty newbie question about this, but running tutorials/examples online leaves a lot to try to figure out in trying to generalize/customize these examples.
custom:
assets:
targets:
- bucket1
I guess you need self:custom.assets.targets.bucket1, not sure if this nested assets will work.
Please check the example below is supposed to work.
service: MyService
custom:
deploymentBucket: s3_my_bucket
provider:
name: aws
deploymentBucket: ${self:custom.deploymentBucket}
stage: dev

In serverless(aws), how to get variable reference from serverless.yml file?

In Serverless.yml, I defined resource:
provider:
name: aws
runtime: nodejs6.10
region: us-east-1
stage: dev
environment:
customerDef: myvariable
resources:
Resources:
NewResource:
Type: AWS::S3::Bucket
Properties:
BucketName: ${self:service.name}-${self:provider.stage}-uploads
while in handler.js file which is write handle function.
How to get the reference of BucketName?
How to get the Bucket URI?
How to get the customerDef variable value? (provider->environment->customerDef)
All the environment variables defined under the environment node are available at any .js file using process.env.<variable_name>.
In your case, to access the customerDef variable you should use process.env.customerDef.
You can do the same with the BucketName and Bucket URI.
If you have your variables in environment key, you can reference them by process.env.yourVariable in every part of your project

A valid option to satisfy the declaration could not be found in serverless framework

I'm using serverless framework and using bitbucket-pipeline to configure CI/CD.
I have the following configuration in the serverless.yml file
provider:
name: aws
runtime: nodejs10.x
region: ${opt:region, ${env:AWS_REGION, 'ap-south-1'}}
memorySize: ${opt:memory, ${env:MEMORY_SIZE, 512}}tracing:
apiGateway: ${env:TRACING_API_GATEWAY, true}
I want to be able to pass the variables from the CLI as well as environment variables.
I have set up environment variable for AWS_REGION in the bitbucket pipeline variables but not for MEMORY_SIZE as want to use the default value.
But this gives error while running the pipeline.
Serverless Warning --------------------------------------
A valid option to satisfy the declaration 'opt:region,ap-south-1' could not be found.
Serverless Warning --------------------------------------
A valid environment variable to satisfy the declaration 'env:MEMORY_SIZE,512' could not be found.
Serverless Warning --------------------------------------
A valid environment variable to satisfy the declaration 'env:TRACING_API_GATEWAY,true' could not be found.
First, those are warnings not errors. A Serverless error would be something like:
Serverless Error ---------------------------------------
Trying to populate non string value into a string for variable ${opt:stage}. Please make sure the value of the property is a string.
This specific error happens because in the custom tag I've declared: var: ${opt:stage}-something which should be changed like:
provider:
stage: ${opt:stage, 'local'}
custom:
var: ${self:provider.stage}-something
I think in your case, you need to update region like this:
region: ${opt:region, env:AWS_REGION, 'ap-south-1'}
I couldn't reproduce the last warning though but I reckon ENV variables should be defined in bitbucket-pipelines.yml (or similar CI pipeline YAML) under args or variables and then they can be accessed using ${env:VAR}.

Serverless provider.environment variables not available in custom

I am trying to reference variables in self:provider.environment in my custom variables block; however, I get the following warning:
Serverless Warning --------------------------------------
A valid service attribute to satisfy the declaration
'self:provider.environment.myVar' could not be found.
We are using serverless 1.28.0, here's a sample config:
service: testing-vars
provider:
region: 'us-west-2'
environment:
myVar: ${env:myVar, self:custom.dotenv.myVar}
custom:
refToAbove: ${self:provider.environment.myVar}
...
I would like to reference the provider.environment vars in my custom block.
This was due a plugin not handling variables properly and has been fixed.