Error: Unable to import module 'handler': Error for serverless
Here is my handler:
`use strict`
const fetch = require('fetch')
module.exports.hello = (event, context, callback) => {
console.log('Hello, world!')
callback(null)
}
Here is my serverless.yml:
service: salsifyapicronjob
provider:
name: aws
region: eu-central-1
runtime: nodejs6.10
functions:
hello:
handler: handler.hello
events:
- schedule:
rate: cron(*/5 * * * ? *)
enabled: true
I did notice inside the .serverless zip that the handler.js is not in there. Could this be the reason? And if so how do I fix that?
Related
I have a basic Serverless Express app in a lambda, with a route set to async true. I want to trigger this route asynchronously from a different application, and expect it to run in the background without having to wait for the response.
My full serverless.yml
service: service-name
useDotenv: true
custom:
serverless-offline:
useChildProcesses: true
webpack:
webpackConfig: ./webpack.config.js
packager: "yarn"
includeModules:
forceExclude:
- aws-sdk
prune:
automatic: true
includeLayers: true
number: 3
envStage:
staging: staging
domainPrefix:
staging: service.staging
customDomain:
domainName: ${self:custom.domainPrefix.${opt:stage}}.mydomain.com
basePath: ""
stage: ${self:custom.envStage.${opt:stage}}
createRoute53Record: true
plugins:
- serverless-domain-manager
- serverless-webpack
- serverless-prune-plugin
- serverless-offline
provider:
lambdaHashingVersion: "20201221"
name: aws
runtime: nodejs14.x
region: us-east-1
apiGateway:
minimumCompressionSize: 1024
iamRoleStatements:
- Effect: Allow
Action: ssm:Get*
Resource: "arn:aws:ssm:*:*:parameter/myparams/*"
- Effect: Allow
Action: kms:Decrypt
Resource: "*"
functions:
express:
handler: src/index.middyHandler
events:
- http:
path: /
method: options
- http:
path: /{any+} # Catch all routes
method: options
- http:
path: foo/{any+}
method: get
- http:
path: foo/{any+}
method: post
async: true
Note: The role that deploys this app has permissions to read write to Cloudwatch, and I can see logs from the synchronous invocations, but not from async invocations.
My index.middyHandler
import serverless from "serverless-http";
import express from "express";
import helmet from "helmet";
import bodyParser from "body-parser";
import cookieParser from "cookie-parser";
import middy from "#middy/core";
import ssm from "#middy/ssm";
import doNotWaitForEmptyEventLoop from "#middy/do-not-wait-for-empty-event-loop";
import cors from "cors";
import fooRoutes from "./routes/foo";
const app = express();
app.use(
cors({
methods: "GET,HEAD,OPTIONS,POST",
preflightContinue: false,
credentials: true,
origin: true,
optionsSuccessStatus: 204,
})
);
app.use(helmet({ contentSecurityPolicy: false, crossOriginEmbedderPolicy: false }));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(cookieParser());
app.get("/ping", (req, res) => {
res.send("Pong!");
});
// Register routes
app.use("/foo", fooRoutes);
const handler = serverless(app);
export const middyHandler = middy(handler)
.use(
doNotWaitForEmptyEventLoop({
runOnError: true,
runOnAfter: true,
runOnBefore: true,
})
)
.use(
ssm({
setToEnv: true,
fetchData: {
MY_KEYS: "ssm/path"
},
})
)
When I call this method, it correctly returns a 200 response immediately. But the actual code is never run, I have a DB insert in there, and it doesn't happen. In the API Gateway I can see the X-Amz-Invocation-Type header is correctly being passed as Event type.
It is not a proxy integration, as required for async invocation
What am I missing here? The route controller is a test and the code is very simple
testAsync: async (req, res) => {
console.log("In Test Async"); // Does not display in Cloudwatch
try {
const { value } = req.body;
const resp = await updateTest(value); // This just inserts an entry in the DB with value
return res.send(resp);
} catch (err) {
return res.status(500).send(err);
}
},
Is there any other setting I'm missing here? I'm not an AWS expert, so any help would be highly appreciated. Thanks!
I'm using the serverless framework to try and test EventBridge.
The documentation is a little sparce, but for my test I would like to have two lambda functions created: first one publishes an event, the second consumes it.
Here is my YAML:
service: events
frameworkVersion: '2'
provider:
name: aws
runtime: nodejs12.x
lambdaHashingVersion: '20201221'
functions:
vehicle:
handler: handler.vehicle
events:
- httpApi:
path: /vehicle
method: '*'
bundle:
handler: handler.bundle
events:
- httpApi:
path: /bundle
method: '*'
- eventBridge:
eventBus: vehicle-bus
pattern:
source:
- aos.vehicle.upload
detail-type:
- VehicleUpload
and my handler.js
"use strict";
const AWS = require('aws-sdk');
module.exports.vehicle = async (event) => {
const eventBridge = new AWS.EventBridge({ region: 'us-east-1' });
const vrm = 'WR17MMN'
return eventBridge.putEvents({
Entries: [
{
EventBusName: 'veihcle-bus',
Source: 'aos.vehicle.upload',
DetailType: 'VehicleUpload',
Detail: `{ "Registration": "${vrm}" }`,
},
]
}).promise()
};
module.exports.bundle = async (event) => {
return {
statusCode: 200,
body: JSON.stringify(
{
message: "BUNDLE",
input: event,
aos: "First test OK",
},
null,
2
),
};
};
(I realise I can't just return that from the Lambda but it also needs to be an endpoint. If I make the function body of bundle empty I still get a server error.
What am I missing?
So you need this minimal setup:
org: myOrg
app: my-events
service: event-bridge-serverless
provider:
name: aws
runtime: nodejs10.x
region: eu-west-1
lambdaHashingVersion: 20201221
environment:
DYNAMODB_TABLE: ${self:service}-dev
eventBridge:
useCloudFormation: true
iamRoleStatements:
- Effect: "Allow"
Action:
- "events:PutEvents"
Resource: "*"
functions:
asset:
handler: handler.asset
events:
- eventBridge:
eventBus: my-events
pattern:
source:
- my.event
I have quite a big app built with serverless and now we are trying out serverless-stack. I am trying to reference a user pool created by sst in serverless.yml function. Is it possible? Below are the steps I've tried:
I have created a user pool
import * as cdk from '#aws-cdk/core'
import * as cognito from '#aws-cdk/aws-cognito'
import * as sst from '#serverless-stack/resources'
export default class UserServiceStack extends sst.Stack {
constructor(scope: cdk.Construct, id: string, props: sst.StackProps = {}) {
super(scope, id, props)
const userPool = new cognito.UserPool(this, 'userPool', {
signInAliases: {
email: true,
phone: true,
},
autoVerify: {
email: true,
phone: true,
},
passwordPolicy: {
minLength: 8,
requireDigits: false,
requireLowercase: false,
requireSymbols: false,
requireUppercase: false,
},
signInCaseSensitive: false,
selfSignUpEnabled: true,
})
new cdk.CfnOutput(this, 'UserPoolId', {
value: userPool.userPoolId,
})
const userPoolClient = new cognito.UserPoolClient(this, 'userPoolClient', {
userPool,
authFlows: {
adminUserPassword: true,
userPassword: true,
},
})
new cdk.CfnOutput(this, 'UserPoolClientId', {
value: userPoolClient.userPoolClientId,
})
}
}
and want to update my post confirmation trigger defined in serverless.yml
...
createUser:
handler: createUser.default
events:
- cognitoUserPool:
pool: !ImportValue '${self:custom.sstApp}...' # what to put here?
trigger: PostConfirmation
existing: true
Figured it out.
First How to use cdk output variables in serverless.yml.
Export them into a file
AWS_PROFILE=<profile-name> npx sst deploy --outputs-file ./exports.json
and in serverless.yml you can reference it like so
...
createUser:
handler: createUser.default
events:
- cognitoUserPool:
pool: ${file(../../infrastructure/exports.json):${self:custom.sstApp}-UserServiceStack.userPoolName}
trigger: PostConfirmation
existing: true
Second. serverless is setup such that you have to pass userPoolName, not userPoolId. So I had to generate userpool name and output it
import * as uuid from 'uuid'
...
const userPoolName = uuid.v4()
const userPool = new cognito.UserPool(this, 'userPool', {
userPoolName,
...
})
...
// eslint-disable-next-line no-new
new cdk.CfnOutput(this, 'userPoolName', {
value: userPoolName,
})
Third to avoid AccessDeniedException when calling lambda as a trigger you need to add the following to your resources
- Resources:
OnCognitoSignupPermission:
Type: 'AWS::Lambda::Permission'
Properties:
Action: "lambda:InvokeFunction"
FunctionName:
Fn::GetAtt: [ "CreateUserLambdaFunction", "Arn"] # the name must be uppercased name of your lambda + LambdaFunction at the end
Principal: "cognito-idp.amazonaws.com"
SourceArn: ${file(../../infrastructure/exports.json):${self:custom.sstApp}-UserServiceStack.userPoolArn}
Anyone have any ideas why I'm getting "Access Denied" when trying to put object into S3 inside a lambda function? I have the serverless AWS user with AdministorAccess and allow access to s3 resource inside serverless.yml:
iamRoleStatements:
- Effect: Allow
Action:
- s3:PutObject
Resource: "arn:aws:s3:::*"
Edit - here are the files
serverless.yml
service: testtest
app: testtest
org: workx
provider:
name: aws
runtime: nodejs12.x
iamRoleStatements:
- Effect: Allow
Action:
- s3:PutObject
Resource: "arn:aws:s3:::*/*"
functions:
hello:
handler: handler.hello
events:
- http:
path: users/create
method: get
handler.js
'use strict';
const AWS = require('aws-sdk');
// get reference to S3 client
const S3 = new AWS.S3();
// Uload the content to s3 and allow download
async function uploadToS3(content) {
console.log('going to upload to s3!');
const Bucket = 'mtest-exports';
const key = 'testtest.csv';
try {
const destparams = {
Bucket,
Key: key,
Body: content,
ContentType: "text/csv",
};
console.log('going to put object', destparams);
const putResult = await S3.putObject(destparams).promise();
return putResult;
} catch (error) {
console.log(error);
throw error;
}
}
module.exports.hello = async event => {
const result = await uploadToS3('hello world');
return {
statusCode: 200,
body: JSON.stringify(result),
};
};
I was using TypeScript plugin - #serverless/typescript. I used it to create Lambda function that will resize images that are uploaded to S3 + do some kind of content moderation.
Here is the content of serverless.ts file:
import type { AWS } from '#serverless/typescript';
import resizeImageLambda from '#functions/resizeImageLambda';
const serverlessConfiguration: AWS = {
service: 'myservice-image-resize',
frameworkVersion: '3',
plugins: ['serverless-esbuild'],
provider: {
name: 'aws',
stage: 'dev',
region: 'us-east-1',
profile: 'myProjectProfile', // reference to your local AWS profile created by serverless config command
// architecture: 'arm64', // to support Lambda w/ graviton
iam: {
role: {
statements: [
{
Effect: 'Allow',
Action: [
's3:GetObject',
's3:PutObject',
's3:PutObjectAcl',
's3:ListBucket',
'rekognition:DetectModerationLabels'
],
Resource: [
'arn:aws:s3:::myBucket/*',
'arn:aws:s3:::myBucket',
'arn:aws:s3:::/*',
'*'
]
},
{
Effect: 'Allow',
Action: [
's3:ListBucket',
'rekognition:DetectModerationLabels'
],
Resource: ['arn:aws:s3:::myBucket']
}
]
}
},
// architecture: 'arm64',
runtime: 'nodejs16.x',
environment: {
AWS_NODEJS_CONNECTION_REUSE_ENABLED: '1',
NODE_OPTIONS: '--enable-source-maps --stack-trace-limit=1000',
SOURCE_BUCKET_NAME:
'${self:custom.myEnvironment.SOURCE_BUCKET_NAME.${self:custom.myStage}}',
DESTINATION_BUCKET_NAME:
'${self:custom.myEnvironment.DESTINATION_BUCKET_NAME.${self:custom.myStage}}'
}
},
// import the function via paths
functions: { resizeImageLambda },
package: { individually: true },
custom: {
esbuild: {
bundle: true,
minify: false,
sourcemap: true,
exclude: ['aws-sdk'],
target: 'node16',
define: { 'require.resolve': undefined },
platform: 'node',
concurrency: 10,
external: ['sharp'],
packagerOptions: {
scripts:
'rm -rf node_modules/sharp && SHARP_IGNORE_GLOBAL_LIBVIPS=1 npm install --arch=x64 --platform=linux --libc=glibc sharp'
}
},
myEnvironment: {
SOURCE_BUCKET_NAME: {
dev: 'myBucket',
prod: 'myBucket-prod'
},
DESTINATION_BUCKET_NAME: {
dev: 'myBucket',
prod: 'myBucketProd'
}
},
myStage: '${opt:stage, self:provider.stage}'
}
};
module.exports = serverlessConfiguration;
resizeImageLambda.ts
/* eslint-disable no-template-curly-in-string */
// import { Config } from './config';
export const handlerPath = (context: string) =>
`${context.split(process.cwd())[1].substring(1).replace(/\\/g, '/')}`;
export default {
handler: `${handlerPath(__dirname)}/handler.main`,
events: [
{
s3: {
bucket: '${self:custom.myEnvironment.SOURCE_BUCKET_NAME.${self:custom.myStage}}',
event: 's3:ObjectCreated:*',
existing: true,
forceDeploy: true // for existing buckets
}
}
],
timeout: 15 * 60, // 15 min
memorySize: 2048
};
I remember there were few issues when I wanted to connect it to existing buckets (created outside serverless framework) such as IAM policy was not re-created / updated properly (see forceDeploy end existing parameters in function.events[0].s3 properties in resizeLambda.ts file)
Turns out I was an idiot and have the custom config in the wrong place and ruin the serverless.yml file!
I try do deploy me project to AWS Lambda with serverless-webpack. But i got error on import model to sequelize.
Here is error log from AWS:
module initialization error: ReferenceError
at t.default (/var/task/app.js:1:8411)
at Sequelize.import(/var/task/node_modules/sequelize/lib/sequelize.js:398:32)
My serverless.yml file:
service: backend-aquatru
plugins:
- serverless-webpack
- serverless-offline
- serverless-dotenv-plugin
custom:
webpackIncludeModules:
forceInclude:
- pg
- pg-hstore
webpackConfig: 'webpack.config.js'
includeModules: true
packager: 'npm'
provider:
name: aws
runtime: nodejs8.10
stage: dev
region: us-west-1
functions:
app:
handler: app.handler
events:
- http: 'ANY /'
- http: 'ANY {proxy+}'
webpack.config.js
const path = require('path')
const Dotenv = require('dotenv-webpack')
module.exports = {
entry: './app.js',
target: 'node',
externals: [nodeExternals()],
output: {
libraryTarget: 'commonjs',
path: path.resolve(__dirname, '.webpack'),
filename: 'app.js', // this should match the first part of function handler in serverless.yml
},
plugins: [
new Dotenv(),
],
module: {
rules: [
{
test: /\.js$/,
exclude: /node_modules/,
include: __dirname,
loaders: ['babel-loader'],
},
],
},
}
And my sequelize import model:
import {sequelize as dbConfig} from '../config/vars'
import Sequelize from 'sequelize';
const db = {
Sequelize,
sequelize: new Sequelize(dbConfig.makeUri(), {
operatorsAliases: Sequelize.Op,
dialect: dbConfig.dialect
}),
};
// require each model from every endpoint with sequelize
db.User = db.sequelize.import('User', require('../api/models/user.model'));
// run associations from every model here
Object.keys(db).forEach((modelName) => {
if ('associate' in db[modelName]) {
db[modelName].associate(db);
}
});
export const User = db.User;
export default db;
I saw serverless-webpack issue with import model files with require, and modify my code that way. But it doesn't help. Here is link to discussion. [enter link description here][issue]