MongoDB Connection created for every AWS serverless function call - express

I have set up Express js with Serverless and connecting to mongoDb atlas.
The Code works fine, but It creates a connection for each call. I tried the Caching method also, but no luck with it.
Here is my code below
// server.js
const sls = require('serverless-http')
const connectToDatabase = require('./lib/db');
const app = require('./lib/app')
connectToDatabase();
module.exports.run = sls(app)
//db.js
const mongoose = require('mongoose')
var Promise = require("bluebird");
// console.log("Connecting to " + process.env.DB);
const connection = {}
mongoose.Promise= Promise;
module.exports = async () => {
if (connection.isConnected) {
console.log('=> using existing database connection')
return
}
console.log('=> using new database connection')
const db = await mongoose.connect(process.env.DB,{useNewUrlParser:true})
connection.isConnected = db.connections[0].readyState
}

There are a few things to check:
How long does your Lambda function take to execute? Your function will only handle one request at a time. If you make a second request before your first request completes a new instance of your function will spin up.
Ensure that mongoose isn't closing the connection after your function completes.

Related

react-native (Expo) upload file on background

In my Expo (react-native) application, I want to do the upload task even if the application is in the background or killed.
the upload should be done to firebase storage, so we don't have a REST API.
checked out the Expo task manager library, but I could not figure out how it should be done. is it even possible to achieve this goal with Expo? is the TaskManager the correct package for this task?
there are only some Expo packages that could be registered as a task (e.g. backgroundFetch), and it is not possible to register a custom function (in this case uploadFile method).
I even got more confused as we should enable add UIBackgroundModes key for iOS but it only has audio,location,voip,external-accessory,bluetooth-central,bluetooth-peripheral,fetch,remote-notification,processing as possible values.
I would appreciate it if you can at least guide me on where to start or what to search for, to be able to upload the file even if the app is in the background is killed/terminated.
import { getStorage, ref, uploadBytes } from "firebase/storage";
const storage = getStorage();
const storageRef = ref(storage, 'videos');
const uploadFile = async (file)=>{
// the file is Blob object
await uploadBytes(storageRef, file);
}
I have already reviewed react-native-background-fetch, react-native-background-upload, react-native-background-job . upload should eject Expo, job does not support iOS, and fetch is a fetching task designed for doing task in intervals.
if there is a way to use mentioned libraries for my purpose, please guide me :)
to my understanding, the Firebase Cloud JSON API does not accept files, does it ? if so please give me an example. If I can make storage json API work with file upload, then I can use Expo asyncUpload probably without ejecting.
I have done something similar like you want, you can use expo-task-manager and expo-background-fetch. Here is the code as I used it. I Hope this would be useful for you.
import * as BackgroundFetch from 'expo-background-fetch';
import * as TaskManager from 'expo-task-manager';
const BACKGROUND_FETCH_TASK = 'background-fetch';
const [isRegistered, setIsRegistered] = useState(false);
const [status, setStatus] = useState(null);
//Valor para que se ejecute en IOS
BackgroundFetch.setMinimumIntervalAsync(60 * 15);
// Define the task to execute
TaskManager.defineTask(BACKGROUND_FETCH_TASK, async () => {
const now = Date.now();
console.log(`Got background fetch call at date: ${new Date(now).toISOString()}`);
// Your function or instructions you want
return BackgroundFetch.Result.NewData;
});
// Register the task in BACKGROUND_FETCH_TASK
async function registerBackgroundFetchAsync() {
return BackgroundFetch.registerTaskAsync(BACKGROUND_FETCH_TASK, {
minimumInterval: 60 * 15, // 1 minutes
stopOnTerminate: false, // android only,
startOnBoot: true, // android only
});
}
// Task Status
const checkStatusAsync = async () => {
const status = await BackgroundFetch.getStatusAsync();
const isRegistered = await TaskManager.isTaskRegisteredAsync(
BACKGROUND_FETCH_TASK
);
setStatus(status);
setIsRegistered(isRegistered);
};
// Check if the task is already register
const toggleFetchTask = async () => {
if (isRegistered) {
console.log('Task ready');
} else {
await registerBackgroundFetchAsync();
console.log('Task registered');
}
checkStatusAsync();
};
useEffect(() => {
toggleFetchTask();
}, []);
Hope this isn't too late to be helpful.
I've been dealing with a variety of expo <-> firebase storage integrations recently, and here's some info that might be helpful.
First, I'd recommend not using the uploadBytes / uploadBytesResumable methods from Firebase. This Thread has a long ongoing discussion about it, but basically it's broken in v9. Maybe in the future the Firebase team will solve the issues, but it's pretty broken with Expo right now.
Instead, I'd recommend either going down the route of writing a small Firebase function that either gives a signed-upload-url or handles the upload itself.
Basically, if you can get storage uploads to work via an http endpoint, you can get any kind of upload mechanism working. (e.g. the FileSystem.uploadAsync() method you're probably looking for here, like #brentvatne pointed out, or fetch, or axios. I'll show a basic wiring at the end).
Server Side
Option 1: Signed URL Upload.
Basically, have a small firebase function that returns a signed url. Your app calls a cloud function like /get-signed-upload-url , which returns the url, which you then use. Check out: https://cloud.google.com/storage/docs/access-control/signed-urls for how you'd go about this.
This might work well for your use case. It can be configured just like any httpsCallable function, so it's not much work to set up, compared to option 2.
However, this doesn't work for the firebase storage / functions emulator! For this reason, I don't use this method, because I like to intensively use the emulators, and they only offer a subset of all the functionalities.
Option 2: Upload the file entirely through a function
This is a little hairier, but gives you a lot more fidelity over your uploads, and will work on an emulator! I like this too because it allows doing upload process within the endpoint execution, instead of as a side effect.
For example, you can have a photo-upload endpoint generate thumbnails, and if the endpoint 201's, then you're good! Rather than the traditional Firebase approach of having a listener to cloud storage which would generate thumbnails as a side effect, which then has all kinds of bad race conditions (checking for processing completion via exponentiational backoff? Gross!)
Here are three resources I'd recommend to go about this approach:
https://cloud.google.com/functions/docs/writing/http#multipart_data
https://github.com/firebase/firebase-js-sdk/issues/5848
https://github.com/mscdex/busboy
Basically, if you can make a Firebase cloud endpoint that accepts a File within formdata, you can have busboy parse it, and then you can do anything you want with it... like upload it to Cloud Storage!
an outline of this:
import * as functions from "firebase-functions";
import * as busboy from "busboy";
import * as os from "os";
import * as path from "path";
import * as fs from "fs";
type FieldMap = {
[fieldKey: string]: string;
};
type Upload = {
filepath: string;
mimeType: string;
};
type UploadMap = {
[fileName: string]: Upload;
};
const MAX_FILE_SIZE = 2 * 1024 * 1024; // 2MB
export const uploadPhoto = functions.https.onRequest(async (req, res) => {
verifyRequest(req); // Verify parameters, auth, etc. Better yet, use a middleware system for this like express.
// This object will accumulate all the fields, keyed by their name
const fields: FieldMap = {};
// This object will accumulate all the uploaded files, keyed by their name.
const uploads: UploadMap = {};
// This will accumulator errors during the busboy process, allowing us to end early.
const errors: string[] = [];
const tmpdir = os.tmpdir();
const fileWrites: Promise<unknown>[] = [];
function cleanup() {
Object.entries(uploads).forEach(([filename, { filepath }]) => {
console.log(`unlinking: ${filename} from ${path}`);
fs.unlinkSync(filepath);
});
}
const bb = busboy({
headers: req.headers,
limits: {
files: 1,
fields: 1,
fileSize: MAX_FILE_SIZE,
},
});
bb.on("file", (name, file, info) => {
verifyFile(name, file, info); // Verify your mimeType / filename, etc.
file.on("limit", () => {
console.log("too big of file!");
});
const { filename, mimeType } = info;
// Note: os.tmpdir() points to an in-memory file system on GCF
// Thus, any files in it must fit in the instance's memory.
console.log(`Processed file ${filename}`);
const filepath = path.join(tmpdir, filename);
uploads[filename] = {
filepath,
mimeType,
};
const writeStream = fs.createWriteStream(filepath);
file.pipe(writeStream);
// File was processed by Busboy; wait for it to be written.
// Note: GCF may not persist saved files across invocations.
// Persistent files must be kept in other locations
// (such as Cloud Storage buckets).
const promise = new Promise((resolve, reject) => {
file.on("end", () => {
writeStream.end();
});
writeStream.on("finish", resolve);
writeStream.on("error", reject);
});
fileWrites.push(promise);
});
bb.on("close", async () => {
await Promise.all(fileWrites);
// Fail if errors:
if (errors.length > 0) {
functions.logger.error("Upload failed", errors);
res.status(400).send(errors.join());
} else {
try {
const upload = Object.values(uploads)[0];
if (!upload) {
functions.logger.debug("No upload found");
res.status(400).send("No file uploaded");
return;
}
const { uploadId } = await processUpload(upload, userId);
cleanup();
res.status(201).send({
uploadId,
});
} catch (error) {
cleanup();
functions.logger.error("Error processing file", error);
res.status(500).send("Error processing file");
}
}
});
bb.end(req.rawBody);
});
Then, that processUpload function can do anything you want with the file, like upload it to cloud storage:
async function processUpload({ filepath, mimeType }: Upload, userId: string) {
const fileId = uuidv4();
const bucket = admin.storage().bucket();
await bucket.upload(filepath, {
destination: `users/${userId}/${fileId}`,
{
contentType: mimeType,
},
});
return { fileId };
}
Mobile Side
Then, on the mobile side, you can interact with it like this:
async function uploadFile(uri: string) {
function getFunctionsUrl(): string {
if (USE_EMULATOR) {
const origin =
Constants?.manifest?.debuggerHost?.split(":").shift() || "localhost";
const functionsPort = 5001;
const functionsHost = `http://${origin}:${functionsPort}/{PROJECT_NAME}/${PROJECT_LOCATION}`;
return functionsHost;
} else {
return `https://{PROJECT_LOCATION}-{PROJECT_NAME}.cloudfunctions.net`;
}
}
// The url of your endpoint. Make this as smart as you want.
const url = `${getFunctionsUrl()}/uploadPhoto`;
await FileSystem.uploadAsync(uploadUrl, uri, {
httpMethod: "POST",
uploadType: FileSystem.FileSystemUploadType.MULTIPART,
fieldName: "file", // Important! make sure this matches however you want bussboy to validate the "name" field on file.
mimeType,
headers: {
"content-type": "multipart/form-data",
Authorization: `${idToken}`,
},
});
});
TLDR
Wrap Cloud Storage in your own endpoint, treat it like a normal http upload, everything plays nice.

getInitialProps is called on front end in nextjs

I have code for user fetch from the session in _app.tsx.
MyApp.getInitialProps = async ({ ctx }: any) => {
const { req, res } = ctx;
const session = auth0.getSession(req, res)
return { user: session?.user }
}
Problem is, that getInitialProps is sometimes called on client-side. I don`t understand why? Documentation says:
`getInitialProps enables server-side rendering in a page and allows you to do initial data population, which means sending the page with the data already populated from the server. This is especially useful for SEO.
https://nextjs.org/docs/api-reference/data-fetching/getInitialProps
What is wrong? Why is my function called on client-side?
In case I am wrong, how can I fetch user session data from server on server side?
The link you shared explains you need to use getServerSideProps instead of getInitialProps for Next.js version > 9.3.
Since getInitialProps called on client and server, you have to write authentication logic for both cases. To get the session on client use useUser hook, on server use getSession
import { useUser } from '#auth0/nextjs-auth0';
let user;
if (typeof window === 'undefined'){
const session=auth0.getSession(req,res)
console.log("check session to get user",session)
user=session.user
} else {
const data = useUser()
user=data.user
}

bigquery authenticating with service account from text\string and not file path

I am using
this nodejs bigquery client
I have the service account json in string\text, I want to avoid to write it to temporary file due to security reasons.
Is there an option to do new BigQuery() and provide the service account as string and not as filepath?
Couldn't find this option anythere, in all the examples need to provide the filepath or export the GOOGLE_APPLICATION_CREDENTIALS variable.
Thanks.
It is possible to use the values in your service account as string for authentication. You can use BigQueryOptions and pass a credentials object. Credentials object will need client_email and private_key which can be found in your service account json.
Using the sample code you linked in your question, BigQueryOptions can be implemented in this manner.
const creds = {
client_email: 'your_service_account#project-id.iam.gserviceaccount.com',
private_key: '-----BEGIN PRIVATE KEY-----\nxxxxxxxxxxxxxxxxxxx\n-----END PRIVATE KEY-----\n'
};
const bigquery = new BigQuery(credentials=creds);
The whole code will be:
const {BigQuery} = require('#google-cloud/bigquery');
const creds = {
client_email: 'your_service_account#project-id.iam.gserviceaccount.com',
private_key: '-----BEGIN PRIVATE KEY-----\nxxxxxxxxxxxxxxxxxxx\n-----END PRIVATE KEY-----\n'
};
const bigquery = new BigQuery(credentials=creds);
async function query() {
// Queries the U.S. given names dataset for the state of Texas.
const query = `SELECT name
FROM \`bigquery-public-data.usa_names.usa_1910_2013\`
WHERE state = 'TX'
LIMIT 100`;
// For all options, see https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query
const options = {
query: query,
// Location must match that of the dataset(s) referenced in the query.
location: 'US',
};
// Run the query as a job
const [job] = await bigquery.createQueryJob(options);
console.log(`Job ${job.id} started.`);
// Wait for the query to finish
const [rows] = await job.getQueryResults();
// Print the results
console.log('Rows:');
rows.forEach(row => console.log(row));
}
query()
Snippet of the output:

Mocking runtime config value with sinon

I am adding some config values before hapi server start. Application is works fine although in test I can not use config.get(). I can get around with proxyquire. So I was wondering
Is adding config file "dynamically" is bad design?
Is there a way I can use config.get() in such suitation?
Any alternative approach?
//initialize.js
const config = require('config');
async function startServe() {
const someConfigVal = await callAPIToGetSomeJSONObject();
config.dynamicValue = someConfigVal;
server.start();
}
//doSomething.js
const config = require('config');
function doesWork() {
const valFromConfig = config.dynamicValue.X;
// In test I can use proxiquire by creating config object
...
}
function doesNotWork() {
const valFromConfig = config.get('dynamicValue.X');
// Does not work with sinon mocking as this value does not exist in config when test run.
// sinon.stub(config, 'get').withArgs('dynamicValue.X').returns(someVal);
.....
}
Context: testing.
Is adding config file "dynamically" is bad design? => No. I have done it before. The test code changes configuration file: default.json in mid test to check whether function under test behaves as expected. I used several config utilities.
Is there a way I can use config.get() in such suitation? => Yes. For sinon usage, see the example below, which use mocha. You need to define the stub/mock before the function under test use it, and do not forget to restore the stub/mock. Also there is official documentation related to this: Altering configuration values for testing at runtime, but not using sinon.
const config = require('config');
const sinon = require('sinon');
const { expect } = require('chai');
// Example: simple function under test.
function other() {
const valFromConfig = config.get('dynamicValue.X');
return valFromConfig;
}
describe('Config', function () {
it ('without stub or mock.', function () {
// Config dynamicValue.X is not exist.
// Expect to throw error.
try {
other();
expect.fail('expect never get here');
} catch (error) {
expect(error.message).to.equal('Configuration property "dynamicValue.X" is not defined');
}
});
it('get using stub.', function () {
// Create stub.
const stubConfigGet = sinon.stub(config, 'get');
stubConfigGet.withArgs('dynamicValue.X').returns(false);
// Call get.
const test = other();
// Validate te result.
expect(test).to.equal(false);
expect(stubConfigGet.calledOnce).to.equal(true);
// Restore stub.
stubConfigGet.restore();
});
it('get using mock.', function () {
// Create mock.
const mockConfig = sinon.mock(config);
mockConfig.expects('get').once().withArgs('dynamicValue.X').returns(false);
// Call get.
const test = other();
// Validate te result.
expect(test).to.equal(false);
// Restore mock.
expect(mockConfig.verify()).to.equal(true);
});
});
Hope this helps.

how can I use express code to map a route in a sails hook?

In express code:
var kue = require('kue');
var express = require('express');
var ui = require('kue-ui');
var app = express();
app.use('/api', kue.app);
app.use('/kue', ui.app);
I can access: http://localhost:1337/kue and http://localhost:1337/api just fine.
I tried to move this into my sails hook:
var kue = require('kue');
var ui = require('ui');
module.exports = function galaxyKueSyncHook(sails) {
return {
routes: {
before: {
'get /kue': ui.app,
'get /api': kue.app
}
}
};
}
It doesn't work. I get a blank pages when access the same URLs.
How do properly get this to work in sails?
Additionally, I was able to get express code to work in config/http.js with
module.exports.http = {
customMiddleware: function (app) {
app.use(...);
But, I really want the function to be added in an installable hook.
You can use express like:
var app = sails.hooks.http.app;
app.use('/api', kue.app);