CustomPouchError - react-native

I'm trying to sync my local DB to remote one like this:
const DB_NAME = "my_db";
const REMOTE_DB_URL ="http://<admin>:<password>/<ip-address>:5984/my_db";
const localDB = new PouchDB(DB_NAME);
const remoteDB = new PouchDB(REMOTE_DB_URL);
localDB.sync(remoteDB)
.then(() => {
console.log("Sync done");
})
.catch(err => {
console.log(err);
});
This is the error I get:
message:"getCheckpoint rejected with " name:"unknown" result:{ok:
false, start_time: Mon Dec 18 2017 14:14:03 GMT+0100 (CET), docs_read:
0, docs_written: 0, doc_write_failures: 0, ...} status: 0
Local DB is working fine, but when try to replicate/sync to remote, always get error above
I am using
React Native 0.50.0
pouchdb-react-native: 6.3.4
Remote DB is CouchDB 2.1.1

To sync your CouchDB databases follow these steps:
Step 1:
Serve the remote DB through https, not http by using CouchDB's native SSL support, as indicated here:
http://docs.couchdb.org/en/1.3.0/ssl.html
Step 2:
Make sure you have CORS enabled, as indicated here: http://docs.couchdb.org/en/1.3.0/cors.html

Related

Frequent timeout with app using Serverless Framework (AWS Lambda/Gateway), Express, Mongoose/MongoDB Atlas

Trigger warning : Beginner question.
I built an api using Express and Mongoose with a MongoDB Atlas DB.
Most of the time, it works normally, but often I get timeout errors. This seems to happen very randomly and concerns all routes, etc... Precisely, I get :
`502 Internal server error via POSTMAN`
and in the Serverless Dashboard, I get :
invocation
time invoked 1 day ago, mar 08 at 1:38pm
fatal error Function execution duration going to exceeded configured timeout limit.
cold start
duration 48.9 s
memory used na
request
endpoint /{proxy+}
method POST
status 502
message Internal server error
latency 27 ms
and span & log :
I used this tutorial to wrap my express app to deploy it with serverless framework : https://dev.to/adnanrahic/a-crash-course-on-serverless-apis-with-express-and-mongodb-193k
Serverless.yml file :
service: serviceName
app: appName
org: orgName
provider:
name: aws
runtime: nodejs12.x
stage: ${env:NODE_ENV}
region: eu-central-1
environment:
NODE_ENV: ${env:NODE_ENV}
DB: ${env:DB}
functions:
app:
handler: server.run
events:
- http:
path: /
method: ANY
cors: true
- http:
path: /{proxy+}
method: ANY
cors: true
plugins:
- serverless-offline # Utiliser pour tester localement
- serverless-dotenv-plugin
server.js file :
const sls = require('serverless-http')
const app = require('./app')
module.exports.run = sls(app)
app.js file :
const express = require('express')
const cors = require('cors')
const bodyParser = require('body-parser')
const newRoutes = require('./routes/file')
const app = express()
app.use(bodyParser.json())
const helmet = require('helmet')
app.use(helmet())
app.options('*', cors())
app.use(cors({ allowedHeaders: 'Content-Type, Authorization' }))
app.use('/new-route', newRoutes)
app.use((error, req, res, next) => {
console.log(error)
const status = error.status || 500
const message = error.message
res.status(status).json({
status: status,
message: message
})
})
// Gère la connexion à la base de donnée :
require('./db')
module.exports = app
and finally db.js file :
const mongoose = require('mongoose')
mongoose
.connect(
process.env.DB, {
useNewUrlParser: true,
useUnifiedTopology: true
})
.then(() => {
console.log('connected')
})
.catch(err => console.log(err))
From what I have read, it is related to cold start in Lambda and the way API Gateway handles timeouts (!?). I have read this on mongoose documentation (https://mongoosejs.com/docs/lambda.html), and read also other tutorials, but I don't how exaclty I should adapt it to my situation.
Thank you for your help
Under your provider add timeout, maximum value of timeout in lambda is
900 seconds, place it according to your execution time like 30 seconds
and see what happens
provider:
timeout: 30
The error is clearly saying that it's execution exceeded timeout, since you have not configured timeout so it was using default timeout of 3 seconds, hopefully it will solve the issue
The issue is likely due to your open database connection. While this connection is established any calls to callback won't be returned to the client and your function will timeout.
You need to set context.callbackWaitsForEmptyEventLoop to false.
Here is the explanation from the docs:
callbackWaitsForEmptyEventLoop – Set to false to send the response right away when the callback executes, instead of waiting for the Node.js event loop to be empty. If this is false, any outstanding events continue to run during the next invocation.
With serverless-http you can set this option quite easily within your server.js file:
const sls = require('serverless-http')
const app = require('./app')
module.exports.run = sls(app, { callbackWaitsForEmptyEventLoop: false })

socketIO over SSL on Smartphone Browser

I have an Apache webserver with a valid SSL certificate. It runs my web application on it. Let's call it Server A.
Then I have a second server running a Node-Js server with a valid SSL certificate. There also socket.IO runs. And this one we call Server B.
A client requests the web application at server A and gets the desired page displayed. If the page is set up at the client, a connection to server B is established via websockets. If another client should change something on the page, it will be adapted for all currently connected clients.
Websockets work as desired. As long as the page is accessed via a computer browser.
If I now go to the website with my smartphone (Iphone 7) via Safari or Chrome (WLAN), no connection to the websocket server (Server B) is established.
Then I set up a small websocket example on http without encryption.
There the websockets work on the smartphone browser.
I hope I could describe my problem understandably. I am very grateful for hints, examples or similar.
// This script run on my Server
const fs = require('fs');
const server = require('https').createServer({
key: fs.readFileSync('myserver.key', 'utf8'),
cert: fs.readFileSync('myserver.cer', 'utf8'),
passphrase: ''
});
let io = require('socket.io')(server);
server.listen(3003);
io.on('connection', function (socket) {
console.log("User Connected connect " + socket.id);
socket.on('disconnect', function () {
console.log("User has close the browser " + socket.id);
});
socket.on('feedback', function (data) {
io.sockets.emit('feedback', data);
});
});
// On Clientsite
socket = io.connect('wss://adressOfServer:3003', {
// secure: true,
transports: ['websocket'],
upgrade: false,
rejectUnauthorized: false
//Here I have already tried many combinations
});
socket.on('connect_error', function (error) {
// alert(error);
});

pouchdb - secure replication with remote LevelDB

I am keen on using PouchDB in browser memory for an Angular application. This PouchDB will replicate from a remote LevelDB database that is fed key-value pairs from an algorithm. So, on the remote end, I would install PouchDB-Server. On the local end, I would do the following (as described here) on a node prompt.
var localDB = new PouchDB('mylocaldb')
var remoteDB = new PouchDB('https://remote-ip-address:5984/myremotedb')
localDB.sync(remoteDB, {
live: true
}).on('change', function (change) {
// yo, something changed!
}).on('error', function (err) {
// yo, we got an error! (maybe the user went offline?)
});
How do we start a PouchDB instance that supports TLS for live replication as described in the snippet above?
How do I start a PouchDB instance that supports TLS for live replication?
So after some more searching, it is clear from this topic, HTTPS is not supported for PocuhDB-Server.
Sorry, I misunderstood your question. I thought you intend to connect to a CouchDB server with PouchDB through HTTPS. Therefore, the following answer actually doesn't answer your question.
I created a server.js file like below to communicate with my CouchDB through HTTPS. Please note that the SSL certificate is (in my case) self-signed, and also CouchDB listens by default on port 6984 in the case of TLS:
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0"; // Ignore rejection, becasue CouchDB SSL certificate is self-signed
//import PouchDB from 'pouchdb'
const PouchDB = require('pouchdb')
const db = new PouchDB('https://admin:****#192.168.1.106:6984/reproduce')
db.allDocs({
include_docs: true,
attachments: false
}).then(function (result) {
// handle result
console.log(result)
}).catch(function (err) {
console.log(err);
});
I'm running the above file with $ node server.js and I'm getting the expected results:
$ node server.js
{ total_rows: 3,
offset: 0,
rows:
[ { id: '5d6590d3-41c7-4011-be5d-b21f80079ae5',
key: '5d6590d3-41c7-4011-be5d-b21f80079ae5',
value: [Object],
doc: [Object] },
{ id: 'ec6a36d1-952e-4d86-9865-3587c6079fb5',
key: 'ec6a36d1-952e-4d86-9865-3587c6079fb5',
value: [Object],
doc: [Object] },
{ id: 'f508e7aa-b4dc-42fc-96be-b7c1ffa54172',
key: 'f508e7aa-b4dc-42fc-96be-b7c1ffa54172',
value: [Object],
doc: [Object] } ] }
I created the above code with NodeJS on server-side. However, if you want to communicate with CouchDB through HTTPS inside the browser, i.e. on client-side, you have to enable CORS on CouchDB.

Socket.io redis How data stored and cleared

i am hosting an app on heroku which is using socket.io. it is using sockets and i am using heroku 4 standard 1X dynos . So for this i used redistogo service and socket.io-redis plugin. it's working great but i want to know does socket.io-redis also clear the data from redis db when socket disconnected. Heroku redis goto service providing only 20MB data storage. .Please guideline How socket.io-redis inserting and clearing the data in redis database.
Assuming that you are referring to https://github.com/Automattic/socket.io-redis/blob/master/index.js, it appears that the plugin uses Redis' PubSub functionality. PubSub does not maintain state in the Redis database so there's no need to clear any data.
The session store is responsible for session clean up upon socket disconnection. I use https://github.com/tj/connect-redis for my session store.
Here is an example of cleaning up the socket connection properly upon disconnecting.
const websocket = require('socket.io')(app.get('server'), {
transports: process.env.transports
})
websocket.setMaxListeners(0)
websocket.adapter(require('socket.io-redis')({
host: process.env.redis_host,
port: process.env.redis_port,
key: 'socket_io',
db: 2
}))
websocket.use((socket, next) => {
app.get('session')(socket.request, socket.request.res || {}, next)
})
websocket.on('connection', socket => {
var sess = socket.request.session
socket.use((packet, next) => {
if(!socket.rooms[sess.id]) {
socket.join(sess.id, () => {
websocket.of('/').adapter.remoteJoin(socket.id, sess.id, err => {
delete socket.rooms[socket.id]
next()
})
})
}
})
socket.on('disconnecting', () => {
websocket.of('/').adapter.remoteDisconnect(sess.id, true, err => {
delete socket.rooms[sess.id]
socket.removeAllListeners()
})
})
})

Socket.IO Authentication

I am trying to use Socket.IO in Node.js, and am trying to allow the server to give an identity to each of the Socket.IO clients. As the socket code is outside the scope of the http server code, it doesn't have easy access to the request information sent, so I'm assuming it will need to be sent up during the connection. What is the best way to
1) get the information to the server about who is connecting via Socket.IO
2) authenticate who they say they are (I'm currently using Express, if that makes things any easier)
Use connect-redis and have redis as your session store for all authenticated users. Make sure on authentication you send the key (normally req.sessionID) to the client. Have the client store this key in a cookie.
On socket connect (or anytime later) fetch this key from the cookie and send it back to the server. Fetch the session information in redis using this key. (GET key)
Eg:
Server side (with redis as session store):
req.session.regenerate...
res.send({rediskey: req.sessionID});
Client side:
//store the key in a cookie
SetCookie('rediskey', <%= rediskey %>); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
//then when socket is connected, fetch the rediskey from the document.cookie and send it back to server
var socket = new io.Socket();
socket.on('connect', function() {
var rediskey = GetCookie('rediskey'); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
socket.send({rediskey: rediskey});
});
Server side:
//in io.on('connection')
io.on('connection', function(client) {
client.on('message', function(message) {
if(message.rediskey) {
//fetch session info from redis
redisclient.get(message.rediskey, function(e, c) {
client.user_logged_in = c.username;
});
}
});
});
I also liked the way pusherapp does private channels.
A unique socket id is generated and
sent to the browser by Pusher. This is
sent to your application (1) via an
AJAX request which authorizes the user
to access the channel against your
existing authentication system. If
successful your application returns an
authorization string to the browser
signed with you Pusher secret. This is
sent to Pusher over the WebSocket,
which completes the authorization (2)
if the authorization string matches.
Because also socket.io has unique socket_id for every socket.
socket.on('connect', function() {
console.log(socket.transport.sessionid);
});
They used signed authorization strings to authorize users.
I haven't yet mirrored this to socket.io, but I think it could be pretty interesting concept.
I know this is bit old, but for future readers in addition to the approach of parsing cookie and retrieving the session from the storage (eg. passport.socketio ) you might also consider a token based approach.
In this example I use JSON Web Tokens which are pretty standard. You have to give to the client page the token, in this example imagine an authentication endpoint that returns JWT:
var jwt = require('jsonwebtoken');
// other requires
app.post('/login', function (req, res) {
// TODO: validate the actual user user
var profile = {
first_name: 'John',
last_name: 'Doe',
email: 'john#doe.com',
id: 123
};
// we are sending the profile in the token
var token = jwt.sign(profile, jwtSecret, { expiresInMinutes: 60*5 });
res.json({token: token});
});
Now, your socket.io server can be configured as follows:
var socketioJwt = require('socketio-jwt');
var sio = socketIo.listen(server);
sio.set('authorization', socketioJwt.authorize({
secret: jwtSecret,
handshake: true
}));
sio.sockets
.on('connection', function (socket) {
console.log(socket.handshake.decoded_token.email, 'has joined');
//socket.on('event');
});
The socket.io-jwt middleware expects the token in a query string, so from the client you only have to attach it when connecting:
var socket = io.connect('', {
query: 'token=' + token
});
I wrote a more detailed explanation about this method and cookies here.
Here is my attempt to have the following working:
express: 4.14
socket.io: 1.5
passport (using sessions): 0.3
redis: 2.6 (Really fast data structure to handle sessions; but you can use others like MongoDB too. However, I encourage you to use this for session data + MongoDB to store other persistent data like Users)
Since you might want to add some API requests as well, we'll also use http package to have both HTTP and Web socket working in the same port.
server.js
The following extract only includes everything you need to set the previous technologies up. You can see the complete server.js version which I used in one of my projects here.
import http from 'http';
import express from 'express';
import passport from 'passport';
import { createClient as createRedisClient } from 'redis';
import connectRedis from 'connect-redis';
import Socketio from 'socket.io';
// Your own socket handler file, it's optional. Explained below.
import socketConnectionHandler from './sockets';
// Configuration about your Redis session data structure.
const redisClient = createRedisClient();
const RedisStore = connectRedis(Session);
const dbSession = new RedisStore({
client: redisClient,
host: 'localhost',
port: 27017,
prefix: 'stackoverflow_',
disableTTL: true
});
// Let's configure Express to use our Redis storage to handle
// sessions as well. You'll probably want Express to handle your
// sessions as well and share the same storage as your socket.io
// does (i.e. for handling AJAX logins).
const session = Session({
resave: true,
saveUninitialized: true,
key: 'SID', // this will be used for the session cookie identifier
secret: 'secret key',
store: dbSession
});
app.use(session);
// Let's initialize passport by using their middlewares, which do
//everything pretty much automatically. (you have to configure login
// / register strategies on your own though (see reference 1)
app.use(passport.initialize());
app.use(passport.session());
// Socket.IO
const io = Socketio(server);
io.use((socket, next) => {
session(socket.handshake, {}, next);
});
io.on('connection', socketConnectionHandler);
// socket.io is ready; remember that ^this^ variable is just the
// name that we gave to our own socket.io handler file (explained
// just after this).
// Start server. This will start both socket.io and our optional
// AJAX API in the given port.
const port = 3000; // Move this onto an environment variable,
// it'll look more professional.
server.listen(port);
console.info(`🌐 API listening on port ${port}`);
console.info(`🗲 Socket listening on port ${port}`);
sockets/index.js
Our socketConnectionHandler, I just don't like putting everything inside server.js (even though you perfectly could), especially since this file can end up containing quite a lot of code pretty quickly.
export default function connectionHandler(socket) {
const userId = socket.handshake.session.passport &&
socket.handshake.session.passport.user;
// If the user is not logged in, you might find ^this^
// socket.handshake.session.passport variable undefined.
// Give the user a warm welcome.
console.info(`⚡︎ New connection: ${userId}`);
socket.emit('Grettings', `Grettings ${userId}`);
// Handle disconnection.
socket.on('disconnect', () => {
if (process.env.NODE_ENV !== 'production') {
console.info(`⚡︎ Disconnection: ${userId}`);
}
});
}
Extra material (client):
Just a very basic version of what the JavaScript socket.io client could be:
import io from 'socket.io-client';
const socketPath = '/socket.io'; // <- Default path.
// But you could configure your server
// to something like /api/socket.io
const socket = io.connect('localhost:3000', { path: socketPath });
socket.on('connect', () => {
console.info('Connected');
socket.on('Grettings', (data) => {
console.info(`Server gretting: ${data}`);
});
});
socket.on('connect_error', (error) => {
console.error(`Connection error: ${error}`);
});
References:
I just couldn't reference inside the code, so I moved it here.
1: How to set up your Passport strategies: https://scotch.io/tutorials/easy-node-authentication-setup-and-local#handling-signupregistration
This article (http://simplapi.wordpress.com/2012/04/13/php-and-node-js-session-share-redi/) shows how to
store sessions of the HTTP server in Redis (using Predis)
get these sessions from Redis in node.js by the session id sent in a cookie
Using this code you are able to get them in socket.io, too.
var io = require('socket.io').listen(8081);
var cookie = require('cookie');
var redis = require('redis'), client = redis.createClient();
io.sockets.on('connection', function (socket) {
var cookies = cookie.parse(socket.handshake.headers['cookie']);
console.log(cookies.PHPSESSID);
client.get('sessions/' + cookies.PHPSESSID, function(err, reply) {
console.log(JSON.parse(reply));
});
});
use session and Redis between c/s
Server side
io.use(function(socket, next) {
// get here session id
console.log(socket.handshake.headers.cookie); and match from redis session data
next();
});
this should do it
//server side
io.sockets.on('connection', function (con) {
console.log(con.id)
})
//client side
var io = io.connect('http://...')
console.log(io.sessionid)