binding mqtt subscribtions to node ejs, and alpine states - express

old Rails and php programmer that is new to js, tyring something and feel I am missing something pretty basic. I have a node server running a small webapp, express, ejs, jquery as well as alpinejs. (very exp with jquery, but learning alpinejs). I have a mqtt server running and can manipulate a couple of topics via my phone, and I can even see them in the node console (code below)
what I fail to understand is how to get, or maybe "bind" the subscription to a variable that I can "pass" into the express app so I can just show the mqtt message in the html. Im currently just trying to set things like fridgeFan = true or somehthing. My idea, I think, is to have mqtt update the alpinejs variable, hence updating the page, then viceversa...
Can someone point me in the general good direction of what im attemtping?
options={
clientId:"node-admin",
clean:true,
};
const express = require('express');
const app = express();
const mqtt = require('mqtt')
const client = mqtt.connect('mqtt://192.168.10.66', options)
client.on("connect",function(){
console.log("connected");
});
client.on('message',function(topic, message, packet){
console.log("message is "+ message);
console.log("topic is "+ topic);
// assume id like to bind topic and message to
// some variable that I can pass or have access to
// in the res.render calls below??
// couple of things I was trying
//document.getElementById('mqtt_div').innerHTML += topic;
//document.querySelector('[x-data]').__x.getUnobservedData().fridgeFan = message;
});
client.subscribe("sensors/sensor1",{qos:1});
app.set('view engine', 'ejs');
app.use(express.static(__dirname + '/public'));
app.get('/', (req, res) => {
res.render('index', {
title: 'Homepage'
});
});
with the above code, I can publish to "sensors/sensor1" via a phone app, I can see it in the node console, of course not the browser console.
How can I bind the "message" to a alpinejs variable so it will update?
OR do I go a different direction and put the client.on calls elsewhere, say the ejs file? in my testings, the ejs files dont know about require mqtt

You can't move your on message handler into the EJS files as they are only "executed" at the point you call render()
You can declare a global variable in the ExpressJS app and update fields with in it from the on('message',...) callback, you then pass this variable into the call to render() but this would mean that the page would only show the value at the time the page was loaded.
If you want live updates in the page as messages are published then you have one real option. That is to use a broker that supports MQTT over Websockets (nearly all modern brokers support this). With this you can load an MQTT client (either the Paho JavaScript client or you can use the MQTT.js client) into the page and subscribe to the topics you are interested in directly from the page. This way new messages are delivered directly to the page and you can then use what ever JavaScript framework you want to update the page.

Related

Update all clients after a change in vuex state

I am using vue2 syntax and vuex , versions : vue/cli 4.5.13 and vue#2.6.14 and vuex 3.6.2
I have a simple to do project , for adding todos in a list, based on the 2019 vue tutorial by traversy.
I have a simple form ij my component to add a to do
<form #submit.prevent="onSubmit" >
and in my vuex store I have
const state = {
todos:''
};
const getters = {
allTodos: (state) => {return state.todos}
};
const actions = {
async addTodo({commit}, title){
const res = await axios.post('https://jsonplaceholder.typicode.com/todos', {
title,
completed:false
});
commit('newTodo', res.data);
}
};
const mutations = {
newTodo:(state, todo)=>(
state.todos.unshift(todo)
)
};
Is there a way to update all clients that view the todos, without clients have to refresh nothing, as soon as a new todo is added in the state, using only vuex/vue?
Thank you
Is there a way to update all clients that view the todos, without clients have to refresh nothing, as soon as a new todo is added in the state, using only vuex/vue?
No, it is not possible.
There is no link between all your clients. All your Vue/VueX code lives in a single client. Here's what you need to do to get where you want to go, and its a long way from here:
Build a backend server. Here's a Node.js guide
Build an APi in your server. Your clients will make requests to this server to get all todos, and post new todos to the server. Here's an express.js guide
You need a database to store your todos in the server. You can use something like MongoDB or an ORM like Sequelize for node.js.
Now you can either write a code to periodically request the server for todos in the background and update it in your vue components, or you can use a pub/sub library like pusher. Pusher uses WebSockets under the hood for maintaining a persistent bidirectional connection. If you want to, you can implement this on your own, you can read about it here, thanks to #Aurora for the link to the tutorial.
Here's a consolidated guide for doing all this:
https://pusher.com/tutorials/realtime-app-vuejs/
Is there a way to update all clients that view the todos, without clients have to refresh nothing, as soon as a new todo is added in the state, using only vuex/vue?
There's a couple of errors in your code:
change todos:'' to todos:[]
change state.todos.unshift(todo) to state.todos.push(todo)
This way, every time that you call addTodo action, all components connected to allTodos getter will show the latest todos
NOTE:
Vuex/Vue are reactive. So in every page that you see using that showcomponent will show you the last update. If you want to show in every USER CONNECTED, of course you don't need http request, you need WEBSOCKETS

electron.js and sql - correct way to set it up?

I am new to electron.js - been reading the documentation and some similar post here:
How do I make a database call from an Electron front end?
Secure Database Connection in ElectronJS Production App?
Electron require() is not defined
How to use preload.js properly in Electron
But it's still not super clear how to properly implement a secure SQL integration. Basically, I want to create a desktop database client. The app will connect to the remote db and users can run all kind of predefined queries and the results will show up in the app.
The documentation says that if you are working with a remote connection you shouldn't run node in the renderer. Should I then require the SQL module in the main process and use IPC to send data back and forth and preload IPCremote?
Thanks for the help
Short answer: yes
Long answer:
Allowing node on your renderer poses a big security risk for your app. It is best practices in this case to create pass a function to your preloader. There are a few options you can use to do this:
Pass a ipcRenderer.invoke function wrapped in another function to your renderer in your preload. You can then invoke a call to your main process which can either send info back via the same function or via sending it via the window.webContents.send command and listening for it on the window api on your renderer. EG:
Preload.js:
const invoke = (channel, args, cb = () => {return}) => {
ipcRenderer.invoke(channel, args).then((res) => {
cb(res);
});
};
const handle = (channel, cb) => {
ipcRenderer.on(channel, function (Event, message) {
cb(Event, message);
});
};
contextBridge.exposeInMainWorld("GlobalApi", {
invoke: invoke,
handle:handle
});
Renderer:
let users
window.GlobalApi.handle("users", (data)=>{users=data})
window.GlobalApi.invoke("get", "users")
or:
let users;
window.GlobalApi.invoke("get", "users", (data)=>{users=data})
Main:
ipcMain.handle("get", async (path) => {
let data = dbFunctions.get(path)
window.webContents.send(
path,
data
);
}
Create a DB interface in your preload script that passes certain invocations to your renderer that when called will return the value that you need from your db. E.G.
Renderer:
let users = window.myCoolApi.get("users");
Preload.js:
let get = function(path){
let data = dbFuncions.readSomeDatafromDB("path");
return data; // Returning the function itself is a no-no shown below
// return dbFuncions.readSomeDatafromDB("path"); Don't do this
}
contextBridge.exposeInMainWorld("myCoolApi", {
get:get
});
There are more options, but these should generally ensure security as far as my knowledge goes.

Vue PWA caching routes in advance

I'm hoping someone can tell me if I'm barking up the wrong tree. I have built a basic web app using Vue CLI and included the PWA support. Everything seems to work fine, I get the install prompt etc.
What I want to do, is cache various pages (routes) that user hasn't visited before, but so that they can when offline.
The reason here is that I'm planning to build an app for an airline and part of that app will act as an in flight magazine, allowing users to read various articles, however the aircrafts do not have wifi so the users need to download the app in the boarding area and my goal is to then pre cache say the top 10 articles so they can read them during the flight.
Is this possible? and is PWA caching the right way to go about it? Has anyone does this sort of thing before?
Thanks in advance
To "convert" your website to an PWA, you just need few steps.
You need to know that the service worker is not running on the main thread and you cant access for example the DOM inside him.
First create an serviceworker.
For example, go to your root directory of your project and add a javascript file called serviceworker.js this will be your service worker.
Register the service worker.
To register the service worker, you will need to check if its even possible in this browser, and then register him:
if ('serviceWorker' in navigator) {
window.addEventListener('load', function() {
navigator.serviceWorker.register('/serviceworker.js').then(function(registration) {
// Registration was successful
console.log('ServiceWorker registration successful with scope');
}, function(err) {
// registration failed :(
console.log('ServiceWorker registration failed: ', err);
});
});
}
In vue.js you can put this inside mounted() or created() hook.
If you would run this code it will say that the service worker is successfully registered even if we havent wrote any code inside serviceworker.js
The fetch handler
Inside of serviceworker.js its good to create a variable for example CACHE_NAME. This will be the name of your cache where the cached content will be saved at.
var CACHE_NAME = "mycache_v1";
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.open(CACHE_NAME).then(function(cache) {
return cache.match(event.request).then(function (response) {
return response || fetch(event.request).then(function(response) {
cache.put(event.request, response.clone());
return response;
});
});
})
);
});
Everytime you make a network request your request runs through the service worker fetch handler here first. You need to response with event.respondWith()
Next step is you first open your cache called mycache_v1 and take a look inside if there is a match with your request.
Remember: cache.match() wont get rejected if there is no match, it just returns undefined because of that there is a || operator at the return statement.
If there is a match available return the match out of the cache, if not then fetch() the event request.
In the fetch() you save the response inside the cache AND return the response to the user.
This is called cache-first approach because you first take a look inside the cache and in case there is no match you make a fallback to the network.
Actually you could go a step further by adding a catch() at your fetch like this:
return response || fetch(event.request).then(function(response) {
cache.put(event.request, response.clone());
return response;
})
.catch(err => {
return fetch("/offline.html")
});
In case there is nothing inside the cache AND you also have no network error you could response with a offline page.
You ask yourself maybe: "Ok, no cache available and no internet, how is the user supposed to see the offline page, it requires internet connection too to see it right?"
In case of that you can pre-cache some pages.
First you create a array with routes that you want to cache:
var PRE_CACHE = ["/offline.html"];
In our case its just the offline.html page. You are able to add css and js files aswell.
Now you need the install handler:
self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
return cache.addAll(PRE_CACHE);
})
);
});
The install is just called 1x whenever a service worker gets registered.
This just means: Open your cache, add the routes inside the cache. Now if you register you SW your offline.html is pre-cached.
I suggest to read the "Web fundamentals" from the google guys: https://developers.google.com/web/fundamentals/instant-and-offline/offline-cookbook
There are other strategies like: network-first
To be honest i dont know exactly how the routing works with SPAs because SPA is just 1 index.html file that is shipped to the client and the routing is handled by javascript you will need to check it out witch is the best strategie for your app.

Express routes and middleware with Firebase Cloud Functions

Question
How do I use Express within Firebase Cloud Functions?
Expectations
Using either of the URLs I've setup, I expect to see, "Hello from Express on Firebase!" in the console logs.
Why? My understanding is, "*" means all routes requested should response.send("Hello from Express on Firebase!");
app.get("*", (_request, response) => {
response.send("Hello from Express on Firebase!");
});
Issue
When I use, https://us-central1-myapp.cloudfunctions.net/helloWorld I get the expected Hello from Firebase! in the logs. Should I also see "Hello from Express on Firebase!"?
When I use, https://us-central1-myapp.cloudfunctions.net/api, I get a 404 error
The URL, https://us-central1-myapp.cloudfunctions.net/api is the issue. See why in the answer below.
Code
// Express
import express = require("express");
const app = express();
const cors = require("cors")({
origin: "*"
});
app.use("*", cors);
// Firebase Functions SDK
import functions = require("firebase-functions");
app.get("*", (_request, response) => {
response.send("Hello from Express on Firebase!");
});
exports.api = functions.https.onRequest(app);
exports.helloWorld = functions.https.onRequest((_request, response) => {
response.send("Hello from Firebase!");
});
tl;dr
An example of what I'm hoping to accomplish is here, but none of the code examples worked for me. I get I get a 404 error with each one.
The Express Documentation here shows a similar HelloWorld example, but I confuse how Firebase takes the place of app.listen(3000, () => console.log('Example app listening on port 3000!'))
Is cors working properly in my example code? Although I get the expected response and log, Chrome console warns: Cross-Origin Read Blocking (CORB) blocked cross-origin response https://appengine.google.com/_ah/lo....
I have a Slack App that is hitting these URLs (I hit them with chrome too). Eventually, I'd like to use Botkit middleware in my Google Cloud Functions. I don't yet grasp proper setup of Express app.use() and app.get()
Answer
I made a simple mistake by treating /api as a function when it's actually a part of the path.
By using this URL with the trailing /
https://us-central1-myapp.cloudfunctions.net/api/
I'm now hitting the Express route and function.

Can't figure out Parse Hosting - Cloud Code Integration

I've been working on this seemingly simple problem for about a week now and feel like there is conflicting information and am hoping someone can give shed some light for me. I'm trying to use Parse Hosting for a marketing site with bootstrap, just HTML and CSS with a little JS; and Cloud Code to do some simple server side tasks like charging a card via Stripe. Everything in the documentation makes it seem this is easily doable, but the documentation also seems to lead me to believe certain methods aren't.
For example, this video shows a Stripe engineer building exactly what I want. However, it's not abundantly clear that he is using pure HTML and CSS for the front end instead of an Express templating engine (which I am not using) - http://blog.parse.com/videos/parse-developer-day-2013-a-new-kind-of-checkout/
This post says Parse Hosting and Express now work hand in hand, GREAT!
http://blog.parse.com/announcements/building-parse-web-apps-with-the-express-web-framework/
But the documentation (JS > Cloud Hosting > Dynamic Websites) says you have to delete index.html >> "If you choose to use Express or Node.js, you'll first need to delete public/index.html so that requests can get through to your custom handler functions."
I want to have a single page website hosted at public/index.html that uses Stripe Checkout v3 to create a token then pass that to Parse for a quick execution of the charge, but again, every which way I try has been unsuccessful so far.
In addition, I'm thinking Parse Hosting of pure HTML/CSS won't work with Cloud Code the way I want because a simple call of /hello below returns nothing.
Here's my code:
//public
//index.html
<form action="/charge" method="POST">
<script
src="https://checkout.stripe.com/checkout.js" class="stripe-button"
data-key="pk_test_zippitydoo"
data-image="http://image.jpg"
data-name="Thing"
data-description="Shut up and take my money"
data-amount="4000">
</script>
</form>
//cloud
//main.js
var express = require('express');
var app = express();
var Stripe = require('stripe');
Stripe.initialize('sk_test_blahblahblah');
app.get('/hello', function(req, res) {
res.send('hello world');
});
app.post('/charge', function(req, res) {
res.send('Charge Attempt');
token_id = req.body.stripe_token
Stripe.Tokens.retrieve(token_id).then(function(token) {
return Stripe.Charges.create({
amount: 1000,
currency: "usd",
source: token_id
});
});
});
What you need is for express to serve your HTML. To do this, register a static resources directory. In your main.js, after you instantiate your app with var app = express(), do this:
app.use(express.static('public'));
Express should then treat your /public/index.html file as the directory index by default, and your app will serve any other files under /public. More info: http://expressjs.com/4x/api.html#express.static
There are several things I did wrong here. I'll explain my mistakes, then you can compare the below code that works with the above code in the question that doesn't.
1) I wasn't parsing the data I was receiving (see underneath // App configuration section)
2) The JSON that is passed needs to be parsed using CamelCase (stripeToken not stripe_token)
3) The charge is set as a variable, not returned (var = charge instead of return charge). Return may work, I didn't test it however.
4) It is imperative that you include the app.listen(); in order to connect to the public folder from the cloud folder
//cloud
//main.js
var express = require('express');
var Stripe = require('stripe');
Stripe.initialize('sk_test_blahblahblah');
var app = express();
// App configuration section
app.use(express.bodyParser()); // Middleware for reading request body
app.post('/charge', function(req, res) {
var stripeToken = req.body.stripeToken;
var stripeEmail = req.body.stripeEmail;
res.send('Charging your card...');
var charge = Stripe.Charges.create({
amount: price,
currency: "usd",
source: stripeToken,
receipt_email: stripeEmail
}, function(err, charge) {
if (err && err.type === 'StripeCardError') {
res.send('The card has been declined. Please check your card and try again.');
}
});
});
// Attach the Express app to your Cloud Code
app.listen();