Auth info required to subscribe to private-App.User.undefined using Laravel, Vue and Pusher - vue.js

I am trying to send a notification that broadcast on a private channel using pusher with laravel api. Everything is ok while sending notification and pusher server is receiving the event. However, when I try to listen to those events nothing display in the console. The error logs in pusher shows the following:
Here is my code of main.js to listen to broadcast
let userId=document.head.querySelector('meta[name="user-id"').content;
console.log(userId)
window.Echo.private('App.User.' + userId)
.notification((notification) => {
console.log(notification.type);
});
This is my code of bootstrap.js
import Echo from 'laravel-echo';
window.Pusher = require('pusher-js');
window.Echo = new Echo({
broadcaster: 'pusher',
key: process.env.MIX_PUSHER_APP_KEY,
cluster: process.env.MIX_PUSHER_APP_CLUSTER,
forceTLS: true,
});
This is my meta tags from welcome.blade.php
<meta name="csrf-token" content="{{ csrf_token() }}">
<meta name="user-id" content="{{Auth::check() ? Auth::user()->id:''}}">
This is my channels.php file code
Broadcast::channel('App.User.{id}', function ($user, $id) {
return (int) $user->id === (int) $id;
},['guards'=>['api']]);
Finally the boot function from BroadcastServiceProvider.php
public function boot()
{
Broadcast::routes(['middleware' => ['auth:api']]);
require base_path('routes/channels.php');
}
Any help would highly appreciated.Thank in advance.

I had the same problem but using Djago and Javascript. My suggestion is:
make sure the URL that Pusher uses for private channels authentication is working
"Enable client events" from "App settings" is ON

I'm grappling with the same issue. From what I've read, if you're using api, you need to add authentication headers to the echo instance in your bootstrap.js file like so:
window.Echo = new Echo({
broadcaster: 'pusher',
key: process.env.MIX_PUSHER_APP_KEY,
cluster: process.env.MIX_PUSHER_APP_CLUSTER,
forceTLS: true,
auth:{
headers: {
Accept: 'application/json',
Authorization: 'Bearer ',
},
},
});
Try this out.

Related

Cloud Flare, R2, how to upload images?

Cloud Flare, R2, how to upload images??
I`m new to Cloud Flare world,
and I can upload the pictures by dragging but
how to upload image using coding? from application??
do I have to use "WORKERS" <-- things?
I have uploaded files to r2 successfully with rclone.
Configure rclone
First, install rclone on our PC.
And then create a rclone.conf file under the path ~/.config/rclone/.
[r2]
type = s3
provider = Cloudflare
access_key_id = <ACCESS_KEY>
secret_access_key = <SECRET_ACCESS_KEY>
region = auto
endpoint = https://<ACCOUNT_ID>.r2.cloudflarestorage.com
acl = private
[r2]: A custom name(an alias) for storage service. We need to use it to operate files.
type = s3: The type of file operation API. R2 supports the S3 standard protocol.
provider = Cloudflare: The storage provider ID. You could use man rclone to get the supported providers.
access_key_id: You need to create a token with edit permission on the R2 console.
secret_access_key: Same as above.
endpoint: The url that rclone uses to operate files. To get the account id on the top-right of R2 homepage.
Usage
run rclone lsf r2: to see your buckets and rclone lsf r2:my-bucket to show the file list within a bucket.
Especially notice the last symbol :
upload a file:
rclone copy /path/to/file r2:my-bucket
Hey I got stuck with this for two days and was not able to figure it out. So I wanted to share my solution.
My Goal
I was developing a software for collecting data from our end users to get information from them. But they needed an image to be submitted for us to verify them, that is why I needed to have a form to enable users to upload their file and an API endpoint to store their file.
Solution
I set up an Cloudflare worker as an API since it connects well with R2, you just have to define it in your worker settings.
My Cloudflare worker script & Example form for allowing users to upload files
// CLOUDFLARE WORKER SCRIPT
// ------------------------
export default {
async fetch(request, env) {
const corsHeaders = {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, HEAD, POST, OPTIONS',
'Access-Control-Max-Age': '86400',
};
// Check for preflight request from the browser.
if (request.method === 'OPTIONS') {
return new Response(null, {
headers: {
...corsHeaders,
'Access-Control-Allow-Headers': request.headers.get(
'Access-Control-Request-Headers'
),
}
});
} else {
// Handle actual request and store image to bucket.
const { headers } = request;
// Key is date now since we want keys to be unique.
const key = Date.now();
await env.MY_BUCKET.put(key, request.body, {
httpMetadata: {
contentType: headers.get('content-type')
}
});
return new Response('success!', {
headers: {
...corsHeaders,
'Access-Control-Allow-Headers': request.headers.get(
'Access-Control-Request-Headers'
),
}
});
}
}
}
<!DOCTYPE html>
<html>
<head>
<title>Upload Images with Cloudflare R2</title>
</head>
<body>
<form method="POST" enctype="multipart/form-data">
<label for="image">Select image to upload:</label>
<input type="file" name="image" id="image" /><br /><br />
<input type="submit" value="Upload" />
</form>
<script>
async function uploadImage(file) {
fetch('https://<YOUR-OWN-WORKER>.workers.dev', {
method: 'POST',
headers: {
'Access-Control-Allow-Origin': '*',
'Content-Type': file.type
},
body: file
})
.then((response) => response.text())
.then((data) => console.log(data))
.catch((error) => console.error(error));
}
const image = document.getElementById('image');
const onSelectFile = () => uploadImage(image.files[0]);
image.addEventListener('change', onSelectFile, false);
</script>
</body>
</html>

No Host in request URL for Grafana datasource plugin tutorial - Add authentication

I'm trying to follow the example for developing a datasource plugin from Grafana. Ultimately I want my plugin to use Oauth, but even with just the basic Grafana datasource proxy example I seem to be having issues.
I have updated my plugin.json, class and constructor.
I have setup this hard coded example.
in plugin.json
{
"path": "grafana",
"url": "https://github.com"
}
],
And a sample testDataSource()
async testDatasource() {
return getBackendSrv()
.datasourceRequest({
url: this.url + '/grafana/grafana',
method: 'GET',
})
.then(response => {
if (response.status === 200) {
return { status: 'success', message: 'Data source is working', title: 'Success' };
} else {
return { status: 'failure', message: 'Data source is not working: ' + response.status, title: 'Failure' };
}
});
}
When I try and save/test this datasource to call that method, I get in the frontend a
HTTP Error Bad Gateway
And in the logs
t=2021-09-17T14:31:22+0000 lvl=eror msg="Data proxy error" logger=data-proxy-log userId=1 orgId=1 uname=admin path=/api/datasources/proxy/9/grafana/grafana remote_addr=172.17.0.1 referer=http://localhost:3000/datasources/edit/9/ error="http: proxy error: http: no Host in request URL"
I would've expected the request to be routed to the datasource proxy and for that to make the request to github but it seems Grafana is making a request to /api/datasources/proxy/9/grafana/grafana and nothing is picking it up?
Looking up my datasource via API, there's nothing listed for URL.
You will need to render this in your ConfigEditor.tsx
<DataSourceHttpSettings
defaultUrl="http://localhost:8080"
dataSourceConfig={options}
onChange={onOptionsChange}
/>
Which will give you the basic form with URL, whitelist, auth options that you see on most plugins. The URL there I guess should match what you have in your routes.

How do i call third party API data via fastify?

I had a small node server and I use the framework fastify.
In one of my routes, I want to get the data from a third party API.
I tried the following snippet:
fastify.route({
method: 'GET',
url: 'https://demo.api.com/api/v2/project/',
handler: async function ({ params, body}, reply) {
if (!body) return reply.send({ sucess: false })
console.log('testing')
console.log(body)
return reply.send({ sucess: true })
}
})
Unfortunately, I cannot call the URL by get because GET url's can only start with '/'.
How do i call a third pary api via fastify? do i need a extention?
If you need to define a route (like http://localhost:3000/) that proxies another server you need to use fastify-http-proxy.
Or if you need to call another endpoint and manage the response, there is the fastify.inject() utility but it is designed for testing.
Anyway, I think the best approach is to use some HTTP client like got
const got = require('got') // npm install got
fastify.get('/my-endpoint', async function (request, reply) {
const response = await got('sindresorhus.com')
console.log(response.body)
// DO SOMETHING WITH BODY
return { sucess: true }
})
Proxy your http requests to another server, with fastify hooks.
here is the example in fastify-http-proxy
server.register(require('fastify-http-proxy'), {
upstream: 'http://my-api.example.com',
prefix: '/api', // optional
http2: false // optional
})
https://github.com/fastify/fastify-http-proxy/blob/master/example.js

mootools Request class and CORS

I'm trying to use CORS to have a script do an Ajax request to geonames.
My script calls this web service method: http://www.geonames.org/export/web-services.html#findNearby
If you check the response headers of the sample call, they include:
Access-Control-Allow-Origin: *
When I try this with mootools (version 1.4.5 just downloaded):
var urlGeonames = "http://api.geonames.org/findNearbyPlaceName";
var req = new Request({
method: 'get',
url: urlGeonames,
data: {
'lat': '89.18',
'lng': '-0.37',
'username': 'myusername',
'radius': '5'
}
}).send();
then I get an error that says :
XMLHttpRequest cannot load
http://api.geonames.org/findNearbyPlaceName?lat=89.18&lng=-0.37&username=myusername&radius=5.
Origin http://127.0.0.1 is not allowed by Access-Control-Allow-Origin.</pre>
On the other hand, when I try old style Ajax code like this:
invocation = new XMLHttpRequest();
if(invocation)
{
invocation.open('GET', urlFlickr, true);
invocation.onreadystatechange = handler;
invocation.send();
}
then it works and I get the XML response in the XHR responseXML.
I found this post A CORS POST request works from plain javascript, but why not with jQuery? that is similar. But here I'm not dealing with my server so I can only work on the javascript side.
Has anyone worked with CORS and mootools and can help on this issue ?
Thanks so much
JM
Hey man check out mootools more JSONP this will solve your problem:
http://mootools.net/docs/more/Request/Request.JSONP
Also it looks like your forgetting to ask for it in JSON format from geonames.org
Try something like:
var myJSONP = new Request.JSONP({
url: 'http://api.geonames.org/findNearbyPlaceNameJSON',
data: {
'lat': '89.18',
'lng': '-0.37',
'username': 'myusername'
},
onRequest: function(url){
// a script tag is created with a src attribute equal to url
},
onComplete: function(data){
// the request was completed.
console.log(data);
}
}).send();
Hope this helps!
The first answer on this other thread:
MooTools CORS request vs native Javascript
Might help.
Basically, the X-Requested-With header is automatically sent by the Mootools with the request, but the server either has to be configured to accept that header or you can remove it using
delete foo.headers['X-Requested-With'];
Before calling
foo.send();
To allow it by the server, you can add this to the .htaccess file of your script that gives back the JSON data:
Header set Access-Control-Allow-Origin "*"
Header set Access-Control-Allow-Headers "Origin, X-Requested-With, Content-Type, Accept"
So yours would look like:
var myJSON = new Request({
url: 'http://api.geonames.org/findNearbyPlaceNameJSON',
data: {
'lat': '89.18',
'lng': '-0.37',
'username': 'myusername'
},
onRequest: function(url){
// a script tag is created with a src attribute equal to url
},
onComplete: function(data){
// the request was completed.
console.log(data);
}
});
delete myJSON.headers['X-Requested-With'];
myJSON.send();

Socket.IO Authentication

I am trying to use Socket.IO in Node.js, and am trying to allow the server to give an identity to each of the Socket.IO clients. As the socket code is outside the scope of the http server code, it doesn't have easy access to the request information sent, so I'm assuming it will need to be sent up during the connection. What is the best way to
1) get the information to the server about who is connecting via Socket.IO
2) authenticate who they say they are (I'm currently using Express, if that makes things any easier)
Use connect-redis and have redis as your session store for all authenticated users. Make sure on authentication you send the key (normally req.sessionID) to the client. Have the client store this key in a cookie.
On socket connect (or anytime later) fetch this key from the cookie and send it back to the server. Fetch the session information in redis using this key. (GET key)
Eg:
Server side (with redis as session store):
req.session.regenerate...
res.send({rediskey: req.sessionID});
Client side:
//store the key in a cookie
SetCookie('rediskey', <%= rediskey %>); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
//then when socket is connected, fetch the rediskey from the document.cookie and send it back to server
var socket = new io.Socket();
socket.on('connect', function() {
var rediskey = GetCookie('rediskey'); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
socket.send({rediskey: rediskey});
});
Server side:
//in io.on('connection')
io.on('connection', function(client) {
client.on('message', function(message) {
if(message.rediskey) {
//fetch session info from redis
redisclient.get(message.rediskey, function(e, c) {
client.user_logged_in = c.username;
});
}
});
});
I also liked the way pusherapp does private channels.
A unique socket id is generated and
sent to the browser by Pusher. This is
sent to your application (1) via an
AJAX request which authorizes the user
to access the channel against your
existing authentication system. If
successful your application returns an
authorization string to the browser
signed with you Pusher secret. This is
sent to Pusher over the WebSocket,
which completes the authorization (2)
if the authorization string matches.
Because also socket.io has unique socket_id for every socket.
socket.on('connect', function() {
console.log(socket.transport.sessionid);
});
They used signed authorization strings to authorize users.
I haven't yet mirrored this to socket.io, but I think it could be pretty interesting concept.
I know this is bit old, but for future readers in addition to the approach of parsing cookie and retrieving the session from the storage (eg. passport.socketio ) you might also consider a token based approach.
In this example I use JSON Web Tokens which are pretty standard. You have to give to the client page the token, in this example imagine an authentication endpoint that returns JWT:
var jwt = require('jsonwebtoken');
// other requires
app.post('/login', function (req, res) {
// TODO: validate the actual user user
var profile = {
first_name: 'John',
last_name: 'Doe',
email: 'john#doe.com',
id: 123
};
// we are sending the profile in the token
var token = jwt.sign(profile, jwtSecret, { expiresInMinutes: 60*5 });
res.json({token: token});
});
Now, your socket.io server can be configured as follows:
var socketioJwt = require('socketio-jwt');
var sio = socketIo.listen(server);
sio.set('authorization', socketioJwt.authorize({
secret: jwtSecret,
handshake: true
}));
sio.sockets
.on('connection', function (socket) {
console.log(socket.handshake.decoded_token.email, 'has joined');
//socket.on('event');
});
The socket.io-jwt middleware expects the token in a query string, so from the client you only have to attach it when connecting:
var socket = io.connect('', {
query: 'token=' + token
});
I wrote a more detailed explanation about this method and cookies here.
Here is my attempt to have the following working:
express: 4.14
socket.io: 1.5
passport (using sessions): 0.3
redis: 2.6 (Really fast data structure to handle sessions; but you can use others like MongoDB too. However, I encourage you to use this for session data + MongoDB to store other persistent data like Users)
Since you might want to add some API requests as well, we'll also use http package to have both HTTP and Web socket working in the same port.
server.js
The following extract only includes everything you need to set the previous technologies up. You can see the complete server.js version which I used in one of my projects here.
import http from 'http';
import express from 'express';
import passport from 'passport';
import { createClient as createRedisClient } from 'redis';
import connectRedis from 'connect-redis';
import Socketio from 'socket.io';
// Your own socket handler file, it's optional. Explained below.
import socketConnectionHandler from './sockets';
// Configuration about your Redis session data structure.
const redisClient = createRedisClient();
const RedisStore = connectRedis(Session);
const dbSession = new RedisStore({
client: redisClient,
host: 'localhost',
port: 27017,
prefix: 'stackoverflow_',
disableTTL: true
});
// Let's configure Express to use our Redis storage to handle
// sessions as well. You'll probably want Express to handle your
// sessions as well and share the same storage as your socket.io
// does (i.e. for handling AJAX logins).
const session = Session({
resave: true,
saveUninitialized: true,
key: 'SID', // this will be used for the session cookie identifier
secret: 'secret key',
store: dbSession
});
app.use(session);
// Let's initialize passport by using their middlewares, which do
//everything pretty much automatically. (you have to configure login
// / register strategies on your own though (see reference 1)
app.use(passport.initialize());
app.use(passport.session());
// Socket.IO
const io = Socketio(server);
io.use((socket, next) => {
session(socket.handshake, {}, next);
});
io.on('connection', socketConnectionHandler);
// socket.io is ready; remember that ^this^ variable is just the
// name that we gave to our own socket.io handler file (explained
// just after this).
// Start server. This will start both socket.io and our optional
// AJAX API in the given port.
const port = 3000; // Move this onto an environment variable,
// it'll look more professional.
server.listen(port);
console.info(`🌐 API listening on port ${port}`);
console.info(`🗲 Socket listening on port ${port}`);
sockets/index.js
Our socketConnectionHandler, I just don't like putting everything inside server.js (even though you perfectly could), especially since this file can end up containing quite a lot of code pretty quickly.
export default function connectionHandler(socket) {
const userId = socket.handshake.session.passport &&
socket.handshake.session.passport.user;
// If the user is not logged in, you might find ^this^
// socket.handshake.session.passport variable undefined.
// Give the user a warm welcome.
console.info(`⚡︎ New connection: ${userId}`);
socket.emit('Grettings', `Grettings ${userId}`);
// Handle disconnection.
socket.on('disconnect', () => {
if (process.env.NODE_ENV !== 'production') {
console.info(`⚡︎ Disconnection: ${userId}`);
}
});
}
Extra material (client):
Just a very basic version of what the JavaScript socket.io client could be:
import io from 'socket.io-client';
const socketPath = '/socket.io'; // <- Default path.
// But you could configure your server
// to something like /api/socket.io
const socket = io.connect('localhost:3000', { path: socketPath });
socket.on('connect', () => {
console.info('Connected');
socket.on('Grettings', (data) => {
console.info(`Server gretting: ${data}`);
});
});
socket.on('connect_error', (error) => {
console.error(`Connection error: ${error}`);
});
References:
I just couldn't reference inside the code, so I moved it here.
1: How to set up your Passport strategies: https://scotch.io/tutorials/easy-node-authentication-setup-and-local#handling-signupregistration
This article (http://simplapi.wordpress.com/2012/04/13/php-and-node-js-session-share-redi/) shows how to
store sessions of the HTTP server in Redis (using Predis)
get these sessions from Redis in node.js by the session id sent in a cookie
Using this code you are able to get them in socket.io, too.
var io = require('socket.io').listen(8081);
var cookie = require('cookie');
var redis = require('redis'), client = redis.createClient();
io.sockets.on('connection', function (socket) {
var cookies = cookie.parse(socket.handshake.headers['cookie']);
console.log(cookies.PHPSESSID);
client.get('sessions/' + cookies.PHPSESSID, function(err, reply) {
console.log(JSON.parse(reply));
});
});
use session and Redis between c/s
Server side
io.use(function(socket, next) {
// get here session id
console.log(socket.handshake.headers.cookie); and match from redis session data
next();
});
this should do it
//server side
io.sockets.on('connection', function (con) {
console.log(con.id)
})
//client side
var io = io.connect('http://...')
console.log(io.sessionid)