Resolving audio broadcasting error: client join failed DYNAMIC_KEY_EXPIRED (Agora.io) - agora.io

I'm a server side developer with rudimentary JS knowhow. I'm tinkering with Agora's audio broadcasting functionality (specifically for the web). For reference, I've been following this: https://docs.agora.io/en/Audio%20Broadcast/start_live_audio_web?platform=Web
I'm attempting to broadcast audio as a host. I have an HTML button which fires a JS function where I:
Initialize a client
Set the role
Join a predefined channel
Publish a local stream
My understanding is that accomplshing the aforementioned will enable me to broadcast audio. When I try this, I end up getting a client join failed DYNAMIC_KEY_EXPIRED error. I'm unable to find documentation regarding how to resolve this. Can you help me resolve this? An illustrative example would be nice.
My JS code is below. Note that I'm using a temp token to test this functionality on localhost.
// rtc object
var rtc = {
client: null,
joined: false,
published: false,
localStream: null,
remoteStreams: [],
params: {}
};
// Options for joining a channel
var option = {
appID: "anAppID",// from 'Project Management' dashboard
channel: "AudioLive",
uid: null,//The user ID should be unique in a channel. If you set the user ID as null or 0, the Agora server assigns a user ID and returns it in the onSuccess callback.
token: "aTempToken"// TEMP Token
}
function createBroadcast(role) {
console.log("entered createBroadcast");
// Create a client
rtc.client = AgoraRTC.createClient({mode: "live", codec: "h264"});
// Initialize the client
rtc.client.init(option.appID, function () {
console.log("init success");
// Note: in a live broadcast, only the host can be heard and seen. You can also call setClientRole() to change the user role after joining a channel.
rtc.client.setClientRole(role);
console.log("role is set");
// Call Client.join in the onSuccess callback of Client.init
rtc.client.join(option.token ? option.token : null, option.channel, option.uid ? +option.uid : null, function (uid) {
console.log("join channel: " + option.channel + " success, uid: " + uid);
rtc.params.uid = uid;
// Call AgoraRTC.createStream to create a stream in the onSuccess callback of Client.join
rtc.localStream = AgoraRTC.createStream({
streamID: rtc.params.uid,
audio: true,
video: false,
screen: false,
})
// Call Stream.init to initialize the stream after 'creating' the stream above
// Initialize the local stream
rtc.localStream.init(function () {
console.log("init local stream success");
// play stream with html element id "local_stream"
rtc.localStream.play("local_stream");
// Call Client.publish in the onSuccess callback of Stream.init to publish the local stream
// Publish the local stream
rtc.client.publish(rtc.localStream, function (err) {
console.log("publish failed");
console.error(err);
})
}, function (err) {
console.error("init local stream failed ", err);
});
}, function(err) {
console.error("client join failed", err)
})
}, (err) => {
console.error(err);
});
}
<div style="background:#f0f3f4;padding:20px">
<button id="broadast" style="height:40px;width:200px" onclick="createBroadcast('host')">Start Live Broadcast</button>
</div>
I've not added the actual values for appID and token in the code above.
Note: Please ask for more information in case you require it.

The error that you are facing is due to the expiry of the token generated for authentication purposes while generating an APP ID. To resolve this you will have to generate a new token as elaborated in the below given links:
Token-expired
renewToken
A token (or a temporary token) expires after a certain period of time. When the SDK notifies the client that the token is about to expire or has expired by the onTokenPrivilegeWillExpire or onTokenPrivilegeDidExpire callbacks, you need to generate a new token and call the renewToken method.
client.on("onTokenPrivilegeWillExpire", function(){
//After requesting a new token
client.renewToken(token);
});
client.on("onTokenPrivilegeDidExpire", function(){
//After requesting a new token
client.renewToken(token);
});
Include the above functions in your javascript code along with the rest of the eventListeners.
Incase your application doesn't require security you can opt to not use a token and generate an App ID without a certificate.
App ID without certificate
Do get back for further support incase the issue remains unresolved.

Related

getaddrinfo ENOTFOUND API Google Cloud

I'm trying to execute API.AI tutorial for building a weather bot for Google Assistant (the one here: https://dialogflow.com/docs/getting-started/basic-fulfillment-conversation)
I made everything successfully, created the bot within API, created the Fulfillments, installed NodeJS on my pc, connected Google Cloud Platform, etc.
Then I created the index.js file by copying it exactly how it's stated on API.ai tutorial with my API key from World Weather Organisation (see below).
But when I use the bot, it doesn't work. On the Google Cloud Platform the error is always the same:
Error: getaddrinfo ENOTFOUND api.worldweatheronline.com
api.worldweatheronline.com:80
at errnoException (dns.js:28)
at GetAddrInfoReqWrap.onlookup (dns.js:76)
No matter how often I do it I get the same error. So I don't actually reach the API. I tried to see if anything changed from WWO side (URL, etc.) but apparently no. I updated NodeJS and still same issue. I refreshed the Google Cloud platform completely and didn't help.
That one I really can't debug. Could anyone help?
Here's the code from API.ai:
'use strict';
const http = require('http');
const host = 'api.worldweatheronline.com';
const wwoApiKey = '[YOUR_API_KEY]';
exports.weatherWebhook = (req, res) => {
// Get the city and date from the request
let city = req.body.result.parameters['geo-city']; // city is a required param
// Get the date for the weather forecast (if present)
let date = '';
if (req.body.result.parameters['date']) {
date = req.body.result.parameters['date'];
console.log('Date: ' + date);
}
// Call the weather API
callWeatherApi(city, date).then((output) => {
// Return the results of the weather API to Dialogflow
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ 'speech': output, 'displayText': output }));
}).catch((error) => {
// If there is an error let the user know
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ 'speech': error, 'displayText': error }));
});
};
function callWeatherApi (city, date) {
return new Promise((resolve, reject) => {
// Create the path for the HTTP request to get the weather
let path = '/premium/v1/weather.ashx?format=json&num_of_days=1' +
'&q=' + encodeURIComponent(city) + '&key=' + wwoApiKey + '&date=' + date;
console.log('API Request: ' + host + path);
// Make the HTTP request to get the weather
http.get({host: host, path: path}, (res) => {
let body = ''; // var to store the response chunks
res.on('data', (d) => { body += d; }); // store each response chunk
res.on('end', () => {
// After all the data has been received parse the JSON for desired data
let response = JSON.parse(body);
let forecast = response['data']['weather'][0];
let location = response['data']['request'][0];
let conditions = response['data']['current_condition'][0];
let currentConditions = conditions['weatherDesc'][0]['value'];
// Create response
let output = `Current conditions in the ${location['type']}
${location['query']} are ${currentConditions} with a projected high of
${forecast['maxtempC']}°C or ${forecast['maxtempF']}°F and a low of
${forecast['mintempC']}°C or ${forecast['mintempF']}°F on
${forecast['date']}.`;
// Resolve the promise with the output text
console.log(output);
resolve(output);
});
res.on('error', (error) => {
reject(error);
});
});
});
}
Oh boy, in fact the reason was most stupid ever. I didn't enable "billing" on Google Cloud Platform and that's why it blocked everything (even though I'm using a free test of the API). They just wanted my credit card number. It works now
I had the same issue trying to hit my db. Billing wasn't the fix as I had billing enabled already.
For me it was knexfile.js setup for MySql - specifically the connection object. In that object, you should replace the host key with socketPath; and prepend /cloudsql/ to the value. Here's an example:
connection: {
// host: process.env.APP_DB_HOST, // The problem
socketPath: `/cloudsql/${process.env.APP_DB_HOST}`, // The fix
database: process.env.APP_DB_NAME,
user: process.env.APP_DB_USR,
password: process.env.APP_DB_PWD
}
Where process.env.APP_DB_HOST is your Instance connection name.
PS: I imagine that even if you're not using Knex, the host or server parameter of a typical DB connectionstring will have to be called socketPath when connecting to Google Cloud SQL.

How to store the data in local device using JSONStore in worklight?

I'm doing Login Page in worklight using JavaScript and jquery, the username and password should validate the data getting from JSONstore?
How to store the data locally using JSONStore in worklight and how does i get the data from JSONStore while validating the username and password?
In below code where my data will store and get, if the username and password has typed where it validate:
var collections = {
people : {
searchFields : {name: 'string'}
},
orders : {
searchFields: {name: 'string'}
}
};
WL.JSONStore.init(collections)
.then(function () {
return WL.JSONStore.init(collections);
})
.then(function () {
return WL.JSONStore.init(collections);
})
.then(function () {
alert('Multiple inits worked');
})
.fail(function (err) {
lert('Multiple inits failed' + err.toString());
});
How to solve the issue?
You really should never ever store username and password locally in the device. That does not sound very secure...
Additionally, where is the username and password coming from? How should the logic be able to validate the credentials? It needs to compare whatever is inputted with something, to know that it is correct. An implementation cannot be done without otherwise, so you need to provide the answer to this...
In the meanwhile, you can take a look at the following tutorial: Offline Authentication.
The included sample application assumes you have first authenticated with a backend system, and later allows for authenticating locally, "offline", in case an Internet connection is not available. For this it uses JSONStore to securely authenticate.
The tutorial include a thorough implementation example, be sure to follow it, and to provide the missing information in your question.
This tutorial explains how to use the JSONStore API, including the Add method: https://developer.ibm.com/mobilefirstplatform/documentation/getting-started-7-1/foundation/data/jsonstore/jsonstore-javascript-api/
var collectionName = 'people';
var options = {};
var data = {name: 'yoel', age: 23};
WL.JSONStore.get(collectionName).add(data, options).then(function () {
// handle success
}).fail(function (error) {
// handle failure
});

What should I consider when I am doing an authentication process with a titanium app?

Hello it's my first time doing a sign in process in a mobile app with Titanium and I wonder what information should I save and the best practice to do it?
My server is configured in this way:
The server requires I send a user and password and if the information match it will provide a token session.
This is the code I use for signing in:
function signIn(e) {
//function to use HTTP to connect to a web server and transfer the data.
var sendit = Ti.Network.createHTTPClient({
onerror : function(e) {
Ti.API.debug(e.error);
alert('There was an error during the connection');
},
timeout : 100000,
});
//Here you have to change it for your local ip
sendit.open('POST', 'http://myserver');
var params = {
user : $.txtUsuario.value,
password : $.txtPassword.value
};
sendit.send(params);
//Function to be called upon a successful response
sendit.onload = function() {
var json = this.responseText;
var response = JSON.parse(json);
if (response.success == "true")
{
var landing = Alloy.createController("menu").getView();
$.index.close();
landing.open();
}
else
{
alert(response);
}
};
};
the code above is working, however I do not know how to manage the sign out. I would like my application works like the most apps do, e.g:
You sign in once and after that if you do not close the app you are able to continues using it and even making a request.
Thank you for any explanation.
It depends on your app requirements. for exemple if you will use the token in your app later you can save it as an AppProperty :
Ti.App.Properties.setString('token',yourTokenGoHere);
and in the app starting you can get it back :
var myToken = Ti.App.Properties.getString('token');
and then you can make a test for example if the token is still valid or not :
if(myToken === 'invalidtoken')
youSholdLogin();
else
youCanGoFurther();
and when the user disconnect rest the token to be invalid :
Ti.App.Properties.setString('token', 'invalidtoken');

Can't get GCM push messages being sent properly

So my GCM push message works if I use this test link
http://www.androidbegin.com/tutorial/gcm.html
Here's the response
{ "multicast_id":7724943165862866717,
"success":1,
"failure":0,
"canonical_ids":0,
"results":[{"message_id":"0:1418649384921891% 7fd2b314f9fd7ecd"}]}
However if I push using my own service using node push service using the toothlessgear/node-gcm lib
https://github.com/ToothlessGear/node-gcm I get a success message on the server but no msg makes it to the client
{ multicast_id: 5130374164465991000,
success: 1,
failure: 0,
canonical_ids: 0,
results: [ { message_id: '0:1418649238305331%7fd2b3145bca2e79' } ] }
I also tried the same message using pushwoosh and push woosh doesn't work either. How come I'm getting a success message on the server, but no push is received on the client on the latter two services. Is there some sort of ip configuration that I need to do, or some sort of certificate? I've used the same google api server key which is open to all ips on all 3 of these services.
Why does the response show success on the latter but no msg gets received on the client?
Node service server side code
var gcm = require('node-gcm');
// create a message with default values
var message = new gcm.Message();
// or with object values
var message = new gcm.Message({
collapseKey: 'demo',
delayWhileIdle: true,
timeToLive: 3,
data: {
key1: 'message1',
key2: 'message2'
}
});
var sender = new gcm.Sender('insert Google Server API Key here');
var registrationIds = ['regId1'];
/**
* Params: message-literal, registrationIds-array, No. of retries, callback-function
**/
sender.send(message, registrationIds, 4, function (err, result) {
console.log(result);
});
So the pushes were correctly being sent, my issue was with the cordova plugin on the client which requires that the android payload for "message" or "title" be set. The sample php just coincidentally was setting the message property and that's why it worked.
Updating the code to add the following to the data
data: {message:'test'}
works correctly

Socket.IO Authentication

I am trying to use Socket.IO in Node.js, and am trying to allow the server to give an identity to each of the Socket.IO clients. As the socket code is outside the scope of the http server code, it doesn't have easy access to the request information sent, so I'm assuming it will need to be sent up during the connection. What is the best way to
1) get the information to the server about who is connecting via Socket.IO
2) authenticate who they say they are (I'm currently using Express, if that makes things any easier)
Use connect-redis and have redis as your session store for all authenticated users. Make sure on authentication you send the key (normally req.sessionID) to the client. Have the client store this key in a cookie.
On socket connect (or anytime later) fetch this key from the cookie and send it back to the server. Fetch the session information in redis using this key. (GET key)
Eg:
Server side (with redis as session store):
req.session.regenerate...
res.send({rediskey: req.sessionID});
Client side:
//store the key in a cookie
SetCookie('rediskey', <%= rediskey %>); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
//then when socket is connected, fetch the rediskey from the document.cookie and send it back to server
var socket = new io.Socket();
socket.on('connect', function() {
var rediskey = GetCookie('rediskey'); //http://msdn.microsoft.com/en-us/library/ms533693(v=vs.85).aspx
socket.send({rediskey: rediskey});
});
Server side:
//in io.on('connection')
io.on('connection', function(client) {
client.on('message', function(message) {
if(message.rediskey) {
//fetch session info from redis
redisclient.get(message.rediskey, function(e, c) {
client.user_logged_in = c.username;
});
}
});
});
I also liked the way pusherapp does private channels.
A unique socket id is generated and
sent to the browser by Pusher. This is
sent to your application (1) via an
AJAX request which authorizes the user
to access the channel against your
existing authentication system. If
successful your application returns an
authorization string to the browser
signed with you Pusher secret. This is
sent to Pusher over the WebSocket,
which completes the authorization (2)
if the authorization string matches.
Because also socket.io has unique socket_id for every socket.
socket.on('connect', function() {
console.log(socket.transport.sessionid);
});
They used signed authorization strings to authorize users.
I haven't yet mirrored this to socket.io, but I think it could be pretty interesting concept.
I know this is bit old, but for future readers in addition to the approach of parsing cookie and retrieving the session from the storage (eg. passport.socketio ) you might also consider a token based approach.
In this example I use JSON Web Tokens which are pretty standard. You have to give to the client page the token, in this example imagine an authentication endpoint that returns JWT:
var jwt = require('jsonwebtoken');
// other requires
app.post('/login', function (req, res) {
// TODO: validate the actual user user
var profile = {
first_name: 'John',
last_name: 'Doe',
email: 'john#doe.com',
id: 123
};
// we are sending the profile in the token
var token = jwt.sign(profile, jwtSecret, { expiresInMinutes: 60*5 });
res.json({token: token});
});
Now, your socket.io server can be configured as follows:
var socketioJwt = require('socketio-jwt');
var sio = socketIo.listen(server);
sio.set('authorization', socketioJwt.authorize({
secret: jwtSecret,
handshake: true
}));
sio.sockets
.on('connection', function (socket) {
console.log(socket.handshake.decoded_token.email, 'has joined');
//socket.on('event');
});
The socket.io-jwt middleware expects the token in a query string, so from the client you only have to attach it when connecting:
var socket = io.connect('', {
query: 'token=' + token
});
I wrote a more detailed explanation about this method and cookies here.
Here is my attempt to have the following working:
express: 4.14
socket.io: 1.5
passport (using sessions): 0.3
redis: 2.6 (Really fast data structure to handle sessions; but you can use others like MongoDB too. However, I encourage you to use this for session data + MongoDB to store other persistent data like Users)
Since you might want to add some API requests as well, we'll also use http package to have both HTTP and Web socket working in the same port.
server.js
The following extract only includes everything you need to set the previous technologies up. You can see the complete server.js version which I used in one of my projects here.
import http from 'http';
import express from 'express';
import passport from 'passport';
import { createClient as createRedisClient } from 'redis';
import connectRedis from 'connect-redis';
import Socketio from 'socket.io';
// Your own socket handler file, it's optional. Explained below.
import socketConnectionHandler from './sockets';
// Configuration about your Redis session data structure.
const redisClient = createRedisClient();
const RedisStore = connectRedis(Session);
const dbSession = new RedisStore({
client: redisClient,
host: 'localhost',
port: 27017,
prefix: 'stackoverflow_',
disableTTL: true
});
// Let's configure Express to use our Redis storage to handle
// sessions as well. You'll probably want Express to handle your
// sessions as well and share the same storage as your socket.io
// does (i.e. for handling AJAX logins).
const session = Session({
resave: true,
saveUninitialized: true,
key: 'SID', // this will be used for the session cookie identifier
secret: 'secret key',
store: dbSession
});
app.use(session);
// Let's initialize passport by using their middlewares, which do
//everything pretty much automatically. (you have to configure login
// / register strategies on your own though (see reference 1)
app.use(passport.initialize());
app.use(passport.session());
// Socket.IO
const io = Socketio(server);
io.use((socket, next) => {
session(socket.handshake, {}, next);
});
io.on('connection', socketConnectionHandler);
// socket.io is ready; remember that ^this^ variable is just the
// name that we gave to our own socket.io handler file (explained
// just after this).
// Start server. This will start both socket.io and our optional
// AJAX API in the given port.
const port = 3000; // Move this onto an environment variable,
// it'll look more professional.
server.listen(port);
console.info(`🌐 API listening on port ${port}`);
console.info(`🗲 Socket listening on port ${port}`);
sockets/index.js
Our socketConnectionHandler, I just don't like putting everything inside server.js (even though you perfectly could), especially since this file can end up containing quite a lot of code pretty quickly.
export default function connectionHandler(socket) {
const userId = socket.handshake.session.passport &&
socket.handshake.session.passport.user;
// If the user is not logged in, you might find ^this^
// socket.handshake.session.passport variable undefined.
// Give the user a warm welcome.
console.info(`⚡︎ New connection: ${userId}`);
socket.emit('Grettings', `Grettings ${userId}`);
// Handle disconnection.
socket.on('disconnect', () => {
if (process.env.NODE_ENV !== 'production') {
console.info(`⚡︎ Disconnection: ${userId}`);
}
});
}
Extra material (client):
Just a very basic version of what the JavaScript socket.io client could be:
import io from 'socket.io-client';
const socketPath = '/socket.io'; // <- Default path.
// But you could configure your server
// to something like /api/socket.io
const socket = io.connect('localhost:3000', { path: socketPath });
socket.on('connect', () => {
console.info('Connected');
socket.on('Grettings', (data) => {
console.info(`Server gretting: ${data}`);
});
});
socket.on('connect_error', (error) => {
console.error(`Connection error: ${error}`);
});
References:
I just couldn't reference inside the code, so I moved it here.
1: How to set up your Passport strategies: https://scotch.io/tutorials/easy-node-authentication-setup-and-local#handling-signupregistration
This article (http://simplapi.wordpress.com/2012/04/13/php-and-node-js-session-share-redi/) shows how to
store sessions of the HTTP server in Redis (using Predis)
get these sessions from Redis in node.js by the session id sent in a cookie
Using this code you are able to get them in socket.io, too.
var io = require('socket.io').listen(8081);
var cookie = require('cookie');
var redis = require('redis'), client = redis.createClient();
io.sockets.on('connection', function (socket) {
var cookies = cookie.parse(socket.handshake.headers['cookie']);
console.log(cookies.PHPSESSID);
client.get('sessions/' + cookies.PHPSESSID, function(err, reply) {
console.log(JSON.parse(reply));
});
});
use session and Redis between c/s
Server side
io.use(function(socket, next) {
// get here session id
console.log(socket.handshake.headers.cookie); and match from redis session data
next();
});
this should do it
//server side
io.sockets.on('connection', function (con) {
console.log(con.id)
})
//client side
var io = io.connect('http://...')
console.log(io.sessionid)