How is everyone going about implementing scheduled jobs / cloud jobs on parse-server? - jobs

According to the parse-server migration guide we could use something like Kue and Kue-UI to emulate the parse.com scheduled jobs functionality.
I haven't implemented Kue or Kue-ui, but looking at the guides, it doesn't look like it provides anywhere close to the same level of functionality as the existing parse.com scheduled jobs. Is this observation correct? Has someone implemented this? Is it true that jobs have to be scheduled through Kue in javascript and Kue-ui only provides a summary of the current status of the jobs and new schedules can't be added through Kue-ui?
Has anyone tried to achieve the same outcome with something like Jenkins? So this is what I had in mind:
each job would still be defined in the cloud code Parse.Cloud.job("job01", function(request, response) {));
modify parse-server slightly to expose job at a similar url to existing cloud functions e.g. /parse/jobs/job01 (this might exist in parse-server soon: github.com/ParsePlatform/parse-server/pull/2560)
create a new jenkins job do a curl at that url
define a cron like schedule for that jenkins job from within the jenkins web ui
I can see the benefits being:
little to no coding
setting up jenkins sounds like much less work then setting up kue, redis and kue-ui
existing cloud job / definitions stay exactly the same
schedule and manually trigger jobs through the jenkins web ui
The only thing that the current parse.com schedule jobs / cloud jobs can do that a jenkins based solution can't is being able to select a job name to create a new schedule for from a drop down list.
Am I missing something? How is everyone else going about this? Thanks.

I ended up doing this with ´node-schedule´. Not sure if it is the best option but it is working fine for me.
index.js
var schedule = require('node-schedule');
var request = require('request');
schedule.scheduleJob('*/15 * * * *', function() {
var options = {
url: serverUrl + '/functions/{function name}}',
headers: {
'X-Parse-Application-Id': appID,
'X-Parse-Master-Key': masterKey,
'Content-Type': 'application/json'
}
};
request.post(options, function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body);
}
});
});
main.js
Parse.Cloud.define('{function name}', function(req, res) {
//Check for master key to prevent users to call this
if (req.master === true) {
//Do operations here
} else {
res.error('Need Master Key');
}
});

I use kue for this purpose. I've described the appoach in my article. In short, this function:
Parse.Cloud.job("sendReport", function(request, response) {
Parse.Cloud.httpRequest({
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
url: "https://example.com/url/", // Webhook url
body: "body goes here",
success: function(httpResponse) {
console.log("Successfully POSTed to the webhook");
},
error: function(httpResponse) {
console.error("Couldn't POST to webhook: " + httpResponse);
}
});
});
becomes this:
// Create a kue instance and a queue.
var kue = require('kue-scheduler');
var Queue = kue.createQueue();
var jobName = "sendReport";
// Create a job instance in the queue.
var job = Queue
.createJob(jobName)
// Priority can be 'low', 'normal', 'medium', 'high' and 'critical'
.priority('normal')
// We don't want to keep the job in memory after it's completed.
.removeOnComplete(true);
// Schedule it to run every 60 minutes. Function every(interval, job) accepts interval in either a human-interval String format or a cron String format.
Queue.every('60 minutes', job);
// Processing a scheduled job.
Queue.process(jobName, sendReport);
// The body of job goes here.
function sendReport(job, done) {
Parse.Cloud.httpRequest({
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
url: "https://example.com/url/", // Webhook url
body: "body goes here"}).then(function(httpResponse) {
console.log("Successfully POSTed to the webhook");
// Don't forget to run done() when job is done.
done();
}, function(httpResponse) {
var errorMessage = "Couldn't POST to webhook: " + httpResponse;
console.error(errorMessage);
// Pass Error object to done() to mark this job as failed.
done(new Error(errorMessage));
});
}
It works fine, though I noticed that sometimes kue-scheduler fires the event more often than needed. See this issue for more info: https://github.com/lykmapipo/kue-scheduler/issues/45

If you are on AWS this may be an option:
Create a AWS CloudWatch Event Rule that is triggered at specific intervals and calls a Lambda function. The Event Rule can pass parameters to the Lambda function.
Create a simple Lambda function that calls a Cloud Code function / job. If you receive the cloud code function name and other parameters from the Event Rule, you only need one generic Lambda function for any Cloud Code call.
This has several advantages as Event Rules are part of AWS infrastructure and can be easily integrated with other AWS services. For example, you can set up intelligent queueing of Event Rule calls so that if the previous call did not complete yet, you discard the next call in the queue, overflow to another queue, notify an operator, etc.

You can use the parse-server-scheduler npm-module.
It does not require any external server and simply allows you to setup scheduling in parse-dashboard.

Related

Axios onUploadProgress jumps at 100% even before the request is done

I have a controller that calls multiple external api from Gmail API concurrently, and then saves the retrieved data into the database, all of this in ONE post request.
To start, I pooled the external api requests, map the results and save it to db.
In my Controller:
$resp = Http::pool(fn (Pool $pool) [
$pool->as('pool-1')->withToken($user->token)->get('https://....call-1');
$pool->as('pool-2')->withToken($user->token)->get('https://....call-2');
$pool->as('pool-3')->withToken($user->token)->get('https://....call-3');
]);
collect($resp)->map(function ($req) {
Email::firstOrCreate(...);
})
In my vue component:
const config = {
onUploadProgress: progressEvent => {
console.log(`sent: ${Math.floor((progressEvent.loaded * 100) / progressEvent.total)}`)
}
}
axios.post(url, param, config).then(response => {}).catch(err => {})
Now, when I check in console.
It should be logging:
sent:0
sent:1
sent:2
...
sent:100
But instead, it automatically logs sent:100 even though the post request is still pending.
Is this a bug in chrome? or axios?, or perhaps it has something to do with external api calls?
Or if it isn't, can someone point out where I went wrong?

Trying to log in to gmail while using TestCafe

I am learning TestCafe and am trying to create an account on a website and then logging in to Gmail to find the activation link. When I try to do this I just get a browser isn't secure message when I get to the part to enter a password. How do I get Gmail to trust TestCafe?
While you might succeed in doing so, this is not a good approach because:
it's slow doing this via GUI
it's britle because selectors will likely change, and you have no control over Google email selectors, so you won't even know if they change them
A better approach wuld be to use a service like Mailosaur where you can create an account and receive emails that you can later query via an API. Instead of doing a whole e2e flow over GUI, you request an email on Mailosaur's API, and if such an email exists, you'll receive a response you can parse and check for various things.
I've done this in the past, you can see my post here: https://sqa.stackexchange.com/questions/40427/automating-verification-of-sent-email-sms-messages/45721#45721 It's exactly Mailosaur and Testcafe (plus it requires axios as a package), so it seems to be what you're looking for.
To add the same code here:
import config from '../config';
import { customAlphabet } from 'nanoid';
import axios from 'axios';
import Newsletter from '../Objects/newsletter';
async function request (reqObject) {
try {
return await axios(reqObject);
} catch (error) {
console.error(error);
}
}
function serverId () {
return process.env.MAILOSAUR_SERVER_ID;
}
function mailosaurFullEmail (id) {
return (id ? id : nanoid()) + '.' + serverId()
+ '#' + config.mailosaurDomain;
}
fixture `Newsletter`
.page(baseUrl);
test
('Sign Up For Newsletter', async t => {
const id = (customAlphabet('1234567890', 10))();
await t
.typeText(Newsletter.newsEmailInput, mailosaurFullEmail(id))
.click(Newsletter.consent)
.click(Newsletter.sendButton);
let res = await request({
method: 'POST',
url: config.mailosaurUrlEmail + serverId(),
headers: {
'Authorization': 'Basic '
+ Buffer.from(process.env.MAILOSAUR_API_KEY)
.toString('base64'),
'Content-Type': 'application/json'
},
data: {
sentTo: mailosaurFullEmail(id)
}
});
await t
.expect(res.status).eql(200);
});
and it requires some config values:
{
"mailosaurUrlEmail": "https://mailosaur.com/api/messages/await?server=",
"mailosaurDomain": "mailosaur.io"
}
This is definitely much better, but it still has some limitations:
Mailosaur's API can still change, so it won't be exactly without any maintenance
it assumes that an email is sent immediately after a user action (newsletter in my case), but that might be far from reality in many situations such as when emails are sent to a queue where it can easily take several minutes to send an email
If you absolutely have to do it via Gmail, you will still be better off looking at their API that should allow you to search and query email messages as well.
There is an issue related to the Google login. You can try turning on the "Allow less secure apps" Google account setting to workaround this issue. Please note that this setting is available for the disabled 2-Step Verification.

Watch for changes to calendar, when to make request

When watching for changes to a collection of events on a given calendar, how often do I need to make a watch request?
Where would I put my code to make a watch request? Does it only need to be done once?
My code below gets an access token and makes a post to create a watch channel, however I'm not sure where to host the code or how often I need to run it:
let { google } = require("googleapis");
let functions = require("firebase-functions");
let privatekey = require("./config.json");
let axios = require("axios");
let jwt = new google.auth.JWT(
privatekey.client_email,
null,
privatekey.private_key,
["https://www.googleapis.com/auth/calendar"]
);
const token = await jwt.authorize();
let headers = {
"Access-Control-Allow-Origin": "*",
"Content-Type": "application/json;charset=UTF-8",
Authorization: token.token_type + " " + token.access_token
};
let data = {
id: randomId,
type: "web_hook",
address: "https://rguc-calendars.firebaseapp.com/notifications",
params: {
ttl: 3600
}
};
axios
.post(
"https://www.googleapis.com/calendar/v3/calendars/thirdyear#rguc.co.uk/events/watch",
data,
{ headers }
)
.then(function(response) {
// success
})
.catch(function(error) {
// error
});
push notifications
The Google Calendar API provides push notifications that let you watch
for changes to resources. You can use this feature to improve the
performance of your application. It allows you to eliminate the extra
network and compute costs involved with polling resources to determine
if they have changed. Whenever a watched resource changes, the Google
Calendar API notifies your application.
Register the domain of your receiving URL.
For example, if you plan to use https://example.com/notifications as your receiving URL, you need to register https://example.com.
Set up your receiving URL, or "Webhook" callback receiver.
This is an HTTPS server that handles the API notification messages that are triggered when a resource changes.
Set up a notification channel for each resource endpoint you want to watch.
A channel specifies routing information for notification messages. As part of the channel setup, you identify the specific URL where you want to receive notifications. Whenever a channel's resource changes, the Google Calendar API sends a notification message as a POST request to that URL.
Once you have set up the watch google will notify you when ever there is a change you wont have to call it again.

AWS Error: Proxy integrations cannot be configured to transform responses

I'm a beginner in Amazon's Lambda-API implementations.
I'm just deploying a very simple API: a very simple lambda function with Python 2.7 printing "Hello World" that I trigger with API Gateway. However, when I click on the Invoke URL link, it tells me "{"message": "Internal server error"}".
Thus, I'm trying to see what is wrong here, so I click on the API itself and I can see the following being grey in my Method Execution: "Integration Response: Proxy integrations cannot be configured to transform responses."
I have tested many different configurations but I still face the same error. I have no idea why this step is grey.
I had the same problem when trying to integrate API gateway and lambda function. Basically, after spending a couple of hours, I figure out.
So when you were creating a new resource or method the Use Lambda Proxy integration was set by default.
So you need to remove this. Follow to Integration Request and untick the Use Lambda Proxy integration
you will see the following picture
Then in you Resources, Atction tab, choose Enable CORS
Once this done Deploy your API once again and test function. Also, this topic will explain what's happening under the hood.
Good luck...
The Lambda response should be in a specific format for API gateway to process. You could find details in the post. https://aws.amazon.com/premiumsupport/knowledge-center/malformed-502-api-gateway/
exports.handler = (event, context, callback) => {
var responseBody = {
"key3": "value3",
"key2": "value2",
"key1": "value1"
};
var response = {
"statusCode": 200,
"headers": {
"my_header": "my_value"
},
"body": JSON.stringify(responseBody),
"isBase64Encoded": false
};
callback(null, response);
My API was working in Postman but not locally when I was developing the front end. I was getting the same errors when trying to enable CORS on my resources for GET, POST and OPTIONS and after searching all over #aditya answer got me on the right track but I had to tweak my code slightly.
I needed to add the res.statusCodeand the two headers and it started working.
// GET
// get all myModel
app.get('/models/', (req, res) => {
const query = 'SELECT * FROM MyTable'
pool.query(query, (err, results, fields) => {
//...
const models = [...results]
const response = {
data: models,
message: 'All models successfully retrieved.',
}
//****** needed to add the next 3 lines
res.statusCode = 200;
res.setHeader('content-type', 'application/json');
res.setHeader('Access-Control-Allow-Origin', '*');
res.send(response)
})
})
If you re using terraform for aws resource provision you can set the
"aws_api_gateway_integration" type = "AWS" instead of "AWS_PROXY" and that should resolve your problem.

Why doesn't dojo.io.script.get() execute the provided error function when receiving a 404?

I am trying to use the following to do a cross-domain get:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
preventCache: true,
load: dojo.hitch( this, loadFunction ),
error: dojo.hitch( this, function() {
console.log('Error!!!');
})
});
The load function runs fine, however, when the server returns a 404, the error function does not run. Can anyone tell me why?
EDIT
After some investigation, I found that a timeout and handler could be implemented in the following way:
dojo.io.script.get({
url: myUrl,
callbackParamName: "callback",
timeout: 2000
}).then(function(data){
console.log(data);
}, function(error){
alert(error);
});
This uses functionality provided by the dojo.Deferred object.
When accessing server with script tags (that what dojo.io.script.get does), status code and headers are not available.
You may try some other ways to detect a problem, like using a timeout and analyzing a content of a script. The latter is problematic for JSONP calls (like in your example).
I realize this is old but I thought I'd share a solution in case others, like I had, come across this thread.
dojo.io.script is essentially adding a <script/> to your html page. So you can try this:
var script = document.createElement('script');
script.setAttribute('type', 'text/javascript');
script.setAttribute('src', myUrl);
script.onerror = function() {
debugger
}
script.onload = function() {
debugger
}
document.getElementsByTagName('body')[0].appendChild(script);
That way if the script fails to load the onerror event is called.
*This may not work in every instance but is a good start