I update my application from Ionic 1 to Ionic 2.
For the first App (Ionic 1) I use AngularFire and custom authentication (with Slim Framework). With Ionic 2 I try to do the same thing with AngularFire2 (and firebase 2.4.2) but I have this error when I auth to firebase.
Code (App.ts):
#App({
templateUrl: './build/app.html',
providers: [
FIREBASE_PROVIDERS,
defaultFirebase('https://<APP>.firebaseio.com/'),
firebaseAuthConfig({
provider: AuthProviders.Custom,
method: AuthMethods.CustomToken
})
]
})
Code (Login.ts):
export class LoginPage {
n_adherent:number = null;
password:string = '';
constructor(..., private af:AngularFire, private _authService:AuthService) {}
connect() {
let credentials = {
n_adherent: parseInt(this.n_adherent, 10),
password: this.password
};
// Send credentials to my PHP server
this._autService.login(credentials)
.subscribe(data => {
if (data.token) {
// I get the token
let token = data.token;
// Authenticate to Firebase
this.af.auth.login(token)
.then((data) => console.log(data))
.catch((error) => console.log(error));
}
});
}
}
Error (in console):
You must include credentials to use this auth method.
Code from firebase/php-jwt:
<?php
use \Firebase\JWT\JWT;
$key = "example_key";
$token = array(
"iss" => "http://example.org",
"aud" => "http://example.com",
"iat" => 1356999524,
"nbf" => 1357000000
);
/**
* IMPORTANT:
* You must specify supported algorithms for your application. See
* https://tools.ietf.org/html/draft-ietf-jose-json-web-algorithms-40
* for a list of spec-compliant algorithms.
*/
$jwt = JWT::encode($token, $key);
$decoded = JWT::decode($jwt, $key, array('HS256'));
print_r($decoded);
/*
NOTE: This will now be an object instead of an associative array. To get
an associative array, you will need to cast it as such:
*/
$decoded_array = (array) $decoded;
/**
* You can add a leeway to account for when there is a clock skew times between
* the signing and verifying servers. It is recommended that this leeway should
* not be bigger than a few minutes.
*
* Source: http://self-issued.info/docs/draft-ietf-oauth-json-web-token.html#nbfDef
*/
JWT::$leeway = 60; // $leeway in seconds
$decoded = JWT::decode($jwt, $key, array('HS256'));
?>
Your help is needed.
Your token is a string, right?
We just had the same error and had to debug the source code.
We realised, that the this.af.auth.login method is waiting for two parameters.
Simply put, use the following:
this.af.auth.login(token, {})
Cheers,
Marcell
This should fix your issue:
this.af.auth.login(data.token, {
provider: AuthProviders.Custom,
method: AuthMethods.CustomToken
})
.then(data => {
console.log(data);
})
.catch(error => {
console.log(error);
});
Cheers.
Related
I am using laravel fortify in my single page vue application. When i send XHR request to sitePath + '/login' i get {two_factor: false} in response and user is logged in. How can i add user information so i can save the data in local storage.
let response = await this.$root.requestPost(data, url);
async requestPost(data, requestUrl) {
const response = await axios.post(requestUrl, {
// ...data
}).catch(error => {
//
});
return response;
},
Sorry for late answer.
But today I have solved same problem.
In FortifyServiceProvider's register() method, you need implement own LoginResponse:
$this->app->instance(LoginResponse::class, new class implements LoginResponse {
public function toResponse($request)
{
/**
* #var User $user
*/
$user = $request->user();
return $request->wantsJson()
? response()->json(['two_factor' => false, 'email' => $user->email ])
: redirect()->intended(Fortify::redirects('login'));
}
});
Documentation - https://laravel.com/docs/8.x/fortify#customizing-authentication-redirects
I have a SPA which uses the solution provided here to authenticate with Azure AD and everything works as expected. Now I want to migrate this to use MSAL.js.
I use below for login:
import * as MSAL from 'msal'
...
const config = {
auth: {
tenantId: '<mytenant>.com',
clientId: '<myclientid>',
redirectUri: <redirecturi>,
},
cache: {
cacheLocation: 'localStorage',
}
};
const tokenRequest = {
scopes: ["User.Read"]
};
export default {
userAgentApplication: null,
/**
* #return {Promise}
*/
initialize() {
let redirectUri = config.auth.redirectUri;
// create UserAgentApplication instance
this.userAgentApplication = new MSAL.UserAgentApplication(
config.auth.clientId,
'',
() => {
// callback for login redirect
},
{
redirectUri
}
);
// return promise
return new Promise((resolve, reject) => {
if (this.userAgentApplication.isCallback(window.location.hash) || window.self !== window.top) {
// redirect to the location specified in the url params.
}
else {
// try pull the user out of local storage
let user = this.userAgentApplication.getUser();
if (user) {
resolve();
}
else {
// no user at all - go sign in.
this.signIn();
}
}
});
},
signIn() {
this.userAgentApplication.loginRedirect(tokenRequest.scopes);
},
And then I use below to get the token:
getCachedToken() {
var token = this.userAgentApplication.acquireTokenSilent(tokenRequest.scopes);
return token;
}
isAuthenticated() {
// getCachedToken will only return a valid, non-expired token.
var user = this.userAgentApplication.getUser();
if (user) {
// get token
this.getCachedToken()
.then(token => {
axios.defaults.headers.common["Authorization"] = "Bearer " + token;
// get current user email
axios
.get('<azureapi-endpoint>' + '/GetCurrentUserEmail')
.then(response => { })
.catch(err => { })
.finally(() => {
});
})
.catch(err => { })
.finally(() => { });
return true;
}
else {
return false;
}
},
}
but after login I get below error:
Access to XMLHttpRequest at 'https://login.windows.net/common/oauth2/authorize?response_type=code+id_token&redirect_uri=<encoded-stuff>' (redirected from '<my-azure-api-endpoint>') from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
Also the token that I get seems to be invalid as I get 401 errors trying to call api using the token. Upon checking the token against https://jwt.io/ I get an invalid signature.
I really appreciate anyone's input as I've already spent good few days and haven't got anywhere yet.
I'm not sure if this is your issue. however, for msal.js, in the config, there is no tenantId parameter, it's supposed to be authority. Here is a sample for graph api using msal.js
https://github.com/Azure-Samples/active-directory-javascript-graphapi-v2
specifically: the config is here: https://github.com/Azure-Samples/active-directory-javascript-graphapi-v2/blob/quickstart/JavaScriptSPA/authConfig.js
as per here, https://learn.microsoft.com/en-us/azure/active-directory/develop/msal-js-initializing-client-applications it is supposed to be hitting login.microsoftonline.com not login.windows.net
I need help getting my Firebase Apollo/GraphQL Cloud Function to authenticate and receive query requests.
I implemented an Apollo/GraphQL server as a Cloud Function in
Firebase/Firestore using this repository from this post.
I set permissions for the cloud function to
allAuthenticatedUsers and I am using Firebase Phone
Authentication to authenticate.
I used code from this stackoverflow answer to help structure the
authentication portion not included in the initial repository.
The Apollo/GraphQL function works fine (tested with playground) when permissions are set to allUsers. After setting permissions to allAuthenticatedUsers and attempting to send authenticated queries I am receiving the following error response:
Bearer error="invalid_token" error_description="The access token could not be verified"
I believe I am making a mistake with the request sent by the client, and or the handling of the verification and "context" of the ApolloServer. I have confirmed the initial user token is correct. My current theory is that I am sending the wrong header, or messing up the syntax somehow at either the client or server level.
To explain what I believe the appropriate flow of the request should be:
Token generated in client
Query sent from client with token as header
ApolloServer cloud function receives request
Token is verified by Firebase, provides new verified header token
Server accepts query with new verified header token and returns data
If anyone can explain how to send valid authenticated client queries to a Firebase Apollo/GraphQL Cloud Function the help would be greatly appreciated. Code for server and client below.
Server.js (ApolloServer)
/* Assume proper imports */
/* Initialize Firebase Admin SDK */
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "[db-url]",
});
/* Async verification with user token */
const verify = async (idToken) => {
var newToken = idToken.replace("Bearer ", "");
let header = await admin.auth().verifyIdToken(newToken)
.then(function(decodedToken) {
let uid = decodedToken.uid;
// Not sure if I should be using the .uid from above as the token?
// Also, not sure if returning the below object is acceptable, or
// if this is even the correct header to send to firebase from Apollo
return {
"Authorization": `Bearer ${decodedToken}`
}
}).catch(function(error) {
// Handle error
return null
});
return header
}
/* Server */
function gqlServer() {
const app = express();
const apolloServer = new ApolloServer({
typeDefs: schema,
resolvers,
context: async ({ req, res }) => {
const verified = await verify(req.headers.Authorization)
console.log('log verified', verified)
return {
headers: verified ? verified: '',
req,
res,
}
},
// Enable graphiql gui
introspection: true,
playground: true
});
apolloServer.applyMiddleware({app, path: '/', cors: true});
return app;
}
export default gqlServer;
Client.js (ApolloClient)
Client query constructed using these instructions.
/* Assume appropriate imports */
/* React Native firebase auth */
firebase.auth().onAuthStateChanged(async (user) => {
const userToken = await user.getIdToken();
/* Client creation */
const client = new ApolloClient({
uri: '[Firebase Cloud Function URL]',
headers: {
Authorization: userToken ? `Bearer ${userToken}` : ''
},
cache: new InMemoryCache(),
});
/* Query test */
client.query({
query: gql`
{
hello
}
`
}).then(
(result) => console.log('log query result', result)
).catch(
(error) => console.log('query error', error)
)
})
UPDATE 05/03/20
I may have found the source of the error. I won't post an answer until I confirm, but here's the update. Looks like allAuthenticatedUsers is a role specific to Google accounts and not Firebase. See this part of the google docs and this stackoverflow answer.
I will do some testing but the solution may be to change the permissions to allUsers which may still require authentication. If I can get it working I will update with an answer.
I was able to get things working. Working requests required the following changes:
Change cloud function "invoker" role to include allUsers instead of allAuthenticatedUsers. This because the allUsers role makes the function available to http requests (you can still require authentication through sdk verification)
Adjusting the code for the server and client as shown below. Minor change to header string construction.
Server.js (ApolloServer)
/* Assume proper imports */
/* Initialize Firebase Admin SDK */
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "[db-url]",
});
/* Async verification with user token */
const verify = async (idToken) => {
if (idToken) {
var newToken = idToken.replace("Bearer ", "");
// var newToken = idToken
let header = await admin.auth().verifyIdToken(newToken)
.then(function(decodedToken) {
// ...
return {
"Authorization": 'Bearer ' + decodedToken
}
}).catch(function(error) {
// Handle error
return null
});
return header
} else {
throw 'No Access'
}
}
/* Server */
function gqlServer() {
const app = express();
const apolloServer = new ApolloServer({
typeDefs: schema,
resolvers,
context: async ({ req, res }) => {
// headers: req.headers,
const verified = await verify(req.headers.authorization)
console.log('log verified', verified)
return {
headers: verified ? verified: '',
req,
res,
}
},
// Enable graphiql gui
introspection: true,
playground: true
});
apolloServer.applyMiddleware({app, path: '/', cors: true});
return app;
}
export default gqlServer;
Client.js (ApolloClient)
/* Assume appropriate imports */
/* React Native firebase auth */
firebase.auth().onAuthStateChanged(async (user) => {
const userToken = await user.getIdToken();
/* Client creation */
const userToken = await user.getIdToken();
const client = new ApolloClient({
uri: '[Firebase Cloud Function URL]',
headers: {
"Authorization": userToken ? 'Bearer ' + userToken : ''
},
cache: new InMemoryCache(),
});
client.query({
query: gql`
{
hello
}
`
}).then(
(result) => console.log('log query result', result)
).catch(
(error) => console.log('query error', error)
)
})
I am having a single page application hidden behind Auth0 lock, using #auth0/auth0-spa-js. I would like to test it using Cypress, so I have decided to follow the official Auth0 blog post, as well as Johnny Reilly blog post.
I am able to successfully retrieve valid JWT token from auth0 using suggested request. I have no idea what to do with it :(
The trouble I am facing is that both of the above approaches are relying on the app to store the JWT token locally (either in cookie or localstorage). The #auth0/auth0-spa-js is, however, using a different approach, and I assume all the relevant cookies/localstorage is stored on auth0 domains.
Do you have any idea, if there is a way to get around it?
There is a similar issue reported here raised in July 2018, not really providing any solution
I found a resolved issue on #auth0/auth0-spa-js github. The approach suggested by cwmrowe seems to be working
The solution is to mock the response of oauth/token endpoint with token generated on e2e test side.
The approach seems to be working for us
I am copying over the sample code cwmrowe has provided
Cypress.Commands.add(
'login',
(username, password, appState = { target: '/' }) => {
cy.log(`Logging in as ${username}`);
const options = {
method: 'POST',
url: Cypress.env('Auth0TokenUrl'),
body: {
grant_type: 'password',
username,
password,
audience: Cypress.env('Auth0Audience'),
scope: 'openid profile email',
client_id: Cypress.env('Auth0ClientId'),
client_secret: Cypress.env('Auth0ClientSecret')
}
};
cy.request(options).then(({ body }) => {
const { access_token, expires_in, id_token } = body;
cy.server();
// intercept Auth0 request for token and return what we have
cy.route({
url: 'oauth/token',
method: 'POST',
response: {
access_token,
expires_in,
id_token,
token_type: 'Bearer'
}
});
// Auth0 SPA SDK will check for value in cookie to get appState
// and validate nonce (which has been removed for simplicity)
const stateId = 'test';
const encodedAppState = encodeURI(JSON.stringify(appState));
cy.setCookie(
`a0.spajs.txs.${stateId}`,
`{%22appState%22:${encodedAppState}%2C%22scope%22:%22openid%20profile%20email%22%2C%22audience%22:%22default%22}`
);
const callbackUrl = `/auth/callback?code=test-code&state=${stateId}`;
return cy.visit(callbackUrl);
});
}
);
declare namespace Cypress {
interface Chainable<Subject> {
login(
username: string,
password: string,
appState?: any
): Chainable<Subject>;
}
}
Whilst it's not recommended to use the UI to login I do this myself once prior to all tests and then use the silent auth for the tests:- cy.visit("/") silent auths and allows access to the app.
integration/app.js
describe("App", () => {
before(() => {
Cypress.config("baseUrl", "http://localhost:3000");
cy.login();
});
/** Uses silent auth for successive tests */
beforeEach(() => {
cy.restoreLocalStorage();
});
afterEach(() => {
cy.saveLocalStorage();
});
/** tests */
support/commands.js
/**
* Auth0 login
* https://github.com/cypress-io/cypress/issues/461#issuecomment-392070888
*
* Allows silent auth login between tests
*/
let LOCAL_STORAGE_MEMORY = {};
Cypress.Commands.add("saveLocalStorage", () => {
Object.keys(localStorage).forEach(key => {
LOCAL_STORAGE_MEMORY[key] = localStorage[key];
});
});
Cypress.Commands.add("restoreLocalStorage", () => {
Object.keys(LOCAL_STORAGE_MEMORY).forEach(key => {
localStorage.setItem(key, LOCAL_STORAGE_MEMORY[key]);
});
});
Cypress.Commands.add("clearLocalStorage", () => {
LOCAL_STORAGE_MEMORY = {};
});
For those who has issue with Google Sign in for Cypress look at the plugin: https://github.com/lirantal/cypress-social-logins/
it('Login through Google', () => {
const username = Cypress.env('googleSocialLoginUsername')
const password = Cypress.env('googleSocialLoginPassword')
const loginUrl = Cypress.env('loginUrl')
const cookieName = Cypress.env('cookieName')
const socialLoginOptions = {
username,
password,
loginUrl,
headless: false,
isPopup: true,
logs: false,
loginSelector: 'a[href="/auth/auth0/google-oauth2"]',
postLoginSelector: '.account-panel'
}
return cy.task('GoogleSocialLogin', socialLoginOptions).then(({cookies}) => {
cy.clearCookies()
const cookie = cookies.filter(cookie => cookie.name === cookieName).pop()
if (cookie) {
cy.setCookie(cookie.name, cookie.value, {
domain: cookie.domain,
expiry: cookie.expires,
httpOnly: cookie.httpOnly,
path: cookie.path,
secure: cookie.secure
})
Cypress.Cookies.defaults({
whitelist: cookieName
})
}
})
});
I'm a beginner in ember and service workers. My goal is to setup a simple ember app that works offline. I basically have a list of elements that are available through an API (GET/POST).
When I'm online, everything works as expected. I can GET the list and POST new items. When I'm offline the app works, but network requests are not executed once I go back online. All network requests are actually executed while I'm offline (and obviously fail). I would expect that the service worker caches the network requests and executes them only once I'm back online. Is this wrong?
Here some information about my setup:
Ember version:
ember-cli: 2.13.1
node: 7.10.0
os: darwin x64
Service Worker Add-ons (as listed in app/package.json):
"ember-service-worker": "^0.6.6",
"ember-service-worker-asset-cache": "^0.6.1",
"ember-service-worker-cache-fallback": "^0.6.1",
"ember-service-worker-index": "^0.6.1",
I should probably also mention that I use ember-django-adapter in version 1.1.3.
This is my app/ember-cli-build.js
var EmberApp = require('ember-cli/lib/broccoli/ember-app');
module.exports = function(defaults) {
var app = new EmberApp(defaults, {
'esw-cache-fallback': {
// RegExp patterns specifying which URLs to cache.
patterns: [
'http://localhost:8000/api/v1/(.*)',
],
// changing this version number will bust the cache
version: '1'
}
});
return app.toTree();
};
My network requests (GET/POST) go to http://localhost:8000/api/v1/properties/.
This is my app/adapters/applications.js
import DS from 'ember-data';
import DataAdapterMixin from 'ember-simple-auth/mixins/data-adapter-mixin';
export default DS.JSONAPIAdapter.extend(DataAdapterMixin, {
namespace: 'api/v1',
host: 'http://localhost:8000',
authorizer: 'authorizer:token',
headers: { 'Accept': 'application/json', 'Content-Type': 'application/json' },
buildURL: function(type, id, record) {
return this._super(type, id, record) + '/';
}
});
The service worker registers when I open the app:
(function () {
'use strict';
self.addEventListener('install', function installEventListenerCallback(event) {
return self.skipWaiting();
});
self.addEventListener('activate', function installEventListenerCallback(event) {
return self.clients.claim();
});
const FILES = ['assets/connect.css', 'assets/connect.js', 'assets/connect.map', 'assets/failed.png', 'assets/passed.png', 'assets/test-support.css', 'assets/test-support.js', 'assets/test-support.map', 'assets/tests.js', 'assets/tests.map', 'assets/vendor.css', 'assets/vendor.js', 'assets/vendor.map'];
const PREPEND = undefined;
const VERSION$1 = '1';
const REQUEST_MODE = 'cors';
/*
* Deletes all caches that start with the `prefix`, except for the
* cache defined by `currentCache`
*/
var cleanupCaches = (prefix, currentCache) => {
return caches.keys().then((cacheNames) => {
cacheNames.forEach((cacheName) => {
let isOwnCache = cacheName.indexOf(prefix) === 0;
let isNotCurrentCache = cacheName !== currentCache;
if (isOwnCache && isNotCurrentCache) {
caches.delete(cacheName);
}
});
});
};
const CACHE_KEY_PREFIX = 'esw-asset-cache';
const CACHE_NAME = `${CACHE_KEY_PREFIX}-${VERSION$1}`;
const CACHE_URLS = FILES.map((file) => {
return new URL(file, (PREPEND || self.location)).toString();
});
/*
* Removes all cached requests from the cache that aren't in the `CACHE_URLS`
* list.
*/
const PRUNE_CURRENT_CACHE = () => {
caches.open(CACHE_NAME).then((cache) => {
return cache.keys().then((keys) => {
keys.forEach((request) => {
if (CACHE_URLS.indexOf(request.url) === -1) {
cache.delete(request);
}
});
});
});
};
self.addEventListener('install', (event) => {
event.waitUntil(
caches
.open(CACHE_NAME)
.then((cache) => {
return Promise.all(CACHE_URLS.map((url) => {
let request = new Request(url, { mode: REQUEST_MODE });
return fetch(request)
.then((response) => {
if (response.status >= 400) {
throw new Error(`Request for ${url} failed with status ${response.statusText}`);
}
return cache.put(url, response);
})
.catch(function(error) {
console.error(`Not caching ${url} due to ${error}`);
});
}));
})
);
});
self.addEventListener('activate', (event) => {
event.waitUntil(
Promise.all([
cleanupCaches(CACHE_KEY_PREFIX, CACHE_NAME),
PRUNE_CURRENT_CACHE()
])
);
});
self.addEventListener('fetch', (event) => {
let isGETRequest = event.request.method === 'GET';
let shouldRespond = CACHE_URLS.indexOf(event.request.url) !== -1;
if (isGETRequest && shouldRespond) {
event.respondWith(
caches.match(event.request, { cacheName: CACHE_NAME })
.then((response) => {
if (response) {
return response;
}
return fetch(event.request);
})
);
}
});
const VERSION$2 = '1';
const PATTERNS = ['http://localhost:8000/api/v1/(.*)'];
/**
* Create an absolute URL, allowing regex expressions to pass
*
* #param {string} url
* #param {string|object} baseUrl
* #public
*/
function createNormalizedUrl(url, baseUrl = self.location) {
return decodeURI(new URL(encodeURI(url), baseUrl).toString());
}
/**
* Create an (absolute) URL Regex from a given string
*
* #param {string} url
* #returns {RegExp}
* #public
*/
function createUrlRegEx(url) {
let normalized = createNormalizedUrl(url);
return new RegExp(`^${normalized}$`);
}
/**
* Check if given URL matches any pattern
*
* #param {string} url
* #param {array} patterns
* #returns {boolean}
* #public
*/
function urlMatchesAnyPattern(url, patterns) {
return !!patterns.find((pattern) => pattern.test(decodeURI(url)));
}
const CACHE_KEY_PREFIX$1 = 'esw-cache-fallback';
const CACHE_NAME$1 = `${CACHE_KEY_PREFIX$1}-${VERSION$2}`;
const PATTERN_REGEX = PATTERNS.map(createUrlRegEx);
self.addEventListener('fetch', (event) => {
let request = event.request;
if (request.method !== 'GET' || !/^https?/.test(request.url)) {
return;
}
if (urlMatchesAnyPattern(request.url, PATTERN_REGEX)) {
event.respondWith(
caches.open(CACHE_NAME$1).then((cache) => {
return fetch(request)
.then((response) => {
cache.put(request, response.clone());
return response;
})
.catch(() => caches.match(event.request));
})
);
}
});
self.addEventListener('activate', (event) => {
event.waitUntil(cleanupCaches(CACHE_KEY_PREFIX$1, CACHE_NAME$1));
});
const VERSION$3 = '1';
const INDEX_HTML_PATH = 'index.html';
const CACHE_KEY_PREFIX$2 = 'esw-index';
const CACHE_NAME$2 = `${CACHE_KEY_PREFIX$2}-${VERSION$3}`;
const INDEX_HTML_URL = new URL(INDEX_HTML_PATH, self.location).toString();
self.addEventListener('install', (event) => {
event.waitUntil(
fetch(INDEX_HTML_URL, { credentials: 'include' }).then((response) => {
return caches
.open(CACHE_NAME$2)
.then((cache) => cache.put(INDEX_HTML_URL, response));
})
);
});
self.addEventListener('activate', (event) => {
event.waitUntil(cleanupCaches(CACHE_KEY_PREFIX$2, CACHE_NAME$2));
});
self.addEventListener('fetch', (event) => {
let request = event.request;
let isGETRequest = request.method === 'GET';
let isHTMLRequest = request.headers.get('accept').indexOf('text/html') !== -1;
let isLocal = new URL(request.url).origin === location.origin
if (isGETRequest && isHTMLRequest && isLocal) {
event.respondWith(
caches.match(INDEX_HTML_URL, { cacheName: CACHE_NAME$2 })
);
}
});
}());
This is how network requests appear in Chrome:Network request while offline
I assume the problem is in the configuration of ember-service-worker-cache-fallback. But I'm not quite sure about that. Any idea or link to a working example is welcome. I didn't find a lot about ember-service-worker-cache-fallback so far.
Thanks!
What you've described is the correct and expected behaviour of ember-service-worker-cache-fallback, that is first try fetch from the network if not possible then fallback to fetch from the cached version in the service worker.
I believe what you are looking for is some kind of queuing mechanism for failed requests. This is not covered in the scope of ember-service-worker-cache-fallback.
Fear not though, I had similar ambitions and came up with my own solution called ember-service-worker-enqueue . It's a ember-service-worker plugin that queues only failed mutation requests eg POST, PUT, PATCH, DELETE using Mozilla's Localforage and then sends them when the network is stable.
It's perfect for protecting you ember app against network failures or server errors which respond with 5xx status codes.
NOTE: In my experience, Service Workers are best when treated on a per use case, so don't blindly install my plugin and expect things to work the same way for you, rather go through the heavily commented code ( < 200 lines), fork the plugin and adjust it to fit your needs. Enjoy,
Ps: I'm also working on another one called ember-service-worker-push-notifications still early days but will follow same heavy comments for anyone looking to gain from it. Cheers!