How to import get response times interceptor to Axios - vue.js

All my imported interceptors work fine in Axios, but getResponseTime interceptor doesn't.
NON WORKING EXAMPLE
What about just to assume the case I want to manage and import interceptors to Axios. I'd like to define them in separate file logTimeInterceptor.js:
export const responseTimeInterceptor = (response) => {
// to avoid overwriting if another interceptor
// already defined the same object (meta)
response.meta = response.meta || {}
response.meta.requestStartedAt = new Date().getTime()
return response
}
export const logResponseTimeInterceptor = ((response) => {
// Logs response time - only for successful requests
console.log(
`📡 API | Execution time for: ${response.config.url} - ${
new Date().getTime() - response.config.meta.requestStartedAt
} ms`
)
return response
},
// Support for failed requests
// Handle 4xx & 5xx responses
(response) => {
console.error(
`📡 API | Execution time for: ${response.config.url} - ${
new Date().getTime() - response.config.meta.requestStartedAt
} ms`
)
throw response
}
)
Then I import them and I use them in the way in a given example of httpClient.js:
import axios from 'axios'
import { ApiSettings } from '#/api/config'
import {
responseTimeInterceptor,
logResponseTimeInterceptor
} from '#/api/interceptors/logTimeInterceptor'
// ============== A P I C L I E N T =============
const httpClient = axios.create(ApiSettings)
// ============== A P I I N T E R C E P T O R S =============
httpClient.interceptors.request.use(authInterceptor)
httpClient.interceptors.response.use(
responseTimeInterceptor,
logResponseTimeInterceptor,
)
export default httpClient
But those interceptors don't work as imports together, and I think they has to stay intact in httpClient file. Please advise. It only breaks with logResponseTimes interceptor, the others work fine.
// UPDATE: According to #IVO GELOV I tried with one argument but still with no success, also I moved responseTimeInterceptor to request, the refactored code looks as follows:
// httpClient.js
httpClient.interceptors.request.use(responseTimeInterceptor)
httpClient.interceptors.response.use(logResponseTimeInterceptor)
httpClient.interceptors.response.use(logResponseErrorTimeInterceptor)
responseTimeInterceptor.js
export const responseTimeInterceptor = (response) => {
// to avoid overwriting if another interceptor
// already defined the same object (meta)
response.meta = response.meta || {}
response.meta.requestStartedAt = new Date().getTime()
return response
}
export const logResponseTimeInterceptor = (response) => {
// Logs response time - only for successful requests
console.log(
`📡 API | Execution time for: ${response.config.url} - ${
new Date().getTime() - response.config.meta.requestStartedAt
} ms`
)
return response
}
export const logResponseErrorTimeInterceptor = (response) => {
// Support for failed requests
// Handle 4xx & 5xx responses
console.error(
`📡 API | Execution time for: ${response.config.url} - ${
new Date().getTime() - response.config.meta.requestStartedAt
} ms`
)
throw response
}

Related

How to debug XHR occurring an Angular 10 application

By default Angular 10 uses an XHR backend, but there is a JSONP backend available.
I am receiving some odd results from the HttpClient, which differ from the data received when using other tools to interact with the backend API.
I want to be able to see what is happening in the raw XMLHttpRequest traffic. I suspect a HttpInterceptor is doing something funky, and when I add my own HttpInterceptor the payload has already been altered from what I expect to see. Watching the XHR will "prove" this theory.
The approach I have taken to adding some logging is to copy the XHR backend from https://github.com/angular/angular/blob/master/packages/common/http/src/xhr.ts into a subclass. Small changes needed to be made because not everything is exported from '#angular/common/http'.
(Note the following is MIT licensed
import { HttpXhrBackend, HttpHeaders, HttpRequest, HttpResponse, HttpBackend, HttpEvent } from '#angular/common/http';
import { Injectable } from '#angular/core';
import { Observable, Observer } from 'rxjs';
import {HttpDownloadProgressEvent, HttpErrorResponse, HttpEventType, HttpHeaderResponse, HttpUploadProgressEvent} from '#angular/common/http';
const XSSI_PREFIX = /^\)\]\}',?\n/;
/**
* Determine an appropriate URL for the response, by checking either
* XMLHttpRequest.responseURL or the X-Request-URL header.
*/
function getResponseUrl(xhr: any): string|null {
if ('responseURL' in xhr && xhr.responseURL) {
return xhr.responseURL;
}
if (/^X-Request-URL:/m.test(xhr.getAllResponseHeaders())) {
return xhr.getResponseHeader('X-Request-URL');
}
return null;
}
#Injectable()
export class CustomHttpBackend extends HttpXhrBackend {
handle(req: HttpRequest<any>): Observable<HttpEvent<any>> {
console.log(req);
// Quick check to give a better error message when a user attempts to use
// HttpClient.jsonp() without installing the HttpClientJsonpModule
if (req.method === 'JSONP') {
throw new Error(
`Attempted to construct Jsonp request without HttpClientJsonpModule installed.`);
}
// Everything happens on Observable subscription.
return new Observable((observer: Observer<HttpEvent<any>>) => {
// Start by setting up the XHR object with request method, URL, and withCredentials flag.
const xhr = new XMLHttpRequest();
xhr.open(req.method, req.urlWithParams);
if (!!req.withCredentials) {
xhr.withCredentials = true;
}
// Add all the requested headers.
Array.from(req.headers.keys())
.forEach(key => xhr.setRequestHeader(key, req.headers.get(key)!));
// Add an Accept header if one isn't present already.
if (!req.headers.has('Accept')) {
xhr.setRequestHeader('Accept', 'application/json, text/plain, */*');
}
// Auto-detect the Content-Type header if one isn't present already.
if (!req.headers.has('Content-Type')) {
const detectedType = req.detectContentTypeHeader();
// Sometimes Content-Type detection fails.
if (detectedType !== null) {
xhr.setRequestHeader('Content-Type', detectedType);
}
}
// Set the responseType if one was requested.
if (req.responseType) {
const responseType = req.responseType.toLowerCase();
// JSON responses need to be processed as text. This is because if the server
// returns an XSSI-prefixed JSON response, the browser will fail to parse it,
// xhr.response will be null, and xhr.responseText cannot be accessed to
// retrieve the prefixed JSON data in order to strip the prefix. Thus, all JSON
// is parsed by first requesting text and then applying JSON.parse.
xhr.responseType = ((responseType !== 'json') ? responseType : 'text') as any;
}
// Serialize the request body if one is present. If not, this will be set to null.
const reqBody = req.serializeBody();
// If progress events are enabled, response headers will be delivered
// in two events - the HttpHeaderResponse event and the full HttpResponse
// event. However, since response headers don't change in between these
// two events, it doesn't make sense to parse them twice. So headerResponse
// caches the data extracted from the response whenever it's first parsed,
// to ensure parsing isn't duplicated.
let headerResponse: HttpHeaderResponse|null = null;
// partialFromXhr extracts the HttpHeaderResponse from the current XMLHttpRequest
// state, and memoizes it into headerResponse.
const partialFromXhr = (): HttpHeaderResponse => {
if (headerResponse !== null) {
return headerResponse;
}
// Read status and normalize an IE9 bug (https://bugs.jquery.com/ticket/1450).
const status: number = xhr.status === 1223 ? 204 : xhr.status;
const statusText = xhr.statusText || 'OK';
// Parse headers from XMLHttpRequest - this step is lazy.
const headers = new HttpHeaders(xhr.getAllResponseHeaders());
// Read the response URL from the XMLHttpResponse instance and fall back on the
// request URL.
const url = getResponseUrl(xhr) || req.url;
// Construct the HttpHeaderResponse and memoize it.
headerResponse = new HttpHeaderResponse({headers, status, statusText, url});
return headerResponse;
};
// Next, a few closures are defined for the various events which XMLHttpRequest can
// emit. This allows them to be unregistered as event listeners later.
// First up is the load event, which represents a response being fully available.
const onLoad = () => {
// Read response state from the memoized partial data.
let {headers, status, statusText, url} = partialFromXhr();
// The body will be read out if present.
let body: any|null = null;
if (status !== 204) {
// Use XMLHttpRequest.response if set, responseText otherwise.
body = (typeof xhr.response === 'undefined') ? xhr.responseText : xhr.response;
}
console.log(body);
// Normalize another potential bug (this one comes from CORS).
if (status === 0) {
status = !!body ? 200 : 0;
}
// ok determines whether the response will be transmitted on the event or
// error channel. Unsuccessful status codes (not 2xx) will always be errors,
// but a successful status code can still result in an error if the user
// asked for JSON data and the body cannot be parsed as such.
let ok = status >= 200 && status < 300;
// Check whether the body needs to be parsed as JSON (in many cases the browser
// will have done that already).
if (req.responseType === 'json' && typeof body === 'string') {
// Save the original body, before attempting XSSI prefix stripping.
const originalBody = body;
body = body.replace(XSSI_PREFIX, '');
try {
// Attempt the parse. If it fails, a parse error should be delivered to the user.
body = body !== '' ? JSON.parse(body) : null;
} catch (error) {
// Since the JSON.parse failed, it's reasonable to assume this might not have been a
// JSON response. Restore the original body (including any XSSI prefix) to deliver
// a better error response.
body = originalBody;
// If this was an error request to begin with, leave it as a string, it probably
// just isn't JSON. Otherwise, deliver the parsing error to the user.
if (ok) {
// Even though the response status was 2xx, this is still an error.
ok = false;
// The parse error contains the text of the body that failed to parse.
body = {error, text: body};// as HttpJsonParseError
}
}
}
if (ok) {
// A successful response is delivered on the event stream.
observer.next(new HttpResponse({
body,
headers,
status,
statusText,
url: url || undefined,
}));
// The full body has been received and delivered, no further events
// are possible. This request is complete.
observer.complete();
} else {
// An unsuccessful request is delivered on the error channel.
observer.error(new HttpErrorResponse({
// The error in this case is the response body (error from the server).
error: body,
headers,
status,
statusText,
url: url || undefined,
}));
}
};
// The onError callback is called when something goes wrong at the network level.
// Connection timeout, DNS error, offline, etc. These are actual errors, and are
// transmitted on the error channel.
const onError = (error: ProgressEvent) => {
const {url} = partialFromXhr();
const res = new HttpErrorResponse({
error,
status: xhr.status || 0,
statusText: xhr.statusText || 'Unknown Error',
url: url || undefined,
});
observer.error(res);
};
// The sentHeaders flag tracks whether the HttpResponseHeaders event
// has been sent on the stream. This is necessary to track if progress
// is enabled since the event will be sent on only the first download
// progress event.
let sentHeaders = false;
// The download progress event handler, which is only registered if
// progress events are enabled.
const onDownProgress = (event: ProgressEvent) => {
// Send the HttpResponseHeaders event if it hasn't been sent already.
if (!sentHeaders) {
observer.next(partialFromXhr());
sentHeaders = true;
}
// Start building the download progress event to deliver on the response
// event stream.
let progressEvent: HttpDownloadProgressEvent = {
type: HttpEventType.DownloadProgress,
loaded: event.loaded,
};
// Set the total number of bytes in the event if it's available.
if (event.lengthComputable) {
progressEvent.total = event.total;
}
// If the request was for text content and a partial response is
// available on XMLHttpRequest, include it in the progress event
// to allow for streaming reads.
if (req.responseType === 'text' && !!xhr.responseText) {
progressEvent.partialText = xhr.responseText;
}
// Finally, fire the event.
observer.next(progressEvent);
};
// The upload progress event handler, which is only registered if
// progress events are enabled.
const onUpProgress = (event: ProgressEvent) => {
// Upload progress events are simpler. Begin building the progress
// event.
let progress: HttpUploadProgressEvent = {
type: HttpEventType.UploadProgress,
loaded: event.loaded,
};
// If the total number of bytes being uploaded is available, include
// it.
if (event.lengthComputable) {
progress.total = event.total;
}
// Send the event.
observer.next(progress);
};
// By default, register for load and error events.
xhr.addEventListener('load', onLoad);
xhr.addEventListener('error', onError);
xhr.addEventListener('timeout', onError);
xhr.addEventListener('abort', onError);
// Progress events are only enabled if requested.
if (req.reportProgress) {
// Download progress is always enabled if requested.
xhr.addEventListener('progress', onDownProgress);
// Upload progress depends on whether there is a body to upload.
if (reqBody !== null && xhr.upload) {
xhr.upload.addEventListener('progress', onUpProgress);
}
}
// Fire the request, and notify the event stream that it was fired.
xhr.send(reqBody!);
observer.next({type: HttpEventType.Sent});
// This is the return from the Observable function, which is the
// request cancellation handler.
return () => {
// On a cancellation, remove all registered event listeners.
xhr.removeEventListener('error', onError);
xhr.removeEventListener('abort', onError);
xhr.removeEventListener('load', onLoad);
xhr.removeEventListener('timeout', onError);
if (req.reportProgress) {
xhr.removeEventListener('progress', onDownProgress);
if (reqBody !== null && xhr.upload) {
xhr.upload.removeEventListener('progress', onUpProgress);
}
}
// Finally, abort the in-flight request.
if (xhr.readyState !== xhr.DONE) {
xhr.abort();
}
};
});
}
}
Then in app.module.ts, add the custom provider.
import { HttpBackend } from '#angular/common/http';
import { CustomHttpBackend } from './path/to/custom-http-backend';
providers: [
...,
{
provide: HttpBackend,
useClass: CustomHttpBackend,
},
],

Mocha not exiting even though I call server.close()

I have a pretty standard Express 4 server.
When I run Mocha, it doesn’t exit when the following middleware is used.
app.use("/v1/healthcheck", routes.healthcheck)
"use strict"
import express, { Request, Response, Router } from "express"
// import bearer from "../models/bearer"
const router: Router = express.Router()
// GET /v1/healthcheck
router.get("", async (req: Request, res: Response) => {
try {
// await bearer.findOne()
res.empty()
} catch (error) {
res.internalServerError(error)
}
})
export default router
res.empty = () => {
res.sendStatus(204)
}
If I comment out that middleware, tests fail (404) and exit.
Someone knows what is going on here?
Edit
Using why-is-node-running (as suggested here), apparently it has something to do with underlying MongoDB DNS (DNSCHANNEL).
# DNSCHANNEL
/Users/sunknudsen/server/node_modules/mongodb/lib/core/sdam/srv_polling.js:5 - const dns = require('dns');
/Users/sunknudsen/server/node_modules/mongodb/lib/core/sdam/topology.js:22 - const SrvPoller = require('./srv_polling').SrvPoller;
/Users/sunknudsen/server/node_modules/mongodb/lib/core/index.js:36 - Topology: require('./sdam/topology').Topology,
/Users/sunknudsen/server/node_modules/mongodb/index.js:4 - const core = require('./lib/core');
/Users/sunknudsen/server/node_modules/mongoose/lib/drivers/node-mongodb-native/binary.js:8 - const Binary = require('mongodb').Binary;
/Users/sunknudsen/server/node_modules/mongoose/lib/drivers/node-mongodb-native/index.js:7 - exports.Binary = require('./binary');
/Users/sunknudsen/server/node_modules/mongoose/lib/index.js:15 - require('./driver').set(require('./drivers/node-mongodb-native'));
/Users/sunknudsen/server/node_modules/mongoose/index.js:9 - module.exports = require('./lib/');
/Users/sunknudsen/server/build/server.js:7 - const mongoose_1 = __importDefault(require("mongoose"));
/Users/sunknudsen/server/build/test/index.js:11 - const server_1 = __importDefault(require("../server"));
/Users/sunknudsen/server/node_modules/mocha/lib/esm-utils.js:20 - return require(file);
/Users/sunknudsen/server/node_modules/mocha/lib/esm-utils.js:33 - const result = await exports.requireOrImport(path.resolve(file));
/Users/sunknudsen/server/node_modules/mocha/lib/mocha.js:421 - return esmUtils.loadFilesAsync(
/Users/sunknudsen/server/node_modules/mocha/lib/cli/run-helpers.js:156 - await mocha.loadFilesAsync();
/Users/sunknudsen/server/node_modules/mocha/lib/cli/run-helpers.js:225 - return run(mocha, options, fileCollectParams);
/Users/sunknudsen/server/node_modules/mocha/lib/cli/run.js:366 - await runMocha(mocha, argv);
/Users/sunknudsen/server/node_modules/yargs/lib/command.js:241 - ? innerArgv.then(argv => commandHandler.handler(argv))

How to Instrument a React Web App Client using aws xray

I'm trying to instrument my React Web App using https://docs.aws.amazon.com/xray/latest/devguide/scorekeep-client.html
I am using axios interceptor,But unable to instrument any further ideas?
Here's the axois interceptor code you'll need for X-Ray. Axios does not use the base HTTP library from Node, so you'll need to include this patcher.
I recently wrote a sample app to be published, here is the snippet I used.
Hopefully this helps.
const xray = require('aws-xray-sdk-core');
const segmentUtils = xray.SegmentUtils;
let captureAxios = function(axios) {
//add a request interceptor on POST
axios.interceptors.request.use(function (config) {
var parent = xray.getSegment();
var subsegment = parent.addNewSubsegment(config.baseURL + config.url.substr(1));
subsegment.namespace = 'remote';
let root = parent.segment ? parent.segment : parent;
let header = 'Root=' + root.trace_id + ';Parent=' + subsegment.id + ';Sampled=' + (!root.notTraced ? '1' : '0');
config.headers.get={ 'x-amzn-trace-id': header };
config.headers.post={ 'x-amzn-trace-id': header };
xray.setSegment(subsegment);
return config;
}, function (error) {
var subsegment = xray.getSegment().addNewSubsegment("Intercept request error");
subsegment.close(error);
return Promise.reject(error);
});
// Add a response interceptor
axios.interceptors.response.use(function (response) {
var subsegment = xray.getSegment();
const res = { statusCode: response.status, headers: response.headers };
subsegment.addRemoteRequestData(response.request, res, true);
subsegment.close();
return response;
}, function (error) {
var subsegment = xray.getSegment();
subsegment.close(error);
return Promise.reject(error);
});
};
module.exports = captureAxios;
Usage
Just pass in an initialized instance of Axios.
For React, you'll have to tell me a bit more about what your setup is, and what you're trying to accomplish. X-Ray only cares about the routes in your application - typically interceptors are setup on the routes to collect data and create (and close) the root segment (see the X-Ray SDK for Node Express here). For browser-based integration, we're still discussing possible options from the X-Ray end.

Storing REST response to indexedDB with Cycle.js

I'm in the middle of learninig Cycle.JS and ran into a challenge. I have a component that will get a result from an HTTP call and I'd like to persist this response in indexDB. However, I feel that the request for persistence is the responsibility of another component.
The questions I have are:
Is this a use case for a custom driver that persists HTTP responses to indexDB?
How does another component access the response stream for a request it did not make?
When I try to select the category from the HTTP source, nothing gets logged to the console. I'm using xstream, so the streams should be hot and I expect debug to output. What's going on here?
Below is my component that makes the HTTP call:
import { Feed } from './feed'
export function RssList ({HTTP, props}, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({
url: url,
method: 'GET',
category: 'rss'
}))
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
const vDom$ = response$
.map(Feed)
.startWith('')
return {
DOM: vDom$,
HTTP: request$
}
}
Here is my attempt at accessing the response at the app level:
export function main (sources) {
const urlSource = url$(sources)
const rssSink = rss$(sources, urlSource.value)
const vDom$ = xs.combine(urlSource.DOM, rssSink.DOM)
.map(([urlInput, rssList]) =>
<div>
{urlInput}
{rssList}
</div>
)
sources.HTTP.select('rss').flatten().debug() // nothing happens here
return {
DOM: vDom$,
HTTP: rssSink.HTTP
}
}
Selecting a category in the main (the parent) component is the correct approach, and is supported.
The only reason why sources.HTTP.select('rss').flatten().debug() doesn't log anything is because that's not how debug works. It doesn't "subscribe" to the stream and create side effects. debug is essentially like a map operator that uses an identity function (always takes x as input and outputs x), but with a logging operation as a side effect. So you either need to replace .debug() with .addListener({next: x => console.log(x)}) or use the stream that .debug() outputs and hook it with the operator pipeline that goes to sinks. In other words, debug is an in-between logging side effect, not a destination logging side effect.
Question #1: Custom HTTP->IDB Driver: It depends on the nature of the project, for a simple example I used a general CycleJS IDB Driver. See example below or codesandbox.io example.
Question #2: Components Sharing Streams: Since components and main share the same source/sink API you can link the output (sink) of one component to the input (source) of another. See example below or codesandbox.io example.
Question #3: debug and Logging: As the authoritative (literally) André Staltz pointed out debug needs to be inserted into a completed stream cycle, I.E., an already subscribed/listened stream.
In your example you can put debug in your RssList component:
const response$ = HTTP
.select('rss')
.flatten()
.map(feedAdapter)
.debug()
OR add a listener to your main example:
sources.HTTP.select('rss').flatten().debug()
.addListener({next: x => console.log(x)})
OR, what I like to do, is include a log driver:
run(main, {
DOM: makeDOMDriver('#app'),
HTTP: makeHTTPDriver(),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
Then I'll just duplicate a stream and send it to the log sink:
const url$ = props.url
const http$ = url$.map(url => ({url: url, method: 'GET', category: 'rss'}))
const log$ = url$
return {
DOM: vdom$,
HTTP: http$,
log: log$,
}
Here's some example code for sending HTTP response to IndexedDB storage, using two components that share the data and a general IndexedDB driver:
function main(sources) {
const header$ = xs.of(div('RSS Feed:'))
const rssSink = RssList(sources) // input HTTP select and props
// output VDOM and data for IDB storage
const vDom$ = xs.combine(header$, rssSink.DOM) // build VDOM
.map(([header, rssList]) => div([header, rssList])
)
const idbSink = IdbSink(sources, rssSink.IDB) // output store and put HTTP response
return {
DOM: vDom$,
HTTP: rssSink.HTTP, // send HTTP request
IDB: idbSink.put, // send response to IDB store
log: idbSink.get, // get and log data stored in IDB
}
}
function RssList({ HTTP, props }, feedAdapter = x => x) {
const request$ = props.url$
.map(url => ({url: url, method: 'GET', category: 'rss'}))
const response$ = HTTP.select('rss').flatten().map(feedAdapter)
const idb$ = response$
const vDom$ = response$
.map(Feed)
.startWith(div('','...loading'))
return {
DOM: vDom$,
HTTP: request$,
IDB: { response: idb$ },
}
}
function Feed (feed) {
return div('> ' + feed)
}
function IdbSink(sources, idb) {
return {
get: sources.IDB.store('rss').getAll()
.map(obj => (obj['0'] && obj['0'].feed) || 'unknown'),
put: idb.response
.map(feedinfo => $put('rss', { feed: feedinfo }))
}
}
run(main, {
props: () => ({ url$: xs.of('http://lorem-rss.herokuapp.com/feed') }),
DOM: makeDOMDriver('#root'),
HTTP: makeHTTPDriver(),
IDB: makeIdbDriver('rss-db', 1, upgradeDb => {
upgradeDb.createObjectStore('rss', { keyPath: 'feed' })
}),
log: log$ => log$.addListener({next: log => console.log(log)}),
})
This is a contrived example, simply to explore the issues raised. Codesandbox.io example.

Angular 2 HTTP Request Issue

Really hoping someone can help me.
What I am trying to do: Call a REST API for a json and resolve a Angular 2 promise.
ServerAPI running Node.js/ExpressJS/Lodash
Server.js file:
var express = require('express');
var app = express();
var bodyParser = require("body-parser");
var data = require('./data.json');
var _ = require('lodash');
var cors = require('cors');
app.use(bodyParser.urlencoded({ extended: false }));
app.use(cors());
app.get('/GetData', function (req, resp) {
if (req.query.search != null) {
var result = _.find(data, function (o) {
return o.value === req.query.search.toLowerCase().trim()
});
return resp.send(result)
}
});
app.listen(1337, function () {
console.log('Listening at Port 1337');
});
Ran as tested http://localhost:1337/GetData?search=colorado and returns a vaild json object.
ClientAPI
Service file calling HTTP request:
import {Injectable} from "#angular/core";
import {Http} from "#angular/http";
import {Config} from "../config";
import {SearchResult} from "../models/search-result.model";
import {MockSearchData} from "../mock/mock-search-results";
import {Observable} from 'rxjs/Rx';
import 'rxjs/add/operator/map';
#Injectable()
export class ApiDataService {
constructor(private http:Http) {
}
public performSearchRequest(searchTerm:string,queryType:string):Promise<SearchResult[]> {
return new Promise<SearchResult[]>((resolve, reject) => {
let url = Config.apiBaseUrl + Config.searchApi;
url += "?search=" + searchTerm;
console.log("Your query to be: " + url);
if (searchTerm != "") {
if (queryType == 'mock') {
resolve(MockSearchData);
} else if (queryType == 'api') {
let data = [];
this.http.get(url)
.map(resp => resp.json())
.subscribe(getData => data = getData);
resolve(data);
} else {
reject("No query type found.");
}
} else {
reject("Please enter a search term.");
};
});
}
}
The resolve of the mock data which is a local json file within the ClientAPI works perfectly. I need to get the if function for the api query type to work.
The Angular app starts with no issue and runs the http.get without error. I checked the network tab under the dev tools and can see that a HTTP request was done and returns its response is the valid JSON object I want resolved e.g there is data being returned. Yet the table i am resolving this into is blank.
WHAT AM I DOING WRONG!
The issue occurs here:
this.http.get(url)
.map(resp => resp.json())
.subscribe(getData => data = getData);
resolve(data);
You subscribe to the observable, but it isn’t “resolved” yet when you call resolve directly afterwards. That means you’re really just calling resolve([]).
Instead, do something like this:
this.http.get()./*...*/.subscribe(result => resolve(result));
You also might want to look into the toPromise method on Observables as well as the way to construct a resolved promise directly.