This question already has answers here:
How to get response body with request.send() in dart
(8 answers)
Closed 1 year ago.
I am developing an app using flutter and I am using http library to call the api I built.
and I want to make a multipart request to send files and it sends it as well but I can't receive any response from the server because the object returned is StreamResponse.
please tell me how to get the body of the response.
Code Snippet:
var request = http.MultipartRequest('POST', Uri.parse('$SERVER_URL/signup'));
request.files.add(await http.MultipartFile.fromPath(
'image',
user.image,
filename: 'image',
contentType: MediaType('multipart/form-data', 'multipart/form-data'),
));
StreamResponse x = await request.send();
//get body of the response
Thanks,
Just use http.Response.fromStream()
import 'package:http/http.dart' as http;
var streamedResponse = await request.send();
var response = await http.Response.fromStream(streamedResponse);
StreamedResponse has a stream getter as you'd expect that delivers a stream of byte arrays.
Assuming that you want characters instead of bytes, push those through the appropriate character decoder - let's assume UTF-8.
You then might want all those joined into a single string, so we can use join. Giving you:
print(await x.stream.transform(utf8.decoder).join());
Related
I'm calling an API to fetch orders for a given user based on ID which are fetched from a third-party site. These are fetched correctly as a console.log them in the node server. But when I try to send the results back to the client neither res.send nor res.json results sending the data back to the client. Here is an example of an order from console.log:
{"customer":{"orders":{"edges":[{"node":{"id":"gid://shopify/Order/1234564564","lineItems":{"edges":[{"node":{"title":"Some title here"}}]}}}]}}}
Then on the client in the devtools when I console.log the response I get:
body:ReadableStream
locked:false
[[Prototype]]:ReadableStream
bodyUsed:false
headers:
Headers {}
ok:true
redirected:false
status:200
statusText:"OK"
type:"basic"
url:"http://localhost:9000/api/getOrders?cid=gid://shopify/Customer/1234564564"
[[Prototype]]:Response
Any help is appreciated.
=== UPDATE ===
I've now even tried the most basic of examples and am getting the same response on the client:
app.get('/testExpress', (req, res) => {
res.send("Hello")
});
Thanks to #laurent here is how I was able to receive that response on the client:
fetch('/testExpress')
.then(async (r) => {
const resp = await r.text();
console.log(resp);
})
You should try to get the json out of the response by const json = await response.json().
These are the available methods to parse the response body depending on what you want:
Response.arrayBuffer()
Returns a promise that resolves with an ArrayBuffer representation of the response body.
Response.blob()
Returns a promise that resolves with a Blob representation of the response body.
Response.clone()
Creates a clone of a Response object.
Response.formData()
Returns a promise that resolves with a FormData representation of the response body.
Response.json()
Returns a promise that resolves with the result of parsing the response body text as JSON.
Response.text()
Returns a promise that resolves with a text representation of the response body.
https://developer.mozilla.org/en-US/docs/Web/API/Response
The Response object is a representation of the http response. You should try to receive the body of the response by using one of these methods or parsing the body from the readable stream response.body.
I am using google app script to Call UPS api and generate shipping label. However the API response is truncated and i am unable to decode base64 encoded image which is part of the JSON response object as it is truncated.
I am also not getting any truncation error messages or responses from the UPS servers, neither is google apps script throwing an error
Have contacted UPS support with the JSON request and it seems to works fine at their end.
// Here is the code for API call.
function getLabel() {
var userName = "myUPS_username";
var password = "*********";
var accessKey = "my_access_key";
var transId = "Trans123";
var transactionSrc = "upstest";
var url = "https://wwwcie.ups.com/ship/v1807/shipments";
var header = {
'AccessLicenseNumber' : accessKey,
'password' : password,
'transId' : transId,
'transactionsrc' : transactionSrc,
'username' : userName
};
// parameters for url fetch
var params = {
'method': 'GET',
'contentType': 'application/json',
'headers': header,
'payload' : JSON.stringify(payload)
};
// call the UPS Shipment API
var response = UrlFetchApp.fetch(url, params);
}
Not including the JSON payload here
Answer:
UrlFetchApp() has functionality and response limitations, including POST and response sizes.
More Information:
As per the Apps Script Documentation, URL Fetch has Limitations which are implemented in the methods themselves. The limitations are as follows:
URL Fetch response size: 50MB/call
URL Fetch headers: 100/call
URL Fetch header size: 8kB/call
URL Fetch POST size: 50MB/call
URL Fetch URL length: 2kB/call
Unfortunately, there is no way to get around this.
Feature Request:
There is a Feature Request on Google's Issue Tracker requesting the increase of the UrlFetch response size limit already. This Feature Request can be found here, which you can give a star (☆) in the top left to let Google know more people wish for this request. There is already a response from them saying 'We'll consider raising the quota if there is enough interest from the developer community.', so letting them know this is a wanted feature is a good way to go.
References:
Google's Issue Tracker
Increase the UrlFetch Total Bytes quota Feature Request
Quotas for Google Services
Current Quotas
Current Limitations
I'm building my own WebhookClient for dialog flow. My code is the following (using Azure Functions, similar to Firebase Functions):
module.exports = async function(context, req) {
const agent = new WebhookClient({ request: context.req, response: context.res });
function welcome(agent) {
agent.add(`Welcome to my agent!!`);
}
let intentMap = new Map();
intentMap.set("Look up person", welcome);
agent.handleRequest(intentMap);
}
I tested the query and the response payload looks like this:
{
"fulfillmentText": "Welcome to my agent!!",
"outputContexts": []
}
And the headers in the response look like this:
Transfer-Encoding: chunked
Content-Type: application/json; charset=utf-8
Server: Microsoft-IIS/10.0
X-Powered-By: ASP.NET
Date: Tue, 11 Dec 2018 18:16:06 GMT
But when I test my bot in dialog flow, it returns the following:
Webhook call failed. Error: Failed to parse webhook JSON response:
Expect message object but got:
"笀ഀ ∀昀甀氀昀椀氀氀洀攀渀琀吀攀砀琀∀㨀 ∀圀攀氀挀漀洀攀 琀漀 洀礀 愀最攀渀琀℀℀∀Ⰰഀ ∀漀甀琀瀀甀琀䌀漀渀琀攀砀琀猀∀㨀 嬀崀ഀ紀".
There's Chinese symbols!? Here's a video of me testing it out in DialogFlow: https://imgur.com/yzcj0Kw
I know this should be a comment (as it isn't really an answer), but it's fairly verbose and I didn't want it to get lost in the noise.
I have the same problem using WebAPI on a local machine (using ngrok to tunnel back to Kestrel). A friend of mine has working code (he's hosting in AWS rather than Azure), so I started examining the differences between our responses. I've notice the following:
This occurs with Azure Functions and WebAPI (so it's not that)
The JSON payloads are identical (so it's not that)
Working payload isn't chunked
Working payload doesn't have a content type
As an experiment, I added this code to Startup.cs, in the Configure method:
app.Use(async (context, next) =>
{
var original = context.Response.Body;
var memory = new MemoryStream();
context.Response.Body = memory;
await next();
memory.Seek(0, SeekOrigin.Begin);
if (!context.Response.Headers.ContentLength.HasValue)
{
context.Response.Headers.ContentLength = memory.Length;
context.Response.ContentType = null;
}
await memory.CopyToAsync(original);
});
This code disables response chunking, which is now causing a new and slightly more interesting error for me in the google console:
*Webhook call failed. Error: Failed to parse webhook JSON response: com.google.gson.stream.MalformedJsonException: Unterminated object at line 1 column 94 path $.\u0000\\"\u0000f\u0000u\u0000l\u0000f\u0000i\u0000l\u0000l\u0000m\u0000e\u0000n\u0000t\u0000M\u0000e\u0000s\u0000s\u0000a\u0000g\u0000e\u0000s\u0000\\"\u0000.\
I thought this could be encoding at first, so I stashed my JSON as a string and used the various Encoding classes to convert between them, to no avail.
I fired up Postman and called my endpoint (using the same payload as Google) and I can see the whole response payload correctly - it's almost as if Google's end is terminating the stream part-way through reading...
Hopefully, this additional information will help us figure out what's going on!
Update
After some more digging and various server/lambda configs, I spotted this post here: https://github.com/googleapis/google-cloud-dotnet/issues/2258
It turns out that json.net IS the culprit! I guess it's something to do with the formatters on the way out of the pipeline. In order to prove this, I added this hard-coded response to my POST controller and it worked! :)
return new ContentResult()
{
Content = "{\"fulfillmentText\": null,\"fulfillmentMessages\": [],\"source\": null,\"payload\": {\"google\": {\"expectUserResponse\": false,\"userStorage\": null,\"richResponse\": {\"items\": [{\"simpleResponse\": {\"textToSpeech\": \"Why hello there\",\"ssml\": null,\"displayText\": \"Why hello there\"}}],\"suggestions\": null,\"linkOutSuggestion\": null}}}}",
ContentType = "application/json",
StatusCode = 200
};
Despite the HTTP header saying the charset is utf-8, that is definitely using the utf-16le character set, and then the receiving side is treating them as utf-16be. Given you're running on Azure, it sounds like there is some configuration you need to make in Azure Functions to represent the output as UTF-8 instead of using UTF-16 strings.
I'm trying to send a PDF for content extraction to a Tika Server but always get the error: "Cannot convert text from stream using the source encoding"
This is how Tika is expecting the files:
"All services that take files use HTTP "PUT" requests. When "PUT" is used, the original file must be sent in request body without any additional encoding (do not use multipart/form-data or other containers)." Source https://wiki.apache.org/tika/TikaJAXRS#Services
What is the correct way of sendig the file with XMLHttpRequest()?
Code:
var response, error, file, blob, xhr;
file = new File("/PROJECT/web/dateien/ai/pdf.pdf");
blob = file.toBuffer().toBlob("application/pdf");
url = "http://localhost:9998/tika";
// send data
try {
xhr = new XMLHttpRequest();
xhr.open("PUT", url);
xhr.setRequestHeader("Accept", "text/plain");
xhr.send(blob);
} catch (e) {
error = e;
}
({
response: xhr.responseText,
status: xhr.statusText,
error: error,
type: xhr.responseType,
blob: blob
});
Error:
I suspect PUT request to be converted into a POST request by wakanda when there is blob in XHR body. Can you wireshark your XHR request and add details ? If so, you can probably fill an issue in wakanda (https://github.com/Wakanda/wakanda-issues/issues)
Hope it helps,
Yann
I'm trying to write a front end to an API service with Node JS.
I'd like to be able to have a user point their browser at my node server and make a request. The node script would modify the input to the request, call the api service, then modify the output and pass back to the user.
I like the solution here (with Express JS and node-http-proxy) as it passes the cookies and headers directly from the user through my site to the api server.
proxy request in node.js / express
I see how to modify the input to the request, but i can't figure out how to modify the response. Any suggestions?
transformer-proxy could be useful here. I'm the author of this plugin and I'm answering here because I found this page when looking for the same question and wasn't satisfied with harmon as I don't want to manipulate HTML.
Maybe someone else is looking for this and finds it useful.
Harmon is designed to plug into node-http-proxy https://github.com/No9/harmon
It uses trumpet and so is stream based to work around any buffering problems.
It uses an element and attribute selector to enable manipulation of a response.
This can be used to modify output response.
See here: https://github.com/nodejitsu/node-http-proxy/issues/382#issuecomment-14895039
http-proxy-interceptor is a middleware I wrote for this very purpose. It allows you to modify the http response using one or more transform streams. There are tons of stream-based packages available (like trumpet, which harmon uses), and by using streams you can avoid buffering the entire response.
var httpProxy = require('http-proxy');
var modifyResponse = require('http-proxy-response-rewrite');
var proxy = httpProxy.createServer({
target:'target server IP here',
});
proxy.listen(8001);
proxy.on('error', function (err, req, res) {
res.writeHead(500, {
'Content-Type': 'text/plain'
});
res.end('Something went wrong. And we are reporting a custom error message.');
});
proxy.on('proxyRes', function (proxyRes, req, res) {
modifyResponse(res, proxyRes.headers['content-encoding'], function (body) {
if (body && (body.indexOf("<process-order-response>")!= -1)) {
var beforeTag = "</receipt-text>"; //tag after which u can add data to
// response
var beforeTagBody = body.substring(0,(body.indexOf(beforeTag) + beforeTag.length));
var requiredXml = " <ga-loyalty-rewards>\n"+
"<previousBalance>0</previousBalance>\n"+
"<availableBalance>0</availableBalance>\n"+
"<accuruedAmount>0</accuruedAmount>\n"+
"<redeemedAmount>0</redeemedAmount>\n"+
"</ga-loyalty-rewards>";
var afterTagBody = body.substring(body.indexOf(beforeTag)+ beforeTag.length)+
var res = [];
res.push(beforeTagBody, requiredXml, afterTagBody);
console.log(res.join(""));
return res.join("");
}
return body;
});
});