Angular JS file upload isn't processed properly by connect-multiparty - express

I have been tearing my hair out for the last couple of hours and I hope someone here could help me out. I have an angular application which is using (https://github.com/danialfarid/ng-file-upload) and no matter what I try I can't seem to be able to get the files out at the other end.
I do see the correct request payload in the Chrome developer tools:
------WebKitFormBoundaryEwG0XOfjS0IjwRji
Content-Disposition: form-data; name="file"; filename="images.png"
Content-Type: image/png
------WebKitFormBoundaryEwG0XOfjS0IjwRji-
My code on the server side looks as follows, I am using a router which is using connect-multiparty.
Router.js:
router.post('/api/v1/uploaddocument', multipartyMiddleware, UserFunctions.saveDocument);
The actual save document function:
saveDocument: function(req,res)
{
console.log(req.body, req.files, req.data, req.file)
}
The controller posting this message in angular is:
iSelectClient.controller('PageUploadDocumentCtrl', ['$scope', 'Upload', function ($scope, Upload) {
$scope.$watch('files', function () {
$scope.upload($scope.files);
});
$scope.upload = function (files) {
if (files && files.length) {
for (var i = 0; i < files.length; i++) {
var file = files[i];
Upload.upload({
url: 'http://localhost:3000/api/v1/uploaddocument',
fields: {'username': $scope.username},
file: file
}).progress(function (evt) {
var progressPercentage = parseInt(100.0 * evt.loaded / evt.total);
console.log('progress: ' + progressPercentage + '% ' + evt.config.file.name);
}).success(function (data, status, headers, config) {
console.log('file ' + config.file.name + 'uploaded. Response: ' + data);
});
}
}
};
}]);
I am not using app.use(bodyparser()) on the server side, for each individual route I am defining which parser to use.
What o what could it be?
EDIT
I had an interceptor on each call which set the application type to json, so it never reached the other end correctly. Fixed now, using the exact same setup as below

I had an interceptor on each call which set the application type to json, so it never reached the other end correctly. Fixed now, using the exact same setup as below

Related

Axios xlsx file download issue

I try to download *.xlsx file in Vue by using Axios get request, however response that i get from GET is not what i expected, what i am trying to do:
on frontend in OnClick method:
const response = await this._fileService.getFileAsBlob(fileName);
const downloadBlob = new Blob([response.data], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;' })
virtualLink.href = URL.createObjectURL(downloadBlob);
virtualLink.download = file.fileName?? 'file';
virtualLink.click();
next the getFileAsBlob call
public getFileAsBlob(fileName: string): Promise<AxiosResponse<Blob>> {
return this._http.get<Blob>(`API_URL`, {
responseType: 'arraybuffer',
headers: {
"content-type": "application/octet-stream"
}
});
}
Now my concerns:
First, orginal file byte array is: byte[11524]
but in axios response.data this file is ArrayBuffer(15370) (disclaimer here, i've checked respone in backend, everything is working fine, at the last step backend is returning proper byte array)
Second, as i debugged this response, i've noticed, that although i set "content-type": "application/octet-stream" in response i get "application/json, text/plain, */*", what can be cause of it?
As a result, downloaded file is corrupted and cannot be opened by Excel, can somebody point me where am i having a flaw in logic?

hapi 18 eventsourcing not working without stream.end()

Try to archive:
I try to use the HTML5 EventSourcing API https://developer.mozilla.org/de/docs/Web/API/EventSource to push events to my client application (javascript).
working example code with plain node http:
With a plain example node implementation it works perfectly and as expected. Example code: https://www.html5rocks.com/en/tutorials/eventsource/basics/
Problem:
When i try to integrate EventSourcing (or SSE) into my API endpoint which is based on hapi (currently using latest - 18.1.0) it does not work.
My route handler code mixed with some code i found:
const Stream = require('stream');
class ResponseStream extends Stream.PassThrough {
setCompressor (compressor) {
this._compressor = compressor;
}
}
const stream = new ResponseStream();
let data = 0;
setInterval(() => {
data++;
stream.write('event: message\n');
stream.write('data:' + data + '\n\n');
console.log('write data...', data);
// stream.end();
}, 1000);
return h
.response(stream)
.type('text/event-stream')
.header('Connection', 'keep-alive')
.header('Cache-Control', 'no-cache')
Findings:
I already searched and it seems since hapi 17.x there they exposed the flush method for the compressor < https://github.com/hapijs/hapi/issues/3658 >, section features.
But it still does not working.
They only way it sends a message is to uncomment the stream.end() line after sending the data. The problem obviously is that i cant send further data if i close the stream :/.
If i kill the server (with stream.end() line commented) the data gets transmitted to the client in a "single transmission". I think the problem is is still somewhere with the gzip buffering even when flushing the stream.
There are some code examples in the hapi github but i got none working with hapi 17 or 18 (all exmaples where hapi =< 16) :/
Someone know how to solve the problem or has a working EventSource example with latest hapi? I would kindly appreciate any help or suggestions.
Edit - Solution
The solution from the post below does work but i had also an nginx reverse proxy in front of my api endpoint it seems the main problem was not my code it was the nginx which had also buffered the eventsource messages.
To avoid this sort of problem add in your hapi: X-Accel-Buffering: no; and it works flawless
Well I just tested with Hapi 18.1.0 and managed to create a working example.
This is my handler code:
handler: async (request, h) => {
class ResponseStream extends Stream.PassThrough {
setCompressor(compressor) {
this._compressor = compressor;
}
}
const stream = new ResponseStream();
let data = 0;
setInterval(() => {
data++;
stream.write('event: message\n');
stream.write('data:' + data + '\n\n');
console.log('write data...', data);
stream._compressor.flush();
}, 1000);
return h.response(stream)
.type('text/event-stream')
}
and this is client code just to test
var evtSource = new EventSource("http://localhost/");
evtSource.onmessage = function(e) {
console.log("Data", + e.data);
};
evtSource.onerror = function(e) {
console.log("EventSource failed.", e);
};
These are the resources that where I found my way to working example
https://github.com/hapijs/hapi/blob/70f777bd2fbe6e2462847f05ee10a7206571e280/test/transmit.js#L1816
https://github.com/hapijs/hapi/issues/3599#issuecomment-485190525

Firefox add-on SDK: Get http response headers

I'm new to add-on development and I've been struggling with this issue for a while now. There are some questions here that are somehow related but they haven't helped me to find a solution yet.
So, I'm developing a Firefox add-on that reads one particular header when any web page that is loaded in any tab in the browser.
I'm able to observer tab loads but I don't think there is a way to read http headers inside the following (simple) code, only url. Please correct me if I'm wrong.
var tabs = require("sdk/tabs");
tabs.on('open', function(tab){
tab.on('ready', function(tab){
console.log(tab.url);
});
});
});
I'm also able to read response headers by observing http events like this:
var {Cc, Ci} = require("chrome");
var httpRequestObserver =
{
init: function() {
var observerService = Cc["#mozilla.org/observer-service;1"].getService(Ci.nsIObserverService);
observerService.addObserver(this, "http-on-examine-response", false);
},
observe: function(subject, topic, data)
{
if (topic == "http-on-examine-response") {
subject.QueryInterface(Ci.nsIHttpChannel);
this.onExamineResponse(subject);
}
},
onExamineResponse: function (oHttp)
{
try
{
var header_value = oHttp.getResponseHeader("<the_header_that_i_need>"); // Works fine
console.log(header_value);
}
catch(err)
{
console.log(err);
}
}
};
The problem (and a major source of personal confusion) is that when I'm reading the response headers I don't know to which request the response is for. I want to somehow map the request (request url especially) and the response header ("the_header_that_i_need").
You're pretty much there, take a look at the sample code here for more things you can do.
onExamineResponse: function (oHttp)
{
try
{
var header_value = oHttp.getResponseHeader("<the_header_that_i_need>");
// URI is the nsIURI of the response you're looking at
// and spec gives you the full URL string
var url = oHttp.URI.spec;
}
catch(err)
{
console.log(err);
}
}
Also people often need to find the tab related, which this answers Finding the tab that fired an http-on-examine-response event

File upload Http client issue Titanium

I am trying to upload a .mp4 file to some server. I am using the HTTP client provided by titanium. when I upload the file, HTTP client is adding some headers in the file due to which the file gets corrupted and cannot be played. When I download the uploaded file and open it in notepad I can see the header which are added to the file.
What should I do so that these headers are not added to the file?
Thanks a lot!
// CODE
var uploadFile = Titanium.Filesystem.getFile(dir, _previewUrl);
var fileUploadUrl = 'Some Url for the server to upload';
var headers = { 'Content-Type' : 'multipart/form-data' };
var content = { 'file' : uploadFile };
var xhr = Titanium.Network.createHTTPClient();
for(var key in _headers) {
xhr.setRequestHeader(key, _headers[key]);
}
xhr.onerror = function(e)
{
Ti.UI.createAlertDialog({title:'Error', message:e.error}).show();
Ti.API.info('IN ERROR ' + e.error);
};
xhr.setTimeout(20000);
xhr.onload = function(e)
{
Ti.UI.createAlertDialog({title:'Success', message:'status code ' + this.status}).show();
Ti.API.info('IN ONLOAD ' + this.status + ' readyState ' + this.readyState);
};
xhr.onsendstream = function(e)
{
ind.value = e.progress ;
Ti.API.info('ONSENDSTREAM - PROGRESS: ' + e.progress);
};
// open the client
xhr.open('POST',fileUploadUrl);
// send the data
xhr.send(content);
// END
try setting the headers after you call xhr.open
// open the client
xhr.open('POST',fileUploadUrl);
for(var key in _headers) {
xhr.setRequestHeader(key, _headers[key]);
}
Do not add { 'Content-Type' : 'multipart/form-data' }; header. This way you should get the file properly without any headers like boundary and file name etc. I could send image, 3gpp file like that successfully But, when I send a video file, my server PHP code $_FILES will be empty array. Even the $_FILES["files"]["error"] have no value. There should some other trick to send video file. (Titanium SDK 3.1.1 & android 4.1.2)
xhr.open("POST", URL);
xhr.send({
files : Titanium.Filesystem.getFile(Titanium.Filesystem.applicationDataDirectory, sourcefilename)
});
}
Try not sending the raw blob itself. Send base64 encoded string instead.
var uploadFile = Titanium.Filesystem.getFile(dir, _previewUrl);
var base64File = Ti.Utils.base64encode(uploadFile.read()).toString();
And try changing the header to
xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
xhr.send(base64File);
That will solve your problem.

dojo jax-rs call issue

I'm trying to call a REST webservice using dojo toolkit it seems that the call is encountring some isues this is the call with dojo
dojo.xhrGet({
url: 'http://localhost:9080/TestJMSWeb/jaxrs/categories/all',
handleAs: 'json',
timeout: 2000,
load: callback
});
var callback = dojo.hitch(this, function(data) {
var massagedData = {
label: 'categorie',
identifier: 'id',
items: data
}
this.store = new dojo.data.ItemFileReadStore({data: massagedData});
});
the webservice code is here
#GET
#Path("/all")
#Produces("application/json")
public JSONArray getAllCategories() throws IOException {
final List<Categorie> allCategories = manager.getCategories();
if (allCategories == null || allCategories.isEmpty())
throw new WebApplicationException(ErrorUtil.jSONArrayResponse(Status.NO_CONTENT, "No category found"));
JSONArray jsonArray = jsonCustomerArray(allCategories);
return jsonArray;
}
when I call the webservice I get an error message
ResourceRegis I org.apache.wink.server.internal.registry.ResourceRegistry filterDispatchMethods The system cannot find any method in the ressources.CategorieRessouce class that supports OPTIONS. Verify that a method exists.
[4/24/12 1:23:41:531 GMT] 0000002f SystemErr R 0 TestJMSWeb INFO [WebContainer : 0] openjpa.Runtime - OpenJPA dynamically loaded a validation provider.
it seems that is trying to call the ressource with the OPTIONS method while I'm using the .xhrGet function what is the problem?
Here is a link describing the problem: http://engin.bzzzt.biz/2010/01/22/first-dojo-impression/
The guy talks about how if it is a cross domain request (which I believe yours is, because of the ports), and the request contains some Access-Control-* HTTP headers, than browsers will send the request as OPTIONS instead of GET.
Dojo adds the Access-Control-* headers when it determines you are making a cross domain request. You can try to fix this yourself by going to dojo/_base/xhr.js and commenting out the following lines (723 to 729):
// FIXME: is this appropriate for all content types?
if(args.contentType !== false){
xhr.setRequestHeader("Content-Type", args.contentType || _defaultContentType);
}
if(!args.headers || !("X-Requested-With" in args.headers)){
xhr.setRequestHeader("X-Requested-With", "XMLHttpRequest");
}
I haven't tried this fix yet so please let me know if it works!