XHR blob size limit - xmlhttprequest

I'm having a little trouble regarding a blob upload through xhr.send(Blob). I have a website in which a user creates a video via getUserMedia() which is placed inside my website trough a blob, they can record multiple times until they feel satisfied with the video result. I want them to give the user the ability to upload the video to a server without them downloading the file and then using a form to upload the video. So I managed to make this through xhr.send
function sendXHR(){
var xhr = new XMLHttpRequest();
var video=$("#myexportingvideo");
xhr.open('GET', video.src , true);
xhr.responseType = 'blob';
xhr.onload = function(e) {
if (this.status == 200) {
// Note: .response instead of .responseText
var blob = new Blob([this.response], {type: 'video/webm'});
console.log(blob.size/1024);
console.log(blob.type);
form = new FormData(),
request = new XMLHttpRequest();
form.append("myblob",blob,"Capture.webm");
request.open(
"POST",
"../TryExtension/upload_file.php",
true
);
//request.send(blob);
request.send(form);
}
};
xhr.send();
}
The problem is that when blobs are approx bigger than 1.8 MB in size the data sent is 0, so basically my question is, Is there a limit in the blob size or i have to send the blob in chunks to the server?
An example of the issue is the following.
Console Log
frames captured: 32 => 1.077s video vid.js:155
XHR finished loading: "blob:http%3A//localhost/19d162ae-c22e-48af-a83d-2ddd579ffff9". vid.js:240
1199.46484375 vid.js:225
video/webm vid.js:226
XHR finished loading: "http://localhost/TryExtension/upload_file.php". vid.js:237
frames captured: 91 => 3.052s video vid.js:155
XHR finished loading: "blob:http%3A//localhost/c7d4f3c6-88c8-43c5-9adb-b9060c93bfb3". vid.js:240
3402.873046875 vid.js:225
video/webm vid.js:226
XHR finished loading: "http://localhost/TryExtension/upload_file.php".
The first video is uploaded correctly but the next one is uploaded with 0 bytes of information.
Thanks in advance for all your help.

Related

Asp.net core endpoint "hangs" when sending response as chunked

I'm trying to send a response as Transfer-Encoding: chunked in asp.net core but it fails. Browser hangs (request remains pending) and then says ERR_INCOMPLETE_CHUNKED_ENCODING. curl says that connection was closed. How to send body chunked correctly ?
This is my endpoint:
endpoints.MapGet("/GetMessage", async context =>
{
context.Response.Headers["Transfer-Encoding"] = "chunked";
await context.Response.WriteAsync("Hello World!");
// send the first part
await context.Response.Body.FlushAsync();
// delay to simulate 'work'
await Task.Delay(2000);
// send the second part
await context.Response.WriteAsync("Hello World!");
});
EDIT (my findings):
The reason I need chunked transfer is because I'm using a technique called http streaming. It is leaving response open and sending parts of file as they become available.
What I've learned is that I don't need to set Transfer-Encoding header, I don't need to 'flush' and I need to use Response.Body.WriteAsync since I need to write bytes because file is audio (not text). The final piece is that I need to give it some audio mime type or browser will wait for the entire body before calling progress callback function. There's some under the hood stuff happening that I don't understand but this is good enough for me.
My server test code:
endpoints.MapGet("/GetMessage", async context =>
{
context.Response.Headers["Content-Type"] = "audio/aac";
var buf = System.Text.Encoding.ASCII.GetBytes("Hellow world !");
await context.Response.Body.WriteAsync(buf, 0, buf.Length);
await Task.Delay(3000);
await context.Response.Body.WriteAsync(buf, 0, buf.Length);
});
Html test page:
<html>
<body>
<h3>Hello word</h3>
<script>
let xhr = new XMLHttpRequest()
xhr.onprogress = e => console.log("progress " + e.currentTarget.response)
xhr.onload = e => console.log("load " + e.currentTarget.response)
xhr.open("GET", "/GetMessage");
xhr.send();
</script>
</body>
</html>
This works as expected. Console prints first part right away and the entire body at the end.

Agora Cloud Recording doesn't record mixed in audio files

Hi I have been successfully recording an Agora audio call, where one person speaks in a broadcast role, and during the call mixes in a number of audio files.
All the audio was being recorded until we upgraded to flutter 2 and associated upgraded packages.
Now all that is recorded is the broadcaster voice, and no mixed in audio.
The broadcaster and audience members can all hear the mixed in audio within the call without issue.
The code (Flutter) is similar to this:
Mix in Audio into a valid RTC session, with default settings
final playing = await session.playAudioFile(path, (){
state = MessagePlayerState.STOPPED;
if (!disposing) {
whenFinished();
}
});
The recording options are as follows (My UID is a hardcoded string, that is not the same as any participant UIDs)
http.Response response = await http.post(
Uri.https(AGORA_REST_URL, '$AGORA_REST_API_VERSION/$appId/cloud_recording/resourceid/$resourceId/mode/mix/start'),
headers: <String, String>{
HttpHeaders.authorizationHeader: 'Basic $basicAuth',
HttpHeaders.contentTypeHeader: 'application/json; charset=UTF-8',
},
body: jsonEncode(<String, dynamic>{
'cname': channelName,
'uid': uid,
'clientRequest': {
'recordingConfig':{
'channelType':0,
'streamTypes':2, // TODO: Should be a streamTypes of 0 (audio only), but get failures.
'audioProfile':1,
'videoStreamType':0,
'maxIdleTime':120,
'transcodingConfig':{
'width':360,
'height':640,
'fps':30,
'bitrate':600,
'maxResolutionUid':'1',
'mixedVideoLayout':1
},
'recordingFileConfig':{
'avFileType': ['hls','mp4']
}
},
'storageConfig':{
'vendor':1,
'region':3,
'bucket':AWS_RECORDING_BUCKET, // TODO: Env Var
'accessKey':AWS_BUCKET_ACCESS_KEY,
'secretKey':AWS_BUCKET_SECRET_KEY,
}
},
}),
);
The m3u8 and ts files are present in the S3 bucket.
Adjusting the metadata tags in S3 results in a file that plays fine in Safari, but no mixed in audio is heard.
Converting the file to aac with ffmpeg shows this error
[hls # 0x7fd6cc808200] Opening '2838cfc6254e9fec2e3088976f39d7ce_bip_20210618014151427.ts' for reading
[mpegts # 0x7fd6cc00a600] Packet corrupt (stream = 0, dts = 1437390).
size= 480kB time=00:00:30.69 bitrate= 128.1kbits/s speed=1.49e+03x
video:0kB audio:470kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.093976%
And the result is the same as from the S3 bucket.
Any help or hints appreciated.
This can be closed/ignored. Turns out we had an edge condition that did not show when the app was used normally, but if, for instance, you wanted a very stage managed recording to show off to others it broke.

Display PDF from azure blob in browsers using Microsoft Azure Storage SDK for Node.js and JavaScript for Browsers

I am trying to use Microsoft Azure Storage SDK for Node.js and JavaScript for Browsers (https://github.com/Azure/azure-storage-node) to display PDF contents stored in Azure blob in browsers. So far I couldn't find any examples on how to do it.
I tried to follow the suggestion from https://github.com/Azure/azure-storage-node/issues/440, But couldn't make it work. I am using Azure function.
module.exports = async function (context, req) {
let accessToken = await getAccessToken();
let container = req.params.container;
let filename = req.params.filename;
let tokenCredential = new azure.TokenCredential(accessToken);
let storageAccountName = process.env.StorageAccountName;
let blobService = azure.createBlobServiceWithTokenCredential(`https://${storageAccountName}.blob.core.windows.net/`, tokenCredential);
return new Promise((resolve, reject) => {
let readStream = blobService.createReadStream(container, filename, function (error, result, response) {
if (error) {
context.log(error);
context.log(response);
context.res = {
status: 400,
body: response
};
resolve(context.res);
}
});
let body = '';
readStream.on('data', (chunk) => {
body += chunk;
});
readStream.on('end', () => {
context.res = {
headers: {
'Content-Type': "application/pdf"
},
body: body
};
resolve(context.res);
});
});
};
But I got "Couldn't open PDF" error message in the browser or timeout error.
For downloading blob in browser environment, using URL with SAS is recommended, and in the framework you are using, would an accessible URL pointing to PDF be enough?
Please follow example:
Download Blob
BlobService provides interfaces for downloading a blob into browser memory. Because of browser's sandbox limitation, we cannot save the downloaded data trunks into disk until we get all the data trunks of a blob into browser memory. The browser's memory size is also limited especially for downloading huge blobs, so it's recommended to download a blob in browser with SAS Token authorized link directly.
Shared access signatures (SAS) are a secure way to provide granular access to blobs and containers without providing your storage account name or keys. Shared access signatures are often used to provide limited access to your data, such as allowing a mobile app to access blobs.

How to send multiple images in a Expressjs api get request with sendFIle()

I'm looking for away to send multiple images in one GET request from an Expressjs server through an api.
I want to create an image gallery of each users uploaded images in a MEAN stack. When images are uploaded using multer, the image information is saved to mongodb, including the userid of whoever uploaded it.
When on angularjs, I want user to have access to any of the images they have previously uploaded. Currently I'm sending one file on a GET request based on user id. Is there anyway of sending multiple files in one json. I'm currently using Expressjs's res.sendFile, but haven't found any info about sending multiple back yet.
https://expressjs.com/en/api.html#res.sendFile
Here is my current get request:
exports.getUpload = function(req, res) {
Upload.find({createdby: req.params.Id}).exec(function(err, upload) {
errorhandle.errorconsole(err, 'file found');
console.log(upload[0]);
var options = {
root: '/usr/src/app/server/public/uploads/images'
};
var name = "" + upload[0].storedname +"";
console.log(name);
res.sendFile(name, options,function(err) {
errorhandle.errorconsole(err, 'file sent');
});
});
};
You can't with res.sendFile. In fact I don't think you can at all. Maybe with HTTP/2 Server Push
, but I'm not sure.
What you can do is send a JSON response with a link to all the images:
exports.getUpload = async (req, res) => {
const uploads = await Upload.find({ createdby: req.params.Id }).exec()
const response = uploads.map(image => {name: `https://example.com/uploads/images/${image.storedname}`})
res.json(response)
}
Note error handling omitted.

File upload Http client issue Titanium

I am trying to upload a .mp4 file to some server. I am using the HTTP client provided by titanium. when I upload the file, HTTP client is adding some headers in the file due to which the file gets corrupted and cannot be played. When I download the uploaded file and open it in notepad I can see the header which are added to the file.
What should I do so that these headers are not added to the file?
Thanks a lot!
// CODE
var uploadFile = Titanium.Filesystem.getFile(dir, _previewUrl);
var fileUploadUrl = 'Some Url for the server to upload';
var headers = { 'Content-Type' : 'multipart/form-data' };
var content = { 'file' : uploadFile };
var xhr = Titanium.Network.createHTTPClient();
for(var key in _headers) {
xhr.setRequestHeader(key, _headers[key]);
}
xhr.onerror = function(e)
{
Ti.UI.createAlertDialog({title:'Error', message:e.error}).show();
Ti.API.info('IN ERROR ' + e.error);
};
xhr.setTimeout(20000);
xhr.onload = function(e)
{
Ti.UI.createAlertDialog({title:'Success', message:'status code ' + this.status}).show();
Ti.API.info('IN ONLOAD ' + this.status + ' readyState ' + this.readyState);
};
xhr.onsendstream = function(e)
{
ind.value = e.progress ;
Ti.API.info('ONSENDSTREAM - PROGRESS: ' + e.progress);
};
// open the client
xhr.open('POST',fileUploadUrl);
// send the data
xhr.send(content);
// END
try setting the headers after you call xhr.open
// open the client
xhr.open('POST',fileUploadUrl);
for(var key in _headers) {
xhr.setRequestHeader(key, _headers[key]);
}
Do not add { 'Content-Type' : 'multipart/form-data' }; header. This way you should get the file properly without any headers like boundary and file name etc. I could send image, 3gpp file like that successfully But, when I send a video file, my server PHP code $_FILES will be empty array. Even the $_FILES["files"]["error"] have no value. There should some other trick to send video file. (Titanium SDK 3.1.1 & android 4.1.2)
xhr.open("POST", URL);
xhr.send({
files : Titanium.Filesystem.getFile(Titanium.Filesystem.applicationDataDirectory, sourcefilename)
});
}
Try not sending the raw blob itself. Send base64 encoded string instead.
var uploadFile = Titanium.Filesystem.getFile(dir, _previewUrl);
var base64File = Ti.Utils.base64encode(uploadFile.read()).toString();
And try changing the header to
xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
xhr.send(base64File);
That will solve your problem.