iPad: Captured video w/ AVCam Sample, but it is 1080x720, how do you compress it? - objective-c

Captured video with AVCam Sample Project, but it is huge at 1080x720 resolution. How can I compress for saving to a web server?
I modified the sample code to not save the video file to the AssetsLibrary in "AvCamCaptureManager.m" "recordingDidFinishToOutputFileURL", so I take that outputfile url and send it to my webserver using ASIHttp. These video files are huge, I want to reduce their resolution to 568x320 to reduce the file size.
Given the uncompressed url, how do I compress it to a smaller file format and/or resolution?

I just saw your question, if it's still of any help, try reducing the quelity for the whole session
session.sessionPreset = AVCaptureSessionPresetMedium;
or
session.sessionPreset = AVCaptureSessionPresetLow;
The first will give smaller files while keeping a decent quality, and the latter will give a very small file with the lowest quality available for your device.

Related

Is it possible to slice a video file blob and then re-encode it server side?

Been absolutely banging my head on this one and I would like a sanity check. Mainly if what I want to do it's even possible as I am severely constrained by react-native which has pretty dodgy Blob support.
We all know that video encoding is expensive, thus instead of forcing the user to encode using ffmpeg I would like to delegate the whole process to the backend. It's all good, except that sometimes you might want to trim 30s of a video and it's pointless to upload 3+ minutes of it.
So I had this idea of slicing the blob of the video file:
const startOffset = (startTime * blobSize) / duration;
const endOffset = (endTime * blobSize) / duration;
const slicedBlob = blob.slice(startOffset, endOffset);
// Setting the type as third option is ignored
Something like this, the problem is that the file becomes totally unreadable once it reaches the backend.
React Native cannot handle Blob uploads, thus they are converted in base64, which is totally fine for the whole video, but not for the sliced blob.
This even if I keep the beginning intact:
const slicedBlob = blob.slice(0, endOffset);
I feel like the reason is that the file becomes a application/octet-stream which might impact the decoding?
I am at a bit of a loss here as I cannot understand if this is a react native issue with blobs or if it simply cannot be done.
Thanks for any input.
p.s. I prefer to stick to vanilla expo without using external libraries, I am aware that one exists to handle blobs, but not keen on ejecting relying on external libraries if possible.
You can not simply cut of chunks of a file and have it readable on the other side. For example, in an mp4 the video resolution is only stored in one place. If those bytes get removed, the decoder has no idea how to decode the video.
Yes it is possible to repackage the video client side by rewriting the container, and dropping full GOPs. But it’s would be about 1000 lines for code for you to write and would be limited to certain codecs and containers.

Setting up box metadata for MediaSource to work

Using mp4box to add h264 to a fragmented dash container (mp4)
Appending the m4s (dash) media segments. Don't know how to preserve order in the box metadata and not sure if it can be edited when using mp4box. Basically if I look at the segments this demo:
http://francisshanahan.com/demos/mpeg-dash
there is a bunch of stuff latent in the m4s files that specify order. Like the mfhd atom that has the sequence number. Or the sidx that keeps the earliest presentation time (which is identical to the base media decode time in the tfdt). Plus my sampleDuration times are zeroed in the sample entries in the trun (track fragment run).
Have tried to edit my m4s files with mp4parser without luck. Wondering if somebody else has taken h264 and built up a MediaSource stream?

NSdata to writeImageDataToSavedPhotosAlbum bytes not exactly same?

I am trying to save my NSData using writeImageDataToSavedPhotosAlbum.
My NSdata size is '49894' and I saved it using writeImageDataToSavedPhotosAlbum. if I read my saved image raw Data bytes using ALAssetsLibrary, I am getting my image size as '52161'.
I am expecting both as same. Can somebody guide me what is going wrong ?
Below link also not providing the proper solution.
saving image using writeImageDataToSavedPhotosAlbum modifies the actual image data
You can not and should not rely on the size, firstly because you don't know what the private implementation does and secondly because the image data is supplied with metadata (and if you don't supply metadata then a number of default values will be applied).
You can check what the metadata contains for the resulting asset and see how it differs from the metadata you supplied.
If you need to save exactly the same bytes, and / or you aren't saving valid image files then you should not use the photo library. Instead you should save the data to disk in your app sandbox, either in the documents or library directory.

Base64 encode very large files in objective C to upload file in sharepoint

I have a requirement where user can upload files present in app to SharePoint via same app.
I tried using http://schemas.microsoft.com/sharepoint/soap/CopyIntoItems method of sharepoint. But it needs file in base64 encoded format to be embedded into body of SOAP request . My code crashed on device when I tried to convert even a 30 MB file in base64 encoded string? Same code executed just fine on simulator
Is there any other alternative to upload files (like file streaming etc) onto sharepoint?? I may have to upload files upto 500 MB? Is there more efficient library to convert NSData into base64 encoded string for large file???
Should I read file in chunks and then convert that into base64 encoded string and upload file once complete file is converted? Any other appraoches???
First off, your code probably crashed because it ran out of memory. I would do a loop where I read chunks that I converted and then pushed to a open socket. This probably means that you need to go to a lower level than NSURLConnection, I have tried to search for NSURLConnection and chunked upload without much success.
Some seem to suggest using ASIHttp, but looking at the homepage it seems abandoned by the developer, so I can't recommend that.
AFNetworking looks really good, it has blocks support and I can see in the example on the first page how it could be used for you. Look at the streaming request example. Basically create a NSInputStream that you push chunked data to and use it in a AFHTTPURLConnectionOperation.

NSURLConnection downloading poor quality JPEGs on 3G

I am downloading a JPEG image from a server and assigning it to an image object in my iPhone app using an NSURLConnection.
All works fine when I'm on a Wifi connection, but as soon as I switch to 3G, the quality of the JPEG downloaded reduces dramatically.
Has anyone else experienced this?
Does anyone know of a way to force the iPhone to download the full quality JPEG?
Thanks in advance!
Nick.
If it's the 3G provider that compresses data on the fly I don't think you can do anything about it. Download the image with Safari via 3G -> if the image looks bad (and I expect it will) then it's the provider that compresses it.
To workaround this problem zip the image on the server and unzip it in the application -> this should bypass the compression on the 3G side.
A simple trick is to use https instead of http - this appears to work on O2.
I know this question is quite old but incase this is of any use to anyone...
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:theURL];
// Add headers to avoid mobile network operator image compression
[request setValue:#"no-cache" forHTTPHeaderField:#"Pragma"];
[request setValue:#"no-cache" forHTTPHeaderField:#"Cache-Control"];
Should stop the compression of images.
The mobile operator compresses images in order to save bandwidth, but they tend to respect these header fields, and allow you to request the uncompressed image.