WebRTC local camera preview resolution decreases for network speed - webrtc

I want to scan text page while call is going. What I do is, take frames from local video preview and send it to server for processing.
Before call starts, preview quality and resolution is highest. But when call starts resolution of capturer is decreasing. I can see that onFrameResolutionChanged event is called on local renderer. I'm guessing that Web RTC is changing resolution because of internet speed.
I don't want to change the local display resolution.
I have this issue on IOS and Android WebRTC library.
What can I do to prevent local camera preview resolution from decreasing?
I tried videoSource.adaptOutputFormat function, but it just sets maximum quality and by the time preview still decreases.
Update:
What was I searching was enableCpuOveruseDetection = false. It have be set in
val config = PeerConnection.RTCConfiguration(servers);
config.enableCpuOveruseDetection = false
This works good for android, It does not resize local preview quality.
But in IOS there is no enableCpuOveruseDetection in RTCConfiguration() class. So in IOS problem still remains.

Related

Diffrent results for the image and screenshot of the image

I am using an object localizer with react native image picker to get coordinates of objects within an image. When I send the image by taking a photo with the android device the results I get are not accurate but when I take the screenshot of the photo and send it the results are almost perfect. Why might this be the case and how can I fix it?
The interesting thing is when I use the android studio emulator and send photos without taking screenshots of them the results are correct too. I have read that there are recommended image sizes for these operations however I could not find one for the object localizer.
Edit: I have found that when I take a screen shot the image resolution is equal to my devices width and height however when I take photo it uses cameras resolution.To give an example right now when I take a photo its resolution is 4032x2268 and resolution of said images screen shot is 1080x2220 which is the resolution I use for my android device.İs there any way to set cameras resolution to same as devices resolution?

How to get full resolution through Liveview

I am using SDK 2.3 and develop an Android application with AS-15 and 20 camera that is exclusively dealing with liveview.
I unable to obtain from Liveview a higher resolution than 640x360px, while the camera specs mention a 1920×1080/30P (HQ).
How can I get the full resolution?
Is this a limitation of the API ? Why?
I've found that some (other) cameras implement get/setLiveviewSize and with the L it says
XGA size scale (the size varies depending on the camera models, and some camera models change the liveview quality instead of making the size larger.)
What are the models with the highest liveview resolution?

UIWebView crashing

I have a problem with using UIWebViews in an objective-C app we're developing. It crashes any Retina iPad. Both iPad 1 and 2 behave fine.
The web view is loaded from local HTML/CSS/JS and is dynamic content that contains pricing information so needs to be essentially a mirror of a website which is pre-downloaded to the device.
The page contains lots of images so I think this is memory related. I've tried reducing the payload on the page which stops the crashes. Obviously the stupidity of Apple to choose to quadruple resolution whilst only doubling memory is a root cause of why it works fine on non-retina devices, but how can I manage memory within the page to prevent iOS from destroying the entire app?
Does iOS automatically store imagery as 2048x1536x32bpp as a bitmap (theoretically 12MB per image?) irrespective of the file format? I've tried converting to JPG / PNG but with no effect on the crashes. Only reducing the volume of images present on the page stops the crashes. It's my first foray into iOS development, so please be gentle!
iOS won't change the resolution of the original images that the page loads, but of course it will de-serialize them from .png/.jpg to a bitmap in memory for display, so you should start by trying to reduce the resolution of the .png/.jpg images that the page loads.

Unity3d external camera frame rate

I am working on a live augmented reality application. So far I have worked on many AR-Applications for mobile devices.
Now I have to get the video signal from a Panasonic P2. The camera is an European version. I catch the signal with a AJA io HD Box, witch is connected by firewire to a MacPro. So far everything works great - just not in Unity.
When I start the preview in Unity the framebuffer of the AJA ControlPanel jumps to a frame-rate of 59.94 fps. I guess because of a preference on unity. Because of the European version of the camera I can not switch to 59,94fps or 29,47fps.
I checked all settings in Unity, but couldn't find anything...
Is there any possibility to change the frame-rate unity captures from an external camera?
If you're polling the camera from Unity's Update() function then you will be under the influence of Vsync, which limits frame processing to 60 FPS.
You can switch off Vsync by going to Edit > Project Settings > Quality and then setting the option Vsync Count to "don't sync".

How do I process video frames in HTML5 quickly?

I am testing HTML5's video API. The plan is to have a video play with an effect, like making it black and white. I have and working together using a buffer. I take the current video frame and copy to the scratch buffer where I can process it. The problem is the rate at which it runs.
The Video API of HTML5 has the 'timeupdate' event. I tried using this to have the handler process frames, once for every frame, but it runs at a slower rate than the video.
Any ideas to speed up processing frames?
You can get much more frequent redraws by using requestAnimationFrame to determine when to update your canvas, rather than relying on timeupdate, which only updates every 200-250ms. It's definitely not enough for frame-accurate animation. requestAnimationFrame will update at most every 16ms (approx 60fps), but the browser will throttle it as necessary and sync with video buffer draw calls. It's pretty much exactly what you want for this sort of thing.
Even with higher frame rates, processing video frames with a 2D canvas is going to be pretty slow. For one thing, you're processing every pixel sequentially in the CPU, running Javascript. The other problem is that you're copying around a lot of memory. There's no way to directly access pixels in a video element. Instead, you have to copy the whole frame into a canvas first. Then, you have to call getImageData, which not only copies the whole frame a second time, but it also has to allocate the whole block of memory again, since it creates a new ImageData every time. Would be nice if you could copy into an existing buffer, but you can't.
It turns out you can do extremely fast image processing with WebGL. I've written a library called Seriously.js for exactly this purpose. Check out the wiki for a FAQ and tutorial. There's a Hue/Saturation plugin you can use - just drop the saturation to -1 to get your video to grayscale.
The code will look something like this:
var composition = new Seriously();
var effect = composition.effect('hue-saturation');
var target = composition.target('#mycanvas');
effect.source = '#myvideo';
effect.saturation = -1;
target.source = effect;
composition.go();
The big down side of using WebGL is that not every browser or computer will support it - Internet Explorer is out, as is any machine with old or weird video drivers. Most mobile browsers don't support it. You can get good stats on it here and here. But you can get very high frame rates on pretty large videos, even with much more complex effects.
(There is also a small issue with a browser bug that, oddly enough, shows up in both Chrome and Firefox. Your canvas will often be one frame behind the video, which is only an issue if the video is paused, and is most egregious if you're skipping around. The only workaround seems to be to keep forcing updates, even if your video is paused, which is less efficient. Please feel free to vote those tickets up so they get some attention.)