My website lets a user record and upload a file using some basic html
<input type="file" accept="video/*;capture=camcorder">
Everything works fine; however, I have noticed my video is blurry compared to when I'm just normally recording a video on my iPhone. I've tested on multiple devices and all are blurry. The problem is when using this input element the camera is blurry. Why would the input element make the camera blurry, compared to when I just used the iPhone camera app? Can I add some attributes to the html element? Do I need to specify size? etc.
Here is the difference on my phone. Here is a screenshot when using html input (and yes I tried clicking to focus):
And here is screenshot from normal iPhone camera roll (when not opened through web input):
How can I make this input element camera look as good as the normal iPhone camera app?
Related
It works as expected on Android, but on iOS it opens camera and only after exiting the camera app it leaves the snapshot in the canvas.
How do I make iOS handle display camera feed in HTML?
I'm posting a (non-functional) fiddle, to illustrate the html and the initialization methods.
https://jsfiddle.net/nz41b2kq/
{ some code to appease the SO algorithm }
The solution was to use "playsinline" attribute in the <video> element.
I'm trying to get an image capture from the front or back camera of my smartphone using the WebRTC. I used one of WebRTC samples for that. This code works perfectly in browsers on the computer, but on the smartphone with different operating systems (iOS, Android) I get a black screen in the tag <video autoplay></video>. I tried various browsers, in none of them the image capture function worked properly, everywhere a black screen was displayed. What should I do to capture the picture?
I have a http connection, all my smartphones and cameras work fine, so the problem is definitely in WebRTC (or in its use).
For Safari, try adding playsinline to the video element. See this thread for background information.
If that doesn't help you might want to check the mediastream as well as the video elements readyState property.
I'd like to show a list with image and video files from cameraroll in React Native app. I (accidentally) have seen that the Image component can display probably the first frame of a mov file on iOS. I checked the documentation but they write nothing about video files.
A React component for displaying different types of images, including
network images, static resources, temporary local images, and images
from local disk, such as the camera roll.
My question is if it's good practice to use the Image component for displaying a "thumbnail" from a video file this way or is it better to store a separate thumbnail (image file) for every video and display those in the list instead?
I also noticed this feature when we store '.mp4'(from local storage) in an image component, it can works as thumbnail without degradation in performance. However, I also noticed when you try to render any network video file it doesn't seem to work that way.
So this feature only works when we are displaying videos from phone's gallery.
Therefore, it is good to maintain thumbnails for video files if you are using videos from any other resource
We need to open the iPhone camera, to take images that will be saved to the camera roll.
I have read many examples here that all of them opens the UIImagePickerView.
Besides the fact that i cant understand why i have to open the picker view in order to open the camera , i just can't do that- i dont want the picker view, because i have my costumed photo album that we build, and we just need to have a little button in it, that opens the camera to take an image . without opening any other views above it .
Is that possible to use the camera without this pickerview that will cover my scene ?
or can i lead the user to the camera app and than take him back to my app ?
Thanks.
Instead of high level (i.e. Apple supplies the UI element) classes, you have to go to a more foundational (lower) level of API's, which would be AVCaptureDevice and AVCaptureDeviceInput.
And Apple has some nice source code available in their AVCam project.
If you want to display camera stream in you app without UIImagePickerController than you should you AVFoundation framework.
Here some some examples and tutorials:
take-photos-with AVFoundation
Custom camera
Displaying camera
I am making a new collage application using google chromes new WebRTC features that let me access the camera in javascript. I have been able to put the camera feed on a video element, take snapshots of the camera and store them in variables and them draw them onto my canvas.
My new problem is that even when the css -webkit-filter is changed on the video element with css (by clicking the video preview), the copied data is raw and not filtered. Is there any way to copy the draw the filtered data from the video element? or to draw a filter onto a region in a canvas?