Change brightness of video and save in photo library in Objective-C, XCode - objective-c

I am developing and app for iOS platform. The app is about video editing, I want to give users option to change the brightness of the video before saving. I am using GPUImage library for video editing, is there any way I can change the brightness of the video and save it in the photo library. Thanks
P.S. I tried finding the similar issue, sorry if the answer to this question already exist, i could not find that.

Related

Snapchat style captions on recorded videos

I am using expo and i am trying to implement a feature similar to snapchat/instagrams draw on video and add text/caption to video before upload it. My problem is not the UI part but editing the original video and getting a uri for the new video
I know with images you can use libraries like expo-pixi then take a snapshot of a view but i am not how to go about this for recorded videos specifically
anyone would be kind enough to point me to the right direction?

Want to create custom audio effect in native c/c++ for android

I am working on project which deals with audio video playback in Android KitKat. I am able to play video using video view and it's MediaPlayer helps to modify the audio track of video. I know about Playbackparams which can be used for pitch and tempo shifting but it's available from API 23.
I heard about a library called Sound touch which can be used for pitch and tempo pitching but the source code available for the same shows that it works for WAV files.
I am interested in using sound touch library as a audio effect library which I can place in Android's /system/etc/audio_effects.conf and place the SO file in /system/lib.
But I don't know how to turn this sound touch library in order to make it read audio from Android MediaPlayer as I am planning to call this library in the same as we use presetreverb/equalizer?
Kindly help me out if anyone has done such an implementation.

Open the iPhone camera?

We need to open the iPhone camera, to take images that will be saved to the camera roll.
I have read many examples here that all of them opens the UIImagePickerView.
Besides the fact that i cant understand why i have to open the picker view in order to open the camera , i just can't do that- i dont want the picker view, because i have my costumed photo album that we build, and we just need to have a little button in it, that opens the camera to take an image . without opening any other views above it .
Is that possible to use the camera without this pickerview that will cover my scene ?
or can i lead the user to the camera app and than take him back to my app ?
Thanks.
Instead of high level (i.e. Apple supplies the UI element) classes, you have to go to a more foundational (lower) level of API's, which would be AVCaptureDevice and AVCaptureDeviceInput.
And Apple has some nice source code available in their AVCam project.
If you want to display camera stream in you app without UIImagePickerController than you should you AVFoundation framework.
Here some some examples and tutorials:
take-photos-with AVFoundation
Custom camera
Displaying camera

Can an iOS app access the subtitles or closed captions of a video in the device's Videos library?

Is there a way to do this, by using the Media Player Framework or any other means?
I would like to build an app that allows the user to work with the subtitles of a store-bought video.
Starting from iOS5, UIAccessibility protocol specifies a method UIAccessibilityIsClosedCaptioningEnabled() to get the value of Settings->Video->Closed Captioning switch value.

iPad Gallery Reveal Feature

I've been searching for a library or open source project that does something like the iPad gallery reveal feature, spread to reveal photos in an album, pinch to close. Does anyone know of such a library?
The reason you are having trouble finding this is that applications have been rejected for implementing this feature. Read this for more details.