how to display new songs along with old songs in UITableView - objective-c

-(void) mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection {
[self dismissViewControllerAnimated:YES completion:nil];
NSURL* assetUrl = [mediaItemCollection.representativeItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:assetUrl options:nil];
AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
myPlayer = [AVPlayer playerWithPlayerItem:playerItem];
collectionMutableCopy = [mediaItemCollection.items mutableCopy];
[self.myPlaylistTable reloadData];
//[self updatePlayerQueueWithMediaCollection: mediaItemCollection];
[myPlayer play];
}
I have a Button that allows me to select select songs from Iphone Library. I have another UIButton where it shows the list of selected songs from ipod Library.
I am displaying the selected songs in UITableview which is myPlaylistTable
when i select say two songs from iPod Library those two songs get displayed myPlaylistTable but problem happens when I try to add few more songs the two songs which was displayed earlier got disappear.
For ex if i select two songs first time and If I add another two songs it should show four songs .

Please considering doing the following.... It should help and resolve your problem.
self.clearsSelectionOnViewWillAppear = NO;
That will keep the row selected, unless you manually deselect the row or the user selects another row on the table.

Related

PHImageManager requestImageForAsset returns a non-nil but incorrect result

I have a page view controller which prepares multiple view controllers (and shows one at a time of course). Each view controller loads corresponding asset based on asset's localIdentifier. It works fine most of the time. But, if it tries to load an asset that has been deleted from Camera Roll or Synced albums from Mac, the view controller dismisses automatically. The reason the asset ends up in the array of local identifiers in the first place is the asset for some reason remains in the moments tab in Photos app (but again, it doesn't exist in Camera Roll or the synced albums).
// Get the image
PHCachingImageManager *imageManager = [[PHCachingImageManager alloc] init];
CGSize targetSize = CGSizeZero;
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone) {
// iPhone
targetSize = CGSizeMake(actualSizedImageRect.size.width,actualSizedImageRect.size.height);
} else {
// iPad
targetSize = CGSizeMake(2000,2000);
}
PHImageRequestOptions *options = [[PHImageRequestOptions alloc]init];
options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
options.resizeMode = PHImageRequestOptionsResizeModeNone;
options.version = PHImageRequestOptionsVersionCurrent;
options.progressHandler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *dictionary) {
dispatch_async(dispatch_get_main_queue(), ^{
});
};
[imageManager requestImageForAsset:self.asset
targetSize:targetSize
contentMode:PHImageContentModeAspectFit
options:options
resultHandler:^(UIImage *result, NSDictionary *info) {
dispatch_async(dispatch_get_main_queue(), ^{
[self.assetImageView setImage:result]; //iwashere memo when a photo that no longer exists in albums (but in moments) is about to be presented, above result receives an incorrect image. That's probably causing issues with storyboard constraint so the view controller dismisses automatically.
if (#available(iOS 11.0, *)){
if (self.asset.playbackStyle == PHAssetPlaybackStyleLivePhoto){
[self.assetImageView setHidden:YES];
}
}
});
}];
I have found so far that the result handler above is called several times, and eventually it returns the result containing a different image, whose different dimensions cause a problem with the storyboard. So, the view controller simply dismisses.
If it's returning a nil, I can simply decide not to load it to self.assetImageView, but even when the wrong image is returned, the result is not nil. I want to know if there's a way to figure out if the returned result really belongs to the asset.

Is it possible to display a still image in an AVPlayer?

I'm trying to make a program where I need to show images and videos on an external screen. So I have a Table View where I can enter names and links to video files (mp4 for now) and image files (jpg for now).
I can't find the way to display still images in AVPlayer.
So to visualize the video files or the images, I created an AVPlayer and an UIImageView which have the same sizes. The AVPlayer is placed above the UIImageView.
If I want to display an image, I hide the AVPlayer.
if ([self selectedVideoURL]!=nil){
NSString *myString = [[self selectedVideoURL] absoluteString];
NSString *extension = [myString substringFromIndex: [myString length] - 3];
if (![extension isEqual:#"jpg"]){
self.playerView.hidden = false;
[self playingVideo:[self selectedVideoURL]];
}
else{
self.playerView.hidden = true;
[self displayingImage:[self selectedVideoURL]];
}
}
Is there any way to make it simpler?
Thanks...

how to switch camera mode to video mode using GPUImage in ios

I am making app like Instagram .i am using GPUImage framework ,in this i have to take photos and videos and share. i able to capture photos using this framework and now i have to capture video but i am struggling how to change camera mode photos to video. any help and tutorial then its very good for me. I used this code camera for photos mode.
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypeCamera;
[[NSBundle mainBundle] loadNibNamed:#"OverlayView" owner:self options:nil];
self.overlayView.frame = self.imagepicker.cameraOverlayView.frame;
self.imagepicker.cameraOverlayView = self.overlayView;
self.overlayView = nil;
CGSize result = [[UIScreen mainScreen] bounds].size;
self.imagepicker.showsCameraControls = NO;
self.imagepicker.allowsEditing = NO;
self.imagepicker.wantsFullScreenLayout = NO;
// self.imagepicker.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];
}
else{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
}
In my case, I'm using GPUImage to do both (pictures and videos). Therefore I've created two objects: one of type GPUImageStillCamera(pictures) and other of type GPUImageVideoCamera (videos).
So whenever you need to switch between cameras you basically stop the GPUImageStillCamera capture and initialize a video camera (note that you have to adapt this snippet to your project):
func initializeVideoCamera() {
// Stop the capture of GPUImageStillCamera
stillCamera.stopCameraCapture()
videoCamera = GPUImageVideoCamera.init(sessionPreset: AVCaptureSessionPreset1920x1080, cameraPosition: .Back)
videoCamera?.outputImageOrientation = .Portrait
videoCamera?.addTarget(filter)
// If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
unlink(pathToMovieFile)
initializeWriteWithPath(pathToMovieFile)
videoCamera?.startCameraCapture()
}

MPMoviePlayerController not working properly

I am making an application in which i am trying to play a video. The video starts properly but the video screen changes in black color after 4 sec. I dont know what is the problem.
also when i am setting player.movieplayer.shouldautoplay = NO, there is no effect of this line, video starts automatically.
Here is Code:
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming];
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
player.moviePlayer.shouldAutoplay = NO;
[self.view addSubview:player.view];
[player.moviePlayer play];
Am i missing something here??
I tried to get the total duration of video (using duration property of mpmovieplayercontroller) but its showing 0.0. how to get the duration of video??
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming]; // I was missing this line therefore video was not playing
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
[self.view addSubview:player.view];
[player.moviePlayer play];
There are several issues here:
For this type of usage (integrating the player into your view), you should be using MPMoviePlayerController, not MPMoviePlayerViewController. Use MPMoviePlayerViewController when you want to have a self-contained view controller which can be presented using presentMoviePlayerViewControllerAnimated:.
Assuming you are using ARC, the main problem is that nothing is keeping a reference to your player object. As a consequence, the player is disappearing shortly after you create it. You should keep a reference to it by assigning it to a property or instance variable of your view controller.
For a full example of this, see Till's excellent answer to a similar question.
I'm not sure what your intended purpose of the UIGraphicsBeginImageContext and UIGraphicsEndImageContext calls are, but I can't see that they're needed here.
As for shouldAutoplay = NO, the video is still starting because you are calling play immediately afterwards.
The player's duration property only contains a useful value after a MPMovieDurationAvailableNotification has been received. You'll need to do something similar to the following to have access to the actual duration:
__weak MediaPlayerController *weakSelf = self;
[[NSNotificationCenter defaultCenter] addObserverForName:MPMovieDurationAvailableNotification object:self.player queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification *note) {
NSLog(#"Movie duration: %lf", weakSelf.player.duration);
}];
Use removeObserver:name:object: to remove the observer when you are done.

Application crashes while adding photo adding new contact

I am using ABNewPersonViewController to add new contact to the address book. Every thing is fine if I do not add any photo from photo albums. It crashes if I add any photo and here is the log:-
NSInvalidArgumentException', reason: '*** -[NSCFDictionary setObject:forKey:]: attempt to insert nil value (key: UIImagePickerControllerOriginalImage)
What I am doing wrong, or how I can fix this problem
Thanks
This is the code I use:
mNewPersonViewController = [[[ABNewPersonViewController alloc]init]autorelease];
mNewPersonViewController.hidesBottomBarWhenPushed = YES;
mNewPersonViewController.addressBook = app.addressBook;
mNewPersonViewController.newPersonViewDelegate = self;
UINavigationController *presonNavController = [[UINavigationController alloc]initWithRootViewController:mNewPersonViewController];
self.mPopOverController = [[UIPopoverController alloc]initWithContentViewController:presonNavController ];
CGRect frame = [sender frame];
[self.mPopOverController presentPopoverFromRect:frame inView:self.view permittedArrowDirections: UIPopoverArrowDirectionUp animated:YES];
[presonNavController release];
Looks like you are trying to insert a nil value into the dictionary.