Everytime same video gets uploaded + Assets Library - objective-c

ImgVidData =[[NSData alloc]init];
ALAssetRepresentation *representation = alAsset.defaultRepresentation;
NSURL *movieURL = representation.url;
NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:#"test"] stringByAppendingString:#".mov"]];
// ImgVidData = [NSData dataWithContentsOfFile: [[NSTemporaryDirectory() stringByAppendingPathComponent:#"test"] stringByAppendingString:#".mp4"]];
AVAsset *asset = [AVURLAsset URLAssetWithURL:movieURL options:nil];
AVAssetExportSession *session =
[AVAssetExportSession exportSessionWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
session.outputFileType = AVFileTypeQuickTimeMovie;
session.outputURL = uploadURL;
videoURL=uploadURL;
[session exportAsynchronouslyWithCompletionHandler:^{
if (session.status == AVAssetExportSessionStatusCompleted)
{
NSLog(#"output Video URL %#",uploadURL);
}
}];
ImgVidData = [NSData dataWithContentsOfURL:session.outputURL];
Above is the code , i have implemented inside didFinishPickingAssets:(NSArray *)assets
I'm picking up the video and uploading the video to server. I'm fetching uploaded video URL and converting it into NSdata and uploading it to server.
STRANGE : everytime I'm getting same video uploaded to server which i choose first time. i choose different video everytime but same videos get uplaoded.
Any help !!

Related

Cannot load image to UIImageView from absolute path

I use UIImagePickerController to pick image and load to my UIImageView, but I want to save user pick and load it later, I think will be good save absolute path to user defaults but not working (
How I save path // all working
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo: (NSDictionary *)info
{
[self.backgroundImage setImage:info[UIImagePickerControllerOriginalImage]];
NSURL* localUrl = (NSURL *)[info valueForKey:UIImagePickerControllerReferenceURL];
//in localUrl I see: assets-library://asset/asset.JPG?id=B6C0A21C-07C3-493D-8B44-3BA4C9981C25&ext=JPG
NSUserDefaults *saves = [NSUserDefaults standardUserDefaults];
[saves setValue:[localUrl absoluteString] forKey:#"backimage"];
[saves synchronize];
[self dismissViewControllerAnimated:YES completion:nil];
}
How I try to load: //not working (
NSUserDefaults *saves = [NSUserDefaults standardUserDefaults];
if(![saves objectForKey:#"backimage"]){
[self.backgroundImage setImage:[UIImage imageNamed:#"gameBackiPhone"]];
}else{
NSURL *url = [NSURL URLWithString:[saves objectForKey:#"backimage"]];
//in url I see: assets-library://asset/asset.JPG?id=B6C0A21C-07C3-493D-8B44-3BA4C9981C25&ext=JPG
UIImage *bimage = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:url]];
[self.backgroundImage setImage:bimage];
}
How must be I cannot find.
You can not load image directly using Asset URL, you need to use ALAssetsLibrary class to achieve it. Use following snippet to load image using Asset URL.
// *** It will return Asset from URL passed, create Image from Asset and set into your `UIImageView` ***
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage *largeimage = [UIImage imageWithCGImage:iref];
yourImageView.image = largeImage;
}
};
// *** If any error occurs while getting image from Asset Library following block will be invoked ***
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get image - %#",[myerror localizedDescription]);
};
// *** Set Asset URL to load Image (assets-library://asset/asset.JPG?id=B6C0A21C-07C3-493D-8B44-3BA4C9981C25&ext=JPG) ***
[NSURL *asseturl = [NSURL URLWithString:yourURL];
// *** Create ALAssetsLibrary Instance and load Image ***
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
Dont forget to import AssetsLibrary framework in your project.

AVFoundation - why can't I get the video orientation right

I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}
The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.
In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end
For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}

Cocoa: AVAsset loaded from file has 0 tracks

I'm attempting to concatenate some audio files using the technique shown here. My audio files are .m4a and I can verify that they play fine in Quicktime. Here's the code I'm trying to use to concatenate them:
[currFile.audioContent writeToFile:tempOldFilePath atomically:NO];
AVURLAsset *oldAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempOldFilePath] options:nil];
AVURLAsset *newAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL URLWithString:tempInputFilePath] options:nil];
NSLog(#"oldAsset num tracks = %lu",(unsigned long)oldAudioAsset.tracks.count);
NSLog(#"newAsset num tracks = %lu",(unsigned long)newAudioAsset.tracks.count);
AVAssetTrack *oldTrack = [[oldAudioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVAssetTrack *newTrack = [[newAudioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition *mutableComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *error=nil;
[compTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, oldTrack.timeRange.duration) ofTrack:oldTrack atTime:kCMTimeZero error:&error];
if (error) {
NSLog(#"%#",error.localizedDescription);
error=nil;
}
[compTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, newTrack.timeRange.duration) ofTrack:newTrack
atTime:oldTrack.timeRange.duration error:&error];
if (error) {
NSLog(#"%#",error.localizedDescription);
error=nil;
}
exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPresetAppleM4A];
exporter.outputURL = [NSURL URLWithString:tempCompFilePath];
exporter.outputFileType = AVFileTypeAppleM4A;
[exporter exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"handller");
NSError *error=nil;
NSData *newData = [NSData dataWithContentsOfFile:tempCompFilePath options:0 error:&error];
NSLog(#"%lu",(unsigned long)newData.length);
if (error) {
NSLog(#"%#",error.localizedDescription);
}
currFile.audioContent = newData;
[[AppDelegate sharedDelegate] saveAction:nil];
}];
The first problem I noticed is that the exporter's handler method is never called. I'm guessing the reason for this is the other problem I noticed: After created my AVAssets from URL, log statements show that they contain 0 tracks. Apple's example doesn't exactly show how the AVAssets are loaded.
Any advice on how to get this working?
As you've already found, you need to use fileURLWithPath:, not URLWithString:, to create your URLs.
URLWithString: expects a string that describes a URL, such as #"file:///path/to/file" or #"http://example.com/". When your string describes a path alone, such as #"/path/to/file", you must use fileURLWithPath:, which will fill in the missing pieces correctly.
More technically, URLWithString: will interpret a path as simply a URL with only a path but no particular scheme, which you could go on to use relative to a base URL in any file-oriented scheme, such as HTTP (GET /path/to/file). fileURLWithPath: will interpret a path as a local file path, and return a file: URL accordingly.
I found the reason why this error was occurred and eventually solve it.
Originally I had set the source file path as ".mp4".
But the type of recorded video file was MOV so I changed as ".mov".
NSString *source_file_path = #"temp_video.mov"
instead of
NSString *source_file_path = #"temp_video.mp4"
The problem was fixed and it is working well now.
Hope to be helpful for all.
Apparently I was using the wrong NSURL method. I changed it to this:
AVURLAsset *oldAudioAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:tempOldFilePath] options:nil];
Now my AVAssets each have 1 track in them. If anyone can provide a good explanation as to why this is necessary, I will accept that answer.

AVFoundation to reproduce a video loop

I need to reproduce a video indefinitely (restarting the video when it ends) in my OpenGL application.
To do so I'm trying to utilize AV foundation.
I created an AVAssetReader and an AVAssetReaderTrackOutput and I utilize the copyNextSampleBuffer method to get CMSampleBufferRef and create an OpenGL texture for each frame.
NSString *path = [[NSBundle mainBundle] pathForResource:videoFileName ofType:type];
_url = [NSURL fileURLWithPath:path];
//Create the AVAsset
_asset = [AVURLAsset assetWithURL:_url];
//Get the asset AVAssetTrack
NSArray *arrayAssetTrack = [_asset tracksWithMediaType:AVMediaTypeVideo];
_assetTrackVideo = [arrayAssetTrack objectAtIndex:0];
//create the AVAssetReaderTrackOutput
NSDictionary *dictCompressionProperty = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id) kCVPixelBufferPixelFormatTypeKey];
_trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:_assetTrackVideo outputSettings:dictCompressionProperty];
//Create the AVAssetReader
NSError *error;
_assetReader = [[AVAssetReader alloc] initWithAsset:_asset error:&error];
if(error){
NSLog(#"error in AssetReader %#", error);
}
[_assetReader addOutput:_trackOutput];
//_assetReader.timeRange = CMTimeRangeMake(kCMTimeZero, _asset.duration);
//Asset reading start reading
[_assetReader startReading];
And in -update method of my GLKViewController I call the following:
if (_assetReader.status == AVAssetReaderStatusReading){
if (_trackOutput) {
CMSampleBufferRef sampleBuffer = [_trackOutput copyNextSampleBuffer];
[self createNewTextureVideoFromOutputSampleBuffer:sampleBuffer]; //create the new texture
}
}else if (_assetReader.status == AVAssetReaderStatusCompleted) {
NSLog(#"restart");
[_assetReader startReading];
}
All work fine until the AVAssetReader is in the reading status but when it finished reading and I tried to restart the AVAssetReading with a new call [_assetReader startReading], the application crash without output.
What I'm doing wrong? It is correct to restart an AVAssetReading when it complete his reading?
AVAssetReader doesn't support seeking or restarting, it is essentially a sequential decoder. You have to create a new AVAssetReader object to read the same samples again.
Thanks Costique! Your suggestion brings me back on the problem. I finally restarted the reading by creating a new AVAssetReader. However in order to do that I noted that a new AVAssetReaderTrackOutput must be created and added to the new AVAssetReader
e.g.
[newAssetReader addOutput:newTrackOutput];

SoundCloud API iOS, no sharing - only listening

I got a website where I make playlists with SoundCloud. Now I want to make an app for iPhone so the users can listen to the songs there as well.
http://developers.soundcloud.com/docs/api/ios-quickstart
In the example in the link the users have to sign in to listen and share, but I want my users only to listen. Is there a way around so they don't have to sign in?
Create a page that outputs the playlist in JSON, then in xcode create a class that downloads the JSON for the track dictionaries and play the downloaded content using AVPlayer (or AVQueuePlayer if you're playing the whole list).
Here's some abstract code:
playlistDownloader.m
- (void)downloadPlaylist{
dispatch_queue_t concurrentQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_sync(concurrentQueue, ^{
NSURL *url = [NSURL URLWithString:#"http://www.yourwebsite.com/playlist.json?id=1"];
NSData *data = [NSData dataWithContentsOfURL:url];
NSError *error;
id trackData = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingAllowFragments error:&error];
if (!error) {
tempTrackArray = trackData;
} else {
NSLog(#"Playlist wasn't able to download");
}
});
}
tempTrackArray would be a property declared in the class.
Then in your player, you'd do something like this:
audioPlayer.m
- (void)instanciateAudioPlayer
{
NSDictionary *trackDictionary = [playListDownloader.tempTrackArray objectAtIndex:0];
NSString *urlString = [trackDictionary objectForKey:#"stream_url"];
AVAsset *asset = [AVAsset assetWithURL:streamURL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[avPlayer initWithPlayerItem:playerItem];
}
This is some really rough code but its the general gist of what you're trying to do. Should get you going in the right direction.
Trying the soundcloud quickstart I realised that your stream url needs to begin with https rather than http. Also you need to add your client id from your sound cloud app to the stream url:
NSDictionary *trackDictionary = [playListDownloader.tempTrackArray objectAtIndex:0];
NSString *streamURL = [trackDictionary objectForKey:#"stream_url"];
streamURL = [streamURL stringByReplacingOccurrencesOfString:#"http" withString:#"https"];
NSString *urlString = [NSString stringWithFormat:#"%#?client_id=%#", streamURL, #"a8e117d3fa2121067e0b29105b0543ef"];
From there you just setup the AVPlayer:
self._avPlayer = [AVPlayer playerWithURL:[NSURL URLWithString:urlString]];
[self setupLayer:clayer];
[self._avPlayer play];
Everything should work fine. Hope it helps