I want to create a method to cache an image from an URL, I got the code in Swift since I had used it before, how can I do something similar to this in Objective-C:
import UIKit
let imageCache: NSCache = NSCache<AnyObject, AnyObject>()
extension UIImageView {
func loadImageUsingCacheWithUrlString(urlString: String) {
self.image = nil
if let cachedImage = imageCache.object(forKey: urlString as AnyObject) as? UIImage {
self.image = cachedImage
return
}
let url = URL(string: urlString)
if let data = try? Data(contentsOf: url!) {
DispatchQueue.main.async(execute: {
if let downloadedImage = UIImage(data: data) {
imageCache.setObject(downloadedImage, forKey: urlString as AnyObject)
self.image = downloadedImage
}
})
}
}
}
Before you convert this, you might consider refactoring to make it asynchronous:
One should never use Data(contentsOf:) for network requests because (a) it is synchronous and blocks the caller (which is a horrible UX, but also, in degenerate cases, can cause the watchdog process to kill your app); (b) if there is a problem, there’s no diagnostic information; and (c) it is not cancelable.
Rather than updating image property when done, you should consider completion handler pattern, so caller knows when the request is done and the image is processed. This pattern avoids race conditions and lets you have concurrent image requests.
When you use this asynchronous pattern, the URLSession runs its completion handlers on background queue. You should keep the processing of the image and updating of the cache on this background queue. Only the completion handler should be dispatched back to the main queue.
I infer from your answer, that your intent was to use this code in a UIImageView extension. You really should put this code in a separate object (I created a ImageManager singleton) so that this cache is not only available to image views, but rather anywhere where you might need images. You might, for example, do some prefetching of images outside of the UIImageView. If this code is buried in the
Thus, perhaps something like:
final class ImageManager {
static let shared = ImageManager()
enum ImageFetchError: Error {
case invalidURL
case networkError(Data?, URLResponse?)
}
private let imageCache = NSCache<NSString, UIImage>()
private init() { }
#discardableResult
func fetchImage(urlString: String, completion: #escaping (Result<UIImage, Error>) -> Void) -> URLSessionTask? {
if let cachedImage = imageCache.object(forKey: urlString as NSString) {
completion(.success(cachedImage))
return nil
}
guard let url = URL(string: urlString) else {
completion(.failure(ImageFetchError.invalidURL))
return nil
}
let task = URLSession.shared.dataTask(with: url) { data, response, error in
guard
error == nil,
let responseData = data,
let httpUrlResponse = response as? HTTPURLResponse,
200 ..< 300 ~= httpUrlResponse.statusCode,
let image = UIImage(data: responseData)
else {
DispatchQueue.main.async {
completion(.failure(error ?? ImageFetchError.networkError(data, response)))
}
return
}
self.imageCache.setObject(image, forKey: urlString as NSString)
DispatchQueue.main.async {
completion(.success(image))
}
}
task.resume()
return task
}
}
And you'd call it like:
ImageManager.shared.fetchImage(urlString: someUrl) { result in
switch result {
case .failure(let error): print(error)
case .success(let image): // do something with image
}
}
// but do not try to use `image` here, as it has not been fetched yet
If you wanted to use this in a UIImageView extension, for example, you could save the URLSessionTask, so that you could cancel it if you requested another image before the prior one finished. (This is a very common scenario if using this in table views and the user scrolls very quickly, for example. You do not want to get backlogged in a ton of network requests.) We could
extension UIImageView {
private static var taskKey = 0
private static var urlKey = 0
private var currentTask: URLSessionTask? {
get { objc_getAssociatedObject(self, &Self.taskKey) as? URLSessionTask }
set { objc_setAssociatedObject(self, &Self.taskKey, newValue, .OBJC_ASSOCIATION_RETAIN_NONATOMIC) }
}
private var currentURLString: String? {
get { objc_getAssociatedObject(self, &Self.urlKey) as? String }
set { objc_setAssociatedObject(self, &Self.urlKey, newValue, .OBJC_ASSOCIATION_RETAIN_NONATOMIC) }
}
func setImage(with urlString: String) {
if let oldTask = currentTask {
currentTask = nil
oldTask.cancel()
}
image = nil
currentURLString = urlString
let task = ImageManager.shared.fetchImage(urlString: urlString) { result in
// only reset if the current value is for this url
if urlString == self.currentURLString {
self.currentTask = nil
self.currentURLString = nil
}
// now use the image
if case .success(let image) = result {
self.image = image
}
}
currentTask = task
}
}
There are tons of other things you might do in this UIImageView extension (e.g. placeholder images or the like), but by separating the UIImageView extension from the network layer, one keeps these different tasks in their own respective classes (in the spirit of the single responsibility principle).
OK, with that behind us, let us look at the Objective-C rendition. For example, you might create an ImageManager singleton:
// ImageManager.h
#import UIKit;
NS_ASSUME_NONNULL_BEGIN
typedef NS_ENUM(NSUInteger, ImageManagerError) {
ImageManagerErrorInvalidURL,
ImageManagerErrorNetworkError,
ImageManagerErrorNotValidImage
};
#interface ImageManager : NSObject
// if you make this singleton, mark normal instantiation methods as unavailable ...
+ (instancetype)alloc __attribute__((unavailable("alloc not available, call sharedImageManager instead")));
- (instancetype)init __attribute__((unavailable("init not available, call sharedImageManager instead")));
+ (instancetype)new __attribute__((unavailable("new not available, call sharedImageManager instead")));
- (instancetype)copy __attribute__((unavailable("copy not available, call sharedImageManager instead")));
// ... and expose singleton access point
#property (class, nonnull, readonly, strong) ImageManager *sharedImageManager;
// provide fetch method
- (NSURLSessionTask * _Nullable)fetchImageWithURLString:(NSString *)urlString completion:(void (^)(UIImage * _Nullable image, NSError * _Nullable error))completion;
#end
NS_ASSUME_NONNULL_END
and then implement this singleton:
// ImageManager.m
#import "ImageManager.h"
#interface ImageManager()
#property (nonatomic, strong) NSCache<NSString *, UIImage *> *imageCache;
#end
#implementation ImageManager
+ (instancetype)sharedImageManager {
static dispatch_once_t onceToken;
static ImageManager *shared;
dispatch_once(&onceToken, ^{
shared = [[self alloc] initPrivate];
});
return shared;
}
- (instancetype)initPrivate
{
self = [super init];
if (self) {
_imageCache = [[NSCache alloc] init];
}
return self;
}
- (NSURLSessionTask *)fetchImageWithURLString:(NSString *)urlString completion:(void (^)(UIImage *image, NSError *error))completion {
UIImage *cachedImage = [self.imageCache objectForKey:urlString];
if (cachedImage) {
completion(cachedImage, nil);
return nil;
}
NSURL *url = [NSURL URLWithString:urlString];
if (!url) {
NSError *error = [NSError errorWithDomain:[[NSBundle mainBundle] bundleIdentifier] code:ImageManagerErrorInvalidURL userInfo:nil];
completion(nil, error);
return nil;
}
NSURLSessionTask *task = [NSURLSession.sharedSession dataTaskWithURL:url completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
if (error) {
dispatch_async(dispatch_get_main_queue(), ^{
completion(nil, error);
});
return;
}
if (!data) {
NSError *error = [NSError errorWithDomain:[[NSBundle mainBundle] bundleIdentifier] code:ImageManagerErrorNetworkError userInfo:nil];
dispatch_async(dispatch_get_main_queue(), ^{
completion(nil, error);
});
}
UIImage *image = [UIImage imageWithData:data];
if (!image) {
NSDictionary *userInfo = #{
#"data": data,
#"response": response ? response : [NSNull null]
};
NSError *error = [NSError errorWithDomain:[[NSBundle mainBundle] bundleIdentifier] code:ImageManagerErrorNotValidImage userInfo:userInfo];
dispatch_async(dispatch_get_main_queue(), ^{
completion(nil, error);
});
}
[self.imageCache setObject:image forKey:urlString];
dispatch_async(dispatch_get_main_queue(), ^{
completion(image, nil);
});
}];
[task resume];
return task;
}
#end
And you'd call it like:
[[ImageManager sharedImageManager] fetchImageWithURLString:urlString completion:^(UIImage * _Nullable image, NSError * _Nullable error) {
if (error) {
NSLog(#"%#", error);
return;
}
// do something with `image` here ...
}];
// but not here, because the above runs asynchronously
And, again, you could use this from within a UIImageView extension:
#import <objc/runtime.h>
#implementation UIImageView (Cache)
- (void)setImage:(NSString *)urlString
{
NSURLSessionTask *oldTask = objc_getAssociatedObject(self, &taskKey);
if (oldTask) {
objc_setAssociatedObject(self, &taskKey, nil, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
[oldTask cancel];
}
image = nil
objc_setAssociatedObject(self, &urlKey, urlString, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
NSURLSessionTask *task = [[ImageManager sharedImageManager] fetchImageWithURLString:urlString completion:^(UIImage * _Nullable image, NSError * _Nullable error) {
NSString *currentURL = objc_getAssociatedObject(self, &urlKey);
if ([currentURL isEqualToString:urlString]) {
objc_setAssociatedObject(self, &urlKey, nil, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
objc_setAssociatedObject(self, &taskKey, nil, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
}
if (image) {
self.image = image;
}
}];
objc_setAssociatedObject(self, &taskKey, task, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
}
#end
After trial and error this worked:
#import "UIImageView+Cache.h"
#implementation UIImageView (Cache)
NSCache* imageCache;
- (void)loadImageUsingCacheWithUrlString:(NSString*)urlString {
imageCache = [[NSCache alloc] init];
self.image = nil;
UIImage *cachedImage = [imageCache objectForKey:(id)urlString];
if (cachedImage != nil) {
self.image = cachedImage;
return;
}
NSURL *url = [NSURL URLWithString:urlString];
NSData *data = [NSData dataWithContentsOfURL:url];
if (data != nil) {
dispatch_async(dispatch_get_main_queue(), ^{
UIImage *downloadedImage = [UIImage imageWithData:data];
if (downloadedImage != nil) {
[imageCache setObject:downloadedImage forKey:urlString];
self.image = downloadedImage;
}
});
}
}
#end
My requirement in the app is to capture an image without previewing the UIImagePickerController So I have used following code to capture an image without presenting the same.
- (void)clickImage
{
AVCaptureDevice *rearCamera = [self checkIfRearCameraAvailable];
if (rearCamera != nil)
{
photoSession = [[AVCaptureSession alloc] init];
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:rearCamera error:&error];
if (!error && [photoSession canAddInput:input])
{
[photoSession addInput:input];
AVCapturePhotoOutput *output = [[AVCapturePhotoOutput alloc] init];
if ([photoSession canAddOutput:output])
{
[photoSession addOutput:output];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in output.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo])
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
if (videoConnection)
{
[photoSession startRunning];
[output capturePhotoWithSettings:[AVCapturePhotoSettings photoSettings] delegate:self];
}
}
}
}
}
- (AVCaptureDevice *)checkIfRearCameraAvailable
{
AVCaptureDevice *rearCamera;
AVCaptureDeviceDiscoverySession *captureDeviceDiscoverySession =
[AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInWideAngleCamera]
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
NSArray *allCameras = [captureDeviceDiscoverySession devices];
for (int i = 0; i < allCameras.count; i++)
{
AVCaptureDevice *camera = [allCameras objectAtIndex:i];
if (camera.position == AVCaptureDevicePositionBack)
{
rearCamera = camera;
}
}
return rearCamera;
}
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
if (error)
{
NSLog(#"error : %#", error.localizedDescription);
}
if (photoSampleBuffer)
{
NSData *data = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
UIImage *image = [UIImage imageWithData:data];
_imgView.image = image;
}
}
I have used above code to capture an image But the output image is returning is looking like in Night mode or like Negative Image.
Would you please review code and correct me with this?
I have found following sample code from Apple Developer's site:
https://developer.apple.com/library/content/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112
Above sample code helped to get proper image. I have added this answer for other's help as well.
I need to create the render graph for basic recording to a file and playback from that file, using AVAudioFile & AVAudioInputNode.
Below are the main setup methods I started, but the graph is connected just for player.
How do I construct the graph so that [Input], [Player] [Mixers] are connected to achieve Rec/Play?
#pragma mark - Setup
-(void)setupAudioEngine{
//create
_audioEngine = [[AVAudioEngine alloc]init];
_player = [[AVAudioPlayerNode alloc] init];
_inputMixer = [[AVAudioMixerNode alloc]init];
_playerMixer = [[AVAudioMixerNode alloc]init];
//attach player to engine
[_audioEngine attachNode:_player];
[_audioEngine attachNode:_inputMixer];
[_audioEngine attachNode:_playerMixer];
_input = [_audioEngine inputNode];
_mainMixer = [_audioEngine mainMixerNode];
//setup audio file
NSError *error = nil;
//Connect the render grapth
[_audioEngine connect:_input to:_mainMixer format:[_input inputFormatForBus:0]];
[_audioEngine connect:_player to:_mainMixer format:_audioFile.processingFormat];
//Start the Engine
[_audioEngine startAndReturnError:&error];
if(error)
{
NSLog(#"error: %#", [error localizedDescription]);
}
}
- (IBAction)recordAudio:(UIButton *)sender {
NSError *error = nil;
//setup for writing to a file
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *inputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
NSDictionary *recordSettings = #{
AVFormatIDKey : #(kAudioFormatMPEG4AAC),
AVSampleRateKey : #44100.0f,
AVNumberOfChannelsKey : #1,
AVEncoderBitDepthHintKey : #16,
AVEncoderAudioQualityKey : #(AVAudioQualityMedium)
};
_audioFile = [[AVAudioFile alloc] initForWriting:inputFileURL settings:recordSettings error:&error];
[_mainMixer installTapOnBus:0 bufferSize:4096 format:[_mainMixer outputFormatForBus:0] block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when) {
}];
}
- (IBAction)stopRec:(id)sender {
[_audioEngine disconnectNodeOutput:_input];
}
- (IBAction)playAudio:(UIButton *)sender {
//schedule play
[_player scheduleFile:_audioFile atTime:nil completionHandler:nil];
[_player play];
}
Rule #1: Not all bitrate/audioformat/samplerate combinations work. Most of them will crash, so stick to the ones that work (I discovered a few, empirically).
This code records and plays .M4A format:
func directoryURL() -> NSURL {
let fileManager = NSFileManager.defaultManager()
let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
filepath = urls[0]
let documentDirectory = urls[0] as NSURL
print("STORAGE DIR: "+documentDirectory.description)
//print("---filepath: "+(filepath?.description)!)
let soundURL = documentDirectory.URLByAppendingPathComponent(m4aName) //.m4a
print("SAVING FILE: "+soundURL!.description)
return soundURL!
}
//INIT AUDIO RECORDER
func initializeAudioSession(){
let recordSettings = [AVSampleRateKey : NSNumber(float: Float(16000.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)), //.m4a !!
// AVFormatIDKey : NSNumber(int: Int32(kAudioFileMP3Type)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Low.rawValue)) ] //mp3 crashes!!
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try audioRecorder = AVAudioRecorder(URL: self.directoryURL(),
settings: recordSettings)
audioRecorder.delegate = self
audioRecorder.meteringEnabled = true
audioRecorder.prepareToRecord()
print("M4a Recorder Initialized - OK...")
recordSpeechM4A();
stopRecordingCapio();
} catch let error1 as NSError{
print("Init Error: "+error1.description)
}
}
//REC SPEECH AUDIO IN .M4A format
func recordSpeechM4A(){
if !audioRecorder.recording {
isRecording=true //optional variable for tracking
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(true)
print("**** RECORDING ****")
audioRecorder.record()
//Animate a microphone waveform while the user speaks
self.meterTimer = NSTimer.scheduledTimerWithTimeInterval(0.03,
target:self,
selector:#selector(ViewController.updateWaveview2(_:)),
userInfo:nil,
repeats:false)
} catch {
print("RECORDING ERROR")
}
}
}
//STOP RECORDING
func stopRecording(){
audioRecorder.stop()
}
//PLAY
func playRecordedAudio(){
if (!audioRecorder.recording){
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
audioPlayer.play()
print("PLAYING AUDIO...: "+audioRecorder.url.description)
print("Audio duration: "+audioPlayer.duration.description)
} catch {
print("AUDIO PLAYBACK ERROR")
}
}
}
So I'm using the following code to upload multiple files to a single row of a class.
for (NSString* currentString in directoryContents){
NSLog(#"%#",currentString);
NSString *temp2 = [temp stringByAppendingPathComponent:currentString];
PFFile *file = [PFFile fileWithName:currentString contentsAtPath:temp2];
[file saveInBackground];
PFObject *obj = [PFObject objectWithClassName:#"DreamBits"];
if ([currentString isEqualToString:#"index.html"]) {
[obj setObject:file forKey:#"index"];
}
else {
count += 1;
NSString *filen = [NSString stringWithFormat:#"file%i",count];
NSLog(#"%#",filen);
[obj setObject:file forKey:filen];
}
[obj saveInBackground];
}
The issue is I'm getting the files in different rows for some reason. Any idea how I can correct this?
I am modified your code little bit. I am not run this code but hope it helps you.
PFObject *obj = [PFObject objectWithClassName:#"DreamBits"];
for (NSString* currentString in directoryContents){
NSLog(#"%#",currentString);
NSString *temp2 = [temp stringByAppendingPathComponent:currentString];
PFFile *file = [PFFile fileWithName:currentString contentsAtPath:temp2];
if ([currentString isEqualToString:#"index.html"]) {
[obj setObject:file forKey:#"index"];
}
else {
count += 1;
NSString *filen = [NSString stringWithFormat:#"file%i",count];
NSLog(#"%#",filen);
[obj setObject:file forKey:filen];
}
}
[obj saveInBackground];
Create the PFObject outside the loop.Set all the PFFile object to the PFObject inside the loop.
After loop, save the PFObject. It is better to use the method :
[obj saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error)
{
//Check whether the upload is succeeded or not
}];
I have used this method for upload profile and thumb image in a row in the Parse Server table. Using swift 4 and Parse SDK 1.17.1. Hope this technic will help.
func uploadImage(profileImage: UIImage, profileImageName: String, thumbImage: UIImage, thumbImageName: String) {
if let profileImagefile = PFFile.init(name: profileImageName, data: profileImage.jpegData(compressionQuality: 1)!) {
let fileObject = PFObject(className:"ProfileImage")
fileObject.setValue(profileImagefile, forKey: "profile_image")
fileObject.saveInBackground { (success, error) in
if error == nil {
print("thumb image path: \(profileImagefile.url ?? "")")
if let thumbImage = PFFile.init(name: thumbImageName, data: thumbImage.jpegData(compressionQuality: 0.5)!) {
fileObject.setValue(thumbImage, forKey: "thumb_image")
fileObject.saveInBackground(block: { (result, fileError) in
if fileError == nil {
print("thumb image path: \(thumbImage.url ?? "")")
}else {
print("error on thumb upload: \(result)")
}
})
}
}else {
print("error on file upload: \(error.debugDescription)")
}
}
}
}
I have successfully converted the following method from Obj-C to Swift:
After learning how blocks are replaced by closures in Swift.
Obj-C:
- (RACSignal *)fetchCurrentConditionsForLocation:(CLLocationCoordinate2D)coordinate {
NSString *urlString = [NSString stringWithFormat:#"http://api.openweathermap.org/data/2.5/weather?lat=%f&lon=%f&units=imperial", coordinate.latitude, coordinate.longitude];
NSURL *url = [NSURL URLWithString:urlString];
return [[self fetchJSONFromURL:url] map:^(NSDictionary *json) {
return [MTLJSONAdapter modelOfClass:[WXCondition class] fromJSONDictionary:json error:nil];
}];
}
Swift:
func fetchCurrentConditionsForLocation(coordinate: CLLocationCoordinate2D) -> RACSignal {
let urlString = NSString(format: "http://api.openweathermap.org/data/2.5/weather?lat=%f&lon=%f&units=metric", coordinate.latitude, coordinate.longitude)
let url = NSURL.URLWithString(urlString)
return fetchJSONFromURL(url).map { json in
return MTLJSONAdapter.modelOfClass(WXCondition.self, fromJSONDictionary: json as NSDictionary, error: nil)
}
}
However, I'm having trouble converting the following return map block to Swift:
func fetchHourlyForecastForLocation(coordinate: CLLocationCoordinate2D) -> RACSignal {
var urlString = NSString(format: "http://api.openweathermap.org/data/2.5/forecast?lat=%f&lon=%f&units=metric&cnt=12", coordinate.latitude, coordinate.longitude)
let url = NSURL.URLWithString(urlString)
/* Original Obj-C:
return [[self fetchJSONFromURL:url] map:^(NSDictionary *json) {
RACSequence *list = [json[#"list"] rac_sequence];
return [[list map:^(NSDictionary *item) {
return [MTLJSONAdapter modelOfClass:[WXCondition class] fromJSONDictionary:item error:nil];
}] array];
}];
}
*/
// My attempt at conversion to Swift
// (I can't resolve `rac_sequence` properly). Kind of confused
// as to how to cast it properly and return
// it as an "rac_sequence" array.
return fetchJSONFromURL(url).map { json in
let list = RACSequence()
list = [json["list"] rac_sequence]
return (list).map { item in {
return MTLJSONAdapter.modelOfClass(WXCondition.self, fromJSONDictionary: item as NSDictionary, error: nil)
} as NSArray
}
}
If it helps, this is what rac_sequence is:
- (RACSequence *)rac_sequence {
return [RACArraySequence sequenceWithArray:self offset:0];
}
RACArraySequence():
+ (instancetype)sequenceWithArray:(NSArray *)array offset:(NSUInteger)offset;
EDIT: The fetch method returns a RACSignal not an NSArray:
func fetchJSONFromURL(url: NSURL) -> RACSignal {
}
It looks like you simply forgot the return type in the function declaration. The declaration should look something like this:
func fetchHourlyForecastForLocation(coordinate: CLLocationCoordinate2D) -> RACSignal { //...rest of function
Because the return type is now named you can then remove the as NSArray cast in your return statement at the end of the function. Hope this helps!
Try this :
func fetchHourlyForecastForLocation(coordinate: CLLocationCoordinate2D,completion:(AnyObject) -> Void) {
//after getting the NSArray (maping)
completion(_array)
}