How can I create custom Ads in my iPhone app? - cocoa-touch

I am developing an iPhone app for my University radio station, and I would like to insert ads (that look and feel just like iAds) but with my own custom designs/content/links so I can sell this adspace to possible sponsors.
Would anyone know how I can do this or point me in the right direction? I am building a "utility" style app.

I have a JSON file on my server with certain data about the ad (mine happens to be set up for one, but you could accomodate multiple the same way).
{"promo":"yes","imageURL":"http://somedomain/testAd.png","image2xURL":"http://somedomain/testAd#2x.png","link":"http://www.whereTheAdShouldDirect.com"}
Then, in the app, I have this amongst the rest of viewWillAppear:
NSURL *url = [NSURL URLWithString:#"http://www.mydomain/promo.php"];
NSString *response = [[NSString alloc] initWithContentsOfURL:url];
const char *convert = [response UTF8String];
NSString *responseString = [NSString stringWithUTF8String:convert];
NSDictionary *promo = [responseString JSONValue];
[response release];
if([[promo objectForKey:#"promo"] isEqualToString:#"yes"]){
self.linkURL = [NSURL URLWithString:[promo objectForKey:#"link"]];
NSURL *picURL = [NSURL URLWithString:[promo objectForKey:#"imageURL"]];
if([[[UIDevice currentDevice] systemVersion]intValue]>=4){
if([[UIScreen mainScreen] scale]==2.0){
picURL = [NSURL URLWithString:[promo objectForKey:#"image2xURL"]];
}
}
CGRect imgFrame = CGRectMake(0, 0, 320, 50);
UIButton *adImage=[[UIButton alloc] initWithFrame:imgFrame];
NSData * imageData = [NSData dataWithContentsOfURL:picURL];
UIImage * image = [UIImage imageWithData:imageData];
[adImage setBackgroundImage:image forState:UIControlStateNormal];
[adImage addTarget:self action:#selector(ad) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:adImage];
[adImage release];
}
and this method as well:
-(void)ad{
[[UIApplication sharedApplication] openURL:self.linkURL];
}
You may want to change that last method depending on how you want the ad to react (load a webview right in the app?)

Related

Generating thumbnail from video - ios7

I am using this for reference: Getting thumbnail from a video url or data in IPhone SDK
The method is using the MPMoviePlayerController class instead of the AVFoundation, and I think I want to use that as well because the people said that MPMoviePlayer way is faster than the AVFoundation way.
The problem is, the method used to create the thumbnails, [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame] is deprecated in iOS 7.0.
By looking at the apple docs, the remaining supported ways to create thumbnails are by the methods (void)requestThumbnailImagesAtTimes:(NSArray *)playbackTimes timeOption:(MPMovieTimeOption)option and (void)cancelAllThumbnailImageRequests. But, as the method signatures dictate, these methods return nothing. So how do I access the UIImage thumbnail created by these methods?
If it helps, this is what I have so far in terms of code:
self.videoURL = info[UIImagePickerControllerMediaURL];
NSData *videoData = [NSData dataWithContentsOfURL:self.videoURL];
//Create thumbnail image
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:self.videoURL];
[player requestThumbnailImagesAtTimes:#[#1] timeOption:MPMovieTimeOptionNearestKeyFrame];
//UIImage *thumbnail = ???
How do I get a UIImage reference to the thumbnail?
EDIT
I figured out how to create a notification for the thumbnail image request (using this question as reference). However, I realise that this method works asynchronously from the main thread, and so my notification handler method doesn't seem to ever be called.
This is what I have now.
self.videoURL = info[UIImagePickerControllerMediaURL];
NSData *videoData = [NSData dataWithContentsOfURL:self.videoURL];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:self.videoURL];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(handleThumbnailImageRequestFinishNotification:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:player];
[player requestThumbnailImagesAtTimes:#[#1] timeOption:MPMovieTimeOptionNearestKeyFrame];
And then my handler method:
-(void)handleThumbnailImageRequestFinishNotification:(NSNotification*)notification
{
NSDictionary *userinfo = [notification userInfo];
NSError* value = [userinfo objectForKey:MPMoviePlayerThumbnailErrorKey];
if (value != nil)
{
NSLog(#"Error creating video thumbnail image. Details: %#", [value debugDescription]);
}
else
{
UIImage *thumbnail = [userinfo valueForKey:MPMoviePlayerThumbnailImageKey];
}
But the handler never gets called (or so it appears).
Try this way.
import AVFoundation framework
in *.h
#import <AVFoundation/AVFoundation.h>
in *.m
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:self.urlForConevW options:nil];
AVAssetImageGenerator *generateImg = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *error = NULL;
CMTime time = CMTimeMake(1, 65);
CGImageRef refImg = [generateImg copyCGImageAtTime:time actualTime:NULL error:&error];
NSLog(#"error==%#, Refimage==%#", error, refImg);
UIImage *FrameImage= [[UIImage alloc] initWithCGImage:refImg];
Here is a code to make a thumbnail of the video and save the images to the DocumentDirectory..
//pass the video_path to NSURL
NSURL *videoURL = [NSURL fileURLWithPath:strVideoPath];
AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generator.appliesPreferredTrackTransform = YES;
//Set the time and size of thumbnail for image
NSError *err = NULL;
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
CGSize maxSize = CGSizeMake(425,355);
generator.maximumSize = maxSize;
CGImageRef imgRef = [generator copyCGImageAtTime:thumbTime actualTime:NULL error:&err];
UIImage *thumbnail = [[UIImage alloc] initWithCGImage:imgRef];
//And you can save the image to the DocumentDirectory
NSData *data = UIImagePNGRepresentation(thumbnail);
//Path for the documentDirectory
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
[data writeToFile:[documentsDirectory stringByAppendingPathComponent:currentFileName] atomically:YES];
If your URL is to an HTTP live stream, then it won't return anything, per the docs. For a file URL, I found that I had to start the request after playing the movie, or it would never get called.

Showing video and images

I have my video and images stored in the NSData format in a cache.I want to retrieve the image and show it on imageController also want to retrieve video and and show it on videoController.
I can distinguish them by looking towards the file extension. But the problem is in imageController i can init image by using the NSData:
NSData *data=[NSData dataWithContentsOfURL:[NSURL fileURLWithPath:filePath isDirectory:NO]];
UIImage *image=[[UIImage alloc]initWithData:data];
But how can i use the NSData to play the video?
If you can get their path extensions, you can just use separate implementations for each type. No need to have one way down the road, eh?
-(void)example:(NSString*)filePath {
if ([filePath.pathExtension isEqualToString:#"png"]) {
NSData *data=[NSData dataWithContentsOfURL:[NSURL fileURLWithPath:filePath isDirectory:NO]];
UIImage *image=[[UIImage alloc]initWithData:data];
} else if ([filePath.pathExtension isEqualToString:#"mp4"]) {
MPMoviePlayerController *player =
[[MPMoviePlayerController alloc] initWithContentURL: [NSURL urlWithString:filePath]];
[player prepareToPlay];
[player.view setFrame: myView.bounds]; // player's frame must match parent's
[myView addSubview: player.view];
// ...
[player play];
}
}

MPMoviePlayerController generate thumbnail of local video file and store it

I'm making an app that downloads movies from the server and stores them localy in the NSDocumentDirectory.
This works fine.
I want to add a thumbnail generated from each movie in front of the name in each cell.
My problem:
How can I generate a thumbnail from a movie after it is downloaded (so instantly, without having to play the movie first)? I want to store the thumbnails with the same name of the movie as a jpg in the NSDocumentDirectory.
My guess
-download movie and store it in NSDocumentDirectory (works)
-somehow load the movie in the MPMoviePlayerController's memory (don't know how)
-when loaded in memory, generate thumbnail with thumbnailImageAtTime (MPMovieTimeOptionNearestKeyFrame) (should work)
-store it (should work)
If anyone could help me...
Thanks
#import <MediaPlayer/MediaPlayer.h>
-(UIImage*)getFirstFrameFromVideoFile:(NSString*)sourceFilePath {
NSURL *videoURL = [NSURL fileURLWithPath:sourceFilePath];
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
//Player autoplays audio on init
[player stop];
[player release];
return thumbnail;
}
Other tasks you know already.
Yes, using MPMoviePlayer works... but you must be sure that you do not have another movie player playing elsewhere in your app (even the UIWebView plug-in...) or you will get in trouble.
I do this way:
UIImage *thumbnail = nil;
NSURL *url = [NSURL fileURLWithPath:pathname];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform = YES;
NSError *error = nil;
CMTime time = CMTimeMake(3, 1); // 3/1 = 3 second(s)
CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:nil error:&error];
if (error != nil)
NSLog(#"%#: %#", self, error);
thumbnail = [[UIImage alloc] initWithCGImage:imgRef];
CGImageRelease(imgRef);
Hope this might help
This is the code I use which should generate the thumbnail
(added a big uiimageview for testing, works when I load a local image in it)
NSString *path;
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
path = [[paths objectAtIndex:0] stringByAppendingPathComponent:#"snijtechniekendir/videos"];
path = [path stringByAppendingPathComponent:[videos objectAtIndex:indexPath.row]];
NSURL *videoURL = [NSURL fileURLWithPath:path];
NSLog(#"video url: %#", videoURL);
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
UIImage *thumbnail = [player thumbnailImageAtTime:1 timeOption:MPMovieTimeOptionNearestKeyFrame];
UIImageView *cellimage = [[UIImageView alloc] initWithFrame:CGRectMake(2, 2 , 400, 400)];
[cell.contentView addSubview:cellimage];
NSData *imgData = UIImagePNGRepresentation(thumbnail);
NSLog(#"lenght of video thumb: %#", [imgData length]);
[self.view addSubview:cellimage];
[cellimage setImage:thumbnail];
This is the log for one video file
video url: file://localhost/Users/Home/Library/Application%20Support/iPhone%20Simulator/5.1/Applications/78C165BB-75A9-46A2-A257-469F8652A665/Documents/snijtechniekendir/videos/snijtechniek%2520brunoise.mp4
lenght of video thumb: (null)

Having trouble with this NSURL method

Can anybody tell me excatly how I should execute this method on an NSURL object?
URLByResolvingSymlinksInPath
Compiler says that the method can't be found when I execute it.
Thanks a lot.
This is where I am trying to implement the code:
- (void)data
{
NSURL *url = [[NSURL alloc] initWithString: [[user data] objectAtIndex: 5]];
NSURL * url2 = [[NSURL alloc] init];
url2 = [url URLByResolvingSymlinksInPath];
NSData *newImage = [[NSData alloc] initWithContentsOfURL:url2];
NSImage *image = [[NSImage alloc] initWithData:newImage];
imageView.layer.masksToBounds = YES;
imageView.layer.cornerRadius = 3.0;
[imageView setImage:image];
[user autorelease];
}
Assuming that you have NSURL named something like myURLWithSymlinks, you would do something like:
NSURL * myURLWithoutSymlinks = [myURLWithSymlinks URLByResolvingSymlinksInPath];
Note that according to the docs, URLByResolvingSymlinksInPath is only available in iOS4.0 or above and MacOSX 10.6. If you are using an older version of iOS (or MacOSX), that could be the cause of your problem.

Getting Image from URL Objective C

I'm trying to get an image from an URL and it doesn't seem to be working for me. Can someone point me in the right direction?
Here is my code:
NSURL *url = [NSURL URLWithString:#"http://myurl/mypic.jpg"];
NSString *newIMAGE = [[NSString alloc] initWithContentsOfURL:url
encoding:NSUTF8StringEncoding error:nil];
cell.image = [UIImage imageNamed:newIMAGE];
When I debug the newIMAGE string is nil so something isn't working there.
What you want is to get the image data, then initialize a UIImage using that data:
NSData * imageData = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
cell.image = [UIImage imageWithData: imageData];
[imageData release];
As requested, here's an asynchronous version:
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"http://myurl/mypic.jpg"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
// WARNING: is the cell still using the same data by this point??
cell.image = [UIImage imageWithData: data];
});
[data release];
});
Ok there's a couple of things wrong here:
The conversion from URL (url) to NSString (newImage) is incorrect, what the code actually does there is try to load the contents of "http://myurl/mypic.jpg" into the NSString.
The -imageNamed method takes a string that is a path of a local file, not a URL as an argument.
You need to use an NSData object as an intermediary, like in this example:
http://blogs.oreilly.com/digitalmedia/2008/02/creating-an-uiimage-from-a-url.html
the accepted answer asynchronous version worked very slow in my code. an approach using NSOperation worked light years faster. the code provided by Joe Masilotti --> objective - C : Loading image from URL? (and pasted below):
-(void) someMethod {
// set placeholder image
UIImage* memberPhoto = [UIImage imageNamed:#"place_holder_image.png"];
// retrieve image for cell in using NSOperation
NSURL *url = [NSURL URLWithString:group.photo_link[indexPath.row]];
[self loadImage:url];
}
- (void)loadImage:(NSURL *)imageURL
{
NSOperationQueue *queue = [NSOperationQueue new];
NSInvocationOperation *operation = [[NSInvocationOperation alloc]
initWithTarget:self
selector:#selector(requestRemoteImage:)
object:imageURL];
[queue addOperation:operation];
}
- (void)requestRemoteImage:(NSURL *)imageURL
{
NSData *imageData = [[NSData alloc] initWithContentsOfURL:imageURL];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self performSelectorOnMainThread:#selector(placeImageInUI:) withObject:image waitUntilDone:YES];
}
- (void)placeImageInUI:(UIImage *)image
{
[self.memberPhotoImage setImage:image];
}
In Swift 3 and 4
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
cell.theImageView.image = theImage
This will be done in the main thread.
And to perform the same in asynchronous/background thread
DispatchQueue.main.async(){
let theURL = URL(string:"https://exampleURL.com")
let imagedData = NSData(contentsOf: theURL!)!
let theImage = UIImage(data: imagedData as Data)
}
cell.theImageView.image = theImage
Updating upon Jim dovey answer,[data release] is no longer required because in the updated apple guidelines. Memory management is done automatically by ARC (Automatic counting reference) ,
Here is the updated asynchronous call,
dispatch_async(dispatch_get_global_queue(0,0), ^{
NSData * data = [[NSData alloc] initWithContentsOfURL: [NSURL URLWithString: #"your_URL"]];
if ( data == nil )
return;
dispatch_async(dispatch_get_main_queue(), ^{
self.your_UIimage.image = [UIImage imageWithData: data];
});
});