I am struggling to display image.
The following codes work fine.
imageView.image = [UIImage imageNamed:#"picture"];
or
imageView.image = [UIImage imageNamed:#"picture.png"];
But the following codes do not work.
NSString *theImagePath = [[NSBundle mainBundle] pathForResource:#"picture" ofType:#"png"];
imageView.image = [UIImage imageWithContentsOfFile:theImagePath];
or
imageView.image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"picture" ofType:#"png"]];
I tried clean up the project but it has not fixed yet.
Does anyone have same experience?
imageView.image = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"picture.png" ofType:nil]];
Give it a shot, probably Xcode does not recognize your file as a PNG file.
But why don't you stay with imageNamed:?
Related
I have an app which loads an image from the local disk and then displays it on a UIImageView. I want to set the image to aspect scale to fit.
The problem I'm having is that the imageOrientation is coming back as UIImageOrientationRight even though it's a portrait image and that's messing with how the aspect calculations are done.
I've tried a few methods of changing the meta data but both rotate the image when it gets displayed.
UIImageView *iv = [[UIImageView alloc] initWithFrame:self.frame.frame];
NSMutableString *path =[[NSMutableString alloc] initWithString: [[NSBundle mainBundle] resourcePath]];
[path appendString:#"/pic2.jpg"];
NSData *data = [NSData dataWithContentsOfFile:path];
UIImage *img = [UIImage imageWithData:data];
UIImage *fixed1 = [UIImage imageWithCGImage:[img CGImage]
scale:1.0
orientation: UIImageOrientationUp];
UIImage *sourceImage = img;
UIImage *fixed2 = [UIImage
imageWithCGImage:[img imageRotatedByDegrees:90].CGImage
scale:sourceImage.scale
orientation:UIImageOrientationUp];
iv.image = fixed1; // fixed2;
[iv setContentMode:UIViewContentModeScaleAspectFill];
[self.view addSubview:iv];
In the end I took a different approach and made the UIImageView 360 wide and placed it in the centre and this achieved the desired effect.
I'm not able to show an image that I load from a json file.
I'm parsing my json with JSONKit and everything works fine but I can't load an image in the UIImageview. I hope some of you can help me out there.
below my code I thought might be correct but it isn't.
UIImageView *image = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 100, 115)];
image.image = [UIImage imageNamed:[detailView objectForKey:#"thumbnail"]];
[self.view addSubview:image];
[image release];
i think in json u are getting url of image. display image from url using below code
//NSURL *url = [NSURL URLWithString:#"http://192.168.1.2x0/pic/LC.jpg"];
NSURL *url = [NSURL URLWithString:[detailView objectForKey:#"thumbnail"]];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImageView *subview = [[UIImageView alloc] initWithFrame:CGRectMake(0.0f, 0.0f,320.0f, 460.0f)];
[subview setImage:[UIImage imageWithData:data]];
[self.view addSubview:subview];
[subview release];
the code you posted
image.image = [UIImage imageNamed:[detailView objectForKey:#"thumbnail"]];
will try to find the image on your bundle named the string stored in [detailView objectForKey:#"thumbnail"]
As you mentioned, your images are from remote server, you have to download the image from your remote server.
UIImageView *image = [[UIImageView alloc] initWithFrame:CGRectMake(20, 20, 100, 115)];
image.image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[detailView objectForKey:#"thumbnail"]]]];
[self.view addSubview:image];
[image release]
Which call is correct ? Seems that both calls have same result.
UIImage *img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"image" ofType:#"png"]];
or
UIImage *img = [[UIImage alloc] imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:#"image" ofType:#"png"]];
First in correct, as imageWithContentsOfFile is a class convenience method(Class method).
The easiest way to initialize a UIImage is...
UIImage *img = [UIImage imageNamed: #"image.png"];
So, I'm doing a Breakout-clone on the iPhone. All elements except the bricks to hit, are created and working as expected with the NIB-file.
However, if I want to create different levels and run collision detection on the bricks, it seems stupid to add them in Interface Builder. How do I add them to the view in code?
I got an image called "brick.png" that I want to use with an UIImageView. Also, I want to have arrays and / or lists with these so I can build cool levels with pattern in the bricks and all :)
How do I do this in code?
#Mark is right, I would just add where the image need to be displayed!
UIImageView *imgView = [[[UIImageView alloc] initWithFrame:CGRectMake(10, 10, 100, 20)] autorelease];
NSString *imgFilepath = [[NSBundle mainBundle] pathForResource:#"brick" ofType:#"png"];
UIImage *img = [[UIImage alloc] initWithContentsOfFile:imgFilepath];
[imgView setImage:img];
[img release];
[self.view addSubview:imgView];
I tested the code and for me only shows when told the coordinates
It's really pretty easy. Here's an example of how you would create and display a UIImageView programatically...
UIImageView *imgView = [[[UIImageView alloc] init] autorelease];
NSString *imgFilepath = [[NSBundle mainBundle] pathForResource:#"brick" ofType:#"png"];
UIImage *img = [[UIImage alloc] initWithContentsOfFile:imgFilePath];
[imgView setImage:img];
[img release];
[self.view addSubview:imgView];
That's pretty much all there is to it.
Like the title says, in the iPhone SDK, I want to create an animated UIImageView and use it as a camera overlay. However, nothing appears. I've been using the same setup to display a static image as an overlay, but when trying to use the following code, no overlay appears:
imageView.animationImages = [NSArray arrayWithObjects:
[UIImage imageNamed:#"cameraScreenOverlay1.png"],
[UIImage imageNamed:#"cameraScreenOverlay2.png"],
[UIImage imageNamed:#"cameraScreenOverlay3.png"],
[UIImage imageNamed:#"cameraScreenOverlay4.png"],
[UIImage imageNamed:#"cameraScreenOverlay4.png"],
[UIImage imageNamed:#"cameraScreenOverlay4.png"],
[UIImage imageNamed:#"cameraScreenOverlay3.png"],
[UIImage imageNamed:#"cameraScreenOverlay2.png"],
[UIImage imageNamed:#"cameraScreenOverlay1.png"],
nil];
imageView.animationDuration = 1.0;
imageView.animationRepeatCount = 0;
[imageView startAnimating];
I know the above code works when the imageView is not used as an overlay. Any thoughts? Is this just a limitation of the current SDK?
I am trying to do the same thing. It will work when its just an overlay image, but once I try to animate with an array of images, nothing shows.
I was loading the pngs all wrong with imageWithContentsOfFile. It wont load the image it its just the image file. It needs the actual path.
Try this something like this:
NSArray *animationImages = [NSArray arrayWithObjects:
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back1" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back2" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back3" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back4" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back5" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back6" ofType:#"png"]],
[UIImage imageWithContentsOfFile:[[ NSBundle mainBundle] pathForResource:#"back7" ofType:#"png"]],nil];
imageview = [[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,480)];
imageview.animationImages = [animationImages retain] ;
imageview.animationRepeatCount = 0;
imageview.animationDuration= 1;
[overlay.view addSubview:imageview];
[moviePlayerWindow addSubview:overlay.view];