iPhone: image horizontal gallery - objective-c

I am trying to create sample gallery before filling real data.
float xAxis = 0;
float mostHeight = self.galleryView.bounds.size.height;
for (int i = 0; i < 5; i++) {
//getImage from url
UIImage* myImage = [UIImage imageWithData:
[NSData dataWithContentsOfURL:
[NSURL URLWithString: #"http://4.bp.blogspot.com/-M9k3fhlUwmE/TdYgxL97xMI/AAAAAAAAAY8/_r45zbAm1p0/s1600/CARTOON_Cat-full.jpg"]]];
myImage = [[MyMath sharedMySingleton] imageWithImage:myImage convertToSize:CGSizeMake(mostHeight - 10, mostHeight - 10)];
//set image
UIImageView *rview= [[UIImageView alloc] initWithImage:myImage];
/*adding the view to your scrollview*/
[self.galleryView addSubview:rview];
xAxis += myImage.size.width + 20;
}
//set scrolling area
self.galleryView.contentSize = CGSizeMake(xAxis, mostHeight);
galleryView is UIScrollView from xib. Here, i am trying to fill 5 same pictures to gallery, but it fills only one and space after that (because of xAxis number). What did i do wrong?
P. S. I am new in iPhone sdk, so don't judge me

You never set the frame for the UIImageView. You need to offset the x coordinate of the origin of the frame (frame.origin.x) by width of the image view + padding.

Your forgot to set the frame for your imageView to with the x-Offset of xAxis.

Related

How to get the image width and height from image view in iOS

i have one image as 640*960,and my image view has 300*300.i set the content mode the image view as aspect fit, now i load the image to the imageView it looks follow..
after loaded the image to the image view i need the width and height of the image from the imageView.
Did you try?
#import AVFoundation;
UIImageView *image = [[UIImageView alloc] init];
image.image = [UIImage imageNamed:#"paint5.png"];
CGRect yourFrame = AVMakeRectWithAspectRatioInsideRect(imageView.image.size, imageView.bounds);
NSLog(#"%#",NSStringFromCGRect(yourFrame));
I think thats what u are looking for.
AV Foundation Functions Reference
You can get height and width of image in following way:
UIImageView *image = [[UIImageView alloc] init];
image.image = [UIImage imageNamed:#"paint5.png"];
int imgHeight = image.layer.frame.size.height;
int imgWidth = image.layer.frame.size.width;
NSLog(#"height of image %i",imgHeight);
NSLog(#"widht of image %i",imgWidth);

Attach UILabel to image thumbnail bottom (varying heights)

I have a view with image view and Label. Image view displays images of varying sizes and it is fixed to the top(see image) I want to dynamically glue text label to the bottom of every image without any space between image thumbnail and UILabel.
You can do that programically. I assume that you have UILabel attached to "label" and the image to "imageView" variables.
CGRect labelFrame = label.frame;
labelFrame.origin.y = imageView.frame.origin.y + imageView.frame.size.height + any_space_you_want_between_image_and_label;
label.frame = labelFrame;
That will change the position of label just bellow the imageView. I hope this helps.
Because as you mentioned you use UIViewContentModeScaleAspectFit for the content mode the solution is a bit harder.
You have to actually calculate the final height of the image (the actual size of the image inside the UIImageView):
//UIImage *img = ...; UIImageView *imgView = ....
CGFloat imageWidth = img.size.width;
CGFloat imageHeight = img.size.height;
CGFloat viewWidth = imgView.frame.size.width;
CGFloat viewHeight = imgView.frame.size.height;
float actualHeight = imageHeight * viewWidth / imageWidth;
// this is the actual height of the UIImage inside the UIImageView
CGRect labelFrame = label.frame;
labelFrame.origin.y = imageView.frame.origin.y + actualHeight + any_space_you_want_between_image_and_label;
label.frame = labelFrame;

AVCaptureSession preset photo and AVCaptureVideoPreviewLayer size

I initialize an AVCaptureSession and I preset it like this :
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
if (YES==[newCaptureSession canSetSessionPreset:AVCaptureSessionPresetPhoto]) {
newCaptureSession.sessionPreset = AVCaptureSessionPresetPhoto;
} else {
// Error management
}
Then I setup an AVCaptureVideoPreviewLayer :
self.preview = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height/*426*/)];
CALayer *previewLayer = preview.layer;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
captureVideoPreviewLayer.frame = previewLayer.frame;
[previewLayer addSublayer:captureVideoPreviewLayer];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
My question is:
How can I get the exact CGSize needed to display all the captureVideoPreviewLayer layer on screen ? More precisely I need the height as AVLayerVideoGravityResizeAspect make the AVCaptureVideoPreviewLayer fits the preview.size ?
I try to get AVCaptureVideoPreviewLayer size that fit right.
Very thank you for your help
After some research with AVCaptureSessionPresetPhoto the AVCaptureVideoPreviewLayer respect the 3/4 ration of iPhone camera. So it's easy to have the right height with simple calculus.
As an instance if the width is 320 the adequate height is:
320*4/3=426.6
Weischel's code didn't work for me. The idea worked, but the code didn't. Here's the code that did work:
// Get your AVCaptureSession somehow. I'm getting mine out of self.videoCamera, which is a GPUImageVideoCamera
// Get the appropriate AVCaptureVideoDataOutput out of the capture session. I only have one session, so it's easy.
AVCaptureVideoDataOutput *output = [[[self.videoCamera captureSession] outputs] lastObject];
NSDictionary* outputSettings = [output videoSettings];
// AVVideoWidthKey and AVVideoHeightKey did not work. I had to use these literal keys.
long width = [[outputSettings objectForKey:#"Width"] longValue];
long height = [[outputSettings objectForKey:#"Height"] longValue];
// video camera output dimensions are always for landscape mode. Transpose if your camera is in portrait mode.
if (UIInterfaceOrientationIsPortrait([self.videoCamera outputImageOrientation])) {
long buf = width;
width = height;
height = buf;
}
CGSize outputSize = CGSizeMake(width, height);
If I understand you correctly, you try to get width & height of the current video session.
You can obtain them from the outputSettings dictionary of your AVCaptureOutput. (Use AVVideoWidthKey & AVVideoHeightKey).
e.g.
NSDictionary* outputSettings = [movieFileOutput outputSettingsForConnection:videoConnection];
CGSize videoSize = NSMakeSize([[outputSettings objectForKey:AVVideoWidthKey] doubleValue], [[outputSettings objectForKey:AVVideoHeightKey] doubleValue]);
Update:
Another idea would be to grab the frame size from the image buffer of the preview session.
Implement the AVCaptureVideoDataOutputSampleBufferDelegate method captureOutput:didOutputSampleBuffer:fromConnection:
(don't forget to set the delegate of your AVCaptureOutput)
- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if(imageBuffer != NULL)
{
CGSize imageSize = CVImageBufferGetDisplaySize(imageBuffer);
NSLog(#"%#", NSStringFromSize(imageSize));
}
}
Thanks to gsempe for your answer. I'm on the same problem since hours :)
And i solved it with this code, to center it in the screen in landscape mode :
CGRect layerRect = [[[self view] layer] bounds];
[PreviewLayer setBounds:CGRectMake(0, 0, 426.6, 320)];
[PreviewLayer setPosition:CGPointMake(CGRectGetMidY(layerRect), CGRectGetMidX(layerRect))];
Note that I had to invert the CGRectGetMidY() and CGRectGetMidX() function to have a centered layer into my screen.
Thanks,
Julian

background of UIView gets scaled incorrectly and is offset

I saw other posts but still can't get my background image to scale correctly. I use a UIView to set a background image because i need the background image to tile vertically. I set the frame of the UIView to be the same size as my pixel width and height of the png, but the image gets offset and scaled incorrectly. Could it be because the image has some transparent pixels? I tried the following:
_view.backgroundColor = [UIColor colorWithPatternImage:[UIImage imageNamed:#"background.png"]];
_view.opaque = NO;
_view.contentMode = UIViewContentModeScaleAspectFit;
//_view.layer.contentsGravity = kCAGravityTop;
You might find it easier to just make a subclass of UIView. In the subclass, override drawRect: to draw your image using drawAsPatternInRect:. That method will fill the view by tiling the image:
- (void)drawRect:(CGRect)dirtyRect {
[self.tileImage drawAsPatternInRect:self.bounds];
}
I think it is because you are using a Image as a background colour.
it would be much better if you used an imageview, and added that as a subview, like this.
//Edit for vertical tiling
int y = 0;
int imgsize = 100; //in pixels
for (i =0; i<max; i++) { //max is the number of tiles you want
UIImageView *img = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"background.png"]];
img.frame = CGRectMake(0,y+imgsize,100,100);
y+=imgsize; //increase y each time(new starting point)
[_view addSubview:img];
}
Might not be the best way to do it, but it will work.

setting image view frame dynamically

In my app i am sending certain number of images from my device to other device.I achieved all this .Now what i want is that when i send only 1 image to other device ,then the frame of the image view should be the full screen.If i send 2 images then the frame should be like this;-2 images covering the whole scree.So the frame should change dynamically according to the number of images sent.Currently i am using table view to display the received images .What other option could be the best to achieve my target.Please help .Anyone done this type of work before ,please i need your help.
Thanks,
Daisy
You can do something like this:
CGFloat screenWidth = 320.0;
CGFloat screenHeight = 460.0;
NSArray imageNames = [NSArray arrayWithObjects:#"Picture1.png", #"Picture2.png", nil];
NSInteger numberOfImages = [imageNames size];
for (NSInteger j = 0; j < numberOfImages; ++j)
{
UIImageView *image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:[imageNames objectAtIndex:j]]];
[image setFrame:CGRectMake(0.0, screenHeight / (CGFloat)numberOfImages * (CGFloat)j, screenWidth, screenHeight / (CGFloat)numberOfImages)];
[self addSubview:image];
[image release];
}
This example lists the images vertically. When you want also a horizontal list, then you have to use maths.
The code is not tested.