In my app i am sending certain number of images from my device to other device.I achieved all this .Now what i want is that when i send only 1 image to other device ,then the frame of the image view should be the full screen.If i send 2 images then the frame should be like this;-2 images covering the whole scree.So the frame should change dynamically according to the number of images sent.Currently i am using table view to display the received images .What other option could be the best to achieve my target.Please help .Anyone done this type of work before ,please i need your help.
Thanks,
Daisy
You can do something like this:
CGFloat screenWidth = 320.0;
CGFloat screenHeight = 460.0;
NSArray imageNames = [NSArray arrayWithObjects:#"Picture1.png", #"Picture2.png", nil];
NSInteger numberOfImages = [imageNames size];
for (NSInteger j = 0; j < numberOfImages; ++j)
{
UIImageView *image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:[imageNames objectAtIndex:j]]];
[image setFrame:CGRectMake(0.0, screenHeight / (CGFloat)numberOfImages * (CGFloat)j, screenWidth, screenHeight / (CGFloat)numberOfImages)];
[self addSubview:image];
[image release];
}
This example lists the images vertically. When you want also a horizontal list, then you have to use maths.
The code is not tested.
Related
I initialize an AVCaptureSession and I preset it like this :
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
if (YES==[newCaptureSession canSetSessionPreset:AVCaptureSessionPresetPhoto]) {
newCaptureSession.sessionPreset = AVCaptureSessionPresetPhoto;
} else {
// Error management
}
Then I setup an AVCaptureVideoPreviewLayer :
self.preview = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height/*426*/)];
CALayer *previewLayer = preview.layer;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
captureVideoPreviewLayer.frame = previewLayer.frame;
[previewLayer addSublayer:captureVideoPreviewLayer];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
My question is:
How can I get the exact CGSize needed to display all the captureVideoPreviewLayer layer on screen ? More precisely I need the height as AVLayerVideoGravityResizeAspect make the AVCaptureVideoPreviewLayer fits the preview.size ?
I try to get AVCaptureVideoPreviewLayer size that fit right.
Very thank you for your help
After some research with AVCaptureSessionPresetPhoto the AVCaptureVideoPreviewLayer respect the 3/4 ration of iPhone camera. So it's easy to have the right height with simple calculus.
As an instance if the width is 320 the adequate height is:
320*4/3=426.6
Weischel's code didn't work for me. The idea worked, but the code didn't. Here's the code that did work:
// Get your AVCaptureSession somehow. I'm getting mine out of self.videoCamera, which is a GPUImageVideoCamera
// Get the appropriate AVCaptureVideoDataOutput out of the capture session. I only have one session, so it's easy.
AVCaptureVideoDataOutput *output = [[[self.videoCamera captureSession] outputs] lastObject];
NSDictionary* outputSettings = [output videoSettings];
// AVVideoWidthKey and AVVideoHeightKey did not work. I had to use these literal keys.
long width = [[outputSettings objectForKey:#"Width"] longValue];
long height = [[outputSettings objectForKey:#"Height"] longValue];
// video camera output dimensions are always for landscape mode. Transpose if your camera is in portrait mode.
if (UIInterfaceOrientationIsPortrait([self.videoCamera outputImageOrientation])) {
long buf = width;
width = height;
height = buf;
}
CGSize outputSize = CGSizeMake(width, height);
If I understand you correctly, you try to get width & height of the current video session.
You can obtain them from the outputSettings dictionary of your AVCaptureOutput. (Use AVVideoWidthKey & AVVideoHeightKey).
e.g.
NSDictionary* outputSettings = [movieFileOutput outputSettingsForConnection:videoConnection];
CGSize videoSize = NSMakeSize([[outputSettings objectForKey:AVVideoWidthKey] doubleValue], [[outputSettings objectForKey:AVVideoHeightKey] doubleValue]);
Update:
Another idea would be to grab the frame size from the image buffer of the preview session.
Implement the AVCaptureVideoDataOutputSampleBufferDelegate method captureOutput:didOutputSampleBuffer:fromConnection:
(don't forget to set the delegate of your AVCaptureOutput)
- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if(imageBuffer != NULL)
{
CGSize imageSize = CVImageBufferGetDisplaySize(imageBuffer);
NSLog(#"%#", NSStringFromSize(imageSize));
}
}
Thanks to gsempe for your answer. I'm on the same problem since hours :)
And i solved it with this code, to center it in the screen in landscape mode :
CGRect layerRect = [[[self view] layer] bounds];
[PreviewLayer setBounds:CGRectMake(0, 0, 426.6, 320)];
[PreviewLayer setPosition:CGPointMake(CGRectGetMidY(layerRect), CGRectGetMidX(layerRect))];
Note that I had to invert the CGRectGetMidY() and CGRectGetMidX() function to have a centered layer into my screen.
Thanks,
Julian
I am trying to add a individual UiImageView for each item in an array this is what I have so far
_shapeArray = [NSArray arrayWithObjects:#"cloud.png",#"star.png", nil];
for (int i = 0; i < [_shapeArray count]; i++){
// I make a UIImage, which I will draw later
UIImage* image = [UIImage imageNamed:[NSString stringWithFormat:#"%#",[_shapeArray objectAtIndex:i]]];
UIImageView* blockView = [[UIImageView alloc] initWithImage:image];
blockView.frame = CGRectMake(arc4random()%320, arc4random()%460, image.size.width, image.size.height);
[self.view addSubview:blockView];
}
But as you can tell it just adds the last image in the array. I can not figure out a way to maybe add the array object number to the name of the UIImageView. Maybe I am going at it the wrong way, if so what would be the best way?
This code works, but you need to make sure of a couple of things:
- That the file name actually exists in your bundle (check for uppercase/lowercase), you would not get an error message if it didn't, but it would not show the picture
- That the image sizes are not too large and don't cover each other
You are adding the images in the same frame.
blockView.frame = CGRectMake(arc4random()%320+SomeXValue, arc4random()%460+SomeYvalue, image.size.width, image.size.height);
I need to split a big Image ( about 10000px Height ) in a number of smaller Images to use them as Textures for a OpenGL, below is the way I'm doing it right now, anybody got any ideas to do it faster, because it is taking quite long.
NSArray *images = [NSArray alloc] initWith
for (int i = 0; i<numberOfImages; i++){
int t = i*origHeight;
CGRect fromRect = CGRectMake(0, t, origWidth, origHeight); // or whatever rectangle
CGImageRef drawImage = CGImageCreateWithImageInRect(sourceImage.CGImage, fromRect);
UIImage *newImage = [UIImage imageWithData:UIImageJPEGRepresentation([UIImage imageWithCGImage:drawImage],1.0)];
[images addObject:newImage];
CGImageRelease(drawImage);
}
You can pre-split them before ie using the convert command with ImageMagick which you can get with brew
http://www.imagemagick.org/discourse-server/viewtopic.php?f=1&t=15771
I am trying to create sample gallery before filling real data.
float xAxis = 0;
float mostHeight = self.galleryView.bounds.size.height;
for (int i = 0; i < 5; i++) {
//getImage from url
UIImage* myImage = [UIImage imageWithData:
[NSData dataWithContentsOfURL:
[NSURL URLWithString: #"http://4.bp.blogspot.com/-M9k3fhlUwmE/TdYgxL97xMI/AAAAAAAAAY8/_r45zbAm1p0/s1600/CARTOON_Cat-full.jpg"]]];
myImage = [[MyMath sharedMySingleton] imageWithImage:myImage convertToSize:CGSizeMake(mostHeight - 10, mostHeight - 10)];
//set image
UIImageView *rview= [[UIImageView alloc] initWithImage:myImage];
/*adding the view to your scrollview*/
[self.galleryView addSubview:rview];
xAxis += myImage.size.width + 20;
}
//set scrolling area
self.galleryView.contentSize = CGSizeMake(xAxis, mostHeight);
galleryView is UIScrollView from xib. Here, i am trying to fill 5 same pictures to gallery, but it fills only one and space after that (because of xAxis number). What did i do wrong?
P. S. I am new in iPhone sdk, so don't judge me
You never set the frame for the UIImageView. You need to offset the x coordinate of the origin of the frame (frame.origin.x) by width of the image view + padding.
Your forgot to set the frame for your imageView to with the x-Offset of xAxis.
I have a problem when trying to display small (16x16) UIImages, they appear a bit blurry.
These UIImages are favicons I download from different websites and I have compared them with some images from other apps and they're blurrier.
I'm displaying them on a custom UITableViewCell like below :
NSData *favicon = [NSData dataWithContentsOfURL:[NSURL URLWithString:[subscription faviconURL]]];
if([favicon length] > 0){
UIImage *img = [[UIImage alloc] initWithData:favicon];
CGSize size;
size.height = 16;
size.width = 16;
UIGraphicsBeginImageContext(size);
[img drawInRect:CGRectMake(0, 0, size.width, size.height)];
UIImage *scaledImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
cell.mainPicture.image = scaledImage;
}
Is there anything wrong with my custom UITableViewCell or the way I download the image ?
Thank you.
[EDIT 1] : By the way, .ico and .png look the same.
[EDIT 2] : I'm working on an iPad 2, so no Retina display.
When you display the resultant UIImage to the user, are you aligning the view on pixel boundaries?
theImage.frame = CGRectIntegral(theImage.frame);
Most of your graphics and text will appear to be blurry if your views are positioned or sized with non-integral values. If you run your app in the simulator, you can turn on "Color Misaligned Images" to highlight elements with bad offsets.