I'm working on a problem where I have to download around 10 different large files in a queue, and I need to display a progress bar indicating the status of the total transfer. I have this working just fine with ASIHTTPRequest in iOS4, but I'm trying to transition to AFNetworking since ASIHTTPRequest has issues in iOS5 and is no longer maintained.
I know you can report progress on individual requests using AFHTTPRequestOperation's downloadProgressBlock, but I can't seem to find a way to report overall progress of multiple requests that would be executed on the same NSOperationQueue.
Any suggestions? Thanks!
[operation setUploadProgressBlock:^(NSInteger bytesWritten, NSInteger totalBytesWritten, NSInteger totalBytesExpectedToWrite) {
NSLog(#"Sent %d of %d bytes", totalBytesWritten, totalBytesExpectedToWrite);
}];
Operation is AFHTTPRequestOperation
You can subclass AFURLConnectionOperation to have 2 new properties: (NSInteger)totalBytesSent, and (NSInteger)totalBytesExpectedToSend. You should set these properties in the NSURLConnection callback like so:
- (void)connection:(NSURLConnection *)__unused connection
didSendBodyData:(NSInteger)bytesWritten
totalBytesWritten:(NSInteger)totalBytesWritten
totalBytesExpectedToWrite:(NSInteger)totalBytesExpectedToWrite
{
[super connection: connection didSendBodyData:bytesWritten totalBytesWritten:totalBytesWritten totalBytesExpectedToWrite:totalBytesExpectedToWrite];
self.totalBytesSent = totalBytesWritten;
self.totalBytesExpectedToSend = totalBytesExpectedToSend;
}
Your uploadProgress block may look like this:
……(NSInteger bytesWritten, NSInteger totalBytesWritten, NSInteger totalBytesExpectedToWrite) {
NSInteger queueTotalExpected = 0;
NSInteger queueTotalSent = 0;
for (AFURLConnectionOperation *operation in self.operationQueue) {
queueTotalExpected += operation.totalBytesExpectedToSend;
queueTotalSent += operation.totalBytesSent;
}
self.totalProgress = (double)queueTotalSent/(double)queueTotalExpected;
}];
I would try subclassing UIProgressView with a subclass that keeps track of all the different items you are watching and then has logic that adds the progress of them all together.
With code like this perhaps:
#implementation customUIProgressView
-(void) updateItem:(int) itemNum ToPercent:(NSNumber *) percentDoneOnItem {
[self.progressQueue itemAtIndexPath:itemNum] = percentDoneOnItem;
[self updateProgress];
}
-(void) updateProgress {
float tempProgress = 0;
for (int i=1; i <= [self.progressQueue count]; i++) {
tempProgress += [[self.progressQueue itemAtIndexPath:itemNum] floatValue];
}
self.progress = tempProgress / [self.progressQueue count];
}
Related
I have a Metal-based application that utilizes AVFoundation for movie playback + seeking. To start, I am only processing .mov files and nothing else, and even the app in question will not process any other format. While it has been working effectively in the past, I recently received feedback from some M1 users about black frames only showing up on their app regardless of which time they set their seek bar to.
I have performed the following troubleshooting in my attempts find the root source for this black texture bug:
Verified that the video being processed is .mov movie file type.
Verified that the CVPixelBufferRef object returned from AVPlayerItemVideoOutput's -copyPixelBufferForItemTime: is valid, i.e. not nil.
Verified that the MTLTexture created from the CVPixelBufferRef is also valid, i.e. also not nil.
Converted the MTLTexture to a bitmap and saved it as a .JPEG image to the user's disk.
The last part is probably the most important step here as the images saved are also all black (for users experiencing the bug) when viewed using Finder, and made me come to the assumption that I might be using AVFoundation somewhat wrong. While I hope that my opening post might not be too long in regards to code, below are the following steps I am performing in order to process videos to be rendered using Metal for your reference.
Inside my reader class, I have the following properties:
VideoReaderClass.m
#property (retain) AVPlayer *vidPlayer;
#property (retain) AVPlayerItem *vidPlayerItem;
#property (retain) AVPlayerItemVideoOutput *playerItemVideoOutput;
#property (retain) AVMutableComposition *videoMutableComp;
#property (assign) AVPlayerItemStatus playerItemStatus;
// The frame duration of the composition, as well as the media being processed.
#property (assign) CMTime frameDuration;
// Tracks the time to insert a new footage to
#property (assign) CMTime startInsertTime;
// Weak reference to a delegate responsible for processing seek:
#property (weak) id<VideoReaderRenderer> vidReaderRenderer;
A method called before reading starts. Handles observer cleanup as well.
- (void)initializeReadMediaPipeline
{
[_lock lock];
_startInsertTime = kCMTimeZero;
_playerItemStatus = AVPlayerItemStatusUnknown;
if(_videoMutableComp)
{
[_videoMutableComp release];
}
_videoMutableComp = [AVMutableComposition composition];
[_videoMutableComp retain];
if(_vidPlayer)
{
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"status"
context:MyAddAVPlayerItemKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackBufferFull"
context:MyAVPlayerItemBufferFullKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackBufferEmpty"
context:MyAVPlayerItemBufferEmptyKVOContext];
[[_vidPlayer currentItem] removeObserver:self
forKeyPath:#"playbackLikelyToKeepUp"
context:MyAVPlayerItemBufferKeepUpKVOContext];
[_vidPlayer release];
_vidPlayer = nil;
}
[_lock unlock];
}
In this class is a public method to be called in a background queue when the app should begin processing a video, or a set of videos.
- (BOOL)addMediaAtLocation:(NSURL *)location
{
BOOL result = NO;
[_lock lock];
NSDictionary* optsInfo =
#{
AVURLAssetPreferPreciseDurationAndTimingKey : #(YES)
};
AVURLAsset* assetURL = [AVURLAsset URLAssetWithURL:location options:optsInfo];
AVAssetTrack* assetTrack = [assetURL tracksWithMediaType:AVMediaTypeVideo].firstObject;
NSError* error = nil;
[_videoMutableComp insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
ofAsset:assetURL
atTime:_startInsertTime
error:&error];
if(!error)
{
[_vidPlayerItem addObserver:self
forKeyPath:#"status"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAddAVPlayerItemKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackBufferFull"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferFullKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackBufferEmpty"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferEmptyKVOContext];
[_vidPlayerItem addObserver:self
forKeyPath:#"playbackLikelyToKeepUp"
options:NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew
context:MyAVPlayerItemBufferKeepUpKVOContext];
[_vidPlayer replaceCurrentItemWithPlayerItem:_vidPlayerItem];
}
_vidPlayer = [[AVPlayer alloc] init];
if(_playerItemVideoOutput)
{
[_playerItemVideoOutput release];
_playerItemVideoOutput = nil;
}
[_lock unlock];
}
Called externally by our view controller when seeking is needed.
- (void)seekToFrame:(CMTime)time
{
if(_vidReaderRenderer)
{
if(_vidPlayerItem && _playerItemVideoOutput)
{
[_vidPlayerItem seekTimeTime:time
toleranceBefore:kCMTimeZero
toleranceAfter:kCMTimeZero
completionHandler:^(BOOL finished){
if(finished)
{
CVPixelBufferRef p_buffer = [_playerItemVideoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
if(p_buffer)
{
[_vidReaderRenderer seekOperationFinished:p_buffer];
}
}
}];
}
}
}
Lastly, this is where I handle the notification when the AVPlayerItem's status is set for "ReadyToPlay."
- (void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary<NSKeyValueChangeKey,id> *)change
context:(void *)context
{
// ... Checking for right contexts here, omitting for this example.
if([keyPath isEqualToString:#"status"])
{
AVPlayerItemStatus status = AVPlayerItemStatusUnknown;
// Get the status change from the change dictionary
NSNumber *statusNumber = change[NSKeyValueChangeNewKey];
if([statusNumber isKindOfClass:[NSNumber class]])
{
status = statusNumber.integerValue;
}
// Switch over the status
switch(status)
{
case AVPlayerItemStatusReadyToPlay:
{
// Ready to Play
if(_vidPlayerItem)
{
[_lock lock];
NSDictionary* attribs =
#{
(NSString*)kCVPixelBufferPixelFormatTypeKey : #(kCVPixelFormatType_64RGBAHalf)
};
_playerItemVideoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attribs];
[_vidPlayerItem addOutput:_playerItemVideoOutput];
[_vidPlayer setRate:0.0];
_playerItemStatus = status;
[_lock unlock];
// "Wake up" the AVPlayer/AVPlayerItem here.
[self seekToFrame:kCMTimeZero];
}
break;
}
}
}
}
The code listing below is the class that also acts the delegate for a custom protocol called VideoReaderRenderer, which handles the seekToTime: completion block, as well as converting the pixel buffer to a MTLTexture:
RendererDelegate.m
At some point in its initialization and before performing any seek operations, I instantiate the CVMetalTextureCacheRef.
CVReturn ret_val = CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, _mtlDevice, nil, &_metalTextureCache);
The method in which my VideoReader class calls inside seekToTime's completion block:
-(void)seekOperationFinished:(CVPixelBufferRef)pixelBuffer
{
CVMetalTextureRef mtl_tex = NULL;
size_t w = CVPixelBufferGetWidth(pixelBuffer);
size_t h = CVPixelBufferGetHeight(pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
CVReturn ret_val = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_metalTextureCache,
pixelBuffer,
nil,
MTLPixelFormatRGBA16Float,
w,
h,
0,
&mtl_tex);
CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
if(ret_val != kCVReturnSuccess)
{
if(mtl_tex != NULL)
{
CVBufferRelease(mtl_tex);
CVPixelBufferRelease(pixelBuffer);
}
return;
}
id<MTLTexture> inputTexture = CVMetalTextureGetTexture(mtl_tex);
if(!inputTexture)
return;
NSSize texSize = NSMakeSize(inputTexture.width, inputTexture.height);
_viewPortSize = (simd_uint2){(uint)texSize.width, (uint)texSize.height};
// Create the texture here.
[_textureLock lock];
if(NSEqualSizes(_projectSize, texSize))
{
if(!_inputFrameTex)
{
MTLTextureDescriptor* texDescriptor = [MTLTextureDescriptor new];
texDescriptor.width = texSize.width;
texDescriptor.height = texSize.height;
texDescriptor.pixelFormat = MTLPixelFormatRGBA16Float;
texDescriptor.usage = MTLTextureUsageShaderWrite | MTLTextureUsageShaderRead;
_inputFrameTex = [_mtlDevice newTextureWithDescriptor:texDescriptor];
}
id<MTLCommandBuffer> commandBuffer = [_drawCopyCommandQueue commandBuffer];
id<MTLBlitCommandEncoder> blitEncoder = [commandBuffer blitCommandEncoder];
[blitEncoder copyFromTexture:inputTexture
sourceSlice:0
sourceLevel:0
sourceOrigin:MTLOriginMake(0, 0, 0)
sourceSize:MTLSizeMake(inputTexture.width, inputTexture.height, 1)
toTexture:_inputFrameTex
destinationSlice:0
destinationLevel:0
destinationOrigin:MTLOriginMake(0, 0, 0)];
[blitEncoder endEncoding];
[commandBuffer commit];
[commandBuffer waitUntilCompleted];
}
[_textureLock unlock];
CVBufferRelease(mtl_tex);
CVPixelBufferRelease(pixelBuffer);
// Added "trouble-shooting" code to save the contents of the MTLTexture as a JPEG onto the user's disk.
if(_inputFrameTex)
{
// ... Save contents of texture to disk as JPEG.
}
}
Once again, my apologies for the rather long post. Additional details include having our rendering being displayed on a custom NSView backed by a CAMetalLayer, where its subsequent draw calls are called by a CVDisplayLink for background rendering. Though I don't think the latter seems to be the source of the problem of seekToTime: returning black frames. Can anyone care to shed some light to my situation? Thank you very much in advance.
In my ARC iOS app I am running a for loop that ends up with a large memory allocation overhead. I want to somehow end my for loop with minimal/no extra memory allocated. In this instance I am using the SSKeychain library which lets me fetch things from a keychain. I usually just use autorelease pools and get my memory removed properly but here I don't know what is wrong because I end up with 70 mb + of memory allocated at the end of the loop. I have been told that I should start/end a run loop to properly deal with this. Thoughts?
for (int i = 0; i < 10000; ++i) {
#autoreleasepool {
NSError * error2 = nil;
SSKeychainQuery* query2 = [[SSKeychainQuery alloc] init];
query2.service = #"Eko";
query2.account = #"loginPINForAccountID-2";
query2.password = nil;
[query2 fetch:&error2];
}
}
What are you using to measure memory usage?
Results of a very simple test...
Running in the simulator, measure only resident memory before and after.
Without autoreleasepool...
Started with 27254784, ended with 30212096, used 2957312
With autoreleasepool...
Started with 27316224, ended with 27443200, used 126976
Obviously, the autoreleasepool is preventing memory from growing too bad, and I don't see anything close to 70MB being used under any circumstance.
You should run instruments and get some good readings on the behavior.
Here is the code I hacked and ran...
The memchecker
static NSUInteger available_memory(void) {
NSUInteger result = 0;
struct task_basic_info info;
mach_msg_type_number_t size = sizeof(info);
if (task_info(mach_task_self(), TASK_BASIC_INFO, (task_info_t)&info, &size) == KERN_SUCCESS) {
result = info.resident_size;
}
return result;
}
And the code...
#define USE_AUTORELEASE_POOL 1
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
dispatch_async(dispatch_get_main_queue(), ^{
NSUInteger beginMemory = available_memory();
for (int i = 0; i < 10000; ++i) {
#ifdef USE_AUTORELEASE_POOL
#autoreleasepool
#endif
{
NSError * error2 = nil;
SSKeychainQuery* query2 = [[SSKeychainQuery alloc] init];
query2.service = #"Eko";
query2.account = #"loginPINForAccountID-2";
query2.password = nil;
[query2 fetch:&error2];
}
}
NSUInteger endMemory = available_memory();
NSLog(#"Started with %u, ended with %u, used %u", beginMemory, endMemory, endMemory-beginMemory);
});
return YES;
}
First of all, I'm an Objective-C novice. So I'm not very familiar with OS X or iOS development. My experience is mostly in Java.
I'm creating an agent-based modeling-framework. I'd like to display the simulations and to do that I'm writing a little application. First, a little bit about the framework. The framework has a World class, in which there is a start method, which iterates over all agents and has them perform their tasks. At the end of one "step" of the world (i.e., after all the agents have done their thing), the start method calls the intercept method of an object that implements InterceptorProtocol. This object was previously passed in via the constructor. Using the interceptor, anyone can get a hook into the state of the world. This is useful for logging, or in the scenario that I'm trying to accomplish: displaying the information in a graphical manner. The call to intercept is synchronous.
Now as far as the GUI app is concerned, it is pretty simple. I have a controller that initializes a custom view. This custom view also implements InterceptorProtocol so that it can listen in, to what happens in the world. I create a World object and pass in the view as an interceptor. The view maintains a reference to the world through a private property and so once I have initialized the world, I set the view's world property to the world I have just created (I realize that this creates a cycle, but I need a reference to the world in the drawRect method of the view and the only way I can have it is if I maintain a reference to it from the class).
Since the world's start method is synchronous, I don't start the world up immediately. In the drawRect method I check to see if the world is running. If it is not, I start it up in a background thread. If it is, I examine the world and display all the graphics that I need to.
In the intercept method (which gets called from start running on the background thread), I set setNeedsToDisplay to YES. Since the start method of the world is running in a separate thread, I also have a lock object that I use to synchronize so that I'm not working on the World object while it's being mutated (this part is kind of janky and it's probably not working the way I expect it to - there are more than a few rough spots and I'm simply trying to get a little bit working; I plan to clean up later).
My problem is that the view renders some stuff, and then it pretty much locks up. I can see that the NSLog statements are being called and so the code is running, but nothing is getting updated on the view.
Here's some of the pertinent code:
MasterViewController
#import "MasterViewController.h"
#import "World.h"
#import "InfectableBug.h"
#interface MasterViewController ()
#end
#implementation MasterViewController
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
_worldView = [[WorldView alloc] init];
World* world = [[World alloc] initWithName: #"Bhumi"
rows: 100
columns: 100
iterations: 2000
snapshotInterval: 1
interceptor: _worldView];
for(int i = 0; i < 999; i++) {
NSMutableString* name = [NSMutableString stringWithString: #"HealthyBug"];
[name appendString: [[NSNumber numberWithInt: i] stringValue]];
[world addBug: [[InfectableBug alloc] initWithWorld: world
name: name
layer: #"FirstLayer"
infected: NO
infectionRadius: 1
incubationPeriod: 10
infectionStartIteration: 0]];
}
NSLog(#"Added all bugs. Going to add infected");
[world addBug: [[InfectableBug alloc] initWithWorld: world
name: #"InfectedBug"
layer: #"FirstLayer"
infected: YES
infectionRadius: 1
incubationPeriod: 10
infectionStartIteration: 0]];
[_worldView setWorld: world];
//[world start];
}
return self;
}
- (NSView*) view {
return self.worldView;
}
#end
WorldView
#import "WorldView.h"
#import "World.h"
#import "InfectableBug.h"
#implementation WorldView
#synthesize world;
- (id) initWithFrame:(NSRect) frame {
self = [super initWithFrame:frame];
if (self) {
// Initialization code here.
}
return self;
}
- (void) drawRect:(NSRect) dirtyRect {
CGContextRef myContext = [[NSGraphicsContext currentContext] graphicsPort];
CGContextClearRect(myContext, CGRectMake(0, 0, 1024, 768));
NSUInteger rows = [world rows];
NSUInteger columns = [world columns];
NSUInteger cellWidth = 1024 / columns;
NSUInteger cellHeight = 768 / rows;
if([world running]) {
#synchronized (_lock) {
//Ideally we would need layers, but for now let's just get this to display
NSArray* bugs = [world bugs];
NSEnumerator* enumerator = [bugs objectEnumerator];
InfectableBug* bug;
while ((bug = [enumerator nextObject])) {
if([bug infected] == YES) {
CGContextSetRGBFillColor(myContext, 128, 0, 0, 1);
} else {
CGContextSetRGBFillColor(myContext, 0, 0, 128, 1);
}
NSLog(#"Drawing bug %# at %lu, %lu with width %lu and height %lu", [bug name], [bug x] * cellWidth, [bug y] * cellHeight, cellWidth, cellHeight);
CGContextFillRect(myContext, CGRectMake([bug x] * cellWidth, [bug y] * cellHeight, cellWidth, cellHeight));
}
}
} else {
[world performSelectorInBackground: #selector(start) withObject: nil];
}
}
- (BOOL) isFlipped {
return YES;
}
- (void) intercept: (World *) aWorld {
struct timespec time;
time.tv_sec = 0;
time.tv_nsec = 500000000L;
//nanosleep(&time, NULL);
#synchronized (_lock) {
[self setNeedsDisplay: YES];
}
}
#end
start method in World.m:
- (void) start {
running = YES;
while(currentIteration < iterations) {
#autoreleasepool {
[bugs shuffle];
NSEnumerator* bugEnumerator = [bugs objectEnumerator];
Bug* bug;
while((bug = [bugEnumerator nextObject])) {
NSString* originalLayer = [bug layer];
NSUInteger originalX = [bug x];
NSUInteger originalY = [bug y];
//NSLog(#"Bug %# is going to act and location %i:%i is %#", [bug name], [bug x], [bug y], [self isOccupied: [bug layer] x: [bug x] y: [bug y]] ? #"occupied" : #"not occupied");
[bug act];
//NSLog(#"Bug has acted");
if(![originalLayer isEqualToString: [bug layer]] || originalX != [bug x] || originalY != [bug y]) {
//NSLog(#"Bug has moved");
[self moveBugFrom: originalLayer atX: originalX atY: originalY toLayer: [bug layer] atX: [bug x] atY: [bug y]];
//NSLog(#"Updated bug position");
}
}
if(currentIteration % snapshotInterval == 0) {
[interceptor intercept: self];
}
currentIteration++;
}
}
//NSLog(#"Done.");
}
Please let me know if you'd like to see any other code. I realize that the code is not pretty; I was just trying to get stuff to work and I plan on cleaning it up later. Also, if I'm violating an Objective-C best practices, please let me know!
Stepping out for a bit; sorry if I don't respond immediately!
Whew, quiet a question for probably a simple answer: ;)
UI updates have to be performed on the main thread
If I read your code correctly, you call the start method on a background thread. The start method contains stuff like moveBugFrom:... and also the intercept: method. The intercept method thus calls setNeedsDisplay: on a background thread.
Have all UI related stuff perform on the main thread. Your best bet is to use Grand Central Dispatch, unless you need to support iOS < 4 or OS X < 10.6 (or was it 10.7?), like this:
dispatch_async(dispatch_get_main_queue(), ^{
// perform UI updates
});
I am developing an application for the iPhone. The question I have is how to display a new label with a different text every .5 seconds. For example, it would display Blue, Red, Green, Orange and Purple; one right after one another. Right now I am doing this:
results = aDictionary;
NSArray *myKeys = [results allKeys];
NSArray *sortedKeys = [myKey sortedArrayUsingSelector:#selector(caseInsensitiveCompare:)];
int keyCount = [sortedKeys count];
while (flag == NO) {
NSTimeInterval timeMS = [startDate timeIntervalSinceNow] * -10000.0;
if (timeMS >= i) {
ii++;
i += 1000;
NSLog(#"endDate = %f", timeMS);
int randomNumber = rand() % keyCount + 1;
lblResult.text = [results valueForKey:[sortedKeys objectAtIndex:(randomNumber - 1)]];
result = [results valueForKey:[sortedKeys objectAtIndex:(randomNumber - 1)]];
lblResult.text = result;
}
if (ii > 25) {
flag = YES;
}
}
lblResult.text = [results valueForKey:[sortedKeys objectAtIndex:(sortedKeys.count - 1)]];
this function is called at the viewDidAppear Function and currently isn't displaying the new labels. It only displays the one at the end. Am I doing anything wrong? What would be the best method to approach this?
The problem is that you're not giving the run loop a chance to run (and therefore, drawing to happen). You'll want to use an NSTimer that fires periodically and sets the next text (you could remember in an instance variable where you currently are).
Or use something like this (assuming that items is an NSArray holding your strings):
- (void)updateText:(NSNumber *)num
{
NSUInteger index = [num unsignedInteger];
[label setText:[items objectAtIndex:index]];
index++;
// to loop, add
// if (index == [items count]) { index = 0; }
if (index < [items count]) {
[self performSelector:#selector(updateText:) withObject:[NSNumber numberWithUnsignedInteger:index] afterDelay:0.5];
}
}
At the beginning (e.g. in viewDidAppear:), you could then call
[self updateText:[NSNumber numberWithUnsignedInteger:0]];
to trigger the initial update.
You'd of course need to ensure that the performs are not continuing when your view disappears, you could do this by canceling the performSelector, or if you're using a timer, by simply invalidating it, or using a boolean, or ...
And if you want to get really fancy, use GCD :)
my first question on Stackoverflow.
Let me start with a bit of code. It's a bit repetitive so I'm going to cut out the parts I repeat for different arrays (feel free to ask for the others). However, please ignore the code in preference to answering the Qs at the bottom. Firstly: thank you to answerers in advance. Secondly: the freeing of data.
#implementation ES1Renderer
GLfloat **helixVertices;
GLushort **helixIndices;
GLubyte **helixColors;
- (void)freeEverything
{
if (helixVertices != NULL)
{
for (int i=0; i < alphasToFree / 30 + 1; i++)
free(helixVertices[i]);
free(helixVertices);
}
if (helixIndices != NULL)
{
for (int i=0; i < alphasToFree / 30 + 1; i++)
free(helixIndices[i]);
free(helixIndices);
}
if (helixColors != NULL)
{
for (int i=0; i < alphasToFree / 30 + 1; i++)
free(helixColors[i]);
free(helixColors);
}
}
(I will get to the calling of this in a moment). Now for where I malloc() the arrays.
- (void)askForVertexInformation
{
int nrows = self.helper.numberOfAtoms / 300;
int mrows = [self.helper.bonds count] / 300;
int alphaCarbonRows = [self.helper.alphaCarbons count] / 30;
helixVertices = malloc(alphaCarbonRows * sizeof(GLfloat *) + 1);
helixIndices = malloc(alphaCarbonRows * sizeof(GLfloat *) + 1);
helixColors = malloc(alphaCarbonRows * sizeof(GLfloat *) + 1);
for (int i=0; i < alphaCarbonRows + 1; i++)
{
helixVertices[i] = malloc(sizeof(helixVertices) * HELIX_VERTEX_COUNT * 3 * 33);
helixIndices[i] = malloc(sizeof(helixIndices) * HELIX_INDEX_COUNT * 2 * 3 * 33);
helixColors[i] = malloc(sizeof(helixColors) * HELIX_VERTEX_COUNT * 4 * 33);
}
[self.helper recolourVerticesInAtomRange:NSMakeRange(0, [self.helper.alphaCarbons count]) withColouringType:CMolColouringTypeCartoonBlue forMasterColorArray:helixColors forNumberOfVertices:HELIX_VERTEX_COUNT difference:30];
self.atomsToFree = self.helper.numberOfAtoms;
self.bondsToFree = [self.helper.bonds count];
self.alphasToFree = [self.helper.alphaCarbons count];
}
Finally, the bit which calls everything (this is a separate class.)
- (void)loadPDB:(NSString *)pdbToLoad
{
if (!self.loading)
{
[self performSelectorOnMainThread:#selector(stopAnimation) withObject:nil waitUntilDone:YES];
[self.renderer freeEverything];
[renderer release];
ES1Renderer *newRenderer = [[ES1Renderer alloc] init];
renderer = [newRenderer retain];
[self performSelectorOnMainThread:#selector(stopAnimation) withObject:nil waitUntilDone:YES]; // need to stop the new renderer animating too!
[self.renderer setDelegate:self];
[self.renderer setupCamera];
self.renderer.pdb = nil;
[renderer resizeFromLayer:(CAEAGLLayer*)self.layer];
[newRenderer release];
NSInvocationOperation *invocationOperation = [[NSInvocationOperation alloc] initWithTarget:self selector:#selector(setup:) object:pdbToLoad];
[self.queue addOperation:invocationOperation];
[invocationOperation release];
}
}
- (void)setup:(NSString *)pdbToLoad
{
self.loading = YES;
[helper release];
[renderer.helper release];
PDBHelper *aHelper = [[PDBHelper alloc] initWithContentsOfFile:pdbToLoad];
helper = [aHelper retain];
renderer.helper = [aHelper retain];
[aHelper release];
if (!resized)
{
[self.helper resizeVertices:11];
resized = YES;
}
self.renderer.helper = self.helper;
[self.helper setUpAtoms];
[self.helper setUpBonds];
if (self.helper.numberOfAtoms > 0)
[self.renderer askForVertexInformation];
else
{
// LOG ME PLEASE.
}
[self performSelectorOnMainThread:#selector(removeProgressBar) withObject:nil waitUntilDone:YES];
[self performSelectorOnMainThread:#selector(startAnimation) withObject:nil waitUntilDone:YES];
self.renderer.pdb = pdbToLoad;
self.loading = NO;
}
What I'm doing here is loading a molecule from a PDB file into memory and displaying it on an OpenGL view window. The second time I load a molecule (which will run loadPDB: above) I get the Giant Triangle Syndrome and Related Effects... I will see large triangles over my molecule.
However, I am releasing and reallocating my PDBHelper and ES1Renderer every time I load a new molecule. Hence I was wondering:
1. whether the helixVertices, helixIndices and helixColors which I have declared as class-wide variables are actually re-used in this instance. Do they point to the same objects?
2. Should I be setting all my variables to NULL after freeing? I plan to do this anyway, to pick up any bugs by getting a segfault, but haven't got round to incorporating it.
3. Am I even right to malloc() a class variable? Is there a better way of achieving this? I have no other known way of giving this information to the renderer otherwise.
I can't answer your general questions. There's too much stuff in there. However, this caught my eye:
[helper release];
[renderer.helper release];
PDBHelper *aHelper = [[PDBHelper alloc] initWithContentsOfFile:pdbToLoad];
helper = [aHelper retain];
renderer.helper = [aHelper retain];
[aHelper release];
I think this stuff possibly leaks. It doesn't make sense anyway.
If renderer.helper is a retain or copy property, do not release it. It already has code that releases old values when it is assigned new values. Also do not retain objects you assign to it.
You have alloc'd aHelper, so there's no need to retain it again. The above code should be rewritten something like:
[helper release];
helper = [[PDBHelper alloc] initWithContentsOfFile:pdbToLoad];
renderer.helper = helper;
Also, I think your helix malloced arrays should probably be instance variables. As things stand, if you have more than one ES1Renderer, they are sharing those variables.