Can I get (directly from gamecenter) a total (sum) of all scores uploaded to an apple gamecenter leaderboard - game-center-leaderboard

For a single gamecenter leaderboard I want to get a total of all scores uploaded within the last week.
Currently I query the leaderboard with a start of 0 and range of 100, fetch the scores, add those scores to a running total, fetch another 100 with a new start at 100, add those to the running total; keep doing this until gamecenter returns no scores.
This works but it is not efficient as the multiple fetches to the leaderboards takes quite a bit of time. I am hoping there is a total stored somewhere on the gamecenter server that I can access directly.
The following is one call to gamecenter and totaling all of those scores
NSInteger scorePercent = 0;
localLeaderboard.range = NSMakeRange(rangestart, 100);
[localLeaderboard loadScoresWithCompletionHandler: ^(NSArray *scores, NSError *error) {
if (error != nil) {
NSLog(#"leadboard loadScores returned error = %#", error);
// handle the error. if (scores != nil)
}
if (scores != nil){
// process the score information..
NSInteger numScores = localLeaderboard.maxRange; // number of leaderboard entries returned.
for (NSInteger nscores=0; nscores < numScores; nscores++ ) {
scorePercent += ((GKScore*) scores[nscores]).value;
}
scoreTotal += scorePercent // aggregate for later percentage calculations
}
}];
Current method works but is not efficient.

Related

SpriteKit - Logging the debug info such as fps, NodeCount

I know how to display those info in the screen, but I would like to log them in a file/console for offline investigation. How could I do that?
You can measure it yourself using scene - (void)update method. This method calls each frame need to calculate.
- (void)update
{
self.frameCount++; // increase frame count value
uint64_t currentTime = mach_absolute_time(); // get current time
// according to this two values and last update method call time you're already can calculate the fps
self.lastUpdateTick = currentTime; // remember for the future method call
}
To get the nodes count value you should just to count children of all the scene children. It may be some kind of recursive algorithm (not tested).
- (NSUInteger)childrenOf:(SKNode *)node
{
NSUInteger count = 0;
for (SKNode *child in node.children)
count += [self childrenOf:child] + 1;
return count;
}
- (void)calculateSceneChildrenCount
{
NSUInteger count = [self childrenOf:self];
NSLog(#"count is %lu",count);
}

Leaderboards not showing all updated scores when sending scores via endMatchInTurnWithMatchData:scores method

I want to implement ELO rating system for a game. It means that after ending a game, I have to calculate increase for winner and decrease for looser from their actual score.
I have leaderboard of type "Most Recent Score" to see just the last sent score.
I use loadScoresWithCompletionHandler for loading score, then calculation (now just adding different values) and then endMatchInTurnWithMatchData:scores:achievements:completionHandler: for ending the match and updating the score.
GKTurnBasedParticipant* player1 = [match.participants firstObject];
GKTurnBasedParticipant* player2 = [match.participants lastObject];
GKLeaderboard *leaderboardRequest = [[GKLeaderboard alloc] initWithPlayerIDs:#[player1.playerID, player2.playerID]];
leaderboardRequest.timeScope = GKLeaderboardTimeScopeAllTime;
leaderboardRequest.identifier = LEADERBOARD_ELO_RATING_ID;
[leaderboardRequest loadScoresWithCompletionHandler:^(NSArray *scores, NSError *error) {
if(error){
NSLog(#"%#", error);
return;
}
GKScore *player1Score = [scores firstObject];
GKScore *player2Score = [scores lastObject];
float score1 = ((float)player1Score.value) / 1000.0f;
float score2 = ((float)player2Score.value) / 1000.0f;
// calculation of new score
score1 +=10;
score2 +=1;
GKScore *player1NewScore = [[GKScore alloc] initWithLeaderboardIdentifier:LEADERBOARD_ELO_RATING_ID forPlayer:player1Score.playerID];
GKScore *player2NewScore = [[GKScore alloc] initWithLeaderboardIdentifier:LEADERBOARD_ELO_RATING_ID forPlayer:player2Score.playerID];
player1NewScore.value = (int64_t)(score1 * 1000.0f);
player2NewScore.value = (int64_t)(score2 * 1000.0f);
[match endMatchInTurnWithMatchData:[game.board matchData]
scores:#[player1NewScore, player2NewScore]
achievements:#[]
completionHandler:^(NSError *error) {
if(error){// todo handle error
}
}];
}];
Getting score and uploading the new score works fine but when I go to see leaderboards (using GKGameCenterViewController or GameCenter app) I can see updated score only by the local player (the participant who has ended the match and sent the final data). But if I do a request by loadScoresWithCompletionHandler method - I can see that scores of both players were updated - but only the local player's is displayed in leaderboardController.
Example:
Match started:
Player A - 10 pts
Player B - 10 pts
Match ended (Player A sent these scores using method endMatchInTurnWithMatchData:scores:achievements:completionHandler:):
Player A - 15 pts
Player B - 8 pts
Match ended - loadScoresWithCompletionHandler result shows scores:
Player A - 15 pts
Player B - 8 pts
Match ended - GKGameCenterViewController or GameCenter app shows scores:
Player A - 15 pts
Player B - 10 pts
Why is this happening, am I doing something wrong? Is it because of using Game Center sandbox? Otherwise how should I exactly update score of both players by endMatchInTurnWithMatchData:scores:achievements:completionHandler:?
I found out, that it could be probably just because of using Game Center Sandbox.

bad access playing back user recording synchronised with animation based on recording volume with array

I am trying to store an array based on audio input and then play animation frames corresponding to the input while the recording is played back.
The code is working up to now except after a while it crashes in the simulator and highlights
"CCLOG(#"adding image: %#", characterImageString);";
with this:
EXC_BAD_ACCESS (code=1, address=0xd686be8)
which is memory management I know but I am absolutely stumped.
if(isRecording){
int myInt;
NSString * characterImageString;
//get a number based on the volume input
float f = audioMonitorResults * 200; //convert max(0.06) to 12
f=((f/12)*10);
NSNumber *myNumber = [NSNumber numberWithDouble:(f+0.5)];
myInt = [myNumber intValue] + 1;
//create the image file name from the intiger we
//created from the audiomonitor results
if(myInt < 10){
characterImageString = [NSString stringWithFormat:#"fungus000%i.png",myInt];
} else if (myInt == 10){
characterImageString = [NSString stringWithFormat:#"fungus00%i.png",myInt];
}
CCLOG(#"adding image: %#", characterImageString);
//add each frame
[animationSequence addObject:characterImageString];
// print array contents
NSLog(#"animationSequence Array: %#", animationSequence);
// print array size
NSLog(#"animationSequence Number of Objects in Array: %u", [animationSequence count]); }
This is the code that plays as the audio is playing back:
-(void) updateAnimation:(ccTime) delta{
myFrame ++;
NSString *imageToDisplay;
imageToDisplay = animationSequence[myFrame];
CCTexture2D *currentTextureToDisplay = [[CCTextureCache sharedTextureCache] addImage:imageToDisplay];
[character setTexture:currentTextureToDisplay];
CCLOG(#"current texture to display: %#", currentTextureToDisplay);
if (myFrame >= [animationSequence count]) {
[self unschedule:#selector(updateAnimation:)];
}
Your characterImageString is nil if myInt > 10
The exception is thrown, because you're trying to print a variable, which hasn't been initialized.
You could try changing your code to something like this:
if(myInt < 10)
{
characterImageString = [NSString stringWithFormat:#"fungus000%i.png",myInt];
}
else if (myInt >= 10 && myInt < 100)
{
characterImageString = [NSString stringWithFormat:#"fungus00%i.png",myInt];
}
else if (myInt >= 100 && myInt < 1000)
{
characterImageString = [NSString stringWithFormat:#"fungus0%i.png",myInt];
}
else
{
characterImageString = [NSString stringWithFormat:#"fungus%i.png",myInt];
}
Obviously small debugging goes a long way. Could you add control printout for myInt before the line
if(myInt < 10){
to see the value of myInt before the crash?
if myInt is <= 0 your program has no protection for such case so resulting picture will not exist.
And for myInt > 10 the program will crash since NSString * characterImageString; is an automatic uninitialized variable of the random value.
hmmm ... some motherhood and apple pie , hard with the info available. Not certain what the initial float value is, so declare somewhere your min and max image numbers (say, kMinFrameNumber and kMaxFrameNumber). Since the float value could be anything at the start of your algorithm, add the following 'defensive' lines after computing myInt:
myInt=MAX(kMinFrameNumber,myInt);
myInt=MIN(kMaxFrameNumber,myInt);
then formatting :
characterImageString = [NSString stringWithFormat:#"fungus%04i.png",myInt];
finally, i doubt the exception is thrown at the highlighted line (that is where it is detected).
a. How did you declare the array animationSequence (is it retained?). If not, it could get autoreleased under your feet at some random interval, and you would be trying to send a message to a deallocated instance.
b. You should also check for bounds before addressing animationSequence
if(myFrame<[animationSequence count]-1) {
imageToDisplay = animationSequence[myFrame];
} else {
CCLOGERROR(#"Yelp ! addressing out of bounds!");
// terminate neatly here ! as in unschedule and return
}
c. check if your texture is nil before setting a sprite (it will accept a nil texture in cocos2d version 2.0) but, you are in the dark about the state of your code.

Setting Time Range in AVAssetReader causes freeze

So, I'm trying to do a simple calculation over previously recorded audio (from an AVAsset) in order to create a wave form visual. I currently do this by averaging a set of samples, the size of which is determined by dividing the audio file size by the resolution I want for the wave form.
This all works fine, except for one problem....it's too slow. Running on a 3GS, processing an audio file takes about 3% of the time it takes to play it, which is way to slow (for example, a 1 hour audio file takes about 2.5 minutes to process). I've tried to optimize the method as much as possible but it's not working. I'll post the code I use to process the file. Maybe someone will be able to help with that but what I'm really looking for is a way to process the file without having to go over every single byte. So, say given a resolution of 2,000 I'd want to access the file and take a sample at each of the 2,000 points. I think this would be a lot quicker, especially if the file is larger. But the only way I know to get the raw data is to access the audio file in a linear manner. Any ideas? Here's the code I use to process the file (note, all class vars begin with '_'):
So I've completely changed this question. I belatedly realized that AVAssetReader has a timeRange property that's used for "seeking", which is exactly what I was looking for (see original question above). Furthermore, the question has been asked and answered (I just didn't find it before) and I don't want to duplicate questions. However, I'm still having a problem. My app freezes for a while and then eventually crashes when ever I try to copyNextSampleBuffer. I'm not sure what's going on. I don't seem to be in any kind of recursion loop, it just never returns from the function call. Checking the logs show give me this error:
Exception Type: 00000020
Exception Codes: 0x8badf00d
Highlighted Thread: 0
Application Specific Information:
App[10570] has active assertions beyond permitted time:
{(
<SBProcessAssertion: 0xddd9300> identifier: Suspending process: App[10570] permittedBackgroundDuration: 10.000000 reason: suspend owner pid:52 preventSuspend preventThrottleDownCPU preventThrottleDownUI
)}
I use a time profiler on the app and yep, it just sits there with a minimal amount of processing. Can't quite figure out what's going on. It's important to note that this doesn't occur if I don't set the timeRange property of AVAssetReader. I've checked and the values for timeRange are valid, but setting it is causing the problem for some reason. Here's my processing code:
- (void) processSampleData{
if (!_asset || CMTimeGetSeconds(_asset.duration) <= 0) return;
NSError *error = nil;
AVAssetTrack *songTrack = _asset.tracks.firstObject;
if (!songTrack) return;
NSDictionary *outputSettingsDict = [[NSDictionary alloc] initWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM],AVFormatIDKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved,
nil];
UInt32 sampleRate = 44100.0;
_channelCount = 1;
NSArray *formatDesc = songTrack.formatDescriptions;
for(unsigned int i = 0; i < [formatDesc count]; ++i) {
CMAudioFormatDescriptionRef item = (__bridge_retained CMAudioFormatDescriptionRef)[formatDesc objectAtIndex:i];
const AudioStreamBasicDescription* fmtDesc = CMAudioFormatDescriptionGetStreamBasicDescription (item);
if(fmtDesc ) {
sampleRate = fmtDesc->mSampleRate;
_channelCount = fmtDesc->mChannelsPerFrame;
}
CFRelease(item);
}
UInt32 bytesPerSample = 2 * _channelCount; //Bytes are hard coded by AVLinearPCMBitDepthKey
_normalizedMax = 0;
_sampledData = [[NSMutableData alloc] init];
SInt16 *channels[_channelCount];
char *sampleRef;
SInt16 *samples;
NSInteger sampleTally = 0;
SInt16 cTotal;
_sampleCount = DefaultSampleSize * [UIScreen mainScreen].scale;
NSTimeInterval intervalBetweenSamples = _asset.duration.value / _sampleCount;
NSTimeInterval sampleSize = fmax(100, intervalBetweenSamples / _sampleCount);
double assetTimeScale = _asset.duration.timescale;
CMTimeRange timeRange = CMTimeRangeMake(CMTimeMake(0, assetTimeScale), CMTimeMake(sampleSize, assetTimeScale));
SInt16 totals[_channelCount];
#autoreleasepool {
for (int i = 0; i < _sampleCount; i++) {
AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:_asset error:&error];
AVAssetReaderTrackOutput *trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:songTrack outputSettings:outputSettingsDict];
[reader addOutput:trackOutput];
reader.timeRange = timeRange;
[reader startReading];
while (reader.status == AVAssetReaderStatusReading) {
CMSampleBufferRef sampleBufferRef = [trackOutput copyNextSampleBuffer];
if (sampleBufferRef){
CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sampleBufferRef);
size_t length = CMBlockBufferGetDataLength(blockBufferRef);
int sampleCount = length / bytesPerSample;
for (int i = 0; i < sampleCount ; i += _channelCount) {
CMBlockBufferAccessDataBytes(blockBufferRef, i * bytesPerSample, _channelCount, channels, &sampleRef);
samples = (SInt16 *)sampleRef;
for (int channel = 0; channel < _channelCount; channel++)
totals[channel] += samples[channel];
sampleTally++;
}
CMSampleBufferInvalidate(sampleBufferRef);
CFRelease(sampleBufferRef);
}
}
for (int i = 0; i < _channelCount; i++){
cTotal = abs(totals[i] / sampleTally);
if (cTotal > _normalizedMax) _normalizedMax = cTotal;
[_sampledData appendBytes:&cTotal length:sizeof(cTotal)];
totals[i] = 0;
}
sampleTally = 0;
timeRange.start = CMTimeMake((intervalBetweenSamples * (i + 1)) - sampleSize, assetTimeScale); //Take the sample just before the interval
}
}
_assetNeedsProcessing = NO;
}
I finally figured out why. Apparently there is some sort of 'minimum' duration you can specify for the timeRange of an AVAssetReader. I'm not sure what exactly that minimum is, somewhere above 1,000 but less than 5,000. It's possible that the minimum changes with the duration of the asset...honestly I'm not sure. Instead, I kept the duration (which is infinity) the same and simply changed the start time. Instead of processing the whole sample, I copy only one buffer block, process that and then seek to the next time. I'm still having trouble with the code, but I'll post that as another question if I can't figure it out.

Having trouble calculating accurate total walking/running distance using CLLocationManager

I'm trying to build an iOS app that displays the total distance travelled when running or walking. I've read and re-read all the documentation I can find, but I'm having trouble coming up with something that gives me an accurate total distance.
When compared with Nike+ GPS or RunKeeper, my app consistently reports a shorter distance. They'll report the same at first, but as I keep moving, the values of my app vs other running apps gradually drift.
For example, if I walk .3 kilometers (verified by my car's odometer), Nike+ GPS and RunKeeper both report ~.3 kilometers every time, but my app will report ~.13 kilometers. newLocation.horizontalAccuracy is consistently 5.0 or 10.0.
Here's the code I'm using. Am I missing something obvious? Any thoughts on how I could improve this to get a more accurate reading?
#define kDistanceCalculationInterval 10 // the interval (seconds) at which we calculate the user's distance
#define kNumLocationHistoriesToKeep 5 // the number of locations to store in history so that we can look back at them and determine which is most accurate
#define kValidLocationHistoryDeltaInterval 3 // the maximum valid age in seconds of a location stored in the location history
#define kMinLocationsNeededToUpdateDistance 3 // the number of locations needed in history before we will even update the current distance
#define kRequiredHorizontalAccuracy 40.0f // the required accuracy in meters for a location. anything above this number will be discarded
- (id)init {
if ((self = [super init])) {
if ([CLLocationManager locationServicesEnabled]) {
self.locationManager = [[CLLocationManager alloc] init];
self.locationManager.delegate = self;
self.locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation;
self.locationManager.distanceFilter = 5; // specified in meters
}
self.locationHistory = [NSMutableArray arrayWithCapacity:kNumLocationHistoriesToKeep];
}
return self;
}
- (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation {
// since the oldLocation might be from some previous use of core location, we need to make sure we're getting data from this run
if (oldLocation == nil) return;
BOOL isStaleLocation = [oldLocation.timestamp compare:self.startTimestamp] == NSOrderedAscending;
[self.delegate locationManagerDebugText:[NSString stringWithFormat:#"accuracy: %.2f", newLocation.horizontalAccuracy]];
if (!isStaleLocation && newLocation.horizontalAccuracy >= 0.0f && newLocation.horizontalAccuracy < kRequiredHorizontalAccuracy) {
[self.locationHistory addObject:newLocation];
if ([self.locationHistory count] > kNumLocationHistoriesToKeep) {
[self.locationHistory removeObjectAtIndex:0];
}
BOOL canUpdateDistance = NO;
if ([self.locationHistory count] >= kMinLocationsNeededToUpdateDistance) {
canUpdateDistance = YES;
}
if ([NSDate timeIntervalSinceReferenceDate] - self.lastDistanceCalculation > kDistanceCalculationInterval) {
self.lastDistanceCalculation = [NSDate timeIntervalSinceReferenceDate];
CLLocation *lastLocation = (self.lastRecordedLocation != nil) ? self.lastRecordedLocation : oldLocation;
CLLocation *bestLocation = nil;
CGFloat bestAccuracy = kRequiredHorizontalAccuracy;
for (CLLocation *location in self.locationHistory) {
if ([NSDate timeIntervalSinceReferenceDate] - [location.timestamp timeIntervalSinceReferenceDate] <= kValidLocationHistoryDeltaInterval) {
if (location.horizontalAccuracy < bestAccuracy && location != lastLocation) {
bestAccuracy = location.horizontalAccuracy;
bestLocation = location;
}
}
}
if (bestLocation == nil) bestLocation = newLocation;
CLLocationDistance distance = [bestLocation distanceFromLocation:lastLocation];
if (canUpdateDistance) self.totalDistance += distance;
self.lastRecordedLocation = bestLocation;
}
}
}
As it turns out, the code I posted above works great. The problem happened to be in a different part of my app. I was accidentally converting the distance from meters to miles, instead of from meters to kilometers. Oops!
Anyway, hopefully my post will still have some merit, since I feel it's a pretty solid example of how to track a user's distance with Core Location.
You probably have set kRequiredHorizontalAccuracy too low. If there is no location in the history that has accuracy < kRequiredHorizontalAccuracy, then you ignore all those points and add 0 to the distance.