Leaderboards not showing all updated scores when sending scores via endMatchInTurnWithMatchData:scores method - objective-c

I want to implement ELO rating system for a game. It means that after ending a game, I have to calculate increase for winner and decrease for looser from their actual score.
I have leaderboard of type "Most Recent Score" to see just the last sent score.
I use loadScoresWithCompletionHandler for loading score, then calculation (now just adding different values) and then endMatchInTurnWithMatchData:scores:achievements:completionHandler: for ending the match and updating the score.
GKTurnBasedParticipant* player1 = [match.participants firstObject];
GKTurnBasedParticipant* player2 = [match.participants lastObject];
GKLeaderboard *leaderboardRequest = [[GKLeaderboard alloc] initWithPlayerIDs:#[player1.playerID, player2.playerID]];
leaderboardRequest.timeScope = GKLeaderboardTimeScopeAllTime;
leaderboardRequest.identifier = LEADERBOARD_ELO_RATING_ID;
[leaderboardRequest loadScoresWithCompletionHandler:^(NSArray *scores, NSError *error) {
if(error){
NSLog(#"%#", error);
return;
}
GKScore *player1Score = [scores firstObject];
GKScore *player2Score = [scores lastObject];
float score1 = ((float)player1Score.value) / 1000.0f;
float score2 = ((float)player2Score.value) / 1000.0f;
// calculation of new score
score1 +=10;
score2 +=1;
GKScore *player1NewScore = [[GKScore alloc] initWithLeaderboardIdentifier:LEADERBOARD_ELO_RATING_ID forPlayer:player1Score.playerID];
GKScore *player2NewScore = [[GKScore alloc] initWithLeaderboardIdentifier:LEADERBOARD_ELO_RATING_ID forPlayer:player2Score.playerID];
player1NewScore.value = (int64_t)(score1 * 1000.0f);
player2NewScore.value = (int64_t)(score2 * 1000.0f);
[match endMatchInTurnWithMatchData:[game.board matchData]
scores:#[player1NewScore, player2NewScore]
achievements:#[]
completionHandler:^(NSError *error) {
if(error){// todo handle error
}
}];
}];
Getting score and uploading the new score works fine but when I go to see leaderboards (using GKGameCenterViewController or GameCenter app) I can see updated score only by the local player (the participant who has ended the match and sent the final data). But if I do a request by loadScoresWithCompletionHandler method - I can see that scores of both players were updated - but only the local player's is displayed in leaderboardController.
Example:
Match started:
Player A - 10 pts
Player B - 10 pts
Match ended (Player A sent these scores using method endMatchInTurnWithMatchData:scores:achievements:completionHandler:):
Player A - 15 pts
Player B - 8 pts
Match ended - loadScoresWithCompletionHandler result shows scores:
Player A - 15 pts
Player B - 8 pts
Match ended - GKGameCenterViewController or GameCenter app shows scores:
Player A - 15 pts
Player B - 10 pts
Why is this happening, am I doing something wrong? Is it because of using Game Center sandbox? Otherwise how should I exactly update score of both players by endMatchInTurnWithMatchData:scores:achievements:completionHandler:?

I found out, that it could be probably just because of using Game Center Sandbox.

Related

Can I get (directly from gamecenter) a total (sum) of all scores uploaded to an apple gamecenter leaderboard

For a single gamecenter leaderboard I want to get a total of all scores uploaded within the last week.
Currently I query the leaderboard with a start of 0 and range of 100, fetch the scores, add those scores to a running total, fetch another 100 with a new start at 100, add those to the running total; keep doing this until gamecenter returns no scores.
This works but it is not efficient as the multiple fetches to the leaderboards takes quite a bit of time. I am hoping there is a total stored somewhere on the gamecenter server that I can access directly.
The following is one call to gamecenter and totaling all of those scores
NSInteger scorePercent = 0;
localLeaderboard.range = NSMakeRange(rangestart, 100);
[localLeaderboard loadScoresWithCompletionHandler: ^(NSArray *scores, NSError *error) {
if (error != nil) {
NSLog(#"leadboard loadScores returned error = %#", error);
// handle the error. if (scores != nil)
}
if (scores != nil){
// process the score information..
NSInteger numScores = localLeaderboard.maxRange; // number of leaderboard entries returned.
for (NSInteger nscores=0; nscores < numScores; nscores++ ) {
scorePercent += ((GKScore*) scores[nscores]).value;
}
scoreTotal += scorePercent // aggregate for later percentage calculations
}
}];
Current method works but is not efficient.

CMMotionManager changes from cached reference frame not working as expected

When I call startDeviceMotionUpdatesUsingReferenceFrame, then cache a reference to my first reference frame and call multiplyByInverseOfAttitude on all my motion updates after that, I don't get the change from the reference frame that I am expecting. Here is a really simple demonstration of what I'm not understanding.
self.motionQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 1.0/20.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXArbitraryZVertical toQueue:self.motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error){
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
CMAttitude *att = motion.attitude;
if(self.motionManagerAttitudeRef == nil){
self.motionManagerAttitudeRef = att;
return;
}
[att multiplyByInverseOfAttitude:self.motionManagerAttitudeRef];
NSLog(#"yaw:%+0.1f, pitch:%+0.1f, roll:%+0.1f, att.yaw, att.pitch, att.roll);
}];
}];
First off, in my application I only really care about pitch and roll. But yaw is in there too to demonstrate my confusion.
Everything works as expected if I put the phone laying on my flat desk, launch the app and look at the logs. All of the yaw, pitch roll values are 0.0, then if I spin the phone 90 degrees without lifting it off the surface only the yaw changes. So all good there.
To demonstrate what I think is the problem... Now put the phone inside of (for example) an empty coffee mug, so that all of the angles are slightly tilted and the direction of gravity would have some fractional value in all axis. Now launch the app and with the code above you would think everything is working because there is again a 0.0 value for yaw, pitch and roll. But now spin the coffee mug 90 degrees without lifting it from the table surface. Why do I see significant change in attitude on all of the yaw, pitch and roll?? Since I cached my initial attitude (which is now my new reference attitude), and called muptiplyByInverseOfAttitude shouldn't I just be getting a change in the yaw only?
I don't really understand why using the attitude multiplied by a cached reference attitude doesn't work... And I don't think it is a gimbal lock problem. But here is what gets me exactly what I need. And if you tried the experiment with the coffee mug I described above, this provides exactly the expected results (i.e. spinning the coffee mug on a flat surface doesn't affect pitch and roll values, and tilting the coffee mug in all other directions now only affects one axis at a time too). Plus instead of saving a reference frame, I just save the reference pitch and roll, then when the app starts, everything is zero'ed out until there is some movement.
So all good now. But still wish I understood why the other method did not work as expected.
self.motionQueue = [[NSOperationQueue alloc] init];
self.motionManager = [[CMMotionManager alloc] init];
self.motionManager.deviceMotionUpdateInterval = 1.0/20.0;
[self.motionManager startDeviceMotionUpdatesUsingReferenceFrame: CMAttitudeReferenceFrameXArbitraryZVertical toQueue:self.motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error)
{
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
if(self.motionManagerAttitude == nil){
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
refRollF = atan2(y, x) + M_PI_2;
CGFloat r = sqrtf(x*x + y*y + z*z);
refPitchF = acosf(z/r);
self.motionManagerAttitude = motion.attitude;
return;
}
CGFloat x = motion.gravity.x;
CGFloat y = motion.gravity.y;
CGFloat z = motion.gravity.z;
CGFloat rollF = refRollF - (atan2(y, x) + M_PI_2);
CGFloat r = sqrtf(x*x + y*y + z*z);
CGFloat pitchF = refPitchF - acosf(z/r);
//I don't care about yaw, so just printing out whatever the value is in the attitude
NSLog(#"yaw: %+0.1f, pitch: %+0.1f, roll: %+0.1f", (180.0f/M_PI)*motion.attitude.yaw, (180.0f/M_PI)*pitchF, (180.0f/M_PI)*rollF);
}];
}];

I would like the awarded gems (high score) to represent one gem per every ten points scored

This must be very simple compared to the rest of my work.
Essential I have a point system where you have a score from the gameplay and then you have a score as the Gems. Say you scored 50 points during a round. I want the Gems to be 1 point for every 10 points scored during each round, so in this case the Gems would be 5 points.
Then I am having trouble figuring out how to add the new points to the Gems rather than replacing them like a high score would. For example, after receiving the 5 gems (stated above) I play another round and score 80 points, which equals 8 gems. Now I have 13 gems (5+8) rather than just 8 because its the new high amount.
Thank you for the help!
-(void)incrementPoints {
if (!gameOverGame) {
score++;
[self runAction:scoreSound];
SKLabelNode *scoreNode = (SKLabelNode *)[self childNodeWithName:kPointsName];
NSString *scoreString;
scoreString = [NSString stringWithFormat:#"%i", score];
scoreNode.text = scoreString;
-(void)deleteScores{
[self enumerateChildNodesWithName:kScoreboardRupeeNodeName usingBlock:^(SKNode *node,BOOL *stop){
[node removeFromParent];
}];
if(score == A*10){
highscore = A;
[[AppUserDefaults sharedAppUserDefaults]setHihgscore:(int)score + highscore];
ViewController * viewController = (ViewController *) self.view.window.rootViewController;
[viewController submitToLeaderboard:(int)score];
}
scoreboardNode = [gameObjects scoreboardWithScore:(int)score andHighscore:(int)highscore];
[self addChild:scoreboardNode];
[self enumerateChildNodesWithName:kPointsName usingBlock:^(SKNode *node,BOOL *stop){
[node removeFromParent];
}];
[self die];
}
Simply keep track of the score as points and if 1 gem == 10 points, then that's just a presentation issue; i.e. rather than displaying 85 points as "85" you draw 8 gems. i.e.
NSInteger numberOfGemsToDraw = self.pointsScored / 10;
Note that self.pointsScored would be saved using NSUserDefaults and within Game Center. It's the only score-related data you care about.
Also note that any remaining points (i.e. self.pointsScored % 10) could be used to draw semi-complete gems, so the user has an idea of how close they are to their next complete gem.
tl;dr Store points and display gems.

QTMovie at 29.97 with QTMakeTime

I'm trying to use QTKit to convert a list of images to a quicktime movie. I've figured out how to do everything except get the frame rate to 29.97. Through other forums and resources, the trick seems to be using something like this:
QTTime frameDuration = QTMakeTime(1001, 30000)
However, all my attempts using this method, or even (1000, 29970) still produce a movie at 30fps. This fps is what shows up when playing with Quicktime player.
Any ideas? Is there some other way to set the frame rate for the entire movie once its created?
Here's some sample code:
NSDictionary *outputMovieAttribs = [NSDictionary dictionaryWithObjectsAndKeys:#"jpeg", QTAddImageCodecType, [NSNumber numberWithLong:codecHighQuality], QTAddImageCodecQuality, nil];
QTTime frameDuration = QTMakeTime(1001, 30000);
QTMovie *outputMovie = [[QTMovie alloc] initToWritableFile:#"/tmp/testing.mov" error:nil];
[outputMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
[outputMovie setAttribute:[NSNumber numberWithLong:30000] forKey:QTMovieTimeScaleAttribute];
if (!outputMovie) {
printf("ERROR: Chunk: Could not create movie object:\n");
} else {
int frameID = 0;
while (frameID < [framePaths count]) {
NSAutoreleasePool *readPool = [[NSAutoreleasePool alloc] init];
NSData *currFrameData = [NSData dataWithContentsOfFile:[framePaths objectAtIndex:frameID]];
NSImage *currFrame = [[NSImage alloc] initWithData:currFrameData];
if (currFrame) {
[outputMovie addImage:currFrame forDuration:frameDuration withAttributes:outputMovieAttribs];
[outputMovie updateMovieFile];
NSString *newDuration = QTStringFromTime([outputMovie duration]);
printf("new Duration: %s\n", [newDuration UTF8String]);
currFrame = nil;
} else {
printf("ERROR: Could not add image to movie");
}
frameID++;
[readPool drain];
}
}
NSString *outputDuration = QTStringFromTime([outputMovie duration]);
printf("output Duration: %s\n", [outputDuration UTF8String]);
Ok, thanks to your code, I could solve the issue. I was using the development tool called Atom Inpector to see that the data structure looked totally different than the movies I am currently working with. As I said, I never created a movie from images as you do, but it seems that this is not the way to go if you want to have a movie afterwards. QuickTime recognizes the clip as "Photo-JPEG", so not a normal movie file. The reason for this seems to be, that the added pictures are NOT added to a movie track but just somewhere in the movie. This can also be seen with Atom Inspector.
With the "movieTimeScaleAttribute", you set a timeScale that is not used!
To solve the issue I changed the code just a tiny bit.
NSDictionary *outputMovieAttribs = [NSDictionary dictionaryWithObjectsAndKeys:#"jpeg",
QTAddImageCodecType, [NSNumber numberWithLong:codecHighQuality],
QTAddImageCodecQuality,[NSNumber numberWithLong:2997], QTTrackTimeScaleAttribute, nil];
QTTime frameDuration = QTMakeTime(100, 2997);
QTMovie *outputMovie = [[QTMovie alloc] initToWritableFile:#"/Users/flo/Desktop/testing.mov" error:nil];
[outputMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute];
[outputMovie setAttribute:[NSNumber numberWithLong:2997] forKey:QTMovieTimeScaleAttribute];
Everything else is unaltered.
Oh, by the way. To print the timeValue and timeScale, you could also do :
NSLog(#"new Duration timeScale : %ld timeValue : %lld \n",
[outputMovie duration].timeScale, [outputMovie duration].timeValue);
This way you can see better if your code does as desired.
Hope that helps!
Best regards
I have never done what you're trying to do, but I can tell you how to get the desired framerate I guess.
If you "ask" a movie for its current timing information, you always get a QTTime structure, which contains the timeScale and the timeValue.
For a 29.97 fps video, you would get a timeScale of 2997 ( for example, see below ).
This is the amount of "units" per second.
So, if the playback position of the movie is currently at exactly 2 seconds, you would get a timeValue of 5994.
The frameDuration is therefore 100, because 2997 / 100 = 29.97 fps.
QuickTime cannot handle float values, so you have to convert all the values to a long value by multiplication.
By the way, you don't have to use 100, you could also use 1000 and a timeScale of 29970, or 200 as frame duration and 5994 timeScale. That's all I can tell you from what you get if you read timing information from already existing clips.
You wrote that this didn't work out for you, but this is how QuickTime works internally.
You should look into it again!
Best regards

Having trouble calculating accurate total walking/running distance using CLLocationManager

I'm trying to build an iOS app that displays the total distance travelled when running or walking. I've read and re-read all the documentation I can find, but I'm having trouble coming up with something that gives me an accurate total distance.
When compared with Nike+ GPS or RunKeeper, my app consistently reports a shorter distance. They'll report the same at first, but as I keep moving, the values of my app vs other running apps gradually drift.
For example, if I walk .3 kilometers (verified by my car's odometer), Nike+ GPS and RunKeeper both report ~.3 kilometers every time, but my app will report ~.13 kilometers. newLocation.horizontalAccuracy is consistently 5.0 or 10.0.
Here's the code I'm using. Am I missing something obvious? Any thoughts on how I could improve this to get a more accurate reading?
#define kDistanceCalculationInterval 10 // the interval (seconds) at which we calculate the user's distance
#define kNumLocationHistoriesToKeep 5 // the number of locations to store in history so that we can look back at them and determine which is most accurate
#define kValidLocationHistoryDeltaInterval 3 // the maximum valid age in seconds of a location stored in the location history
#define kMinLocationsNeededToUpdateDistance 3 // the number of locations needed in history before we will even update the current distance
#define kRequiredHorizontalAccuracy 40.0f // the required accuracy in meters for a location. anything above this number will be discarded
- (id)init {
if ((self = [super init])) {
if ([CLLocationManager locationServicesEnabled]) {
self.locationManager = [[CLLocationManager alloc] init];
self.locationManager.delegate = self;
self.locationManager.desiredAccuracy = kCLLocationAccuracyBestForNavigation;
self.locationManager.distanceFilter = 5; // specified in meters
}
self.locationHistory = [NSMutableArray arrayWithCapacity:kNumLocationHistoriesToKeep];
}
return self;
}
- (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation {
// since the oldLocation might be from some previous use of core location, we need to make sure we're getting data from this run
if (oldLocation == nil) return;
BOOL isStaleLocation = [oldLocation.timestamp compare:self.startTimestamp] == NSOrderedAscending;
[self.delegate locationManagerDebugText:[NSString stringWithFormat:#"accuracy: %.2f", newLocation.horizontalAccuracy]];
if (!isStaleLocation && newLocation.horizontalAccuracy >= 0.0f && newLocation.horizontalAccuracy < kRequiredHorizontalAccuracy) {
[self.locationHistory addObject:newLocation];
if ([self.locationHistory count] > kNumLocationHistoriesToKeep) {
[self.locationHistory removeObjectAtIndex:0];
}
BOOL canUpdateDistance = NO;
if ([self.locationHistory count] >= kMinLocationsNeededToUpdateDistance) {
canUpdateDistance = YES;
}
if ([NSDate timeIntervalSinceReferenceDate] - self.lastDistanceCalculation > kDistanceCalculationInterval) {
self.lastDistanceCalculation = [NSDate timeIntervalSinceReferenceDate];
CLLocation *lastLocation = (self.lastRecordedLocation != nil) ? self.lastRecordedLocation : oldLocation;
CLLocation *bestLocation = nil;
CGFloat bestAccuracy = kRequiredHorizontalAccuracy;
for (CLLocation *location in self.locationHistory) {
if ([NSDate timeIntervalSinceReferenceDate] - [location.timestamp timeIntervalSinceReferenceDate] <= kValidLocationHistoryDeltaInterval) {
if (location.horizontalAccuracy < bestAccuracy && location != lastLocation) {
bestAccuracy = location.horizontalAccuracy;
bestLocation = location;
}
}
}
if (bestLocation == nil) bestLocation = newLocation;
CLLocationDistance distance = [bestLocation distanceFromLocation:lastLocation];
if (canUpdateDistance) self.totalDistance += distance;
self.lastRecordedLocation = bestLocation;
}
}
}
As it turns out, the code I posted above works great. The problem happened to be in a different part of my app. I was accidentally converting the distance from meters to miles, instead of from meters to kilometers. Oops!
Anyway, hopefully my post will still have some merit, since I feel it's a pretty solid example of how to track a user's distance with Core Location.
You probably have set kRequiredHorizontalAccuracy too low. If there is no location in the history that has accuracy < kRequiredHorizontalAccuracy, then you ignore all those points and add 0 to the distance.