iOS11 AUGraphInitialize Err-code 560557684 - background

With the application in the background on iOS 11,
Execution of AUGraphInitialize () fails,
Error code 560557684 is returned and audio can not be played.
AUGraphInitialize () was successful on iOS 10 in the same situation.
Is this a bug in iOS 11?
Is there planning to fix apple in this case?
If it is not a bug,
With iOS 11 the application is in the background,
What kind of implementation should be done to make AUGraphInitialize () succeed?
Please give me hints!
Describe the source of this error case below.
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
if (audioSession == nil) {
LOGE("AVAudioSession : sharedInstance() error.");
}
NSError *audioSessionError = nil;
BOOL ret = [audioSession setActive:NO error:&audioSessionError];
if (ret == false) {
LOGE("AVAudioSession : setActive() error. code = %d", (int)audioSessionError.code);
}
NSTimeInterval bufferDuration = 2048 / 44100;
ret = [audioSession setPreferredIOBufferDuration:bufferDuration error:&audioSessionError];
if (ret == false) {
LOGE("AVAudioSession : setPreferredIOBufferDuration() error. code = %d", (int)audioSessionError.code);
}
ret = [audioSession setCategory:AVAudioSessionCategoryPlayback
error:&audioSessionError];
if (ret == false) {
LOGE("AVAudioSession : setCategory() error. code = %d", (int)audioSessionError.code);
}
[audioSession setActive:YES error:&audioSessionError];
OSStatus err = NewAUGraph(&mGraph);
if ( err != noErr ) {
LOGE("AUGraph : NewAUGraph() error. code = %d", (int)err);
}
err = AUGraphOpen(mGraph);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphOpen() error. code = %d", (int)err);
}
AudioComponentDescription output_desc;
output_desc.componentType = kAudioUnitType_Output;
output_desc.componentSubType = kAudioUnitSubType_RemoteIO;
output_desc.componentManufacturer = kAudioUnitManufacturer_Apple;
output_desc.componentFlags = 0;
output_desc.componentFlagsMask = 0;
AUNode outputNode;
err = AUGraphAddNode(mGraph, &output_desc, &outputNode);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphAddNode() error. code = %d", (int)err);
}
err = AUGraphNodeInfo(mGraph, outputNode, NULL, &mOutUnit);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphNodeInfo() error. code = %d", (int)err);
}
AURenderCallbackStruct callback;
callback.inputProc = _renderCallback;
callback.inputProcRefCon = (__bridge void*)self;
err = AUGraphSetNodeInputCallback(mGraph,
outputNode,
0,
&callback);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphSetNodeInputCallback() error. code = %d", (int)err);
}
AudioStreamBasicDescription outputFormat;
UInt32 size = sizeof(AudioStreamBasicDescription);
err = AudioUnitGetProperty( mOutUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &outputFormat, &size );
if ( err != noErr ) {
LOGE("AUGraph : AudioUnitGetProperty() error. code = %d", (int)err);
return nil;
}
outputFormat.mSampleRate = 44100.0f;
err = AudioUnitSetProperty(mOutUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &outputFormat, size);
if ( err != noErr ) {
LOGE("AUGraph : AudioUnitSetProperty() error. code = %d", (int)err);
}
err = AUGraphConnectNodeInput(mGraph,
outputNode, 0,
outputNode, 1
);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphConnectNodeInput() error. code = %d", (int)err);
}
err = AUGraphInitialize(mGraph);
if ( err != noErr ) {
LOGE("AUGraph : AUGraphInitialize() error. code = %d", (int)err);
}

Related

iOS 13 get application statusBar crash

only crash on beta2 and beta 3 when call code like this,:
[application valueForKeyPath:#"statusBar"]
some can help me? i call this method to get phone's network status.
the whole code like this:
if (![self isIPhoneX]) {
if ([[application valueForKeyPath:#"_statusBar"] isKindOfClass:NSClassFromString(#"UIStatusBar_Modern")]) {
children = [[[[application valueForKeyPath:#"_statusBar"] valueForKeyPath:#"_statusBar"] valueForKeyPath:#"foregroundView"] subviews];
} else {
children = [[[application valueForKeyPath:#"_statusBar"] valueForKeyPath:#"foregroundView"] subviews];
}
Class expectClass = NSClassFromString(#"UIStatusBarDataNetworkItemView");
for (id child in children) {
if ([child isKindOfClass:expectClass]) {
int netType = [[child valueForKeyPath:#"dataNetworkType"] intValue];
switch (netType) {
case 0: state = #"";break;
case 1: state = #"2g";break;
case 2: state = #"3g";break;
case 3: state = #"4g";break;
case 5: state = #"wifi";break;
default: state = #"";break;
} /* switch */
}
}
} else {
id statusBar = [application valueForKeyPath:#"statusBar"];
id statusBarView = [statusBar valueForKeyPath:#"statusBar"];
UIView *foregroundView = [statusBarView valueForKeyPath:#"foregroundView"];
children = [[foregroundView subviews][2] subviews];
for (id child in children) {
if ([child isKindOfClass:NSClassFromString(#"_UIStatusBarWifiSignalView")]) {
state = #"wifi";
}else if ([child isKindOfClass:NSClassFromString(#"_UIStatusBarStringView")]) {
NSString *str = [child valueForKeyPath:#"_originalText"];
if ([str isEqualToString:#"4G"]) {
state = #"4g";
}else if([str isEqualToString:#"3G"]){
state = #"3g";
} else{
state = #"2g";
}
}
}
}
I install iOS 13 open beta version and every thing runs good except some label showing ...,but i receive crash when online beta2 and beta3 version.
+ (NSString *)deviceNetworkingType
{
NSString *strNetworkInfo = #"No Network";
struct sockaddr_storage zeroAddress;
bzero(&zeroAddress,sizeof(zeroAddress)); zeroAddress.ss_len = sizeof(zeroAddress);
zeroAddress.ss_family = AF_INET;
// Recover reachability flags
SCNetworkReachabilityRef defaultRouteReachability = SCNetworkReachabilityCreateWithAddress(NULL,(struct sockaddr *)&zeroAddress);
SCNetworkReachabilityFlags flags;
BOOL didRetrieveFlags = SCNetworkReachabilityGetFlags(defaultRouteReachability,&flags);
CFRelease(defaultRouteReachability);
if(!didRetrieveFlags){ return strNetworkInfo;}
BOOL isReachable = ((flags & kSCNetworkFlagsReachable)!=0);
BOOL needsConnection = ((flags & kSCNetworkFlagsConnectionRequired)!=0);
if(!isReachable || needsConnection) {return strNetworkInfo;}
if((flags & kSCNetworkReachabilityFlagsConnectionRequired)== 0){
strNetworkInfo = #"WIFI";
}
if(((flags & kSCNetworkReachabilityFlagsConnectionOnDemand ) != 0) ||(flags & kSCNetworkReachabilityFlagsConnectionOnTraffic) != 0) {
if ((flags & kSCNetworkReachabilityFlagsInterventionRequired) == 0){
strNetworkInfo = #"WIFI";
}
}
if ((flags & kSCNetworkReachabilityFlagsIsWWAN) ==kSCNetworkReachabilityFlagsIsWWAN) {
if ([[[UIDevice currentDevice] systemVersion] floatValue] >= 7.0) {
CTTelephonyNetworkInfo * info = [[CTTelephonyNetworkInfo alloc] init];
NSString *currentRadioAccessTechnology = info.currentRadioAccessTechnology;
if (currentRadioAccessTechnology) {
if ([currentRadioAccessTechnology isEqualToString:CTRadioAccessTechnologyLTE]) {
strNetworkInfo =#"4G";
} else if ([currentRadioAccessTechnology isEqualToString:CTRadioAccessTechnologyEdge] || [currentRadioAccessTechnology isEqualToString:CTRadioAccessTechnologyGPRS]) {
strNetworkInfo =#"2G";
} else {
strNetworkInfo =#"3G";
}
}
} else {
if((flags & kSCNetworkReachabilityFlagsReachable) == kSCNetworkReachabilityFlagsReachable) {
if ((flags & kSCNetworkReachabilityFlagsTransientConnection) == kSCNetworkReachabilityFlagsTransientConnection) {
if((flags & kSCNetworkReachabilityFlagsConnectionRequired) == kSCNetworkReachabilityFlagsConnectionRequired) {
strNetworkInfo =#"2G";
} else {
strNetworkInfo =#"3G";
}
}
}
}
}
return strNetworkInfo;
}

CMSampleBuffer from Replay Kit after encoding to Mpeg-TS Video is normal, Audio is corrupted

I am trying to stream AAC encoded audio data received as CMSampleBuffer. AudioConverterFillComplexBuffer returns 0 status code.
But after passing this data to my FFMPEG HLSWriter audio is not correctly saved (short truncated signals).
Below is the sample code.
static OSStatus inInputDataProc(AudioConverterRef inAudioConverter,
UInt32 *ioNumberDataPackets,
AudioBufferList *ioData,
AudioStreamPacketDescription **outDataPacketDescription,
void *inUserData)
{
KFAACEncoder *encoder = (__bridge KFAACEncoder *)(inUserData);
UInt32 requestedPackets = *ioNumberDataPackets;
if(requestedPackets > encoder.cycledBuffer.size / 2)
{
//NSLog(#"PCM buffer isn't full enough!");
*ioNumberDataPackets = 0;
return -1;
}
static size_t staticBuffSize = 4096;
static void* staticBuff = nil;
if(!staticBuff)
{
staticBuff = malloc(staticBuffSize);
}
size_t outputBytesSize = requestedPackets * 2;
[encoder.cycledBuffer popToBuffer:staticBuff bytes: outputBytesSize];
ioData->mBuffers[0].mData = staticBuff;
ioData->mBuffers[0].mDataByteSize = (int)outputBytesSize;
*ioNumberDataPackets = ioData->mBuffers[0].mDataByteSize / 2;
return noErr;
}
- (void) encodeSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
CFRetain(sampleBuffer);
dispatch_async(self.encoderQueue,
^{
if (!_audioConverter)
{
[self setupAACEncoderFromSampleBuffer:sampleBuffer];
}
CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CFRetain(blockBuffer);
size_t pcmBufferSize = 0;
void* pcmBuffer = nil;
OSStatus status = CMBlockBufferGetDataPointer(blockBuffer, 0, NULL, &pcmBufferSize, &pcmBuffer);
[_cycledBuffer push:pcmBuffer size:pcmBufferSize];
NSError *error = nil;
if (status != kCMBlockBufferNoErr)
{
error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
}
memset(_aacBuffer, 0, _aacBufferSize);
AudioBufferList outAudioBufferList = {0};
outAudioBufferList.mNumberBuffers = 1;
outAudioBufferList.mBuffers[0].mNumberChannels = 1;
outAudioBufferList.mBuffers[0].mDataByteSize = _aacBufferSize;
outAudioBufferList.mBuffers[0].mData = _aacBuffer;
AudioStreamPacketDescription *outPacketDescription = NULL;
UInt32 ioOutputDataPacketSize = 1;
status = AudioConverterFillComplexBuffer(_audioConverter,
inInputDataProc,
(__bridge void *)(self),
&ioOutputDataPacketSize,
&outAudioBufferList,
NULL);
NSData *data = nil;
if (status == 0)
{
NSData *rawAAC = [NSData dataWithBytes:outAudioBufferList.mBuffers[0].mData length:outAudioBufferList.mBuffers[0].mDataByteSize];
if (_addADTSHeader) {
NSData *adtsHeader = [self adtsDataForPacketLength:rawAAC.length];
NSMutableData *fullData = [NSMutableData dataWithData:adtsHeader];
[fullData appendData:rawAAC];
data = fullData;
} else {
data = rawAAC;
}
} else {
error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
}
if (self.delegate) {
KFFrame *frame = [[KFFrame alloc] initWithData:data pts:pts];
NSLog(#"Bytes of data %lu", (unsigned long)data.length);
dispatch_async(self.callbackQueue, ^{
[self.delegate encoder:self encodedFrame:frame];
});
}
CFRelease(sampleBuffer);
CFRelease(blockBuffer);
});
}

FFmpeg jump to most recent frame

I am looking for some help with dropping/skipping FFmpeg frames. The project I am working on streams live video which when the app goes into the background, upon returning to an active state the video stream spends a long time catching up by fast forwarding itself to the current frame. This isn't ideal and what I am aiming to achieve is have the app immediately jump to the most recent frame.
What I need to do is drop the amount of frames that are being fast-forwarded in order to catch up to the most recent frame. Is this possible? Here is my current code which decodes the frames:
- (NSArray *) decodeFrames: (CGFloat) minDuration
{
NSMutableArray *result = [NSMutableArray array];
#synchronized (lock) {
if([_reading integerValue] != 1){
_reading = [NSNumber numberWithInt:1];
#synchronized (_seekPosition) {
if([_seekPosition integerValue] != -1 && _seekPosition){
[self seekDecoder:[_seekPosition longLongValue]];
_seekPosition = [NSNumber numberWithInt:-1];
}
}
if (_videoStream == -1 &&
_audioStream == -1)
return nil;
AVPacket packet;
CGFloat decodedDuration = 0;
CGFloat totalDuration = [TimeHelper calculateTimeDifference];
do {
BOOL finished = NO;
int count = 0;
while (!finished) {
if (av_read_frame(_formatCtx, &packet) < 0) {
_isEOF = YES;
[self endOfFileReached];
break;
}
[self frameRead];
if (packet.stream_index ==_videoStream) {
int pktSize = packet.size;
while (pktSize > 0) {
int gotframe = 0;
int len = avcodec_decode_video2(_videoCodecCtx,
_videoFrame,
&gotframe,
&packet);
if (len < 0) {
LoggerVideo(0, #"decode video error, skip packet");
break;
}
if (gotframe) {
if (!_disableDeinterlacing &&
_videoFrame->interlaced_frame) {
avpicture_deinterlace((AVPicture*)_videoFrame,
(AVPicture*)_videoFrame,
_videoCodecCtx->pix_fmt,
_videoCodecCtx->width,
_videoCodecCtx->height);
}
KxVideoFrame *frame = [self handleVideoFrame];
if (frame) {
[result addObject:frame];
_position = frame.position;
decodedDuration += frame.duration;
if (decodedDuration > minDuration)
finished = YES;
}
} else {
count++;
}
if (0 == len)
break;
pktSize -= len;
}
}
av_free_packet(&packet);
}
} while (totalDuration > 0);
_reading = [NSNumber numberWithInt:0];
return result;
}
}
return result;

How to play pcm audio buffer from a socket server using audio unit circular buffer

I hope someone can help me. I am new to Objective-c and OSX and I am trying to play audio data I am receiving via socket into my audio queue. I found out this link https://stackoverflow.com/a/30318859/4274654 which in away address my issue with circular buffer.
However when I try to run my project it returns
It returns an error (OSStatus) -10865. That is why the code logs " Error enabling AudioUnit output bus".
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
Here is my code:
Test.h
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import "TPCircularBuffer.h"
#interface Test : Communicator
#property (nonatomic) AudioComponentInstance audioUnit;
#property (nonatomic) TPCircularBuffer circularBuffer;
-(TPCircularBuffer *) outputShouldUseCircularBuffer;
-(void) start;
#end
Test.m
#import "Test.h"
#define kOutputBus 0
#define kInputBus 1
#implementation Test{
BOOL stopped;
}
static OSStatus OutputRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData){
Test *output = (__bridge Test*)inRefCon;
TPCircularBuffer *circularBuffer = [output outputShouldUseCircularBuffer];
if( !circularBuffer ){
SInt32 *left = (SInt32*)ioData->mBuffers[0].mData;
for(int i = 0; i < inNumberFrames; i++ ){
left[ i ] = 0.0f;
}
return noErr;
};
int32_t bytesToCopy = ioData->mBuffers[0].mDataByteSize;
SInt16* outputBuffer = ioData->mBuffers[0].mData;
uint32_t availableBytes;
SInt16 *sourceBuffer = TPCircularBufferTail(circularBuffer, &availableBytes);
int32_t amount = MIN(bytesToCopy,availableBytes);
memcpy(outputBuffer, sourceBuffer, amount);
TPCircularBufferConsume(circularBuffer,amount);
return noErr;
}
-(void) start
{
[self circularBuffer:&_circularBuffer withSize:24576*5];
stopped = NO;
[self setupAudioUnit];
// [super setup:#"http://localhost" port:5321];
}
-(void) setupAudioUnit
{
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
AudioComponent comp = AudioComponentFindNext(NULL, &desc);
OSStatus status;
status = AudioComponentInstanceNew(comp, &_audioUnit);
if(status != noErr)
{
NSLog(#"Error creating AudioUnit instance");
}
// Enable input and output on AURemoteIO
// Input is enabled on the input scope of the input element
// Output is enabled on the output scope of the output element
UInt32 one = 1;
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
if(status != noErr)
{
NSLog(#"Error enableling AudioUnit output bus");
}
// Explicitly set the input and output client formats
// sample rate = 44100, num channels = 1, format = 16 bit int point
AudioStreamBasicDescription audioFormat = [self getAudioDescription];
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, &audioFormat, sizeof(audioFormat));
if(status != noErr)
{
NSLog(#"Error setting audio format");
}
AURenderCallbackStruct renderCallback;
renderCallback.inputProc = OutputRenderCallback;
renderCallback.inputProcRefCon = (__bridge void *)(self);
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &renderCallback, sizeof(renderCallback));
if(status != noErr)
{
NSLog(#"Error setting rendering callback");
}
// Initialize the AURemoteIO instance
status = AudioUnitInitialize(_audioUnit);
if(status != noErr)
{
NSLog(#"Error initializing audio unit");
}
}
- (AudioStreamBasicDescription)getAudioDescription {
AudioStreamBasicDescription audioDescription = {0};
audioDescription.mFormatID = kAudioFormatLinearPCM;
audioDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked | kAudioFormatFlagsNativeEndian;
audioDescription.mChannelsPerFrame = 1;
audioDescription.mBytesPerPacket = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mFramesPerPacket = 1;
audioDescription.mBytesPerFrame = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mBitsPerChannel = 8 * sizeof(SInt16);
audioDescription.mSampleRate = 44100.0;
return audioDescription;
}
-(void)circularBuffer:(TPCircularBuffer *)circularBuffer withSize:(int)size {
TPCircularBufferInit(circularBuffer,size);
}
-(void)appendDataToCircularBuffer:(TPCircularBuffer*)circularBuffer
fromAudioBufferList:(AudioBufferList*)audioBufferList {
TPCircularBufferProduceBytes(circularBuffer,
audioBufferList->mBuffers[0].mData,
audioBufferList->mBuffers[0].mDataByteSize);
}
-(void)freeCircularBuffer:(TPCircularBuffer *)circularBuffer {
TPCircularBufferClear(circularBuffer);
TPCircularBufferCleanup(circularBuffer);
}
-(TPCircularBuffer *) outputShouldUseCircularBuffer
{
return &_circularBuffer;
}
-(void) stop
{
OSStatus status = AudioOutputUnitStop(_audioUnit);
if(status != noErr)
{
NSLog(#"Error stopping audio unit");
}
TPCircularBufferClear(&_circularBuffer);
_audioUnit = nil;
stopped = YES;
}
-(void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)event{
switch (event) {
case NSStreamEventOpenCompleted:
NSLog(#"Stream opened");
break;
case NSStreamEventHasBytesAvailable:
if (stream == [super inputStream]) {
NSLog(#"NSStreamEventHasBytesAvailable");
uint8_t buffer[1024];
NSUInteger len;
while ([[super inputStream] hasBytesAvailable]) {
len = [[super inputStream] read:buffer maxLength:sizeof(buffer)];
if (len > 0) {
//converting buffer to byte data
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
if (nil != output) {
//NSLog(#"server overideddddd said: %#", output);
}
NSData *data0 = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != data0) {
SInt16* byteData = (SInt16*)malloc(len);
memcpy(byteData, [data0 bytes], len);
double sum = 0.0;
for(int i = 0; i < len/2; i++) {
sum += byteData[i] * byteData[i];
}
Byte* soundData = (Byte*)malloc(len);
memcpy(soundData, [data0 bytes], len);
if(soundData)
{
AudioBufferList *theDataBuffer = (AudioBufferList*) malloc(sizeof(AudioBufferList) *1);
theDataBuffer->mNumberBuffers = 1;
theDataBuffer->mBuffers[0].mDataByteSize = (UInt32)len;
theDataBuffer->mBuffers[0].mNumberChannels = 1;
theDataBuffer->mBuffers[0].mData = (SInt16*)soundData;
NSLog(#"soundData here");
[self appendDataToCircularBuffer:&_circularBuffer fromAudioBufferList:theDataBuffer];
}
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"Can't connect to server");
break;
case NSStreamEventEndEncountered:
[stream close];
[stream removeFromRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
break;
default:
NSLog(#"Unknown event");
}
[super stream:stream handleEvent:event];
}
#end
I would highly appreciate if there is any one with an example of playing buffers returned from a socket server into audio queue so that I can be able to listen to sound as it comes from the socket server.
Thanks
Your code seems to be asking for a kAudioUnitSubType_VoiceProcessingIO audio unit. But kAudioUnitSubType_RemoteIO would be a more suitable iOS audio unit for just playing buffers of audio samples.
Also, your code does not seem to first select an appropriate audio session category and activate it before playing audio. See Apple's documentation for doing this: https://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/Introduction/Introduction.html

Xcode: How do I end a 2 player turn based multiplayer match with a winner and loser using GameCenter?

I've had a bit of trouble finding any information on this and all the code samples I come across are based on the match ending in a tie for all players. In my 2 player turn based game I want to be able to end the match with a winner and loser. With the below code I've written the match always ends with the same result for both players, if its a win then both player 1 and 2 win, if its a loss both player 1 and 2 loose... any help? Thank you.
if (gameOver == true) {
if (GameWinner == 0) {
GKTurnBasedParticipant *player0 = [currentMatch.participants objectAtIndex:0];
player0.matchOutcome = GKTurnBasedMatchOutcomeWon;
GKTurnBasedParticipant *player1 = [currentMatch.participants objectAtIndex:1];
player1.matchOutcome = GKTurnBasedMatchOutcomeLost;
[currentMatch endMatchInTurnWithMatchData:data completionHandler:^(NSError *error) {
if (error) {
NSLog(#"%#", error);
}
}];
testlabel.text = #"Player 1 Wins!";
} else if (GameWinner == 1) {
GKTurnBasedParticipant *player0 = [currentMatch.participants objectAtIndex:0];
player0.matchOutcome = GKTurnBasedMatchOutcomeLost;
GKTurnBasedParticipant *player1 = [currentMatch.participants objectAtIndex:1];
player1.matchOutcome = GKTurnBasedMatchOutcomeWon;
[currentMatch endMatchInTurnWithMatchData:data completionHandler:^(NSError *error) {
if (error) {
NSLog(#"%#", error);
}
}];
testlabel.text = #"Player 2 Wins!";
} else if (GameWinner == 2) {
for (GKTurnBasedParticipant *part in currentMatch.participants) {
part.matchOutcome = GKTurnBasedMatchOutcomeTied;
}
[currentMatch endMatchInTurnWithMatchData:data completionHandler:^(NSError *error) {
if (error) {
NSLog(#"%#", error);
}
}];
testlabel.text = #"Tie Game!";
} else {
testlabel.text = #"Your turn is over.";
}
This sounds similar to this SO Question, try:
GKTurnBasedParticipant *curr = currentMatch.currentParticipant;
NSUInteger currentIndex = [currentMatch.participants indexOfObject:currentMatch.currentParticipant];
NSUInteger nextIndex = (currentIndex + 1) % [currentMatch.participants count];
GKTurnBasedParticipant *next = [currentMatch.participants objectAtIndex:nextIndex];
if (currScore < otherScore)
{
// Curr player lost
curr.matchOutcome = GKTurnBasedMatchOutcomeLost;
next.matchOutcome = GKTurnBasedMatchOutcomeWon;
}
else if (currScore == otherScore)
{
// Tied
curr.matchOutcome = GKTurnBasedMatchOutcomeTied;
next.matchOutcome = GKTurnBasedMatchOutcomeTied;
}
else
{
// Won
curr.matchOutcome = GKTurnBasedMatchOutcomeWon;
next.matchOutcome = GKTurnBasedMatchOutcomeLost;
}