GCM Push Notifications are not coming on iPhone when app is not active for one day - google-cloud-messaging

On iPhone/iPad after installing the app, app is able to receive notification first time. Once app is inactive, then it stops getting notification in both foreground and background.
Can someone point, what I am I missing . Looks like no issue with message format since I get first time.
Server Code:
message.put("priority", "high");
message.put("content_available",true);
if (to != null)
{
message.put("to", to.replace("\\", ""));
}
if (messageId != null)
{
message.put("message_id", messageId);
}
JSONObject subobj = new JSONObject();
subobj.put("sound", "default");
message.put("notification", subobj);
message.put("data", payload);
if (timeToLive != null)
{
message.put("time_to_live", timeToLive);
}
if (delayWhileIdle != null && delayWhileIdle)
{
message.put("delay_while_idle", true);
}
if (collapseKey != null)
{
message.put("collapse_key", collapseKey);
}
message.put("delivery_receipt_requested", true);
Client Code:
- (BOOL)application:(UIApplication*)application didFinishLaunchingWithOptions:(NSDictionary*)launchOptions
{
// First run Delete keychain
if (![[NSUserDefaults standardUserDefaults] objectForKey:#"FirstRun"]) {
// Delete values from keychain here
application.applicationIconBadgeNumber = 0;
[self resetKeychain];
[[NSUserDefaults standardUserDefaults] setValue:#"1strun" forKey:#"FirstRun"];
[[NSUserDefaults standardUserDefaults] synchronize];
}
self.viewController = [[MainViewController alloc] init];
CGRect screenBounds = [[UIScreen mainScreen] bounds];
#if __has_feature(objc_arc)
self.window = [[UIWindow alloc] initWithFrame:screenBounds];
#else
self.window = [[[UIWindow alloc] initWithFrame:screenBounds]
autorelease];
#endif
self.window.autoresizesSubviews = YES;
//notification
[self updateDurationLabel];
UIFont *font = [UIFont boldSystemFontOfSize:10.0f];
NSDictionary *attributes = [NSDictionary dictionaryWithObject:font forKey:NSFontAttributeName];
[self.segFromStyle setTitleTextAttributes:attributes forState:UIControlStateNormal];
[self.segToStyle setTitleTextAttributes:attributes forState:UIControlStateNormal];
[self flowManager];
UILocalNotification *remoteNotification = [launchOptions objectForKey:UIApplicationLaunchOptionsRemoteNotificationKey];
if (remoteNotification) {
application.applicationIconBadgeNumber = 0;
self.launchNotification = remoteNotification.userInfo;
NSLog(#"NotificationCheck: remoteNotification");
}
UILocalNotification *localNotification = [launchOptions objectForKey:UIApplicationLaunchOptionsLocalNotificationKey];
if (localNotification) {
application.applicationIconBadgeNumber = 0;
NSLog(#"NotificationCheck: localNotification");
self.launchNotification = localNotification.userInfo;
NSData* jsonData = [NSJSONSerialization dataWithJSONObject:localNotification.userInfo options:0 error:nil];
NSString* jsonString = [[NSString alloc] initWithBytes:[jsonData bytes] length:[jsonData length] encoding:NSUTF8StringEncoding];
NSLog(#"Dict:%#", jsonString);
}
[self.window makeKeyAndVisible];
return YES;
}

You may refer with this thread wherein it suggested to add the "priority": "high" to the JSON toget the notifications in the background.
{
"to" : "token...",
"priority": "high",
"notification" : {
"title": "GCM TITLE",
"body" : "FROM GCM",
"badge": "1",
"sound": "default"
}
}
Additional references:
GCM push notification to iOS with content_available (not working to invoke from inactive state)
GCM support for ios application when application in background or killed
Making GCM work for iOS device in the background

Related

Action buttons on Remote Notification iOS 10 (objective C) is not coming IIViewDeckController

After lot of googling and following apple's doc , still I am not able to get action button in remote(push) notification but I am getting it in local notification by following the same code for local notification.
- (void)triggerAndRegisterNotification {
if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"10.0")) {
// create actions
#if XCODE_VERSION_GREATER_THAN_OR_EQUAL_TO_8
UNUserNotificationCenter *center = [UNUserNotificationCenter currentNotificationCenter];
// create actions
UNNotificationAction *acceptAction = [UNNotificationAction actionWithIdentifier:#"com.AG.yes"
title:#"Save"
options:UNNotificationActionOptionForeground];
UNNotificationAction *declineAction = [UNNotificationAction actionWithIdentifier:#"com.AG.no"
title:#"Decline"
options:UNNotificationActionOptionDestructive];
UNNotificationAction *snoozeAction = [UNNotificationAction actionWithIdentifier:#"com.AG.snooze"
title:#"Snooze"
options:UNNotificationActionOptionDestructive];
NSArray *notificationActions = #[ acceptAction, declineAction, snoozeAction ];
// create a category
UNNotificationCategory *inviteCategory = [UNNotificationCategory categoryWithIdentifier:CYLInviteCategoryIdentifier actions:notificationActions intentIdentifiers:#[] options:UNNotificationCategoryOptionNone];
NSSet *categories = [NSSet setWithObject:inviteCategory];
// registration
[center setNotificationCategories:categories];
#endif
} else if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"8.0")) {
// create actions
UIMutableUserNotificationAction *acceptAction =
[[UIMutableUserNotificationAction alloc] init];
acceptAction.identifier = #"com.AG.yes";
acceptAction.title = #"Accept";
acceptAction.activationMode =
UIUserNotificationActivationModeBackground;
acceptAction.destructive = NO;
acceptAction.authenticationRequired = NO; //If YES requies
passcode, but does not unlock the device
UIMutableUserNotificationAction *declineAction =
[[UIMutableUserNotificationAction alloc] init];
declineAction.identifier = #"com.AG.no";
acceptAction.title = #"Decline";
acceptAction.activationMode =
UIUserNotificationActivationModeBackground;
declineAction.destructive = YES;
acceptAction.authenticationRequired = NO;
UIMutableUserNotificationAction *snoozeAction =
[[UIMutableUserNotificationAction alloc] init];
snoozeAction.identifier = #"com.AG.snooze";
acceptAction.title = #"Snooze";
snoozeAction.activationMode =
UIUserNotificationActivationModeBackground;
snoozeAction.destructive = YES;
snoozeAction.authenticationRequired = NO;
// create a category
UIMutableUserNotificationCategory *inviteCategory =
[[UIMutableUserNotificationCategory alloc] init];
inviteCategory.identifier = CYLInviteCategoryIdentifier;
NSArray *notificationActions = #[ acceptAction, declineAction,
snoozeAction ];
[inviteCategory setActions:notificationActions
forContext:UIUserNotificationActionContextDefault];
[inviteCategory setActions:notificationActions
forContext:UIUserNotificationActionContextMinimal];
// registration
NSSet *categories = [NSSet setWithObject:inviteCategory];
UIUserNotificationType types = UIUserNotificationTypeBadge |
UIUserNotificationTypeSound | UIUserNotificationTypeAlert;
UIUserNotificationSettings *settings =
[UIUserNotificationSettings settingsForTypes:types
categories:categories];
[[UIApplication sharedApplication]
registerUserNotificationSettings:settings];
}
/// 2. request authorization for localNotification
[self registerNotificationSettingsCompletionHandler:^(BOOL
granted,
NSError * _Nullable error) {
if (granted) {
NSLog(#"request authorization succeeded!");
[[UIApplication sharedApplication]
registerForRemoteNotifications];
}
}];
**COMMENTED CODE FOR LOCAL NOTIFICATION**
// if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"10.0")) {
// #if XCODE_VERSION_GREATER_THAN_OR_EQUAL_TO_8
// Deliver the notification at 08:30 everyday
// NSDateComponents *dateComponents = [[NSDateComponents
alloc] init];
// dateComponents.hour = 8;
// dateComponents.minute = 30;
// UNCalendarNotificationTrigger *trigger =
// [UNCalendarNotificationTrigger
triggerWithDateMatchingComponents:dateComponents
repeats:YES];
// UNMutableNotificationContent *content =
[[UNMutableNotificationContent alloc] init];
// content.title = [NSString
localizedUserNotificationStringForKey:#"AG said:"
arguments:nil];
// content.body = [NSString
localizedUserNotificationStringForKey:#"Hello Tom!Get up,
let's play with Jerry!" arguments:nil];
// content.sound = [UNNotificationSound
defaultSound];
// content.categoryIdentifier =
CYLInviteCategoryIdentifier;
/// 4. update application icon badge number
// content.badge = #([[UIApplication sharedApplication]
applicationIconBadgeNumber] + 1);
// content.launchImageName = #"any string is ok,such as
微博#iOS程序犭袁";
// Deliver the notification in five seconds.
//*** Terminating app due to uncaught exception
'NSInternalInconsistencyException', reason: 'time interval
must be at least 60 if repeating'
// UNTimeIntervalNotificationTrigger *trigger =
[UNTimeIntervalNotificationTrigger
// triggerWithTimeInterval:60.0f repeats:YES];
// UNNotificationRequest *request = [
UNNotificationRequest
requestWithIdentifier:#"FiveSecond"
// content:content trigger:trigger];
// UNUserNotificationCenter *center =
[UNUserNotificationCenter currentNotificationCenter];
/// 3. schedule localNotification,The delegate must be set before
the application returns from
applicationDidFinishLaunching:.
// center.delegate = self;
// [center addNotificationRequest:request
withCompletionHandler:^(NSError * _Nullable error) {
// if (!error) {
// NSLog(#"add NotificationRequest succeeded!");
// }
// }];
//#endif
// } else {
/// 3. schedule localNotification
// UILocalNotification *localNotification =
[[UILocalNotification alloc] init];
// localNotification.fireDate = [NSDate
dateWithTimeIntervalSinceNow:5.f];
// localNotification.alertTitle = #"AG said:";
// localNotification.alertBody = #"Hello Tom!Get up,
let's play with Jerry!";
// localNotification.alertAction = #"play with Jerry";
//Identifies the image used as the launch image when the user
taps (or slides) the action button (or slider).
// localNotification.alertLaunchImage =
#"LaunchImage.png";
// localNotification.userInfo = #{ #"CategoryIdentifier"
: CYLInviteCategoryIdentifier };
//
// localNotification.timeZone = [NSTimeZone
defaultTimeZone];
//repeat evey minute, 0 means don't repeat
// localNotification.repeatInterval = NSCalendarUnitMinute;
/// 4. update application icon badge number
// localNotification.applicationIconBadgeNumber =
[[UIApplication sharedApplication]
applicationIconBadgeNumber] + 1;
// [[UIApplication sharedApplication]
scheduleLocalNotification:localNotification];
//
// }
}
/// 3. THIS IS THE METHOD TO AUTHORIZE THE NOTIFICATION
- (void)registerNotificationSettingsCompletionHandler:(void (^)(BOOL
granted, NSError *__nullable error))completionHandler; {
/// 2. request authorization for localNotification
if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"10.0")) {
#if XCODE_VERSION_GREATER_THAN_OR_EQUAL_TO_8
UNUserNotificationCenter *center = [UNUserNotificationCenter
currentNotificationCenter];
[center requestAuthorizationWithOptions:
(UNAuthorizationOptionBadge | UNAuthorizationOptionSound
| UNAuthorizationOptionAlert)
completionHandler:completionHandler];
#endif
} else if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"8.0"))
{
// UIUserNotificationSettings
*userNotificationSettings = [UIUserNotificationSettings
settingsForTypes:(UIUserNotificationTypeAlert |
UIUserNotificationTypeSound |
UIUserNotificationTypeBadge)
//
categories:nil];
// UIApplication *application = [UIApplication
sharedApplication];
// [application
registerUserNotificationSettings:userNotificationSettings];
//FIXME:
// !completionHandler ?: completionHandler(granted, error);
}
}
**AND IN** APPDELEGATE.m
- (BOOL)application:(UIApplication *)application
didFinishLaunchingWithOptions:(NSDictionary *)launchOptions{
if (SYSTEM_VERSION_GREATER_THAN_OR_EQUAL_TO(#"10.0")) {
#if XCODE_VERSION_GREATER_THAN_OR_EQUAL_TO_8
/// schedule localNotification, the delegate must be set before
// the application returns fromapplicationDidFinishLaunching:.
UNUserNotificationCenter *center = [UNUserNotificationCenter
currentNotificationCenter];
center.delegate = self;
#endif
}
[self triggerAndRegisterNotification];
}
I am using iphone 7 for testing purpose.
Please help me solve this. Thanks in Advance
Payload JSon
aps = {
alert = {
"artist_id" = 16912;
body = "Kurt Rosenwinkel is playing at Joe Henderson Lab at
SFJAZZ Center";
eventid = 149687805;
sound = default;
timestamp = "810b6035-e4d7-4722-81db-7455e81a48fe";
title = "Kurt Rosenwinkel";
tracks = "itunes,spotify";
type = 2;
};
category = "com.wcities.notification";
};
I checked the category identifier which I set in my app is also same as in payload json.
UPDATE
As I debugged and come to the point that from above code I am getting push notification with action buttons but in some where after did finish launching I am changing my windows root view controller to view controller which is a child of IIViewDeckController.
After commenting this line push notification is coming with action buttons. I am totally confused why it is happening because as per my knowledge there should not be any impact on push notifications if I set or present or push any viewcontroller.
please let me know If I am doing any mistakes here. I have shared all code and scenario above.
Thanks
As I am using dockview library so the viewcontrollers which are added in dock view controller had some code related to setting categories as nil.That is the reason due to which I couldn't able to get the action button in remote notification so I remove this extra code and everything is working fine. I am still finding that stupid person who wrote this code. Appretiate the help
in ios 10 notification work with -(void)userNotificationCenter:(UNUserNotificationCenter *)center willPresentNotification:(UNNotification *)notification withCompletionHandler:(void (^)(UNNotificationPresentationOptions options))completionHandler{
//code
completionHandler(UNNotificationPresentationOptionAlert);
}
-(void)userNotificationCenter:(UNUserNotificationCenter *)center didReceiveNotificationResponse:(UNNotificationResponse *)response withCompletionHandler:(void(^)())completionHandler{
}

Text To Speech And Speech To Text Recognition -->self - Recognition is occurring

I want to developing an app which should support speech to text and text to speech ,
i)Speech to Text- Procedure-i have used Speech framework for speech to text ,whenever i open a app and if i start speaking ,the app should recognize the voice and should convert the speech into text .This is working
ii)Text to Speech - Procedure -i have used AVFoundation and MediaPlayer library If user press the play button it should convert the text i.e,whatever appear in the screen into speech.Working now .
Here is the problem is am facing
while processing text to speech ,the speech recognizer recognizes the playing voice and printing the words again in textbox.
Example- if i say "Hello Good Morning" it is printing in text box and then if i press a play button it is playing a voice Hello Good Morning but at this time speech to text recognize recognize this voice i mean self- recognition and it's printing "Hello Good Morning Hello Good Morning"
I want to stop the Speech To Text Process while processing the Text To Speech
For this , i have stopped speech recognition request while playing the speech
here is the code,
#implementation ViewController
{
SFSpeechAudioBufferRecognitionRequest *recognitionRequest;
SFSpeechRecognitionTask *recognitionTask;
AVAudioEngine *audioEngine;
NSMutableArray *speechStringsArray;
BOOL SpeechToText;
NSString* resultString;
NSString *str ;
NSString *searchString;
NSString *textToSpeak;
}
- (void)viewDidLoad {
[super viewDidLoad];
//Speech To Text ****
speechStringsArray = [[NSMutableArray alloc]init];
// Initialize background audio session
NSError *error = NULL;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback error:&error];
if(error) {
NSLog(#"#error: %#", error);
}
[session setActive:YES error:&error];
if (error) {
NSLog(#"#error: %#", error);
}
// Enabled remote controls
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
// Voice setup
self.voicePicker.delegate = self;
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:#"en-us"];
self.voices = [NSMutableArray arrayWithObjects:
#{#"voice" : #"en-us", #"label" : #"American English (Female)"},
#{#"voice" : #"en-au", #"label" : #"Australian English (Female)"},
#{#"voice" : #"en-gb", #"label" : #"British English (Male)"},
#{#"voice" : #"en-ie", #"label" : #"Irish English (Female)"},
#{#"voice" : #"en-za", #"label" : #"South African English (Female)"},
nil];
// Synthesizer setup
self.synthesizer = [[AVSpeechSynthesizer alloc] init];
self.synthesizer.delegate = self;
// UITextView delegate
self.textView.delegate = self;
// This notifcation is generated from the AppDelegate applicationDidBecomeActive method to make sure that if the play or pause button is updated in the background then the button will be updated in the toolbar
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(updateToolbar) name:#"updateToolbar" object:nil];
}
-(void)viewDidAppear:(BOOL)animated
{
self.speechRecognizer = [[SFSpeechRecognizer alloc]initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US en-UK"]];
self.speechRecognizer.delegate = self;
audioEngine = [[AVAudioEngine alloc]init];
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
[self start_record];
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"AuthorizationStatusDenied");
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"AuthorizationStatusRestricted");
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
break;
default:
NSLog(#"Default");
break;
}
}];
//MARK : Interface Builder Actions
}
****coding for increase the speed and pitch****
- (IBAction)handleSpeedStepper:(UIStepper *)sender
{
double speedValue = self.speedStepper.value;
[self.speedValueLabel setText:[NSString stringWithFormat:#"%.1f", speedValue]];
}
- (IBAction)handlePitchStepper:(UIStepper *)sender
{
double pitchValue = self.pitchStepper.value;
[self.pitchValueLabel setText:[NSString stringWithFormat:#"%.1f", pitchValue]];
}
//Play button for text to speech
- (IBAction)handlePlayPauseButton:(UIBarButtonItem *)sender
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Stop immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Stop at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
[self updateToolbarWithButton:#"play"];
}
else if (self.synthesizer.paused) {
[self.synthesizer continueSpeaking];
[self updateToolbarWithButton:#"pause"];
}
else {
[self speakUtterance];
[self updateToolbarWithButton:#"pause"];
}
}
//method for speech to text
-(void)start_record{
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc]init];
AVAudioInputNode *inputNode = audioEngine.inputNode;
if (recognitionRequest == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Audio engine has no input node ");}
//configure request so that results are returned before audio recording is finished
[recognitionRequest setShouldReportPartialResults:YES];
// A recognition task represents a speech recognition session.
//We keep a reference to the task so that it can be cancelled .
recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:recognitionRequest resultHandler:^(SFSpeechRecognitionResult * result, NSError * error1) {
BOOL isFinal = false;
if ((result = result)) {
NSString *speech = result.bestTranscription.formattedString;
NSLog(#"the speech:%#",speech);
// coding for fixing append string issue
for (int i = 0 ;i <speechStringsArray.count;i++)
{
str = [speechStringsArray objectAtIndex:i];
NSRange range = [speech rangeOfString:str options:NSCaseInsensitiveSearch];
NSLog(#"found: %#", (range.location != NSNotFound) ? #"Yes" : #"No");
if (range.location != NSNotFound) {
resultString = [speech stringByReplacingCharactersInRange:range withString:#""];
speech = resultString;
NSLog(#" the result is : %#",resultString);
}
}
//specific functions - space for second word
if (resultString.length>0) {
self.textView.text = [NSString stringWithFormat:#"%#%#",self.textView.text,resultString];
[speechStringsArray addObject:resultString]; }
//specific function space for first word -Working fine
else
{
[speechStringsArray addObject:speech];
self.textView.text = speech;
}
}
NSLog(#" array %#",speechStringsArray);
isFinal = result.isFinal;
}
if (error1 != nil || isFinal) {
[audioEngine stop];
[inputNode removeTapOnBus:0];
recognitionRequest = nil;
recognitionTask = nil;
[self start_record];
}}];
AVAudioFormat *recordingFormat = [inputNode outputFormatForBus:0];
[inputNode installTapOnBus:0 bufferSize:1024 format:recordingFormat block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when){
[recognitionRequest appendAudioPCMBuffer:buffer];}
];
NSError *error1;
[audioEngine prepare];
[audioEngine startAndReturnError:&error1];}
- (void)speakUtterance
{
NSLog(#"speakUtterance");
didStartSpeaking = NO;
textToSpeak = [NSString stringWithFormat:#"%#", self.textView.text];
AVSpeechUtterance *utterance = [[AVSpeechUtterance alloc] initWithString:textToSpeak];
utterance.rate = self.speedStepper.value;
utterance.pitchMultiplier = self.pitchStepper.value;
utterance.voice = self.voice;
[self.synthesizer speakUtterance:utterance];
[self displayBackgroundMediaFields];
}
- (void)displayBackgroundMediaFields
{
MPMediaItemArtwork *artwork = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:#"Play"]];
NSDictionary *info = #{ MPMediaItemPropertyTitle: self.textView.text,
MPMediaItemPropertyAlbumTitle: #"TextToSpeech App",
MPMediaItemPropertyArtwork: artwork};
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = info;
}
- (void)updateToolbar
{
if (self.synthesizer.speaking && !self.synthesizer.paused) {
[self updateToolbarWithButton:#"pause"];
}
else {
[self updateToolbarWithButton:#"play"];
}}
- (void)updateToolbarWithButton:(NSString *)buttonType
{
//stopping the speech to text process
if (audioEngine.isRunning) {
[audioEngine stop];
[recognitionRequest endAudio];
}
NSLog(#"updateToolbarWithButton: %#", buttonType);
UIBarButtonItem *audioControl;
if ([buttonType isEqualToString:#"play"]) {
// Play
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPlay target:self action:#selector(handlePlayPauseButton:)];
}
else {
// Pause
audioControl = [[UIBarButtonItem alloc]initWithBarButtonSystemItem:UIBarButtonSystemItemPause target:self action:#selector(handlePlayPauseButton:)];
}
UIBarButtonItem *flexibleItem = [[UIBarButtonItem alloc] initWithBarButtonSystemItem:UIBarButtonSystemItemFlexibleSpace target:nil action:nil];
[self.toolbar setItems:#[flexibleItem, audioControl, flexibleItem]];
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)receivedEvent
{
NSLog(#"receivedEvent: %#", receivedEvent);
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlPlay:
NSLog(#"UIEventSubtypeRemoteControlPlay");
if (self.synthesizer.speaking) {
[self.synthesizer continueSpeaking];
}
else {
[self speakUtterance];
}
break;
case UIEventSubtypeRemoteControlPause:
NSLog(#"pause - UIEventSubtypeRemoteControlPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
break;
case UIEventSubtypeRemoteControlTogglePlayPause:
if (self.synthesizer.paused) {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
[self.synthesizer continueSpeaking];
}
else {
NSLog(#"UIEventSubtypeRemoteControlTogglePlayPause");
if (self.pauseSettingSegmentedControl.selectedSegmentIndex == 0) {
// Pause immediately
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryImmediate];
}
else {
// Pause at end of current word
[self.synthesizer pauseSpeakingAtBoundary:AVSpeechBoundaryWord];
}
}
break;
case UIEventSubtypeRemoteControlNextTrack:
NSLog(#"UIEventSubtypeRemoteControlNextTrack - appropriate for playlists");
break;
case UIEventSubtypeRemoteControlPreviousTrack:
NSLog(#"UIEventSubtypeRemoteControlPreviousTrack - appropriatefor playlists");
break;
default:
break;
}
}
}
pragma mark UIPickerViewDelegate Methods
- (NSInteger)numberOfComponentsInPickerView:(UIPickerView *)pickerView
{
return 1;
}
- (NSInteger)pickerView:(UIPickerView *)pickerView numberOfRowsInComponent:(NSInteger)component
{
return self.voices.count;
}
- (UIView *)pickerView:(UIPickerView *)pickerView viewForRow:(NSInteger)row forComponent:(NSInteger)component reusingView:(UIView *)view
{
UILabel *rowLabel = [[UILabel alloc] init];
NSDictionary *voice = [self.voices objectAtIndex:row];
rowLabel.text = [voice objectForKey:#"label"];
return rowLabel;
}
- (void)pickerView:(UIPickerView *)pickerView didSelectRow: (NSInteger)row inComponent:(NSInteger)component
{
NSDictionary *voice = [self.voices objectAtIndex:row];
NSLog(#"new picker voice selected with label: %#", [voice objectForKey:#"label"]);
self.voice = [AVSpeechSynthesisVoice voiceWithLanguage:[voice objectForKey:#"voice"]];
}
pragma mark SpeechSynthesizerDelegate methods
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer didFinishSpeechUtterance:(AVSpeechUtterance *)utterance
{
// This is a workaround of a bug. When we change the voice the first time the speech utterence is set fails silently. We check that the method willSpeakRangeOfSpeechString is called and set didStartSpeaking to YES there. If this method is not called (silent fail) then we simply request to speak again.
if (!didStartSpeaking) {
[self speakUtterance];
}
else {
[self updateToolbarWithButton:#"play"];
NSLog(#"the text are:%#",self.textView.text);
}}
- (void)speechSynthesizer:(AVSpeechSynthesizer *)synthesizer willSpeakRangeOfSpeechString:(NSRange)characterRange utterance:(AVSpeechUtterance *)utterance
{
didStartSpeaking = YES;
//[self setTextViewTextWithColoredCharacterRange:characterRange];
}
#pragma mark UITextViewDelegate Methods
- (BOOL)textView:(UITextView *)textView shouldChangeTextInRange:(NSRange)range replacementText:(NSString *)text {
if([text isEqualToString:#"\n"]) {
[textView resignFirstResponder];
return NO;
}
return YES;
}
Dont initialize all the things in the ViewDidLoad. When You tap on button to convert text to speech, at that that time make the speech to text conversion object as nil also set the delegate the nil. Same things for the vice versa also.

Twitter number of followers iOS 6

Hi how can I get the number of followers of the current twitter user in iOS 6. The TWRequest is depreciated so how can I use the new Social.Framework to get the number of followers?
First, you need to Authenticate your request (Get permission).
second, see follow these steps:
1.Download FHSTwitterEngine Twitter Library.
2.Add the folder FHSTwitterEngine" to your project and #import "FHSTwitterEngine.h".
3.add SystemConfiguration.framework to your project.
Usage : 1.in the [ViewDidLoad] add the following code:
UIButton *logIn = [UIButton buttonWithType:UIButtonTypeRoundedRect];
logIn.frame = CGRectMake(100, 100, 100, 100);
[logIn setTitle:#"Login" forState:UIControlStateNormal];
[logIn addTarget:self action:#selector(showLoginWindow:) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:logIn];
[[FHSTwitterEngine sharedEngine]permanentlySetConsumerKey:#"<consumer_key>" andSecret:#"<consumer_secret>"];
[[FHSTwitterEngine sharedEngine]setDelegate:self];
and don't forget to import the delegate FHSTwitterEngineAccessTokenDelegate.
you need to get the permission for your request, with the following method which will present Login window:
- (void)showLoginWindow:(id)sender {
[[FHSTwitterEngine sharedEngine]showOAuthLoginControllerFromViewController:self withCompletion:^(BOOL success) {
NSLog(success?#"L0L success":#"O noes!!! Loggen faylur!!!");
}];
}
when the Login window is presented, enter your Twitter Username and Password to authenticate your request.
add the following methods to your code:
-(void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
[[FHSTwitterEngine sharedEngine]loadAccessToken];
NSString *username = [[FHSTwitterEngine sharedEngine]loggedInUsername];// self.engine.loggedInUsername;
if (username.length > 0) {
lbl.text = [NSString stringWithFormat:#"Logged in as %#",username];
[self listResults];
} else {
lbl.text = #"You are not logged in.";
}
}
- (void)storeAccessToken:(NSString *)accessToken {
[[NSUserDefaults standardUserDefaults]setObject:accessToken forKey:#"SavedAccessHTTPBody"];
}
- (NSString *)loadAccessToken {
return [[NSUserDefaults standardUserDefaults]objectForKey:#"SavedAccessHTTPBody"];
}
4.Now you are ready to get your request, the following method will list your followers ID's, add them to an NSArray and the get the Count from the NSArray:
- (Void)listFriends:(id)sender {
NSMutableArray *arr = [[NSMutableArray alloc]init];
[_tweetField resignFirstResponder];
dispatch_async(GCDBackgroundThread, ^{
#autoreleasepool {
[UIApplication sharedApplication].networkActivityIndicatorVisible = YES;
// NSLog(#"Friends' IDs: %#",[[FHSTwitterEngine sharedEngine]getFriendsIDs]);
dict = [[FHSTwitterEngine sharedEngine]getFollowersIDs];
for (NSDictionary *item in [dict objectForKey:#"ids"]) {
[arr addObject:[dict objectForKey:#"ids"]];
}
dispatch_sync(GCDMainThread, ^{
#autoreleasepool {
UIAlertView *av = [[UIAlertView alloc]initWithTitle:#"Complete!" message:#"Your list of followers has been fetched" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[av show];
[UIApplication sharedApplication].networkActivityIndicatorVisible = NO;
NSLog(#"====> %d",[arr count]);
}
});
}
});
}
I tested this code and it's working perfectly ^_^.
use This code
(IBAction)listFriends:(id)sender {
NSMutableArray *arr = [[NSMutableArray alloc]init];
dict=[[NSMutableDictionary alloc]init];
[_tweetField resignFirstResponder];
dispatch_async(GCDBackgroundThread, ^{
#autoreleasepool {
[UIApplication sharedApplication].networkActivityIndicatorVisible = YES;
//to get friends id.......
NSLog(#"Friends' IDs: %#",[[FHSTwitterEngine sharedEngine]getFriendsIDs]);
/* dict = [[FHSTwitterEngine sharedEngine]getFollowersIDs];
for (NSDictionary *item in [dict objectForKey:#"ids"]) {
[arr addObject:[dict objectForKey:#"ids"]];
}*/
// To get friends name ...
// NSLog(#"Friends_Name: %#",[[FHSTwitterEngine sharedEngine]listFriendsForUser:_loggedInUserLabel.text isID:NO withCursor:#"-1"]);
dict = [[FHSTwitterEngine sharedEngine]listFriendsForUser:_loggedInUserLabel.text isID:NO withCursor:#"-1"];
NSLog(#"====> %#",[dict objectForKey:#"users"] );
NSMutableArray *array=[dict objectForKey:#"users"];
for(int i=0;i<[array count];i++)
{
NSLog(#"names:%#",[[array objectAtIndex:i]objectForKey:#"name"]);
}
// NSLog(#"Friends_Name: %#",[[FHSTwitterEngine sharedEngine]getMentionsTimelineWithCount:1000 sinceID:nil maxID:nil]);
dispatch_sync(GCDMainThread, ^{
#autoreleasepool {
UIAlertView *av = [[UIAlertView alloc]initWithTitle:#"Complete!" message:#"Your list of followers has been fetched" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[av show];
[UIApplication sharedApplication].networkActivityIndicatorVisible = NO;
// NSLog(#"====> %d",[arr count]);
}
});
}
});
}

AVCaptureOutput Delegate

I'm creating an application that is using -(void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection { } function but that function is not being called. To furtherly explain, the application is using code from this tutorial to create a video recording app. When I ran the tutorial's code in xCode it ran the function above but when I copied it over into my application, not modifying it in anyway, it was never called.
Here's the code used:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
NSError *error = nil;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone){
[session setSessionPreset:AVCaptureSessionPreset640x480];
} else {
[session setSessionPreset:AVCaptureSessionPresetPhoto];
}
// Select a video device, make an input
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
// find the front facing camera
for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
if ([d position] == desiredPosition) {
device = d;
isUsingFrontFacingCamera = YES;
break;
}
}
// fall back to the default camera.
if( nil == device )
{
isUsingFrontFacingCamera = NO;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if( !error ) {
// add the input to the session
if ( [session canAddInput:deviceInput] ){
[session addInput:deviceInput];
}
previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
CALayer *rootLayer = [previewView layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[session startRunning];
}
session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:#"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:#"Dismiss"
otherButtonTitles:nil];
[alertView show];
[self teardownAVCapture];
}
NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
// Make a video data output
videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
// we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutput setVideoSettings:rgbOutputSettings];
[videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked
// create a serial dispatch queue used for the sample buffer delegate
// a serial dispatch queue must be used to guarantee that video frames will be delivered in order
// see the header doc for setSampleBufferDelegate:queue: for more information
videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
[videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
if ( [session canAddOutput:videoDataOutput] ){
[session addOutput:videoDataOutput];
}
// get the output for doing face detection.
[[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];
//[self setupCaptureSession];
}
Okay I think I know what the problem is. You had [session startRunning] even before you set up your videoDataOutput. A session with no video data output....well, will not call the AVCaptureOutput delegate.

NSInvocationOperation with bad access crash

I got a big problem with NSInvocationOperation. When I run on my iPod 4, firmware 5.0.1, it's OK. But on my iPhone 4, iOS 4.1, it crashed. This is my code:
CustomAnnotation *annotation = [[[ModelManager defaultModelManager] getAnnotationDictionaryInMemory] objectForKey:joltId];
if (annotation == nil)
annotation = [[[ModelManager defaultModelManager] getAnnotationDictionaryInMemory] annotationWithInstagramId:eID];
if (annotation == nil)
continue;
//THIS ONE CRASH ON IPHONE 4 OS 4.1
NSDictionary *param = [[NSDictionary alloc] initWithObjectsAndKeys: #"aKey", #"aValue", nil];
NSInvocationOperation *op = [[NSInvocationOperation alloc] initWithTarget:annotation selector:#selector(updateAnnotation:) object:param];
[_queue addOperation:op];
[op autorelease];
updateAnnotation function was defined in CustomAnnotation class:
- (void)updateAnnotation:(id)sender {
//NSLog(#"UPDATE ANNOTATION ID: %#", self.annoID);
NSDictionary *senderDic = (NSDictionary *)sender;
UzooltAppDelegate* appDelegate = (UzooltAppDelegate*) [UIApplication sharedApplication].delegate;
if ([senderDic objectForKey:#"nb_rejolt"] != nil) {
NSInteger new_nb_rejolts = [[senderDic objectForKey:#"nb_rejolt"] intValue];
//if (new_nb_rejolts != rejolts) {
//NSLog(#"update nb rejolts: %d", new_nb_rejolts);
if (self.rejolts != new_nb_rejolts) {
self.rejolts = new_nb_rejolts;
//if (self.isGrouped)
// [self.masterAnnotation.annotationView performSelectorOnMainThread:#selector(updateAnnotationViewWithRejolts:) withObject:[NSString stringWithFormat:#"%d", rejolts] waitUntilDone:NO];
//else
if ([senderDic objectForKey:#"rejolt_fbid"] != nil) {
NSString *fbRejoltsString = [senderDic objectForKey:#"rejolt_fbid"];
self.fbRejolts = [[fbRejoltsString componentsSeparatedByString:#","] mutableCopy];
[self.fbRejolts removeLastObject];
[self updateNumberOfRejoltsFromFBFrds];
}
[self.annotationView performSelectorOnMainThread:#selector(updateAnnotationViewWithRejolts:) withObject:[NSString stringWithFormat:#"%d", rejolts] waitUntilDone:NO];
}
if (self.isGrouped) {
if (self.masterAnnotation.isShowingRadius == NO && appDelegate.uMainViewController.isInGroup == NO && ( new_nb_rejolts > self.masterAnnotation.rejolts || (new_nb_rejolts == self.masterAnnotation.rejolts && self.getAge < self.masterAnnotation.getAge))) {
[self.masterAnnotation removeFromGroupedAnnotations:self];
self.annotationView.hidden = NO;
[self.annotationView performSelectorOnMainThread:#selector(showWithAlpha:) withObject:[NSNumber numberWithFloat:annotationAlpha] waitUntilDone:NO];
[[NSNotificationCenter defaultCenter] postNotificationName:#"need_group_annotations" object:nil];
}
}
//}
}
if ([senderDic objectForKey:#"lifetime"] != nil) {
float new_lifetime = [[senderDic objectForKey:#"lifetime"] floatValue]*3600;
//NSLog(#"new lifetime: %.f", new_lifetime);
if (new_lifetime != lifetime) {
//NSLog(#"update lifetime");
self.lifetime = new_lifetime;
}
}
[self updateViewAlpha];
if ([senderDic objectForKey:#"radius"] != nil) {
float new_radius = [[senderDic objectForKey:#"radius"] floatValue];
if (new_radius != radius) {
//NSLog(#"update lifetime");
self.radius = new_radius;
}
}
/*
if ([appDelegate.uMainViewController isMaximumZoomIn])
[[self annotationView] performSelectorOnMainThread:#selector(setGroupNumberIndicatorVisible:) withObject:[NSNumber numberWithBool:YES] waitUntilDone:NO];
else
[[self annotationView] performSelectorOnMainThread:#selector(setGroupNumberIndicatorVisible:) withObject:[NSNumber numberWithBool:NO] waitUntilDone:NO];
*/
if (isSelected == YES && isShowingRadius == NO ) {
//NSLog(#"update details");
[self performSelectorOnMainThread:#selector(updateDetailsView) withObject:nil waitUntilDone:NO];
UzooltAppDelegate* appDelegate = (UzooltAppDelegate*) [UIApplication sharedApplication].delegate;
if (isGrouped == YES && appDelegate.uMainViewController.isInGroup) {
NSLog(#"grouped jolt rejolted");
[self performSelectorOnMainThread:#selector(updateInGroupAnnotationView) withObject:nil waitUntilDone:NO];
}
}
}
I don't know where it wrong? Please help me. Thanks all !!!
Try running with NSZombies enabled. It should at least give you a hint as to what object you are trying to access after it has been deallocated.
NSZombies enabled, debug information