Have anyone tried to use the SDK4's SMS composer?
If anyone's got some reference or source code please put in here
Thanks
If you want to support 3.1 devices, you need to do a few things:
In your target's build settings:
set Base SDK to iPhone Device 4.0
set iPhone OS Deployment Target to iPhone OS 3.x (the lowest OS level you want to support)
In your target's general settings, under Linked Libraries, change the "Type" next to MessageUI.framework to Weak.
Don't import <MessageUI/MFMessageComposeViewController.h> or it will crash on launch on 3.1. Just import <MessageUI/MessageUI.h>
To make sure it doesn't crash on 3.1.x, you need to test for the availability of MFMessageComposeViewController:
Class smsClass = (NSClassFromString(#"MFMessageComposeViewController"));
if (smsClass != nil && [MFMessageComposeViewController canSendText]) {
MFMessageComposeViewController *controller = [[MFMessageComposeViewController alloc] init];
controller.body = text;
controller.recipients = [NSArray arrayWithObjects: nil];
controller.messageComposeDelegate = self;
[self presentModalViewController:controller animated:YES];
[controller release];
}
If you've got the 4.0 SDK already, check MFMessageComposeViewController. The usage is similar to MFMailComposeViewController.
Related
I have implemented CallKit for audio and video call with VoIP PushKit in iOS and it is working fine in iOS 12 and prior versions, and also it is working fine normally in iOS 13 and 13.1.
But it is failing in 2 scenarios:
1) Our App is in foreground state. When cellular call is running and VoIP push is received, then Call kit incoming call screen is showing for 5 - 10 seconds, and then both Cellular and VOIP calls are failing with Alert "Call Failed".
2) Our App is in Background or Killed state. When cellular call is running and VoIP push is received, then both Cellular and VOIP calls are failing with Alert "Call Failed". No incoming call UI is showing this time.
I am showing my code here:
- (void)registerAppForVOIPPush {
PKPushRegistry *pushRegistry = [[PKPushRegistry alloc] initWithQueue:dispatch_get_main_queue()];
pushRegistry.delegate = self;
pushRegistry.desiredPushTypes = [NSSet setWithObject:PKPushTypeVoIP];
}
Then Push delegates
#pragma mark PKPushRegistryDelegate ----
- (void)pushRegistry:(PKPushRegistry *)registry didUpdatePushCredentials: (PKPushCredentials *)credentials forType:(NSString *)type {
NSString *newToken = [self hexadecimalStringFromData:credentials.token];
//Make a note of this token to a server to send VOIP for a particular device
NSLog(#"VOIP token ::: %#", newToken);
_voipToken = newToken;
}
- (void)pushRegistry:(PKPushRegistry *)registry didReceiveIncomingPushWithPayload:(PKPushPayload *)payload forType:(PKPushType)type {
//available(iOS, introduced: 8.0, deprecated: 11.0)
[self pushRegistryDidReceivedPushWithPayload:payload forType:type withCompletionHandler:NULL];
}
- (void)pushRegistry:(PKPushRegistry *)registry didReceiveIncomingPushWithPayload:(PKPushPayload *)payload forType:(PKPushType)type withCompletionHandler:(void (^)(void))completion {
//available(iOS 11.0, *)
[self pushRegistryDidReceivedPushWithPayload:payload forType:type withCompletionHandler:completion];
}
- (void)pushRegistryDidReceivedPushWithPayload:(PKPushPayload *)payload forType:(PKPushType)type withCompletionHandler:(void (^)(void))completion {
//Call kit configration
CXProviderConfiguration *providerConfig = [[CXProviderConfiguration alloc] initWithLocalizedName:#"my app Call"];
providerConfig.supportsVideo = NO;
providerConfig.maximumCallGroups = 1;
providerConfig.maximumCallsPerCallGroup = 1;
providerConfig.supportedHandleTypes = [[NSSet alloc] initWithObjects:[NSNumber numberWithInteger:CXHandleTypeGeneric], nil];
providerConfig.iconTemplateImageData = UIImagePNGRepresentation([UIImage imageNamed:#"IconMask"]);
CXProvider *provider = [[CXProvider alloc] initWithConfiguration:providerConfig];
[provider setDelegate:self queue:nil];
//generate token
NSUUID *callbackUUIDToken = [NSUUID UUID];
//Display callkit
NSString *uniqueIdentifier = #"Max test";
CXCallUpdate *update = [[CXCallUpdate alloc] init];
update.remoteHandle = [[CXHandle alloc] initWithType:CXHandleTypeGeneric value:uniqueIdentifier];
update.supportsGrouping = FALSE;
update.supportsUngrouping = FALSE;
update.supportsHolding = FALSE;
update.localizedCallerName = uniqueIdentifier;
update.hasVideo = NO;
[provider reportNewIncomingCallWithUUID:callbackUUIDToken update:update completion:^(NSError * _Nullable error) {
NSLog(#"reportNewIncomingCallWithUUID error: %#",error);
}];
if (completion) {
dispatch_async(dispatch_get_main_queue(), ^{
completion();
});
}
}
I have implemented CXProvider delegate method perfectly
- (void)provider:(CXProvider *)provider performAnswerCallAction:(CXAnswerCallAction *)action{
[action fulfill];
}
- (void)provider:(CXProvider *)provider performEndCallAction:(CXEndCallAction *)action{
[action fulfill];
}
and also managed other delegate methods to manage call and everything, and it is working perfectly in all conditions.
I have checked these two scenarios with other apps like Google Duo, Whatsapp and FaceTime and it's showing CallKit properly without failing, but in my app it is failing. I have no clue where it is failing.
So, I have this 2 stated issues for iOS 13 and later versions. Any help will be appreciated.
Thanks.
This is probably an iOS 13 bug and, if you haven't already done it, you should report it to Apple.
I think that the reason why apps like Whatapp (and the one I develop) are working, is that we build the app against the iOS 12 SDK. We do this because of the limitations of VoIP push notifications introduced in iOS 13. So, you can try to work around the issue—at least until April 2020—building against the iOS 12 SDK. Hopefully, Apple we'll soon fix this issue.
#Max I have faced the same issue that you faced in iOS version 13.0 to 13.2.0.
As many developers have reported this issue to Apple. The latest iOS version that released last week(iOS 13.2.2) has this bug resolved. So, now instead of building from older SDK, you can start working with latest SDK and xCode 11.2.1.
I want to scan a barcode or QR code image which is in my photo gallery in ios 7. In ios 7 we can use camera to scan a bar code image but I did not find any method to select a Bar code image from UIImagePickerController and scan it. Is there any methods available in AVFoundation Framework ?
Help me..
I have the same issue, most of the once perfectly running 32 bit barcode SDKs are broken with 7.1 due to the architecture requirement to support arm64. Zbar is affected, ZXing totally got out of the iOS platform what is left are commercial packages. I tried one of them called manatee it works but it is truncating the first character of the barcode from the output. At the moment your best bet are these commercial SDKs working with IOS 7.1 or go back to 7.0 or 6.1 and use Zbar.
AVfoundation solution put forward by #Stark works well with camera capture (I've tested it with some modifications to recognise PDF417, AztecCodes, and 6 or so 1D barcodes),however the codes in the sample app cannot process existing images from the media library. I searched intensively and the nearest bet is the CoreImage detection which does facial recognition on images, unfortunately there is no barcode detection option yet.
There are many APIs available for barcode scanning:-
Softek Barcode Reader SDK
ZBar bar code reader
shopsavvy
red laser
ZXing
If you want to use AVfoundation framework only, here is the link for the tutorial..
http://www.appcoda.com/qr-code-ios-programming-tutorial/
Here is the code which start Reading the barcode
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
and to stop it.
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
}
I'm very new to Objective-C, and am trying to update some code that's about 3 years old to work with iOS 7. There are two or two instances of AudioSessionSetProperty and AudioSessionInitialize appearing in the code:
1:
- (void)applicationDidFinishLaunching:(UIApplication *)application {
AudioSessionInitialize(NULL,NULL,NULL,NULL);
[[SCListener sharedListener] listen];
timer = [NSTimer scheduledTimerWithTimeInterval: 0.5 target: self selector: #selector(tick:) userInfo:nil repeats: YES];
// Override point for customization after app launch
[window addSubview:viewController.view];
[window makeKeyAndVisible];
}
And 2:
- (id)init {
if ([super init] == nil){
return nil;
}
AudioSessionInitialize(NULL,NULL,NULL,NULL);
Float64 rate=kSAMPLERATE;
UInt32 size = sizeof(rate);
AudioSessionSetProperty (kAudioSessionProperty_PreferredHardwareSampleRate, size, &rate);
return self;
}
For some reason this code works on iOS7 in the simulator but not a device running iOS7, and I suspect that these deprecations are the cause. I've been reading through the Docs and related questions on this website, and it appears that I need to use AVAudioSession instead. I've been trying to update the code for a long time now, and I'm unsure of how to properly switch over to AVAudioSession. Does anyone know how these two methods above need to look?
Side note: I've managed to hunt down an article that outlines the transition:
https://github.com/software-mariodiana/AudioBufferPlayer/wiki/Replacing-C-functions-deprecated-in-iOS-7
But I can't seem to apply this to the code above.
The code I'm trying to update is a small frequency detection app from git:
https://github.com/jkells/sc_listener
Alternatively, if someone could point me to a sample demo app that can detect frequencies on iOS devices, that would be awesome.
As you have observed, pretty much all of the old Core Audio AudioSession functions have been deprecated in favour of AVAudioSession.
The AVAudioSession is a singleton object which will get initialised when you first call it:
[AVAudioSession sharedInstance]
There is no separate initialize method. But you will want to activate the audio session:
BOOL activated = [[AVAudioSession sharedInstance] setActive:YES error:&error];
As regards setting the hardware sample rate using AVAudioSession, please refer to my answer here:
How can I obtain the native (hardware-supported) audio sampling rates in order to avoid internal sample rate conversion?
For other compares & contrasts between Core Audio audioSession and AVFoundation's AVAudioSession here are some of my other answers around the same topic:
How Do I Route Audio to Speaker without using AudioSessionSetProperty?
use rear microphone of iphone 5
Play audio through upper (phone call) speaker
How to control hardware mic input gain/level on iPhone?
I wrote a short tutorial that discusses how to update to the new AVAudioSession objects. I posted it on GitHub: "Replacing C functions deprecated in iOS 7."
Under iOS7, is the primary ANCS Service meant to be constantly advertised, or does it need to be enabled in obfuscated settings / implemented using a custom CBPeripheralManager (using the Apple-specified Service and Characteristic UUIDs) for a potential Notification Consumer to successfully discover it and subscribe?
The Apple documentation (both the CoreBluetooth Programming Guide, and the ANCS Specification) are surprisingly bereft of any information on this. They seem to hint at requiring a custom implementation, but this is just conjecture on our part.
Given the primary ANCS Service UUID: 7905F431-B5CE-4E99-A40F-4B1E122D00D0, performing a scan yields no hits. Scanning the entire BLE spectrum, as expected, yields hits for other BLE devices, but not a single ANCS device.
EDIT 1:
Defining a custom CBPeripheralManager and manually adding the Apple-specified ANCS Service with its associated Characteristics fails, with the NSError: Error Domain=CBErrorDomain Code=8 "The specified UUID is not allowed for this operation."
Consequently, it appears that the Service UUID is reserved by Apple (as it should be), and we cannot enable it in this manner.
Any insight is greatly appreciated; we've reached out to Apple about this, and will update when we hear from them.
The code below reproduces the NSError mentioned above:
// define the ANCS Characteristics
CBUUID *notificationSourceUUID = [CBUUID UUIDWithString:#"9FBF120D-6301-42D9-8C58-25E699A21DBD"];
CBMutableCharacteristic *notificationSource = [[CBMutableCharacteristic alloc] initWithType:notificationSourceUUID properties:CBCharacteristicPropertyNotifyEncryptionRequired value:nil permissions:CBAttributePermissionsReadEncryptionRequired];
CBUUID *controlPointUUID = [CBUUID UUIDWithString:#"69D1D8F3-45E1-49A8-9821-9BBDFDAAD9D9"];
CBMutableCharacteristic *controlPoint = [[CBMutableCharacteristic alloc] initWithType:controlPointUUID properties:CBCharacteristicPropertyWrite value:nil permissions:CBAttributePermissionsWriteEncryptionRequired];
CBUUID *dataSourceUUID = [CBUUID UUIDWithString:#"22EAC6E9-24D6-4BB5-BE44-B36ACE7C7BFB"];
CBMutableCharacteristic *dataSource = [[CBMutableCharacteristic alloc] initWithType:dataSourceUUID properties:CBCharacteristicPropertyNotifyEncryptionRequired value:nil permissions:CBAttributePermissionsReadEncryptionRequired];
// define the ANCS Service
CBUUID *ANCSUUID = [CBUUID UUIDWithString:#"7905F431-B5CE-4E99-A40F-4B1E122D00D0"];
CBMutableService *ANCS = [[CBMutableService alloc] initWithType:ANCSUUID primary:YES];
ANCS.characteristics = #[notificationSource, controlPoint, dataSource];
// define the Advertisement data
NSMutableDictionary *advertisementData = [NSMutableDictionary dictionary];
[advertisementData setValue:#"CUSTOM_ANCS" forKey:CBAdvertisementDataLocalNameKey];
[advertisementData setValue:#"7905F431-B5CE-4E99-A40F-4B1E122D00D0" forKey:CBAdvertisementDataServiceUUIDsKey];
// publish the ANCS service
[self.peripheralManager addService:ANCS];
As a belated answer to this question, now that Mavericks is out, here is what we've come up with.
Our initial efforts to implement the ANCS specification between two iOS devices, one as Peripheral one as Central, were unsuccessful. Apple responded to us after some time (hat tip to their evangelists) and told us this was impossible.
With the addition of the CBPeripheralManager class and CBPeripheralManagerDelegate protocol to the CoreBluetooth.framework embedded in the IOBluetooth.framework on OSX Mavericks (deep breath), we can now use the BLE radio on an OSX device to implement and advertise ANCS.
Thus, this snippet belongs to a CBPeripheralManager on OSX:
- (void) advertiseANCS
{
NSLog(#"%s", __FUNCTION__);
// define the ANCS Characteristics
CBUUID *notificationSourceUUID = [CBUUID UUIDWithString:#"9FBF120D-6301-42D9-8C58-25E699A21DBD"];
CBMutableCharacteristic *notificationSource = [[CBMutableCharacteristic alloc] initWithType:notificationSourceUUID properties:CBCharacteristicPropertyNotifyEncryptionRequired value:nil permissions:CBAttributePermissionsReadEncryptionRequired];
CBUUID *controlPointUUID = [CBUUID UUIDWithString:#"69D1D8F3-45E1-49A8-9821-9BBDFDAAD9D9"];
CBMutableCharacteristic *controlPoint = [[CBMutableCharacteristic alloc] initWithType:controlPointUUID properties:CBCharacteristicPropertyWrite value:nil permissions:CBAttributePermissionsWriteEncryptionRequired];
CBUUID *dataSourceUUID = [CBUUID UUIDWithString:#"22EAC6E9-24D6-4BB5-BE44-B36ACE7C7BFB"];
CBMutableCharacteristic *dataSource = [[CBMutableCharacteristic alloc] initWithType:dataSourceUUID properties:CBCharacteristicPropertyNotifyEncryptionRequired value:nil permissions:CBAttributePermissionsReadEncryptionRequired];
// define the ANCS Service
CBUUID *ANCSUUID = [CBUUID UUIDWithString:#"7905F431-B5CE-4E99-A40F-4B1E122D00D0"];
CBMutableService *ANCS = [[CBMutableService alloc] initWithType:ANCSUUID primary:YES];
ANCS.characteristics = #[notificationSource, controlPoint, dataSource];
// define the Advertisement data
NSMutableDictionary *advertisementData = [NSMutableDictionary dictionary];
[advertisementData setValue:#"ANCS" forKey:CBAdvertisementDataLocalNameKey];
[advertisementData setValue:#[ANCSUUID] forKey:CBAdvertisementDataServiceUUIDsKey];
// publish the ANCS service
[self.peripheralManager addService:ANCS];
[self.peripheralManager startAdvertising:advertisementData];
}
Whereas this snippet belongs on a CBCentralManager on an iOS device:
- (void) discoverANCS
{
NSLog(#"%s", __FUNCTION__);
NSMutableArray *services = [NSMutableArray array];
[services addObject:#"7905F431-B5CE-4E99-A40F-4B1E122D00D0"];
NSMutableDictionary *options = [NSMutableDictionary dictionary];
[options setValue:[NSNumber numberWithBool:NO] forKey:CBCentralManagerScanOptionAllowDuplicatesKey];
[self.centralManager scanForPeripheralsWithServices:services options:options];
}
The iOS device can now see and connect to the OSX radio, which implements the ANCS specification as detailed in the Apple documentation.
<CBCentralManager: 0x14e23280> <CBPeripheral: 0x14d27b40 identifier = 7231B80F-874E-DB5F-2AF9-7F376911E2B7, Name = "ANCS", state = disconnected> {
kCBAdvDataChannel = 39;
kCBAdvDataIsConnectable = 1;
kCBAdvDataLocalName = ANCS;
} -60
Happy hunting
Well the reason is because you are setting the UUID for the advertisement Data Dictionary as a String and not as a CBUUID, also I think that key takes an array of CBUUIDs.
therefore this should make it work:
NSDictionary *advertisementData = #{
CBAdvertisementDataServiceUUIDsKey:#[[CBUUID UUIDWithString:#"7905F431-B5CE-4E99-A40F-4B1E122D00D0"]],
CBAdvertisementDataLocalNameKey:#"ANCS",
};
EDIT: Oh yeah my bad! I forgot to mention that if you are trying to discover this ANCS service from another iOS Device you wont be able to see it, not under iOS 7. Somehow the OS is reserving that service to itself and wont show up on your didDiscoverServices callback even though you might be seeing it on your advertisement data. It will however, work if you have an external device, like a non-iOS device, or a pebble-like device. This is how you expose the ANCS functionality but the rest of the implementation is up to consumer of the service.
According to this blog post:
http://blog.punchthrough.com/post/63658238857/the-apple-notification-center-service-or-wtf-is
You can advertise with 'service solicitation' to pair and access the ANCS without writing any code on the iPhone!
I have not tried it, but will soon.
For anyone currently Googling a similar question: ANCS via CoreBluetooth appears to no longer work in iOS 9. Specifically,
This functionality was removed from OS X and iOS.
Neither platform can be used to consume the ANCS service anymore using CoreBluetooth.
Given by https://forums.developer.apple.com/thread/24336
The ANCS is not advertised on iOS, I used a following way to achieve a long time connection with ANCS:
My peripheral device uses a dummy service, which is advertised. An iOS application is used to discover a device with this service and to create a connection. You can write your own application, or use one of free available (like LightBlue for example).
Once a connection is established, the peripheral device enumerates all services present on connected iOS device. Beside of others, there are those three mentioned in ANCS documentation.
If your register notifications for them, you will get ANCS data.
If you bond devices (iOS and peripheral), ANCS will automatically care for connection (re)establishment any time, if it found bonded device being advertised.
I use ANCS without any code on the iPhone using "pebble-like" prototype hardware.
Using the methods documented above on this question.
The video is for a kind of joke show and meant as a joke as is the concept of "area-wide notifications" :-) and not at all technical. But might be informative somewhat.
http://www.youtube.com/watch?v=O-YWMl7IS-g
I don't make money from my youtube BTW
I have not been successful with iOS--iOS attempts.
Been following this great tutorial on how to integrate twitter into your app. I know there are other ways that programmers have used to integrate twitter before iOS 5 but my question is this:
My app supports iOS 3.0+ so if I integrate twitter using just the iOS 5 way of doing it, how will this affect my users that aren't using iOS 5? Will it even work for them?
Thanks!
If you are OK by only making Twitter available for iOS 5 users, you can check if Twitter is available with this:
// Don't forget to import Twitter!
#import <Twitter/Twitter.h>
....
if([TWTweetComposeViewController class] != nil) {
// your code here
}
Also, make sure that when adding the Twitter framework you set it as optional.
The official API framework wouldn't work unfortunately as the twitter app/integration is only available in iOS 5
A good solution is to use ShareKit, a free API that allows you to integrate twitter, facebook and other social network support.
You should look into DETweetComposeViewController. We built it just for this purpose. It is an iOS4 compatible re-implementation of the TWTweetComposeViewController.
Use weak linking and some code like the following:
- (void)tweet
{
Class tweeterClass = NSClassFromString(#"TWTweetComposeViewController");
if(tweeterClass != nil) {
if([TWTweetComposeViewController canSendTweet]) {
TWTweetComposeViewController *tweetViewController = [[TWTweetComposeViewController alloc] init];
tweetViewController.completionHandler = ^(TWTweetComposeViewControllerResult result) {
if(result == TWTweetComposeViewControllerResultDone) {
}
[self dismissViewControllerAnimated:YES completion:nil];
};
[self presentViewController:tweetViewController animated:YES completion:nil];
} else {
#if !(TARGET_IPHONE_SIMULATOR)
[self displayAlert:#"You can't send a tweet right now, make sure your device has an internet connection and you have at least one Twitter account setup."];
#else
NSString *tweetString = [NSString stringWithFormat:#"http://mobile.twitter.com/home?status=%#%#", [self urlEncode:#"Check out this awesome pic: "] ,[self urlEncode:[_blobTweet.shortUrl absoluteString]]];
NSURL *tweetURL = [NSURL URLWithString:tweetString];
if ([[UIApplication sharedApplication] canOpenURL:tweetURL]) {
[[UIApplication sharedApplication] openURL:tweetURL];
}
#endif
}
} else {
// no Twitter integration could default to third-party Twitter framework
}
}
#end