Is it possible to send a text message from my Mac programmatically? - objective-c

Since Apple has shut down their developer portal pending security upgrades they've created a status page so we can monitor what features they've brought back online. I've written a simple program to monitor this status page for changes.
My Mac is set up to receive iMessages sent to my iPhone. I'm wondering if anyone knows if its possible to have the program I've written send an iMessage to my iPhone when there's been a change in the status page Apple has up.
I usually develop for iPhone, so I appreciate any insight people can offer. The simple program I've written below checks every fifteen minutes if there's been an update and brings the update page up on Safari if there has been.
#import "AppDelegate.h"
#implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
// Insert code here to initialize your application
NSDateFormatter *format = [NSDateFormatter new];
[format setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:8*3600]];
[format setDateFormat:#"YYYY-MM-DD HH:mm:ss"];
NSString *text = #"";
bool no_connection;
do {
text = [[NSString alloc]initWithContentsOfURL:[NSURL URLWithString:#"https://developer.apple.com/support/system-status/"] encoding:NSASCIIStringEncoding error:NULL]; // pulls the source html from Apple's update page
NSString *status = #"No change";
if ([text rangeOfString:[self currentStatus]].location == NSNotFound) { // if cannot find the old text, then there has been a change
status = #"Update!";
}
no_connection = text == nil || [text length] == 0; // if no text or nil text then connection issue
if (no_connection) {
status = #"error making connection";
}
NSLog(#"status: %#",status); // report on status
if (no_connection) { // if no connection then try again in a minute
sleep(60);
continue;
}
sleep(900); // wait 15 minutes (60 x 15 = 900) and check again
} while ([text rangeOfString:[self currentStatus]].location != NSNotFound); // continue checking until there has been a change
NSURL *url = [NSURL URLWithString:#"https://developer.apple.com/support/system-status/"]; // bring up the update page in the browser
if( ![[NSWorkspace sharedWorkspace] openURL:url] )
NSLog(#"Failed to open url: %#",[url description]);
}
-(NSString*)currentStatus { /* returns the specific text in the html source that I'm checking for a change
"<span>" will be replaced with a hyperlink tag */
return #"<span>Certificates, Identifiers & Profiles";
}
#end

I tried to do this once and never came up with an Objective-C solution. However, you could have your app run AppleScript. Here's how you'd do it from the command line:
osascript -e 'tell application "Messages" to send "Test Message" to buddy "MyBuddy"'
And the answer here describes how to run AppleScript from Objective C:
Run AppleScript from Cocoa Application

Related

How to launch Apps, from my App, with a custom parameter so I can check whether the app was launched by me?

I'm working on this app that launches other apps. I'm listening to app launches using:
[[[NSWorkspace sharedWorkspace] notificationCenter] addObserver:self
selector:#selector(appLaunched:) name:NSWorkspaceDidLaunchApplicationNotification
object:nil];
And I launch them using (Mail is just an example):
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:[NSArray arrayWithObject:#"lalalala"], NSWorkspaceLaunchConfigurationArguments, nil];
[[NSWorkspace sharedWorkspace] launchApplicationAtURL:[NSURL URLWithString:#"/Applications/Mail.app"] options:NSWorkspaceLaunchWithoutActivation configuration:dict error:nil];
I did some research, and I saw that you can send an argument when you launch an app (that's why I used the var dict in the code above), but I'm having an issue with this: even using NSWorkspaceLaunchWithoutActivation, the Mail.app is launched and becomes focused with a new composing window. I don't know why it's doing that.
Another thing, if I manage to successfully send a custom argument without focusing the app, how can I check if the app was launched by me (check if the argument is there)?
PS: I'm looking for App Store-ready methods.
Send the timestamp (UTC) together with the app name you started to your server or a local file if possible.
Then you can track it.
Firstly, I'd try NSWorkspaceLaunchAndHide if NSWorkspaceLaunchWithoutActivation isn't "working". Not ideal, no.. but a kludge...
Secondly... here's a "full, running example" that does the trick..
#import <Cocoa/Cocoa.h>
NSString *psAUX(NSString*grep) {
FILE *read_f; char buff[BUFSIZ+1]; int char_rd; NSString *res, *cmnd;
memset(buff, '\0', sizeof(buff));
cmnd = [NSString stringWithFormat:#"/bin/ps aux|grep -i %#",grep];
read_f = popen(cmnd.UTF8String, "r");
if (read_f == NULL) return nil;
char_rd = fread(buff, sizeof(char), BUFSIZ, read_f);
if (!char_rd) return nil;
return res = [NSString stringWithUTF8String:buff], pclose(read_f), res;
}
int main(int argc, char *argv[]) { #autoreleasepool {
NSString* secretStr; NSURL *mailURL; NSDictionary *cfg; NSWorkspace *ws; NSApplication.sharedApplication;
secretStr = #"TAMPAX";
mailURL = [NSURL URLWithString:#"file:///Applications/Mail.app"];
cfg = #{NSWorkspaceLaunchConfigurationArguments:#[secretStr]};
ws = NSWorkspace.sharedWorkspace;
[ws launchApplicationAtURL:mailURL options:0 configuration:cfg error:nil];
fprintf(stderr,"%s",
[psAUX(#"Mail.app") containsString:secretStr]
? "You ARE Mail's baby's daddy!"
: "Hands off, she's NOT yours!");
[NSApp run]; } }
NSLog -> You ARE Mail's baby's daddy!
Congratulations!
You can create a new Task using NSTask. With NSTask you can set arguments as well as some environment variables to app so that you can check if it is launched by you or by someone else.
Here is the sample code sniffet to do so:
NSTask* taskApp = [[NSTask alloc] init];
[taskApp setLaunchPath:#"App path goes here"];
[taskApp setArguments:[NSArray arrayWithObjects:#"Arg1",#"arg2", nil]];
[taskApp setEnvironment: [[NSProcessInfo processInfo] environment]];
[taskApp launch];

Stop local notification

I have a problem below:
When minimize app to background by press home button, create a local notification for pop-up every 5 minutes.
Remove app from background.
-->My expected then pup-up just only show when app exist and it's discarded when remove app from background.
My issue that local notification still active and it still showing pop-up every 5 minutes after remove it from background.
How can i stop it?
Please help me!
Thanks in advanced.
Put this in application delegate. It will remove all local notifications when the application enters background.
- (void)applicationDidEnterBackground:(UIApplication *)application
{
[[UIApplication sharedApplication] cancelAllLocalNotifications];
}
If you don't want to cancel all notifications... I've set up a unique identifier stored in the notification's userInfo dictionary. When I want to delete I fast enumerate through all notifications and pick out the correct one for deletion.
My stumbling blocks here were remembering to store the UUID I'd created for the notification and also remembering to use isEqualToString in the fast enumeration. I guess I could also have used a specific name string instead of a unique identifier. If anyone can let me know a better method than fast enumerating please let me know.
#interface myApp () {
NSString *storedUUIDString;
}
- (void)viewDidLoad {
// create a unique identifier - place this anywhere but don't forget it! You need it to identify the local notification later
storedUUIDString = [self createUUID]; // see method lower down
}
// Create the local notification
- (void)createLocalNotification {
UILocalNotification *localNotif = [[UILocalNotification alloc] init];
if (localNotif == nil) return;
localNotif.fireDate = [self.timerPrototype fireDate];
localNotif.timeZone = [NSTimeZone defaultTimeZone];
localNotif.alertBody = #"Hello world";
localNotif.alertAction = #"View"; // Set the action button
localNotif.soundName = UILocalNotificationDefaultSoundName;
NSDictionary *infoDict = [NSDictionary dictionaryWithObject:storedUUIDString forKey:#"UUID"];
localNotif.userInfo = infoDict;
// Schedule the notification and start the timer
[[UIApplication sharedApplication] scheduleLocalNotification:localNotif];
}
// Delete the specific local notification
- (void) deleteLocalNotification {
// Fast enumerate to pick out the local notification with the correct UUID
for (UILocalNotification *localNotification in [[UIApplication sharedApplication] scheduledLocalNotifications]) {
if ([[localNotification.userInfo valueForKey:#"UUID"] isEqualToString: storedUUIDString]) {
[[UIApplication sharedApplication] cancelLocalNotification:localNotification] ; // delete the notification from the system
}
}
}
// Create a unique identifier to allow the local notification to be identified
- (NSString *)createUUID {
CFUUIDRef theUUID = CFUUIDCreate(NULL);
CFStringRef string = CFUUIDCreateString(NULL, theUUID);
CFRelease(theUUID);
return (__bridge NSString *)string;
}
Most of the above has probably been lifted from StackOverflow at sometime in the last 6months. Hope this helps

Play a paused AVAudioRecorder file

in my program I want the user to be able to:
record his voice,
pause the recording process,
listen to what he recorded
and then continue recording.
I have managed to get to the point where I can record and play the recordings with AVAudioRecorder and AVAudioPlayer. But whenever I try to record, pause recording and then play, the playing part fails with no error.
I can guess that the reason it's not playing is because the audio file hasn't been saved yet and is still in memory or something.
Is there a way I can play paused recordings?
If there is please tell me how
I'm using xcode 4.3.2
If you want to play the recording, then yes you have to stop recording before you can load the file into the AVAudioPlayer instance.
If you want to be able to playback some of the recording, then add more to the recording after listening to it, or say record in the middle.. then you're in for some trouble.
You have to create a new audio file and then combine them together.
This was my solution:
// Generate a composition of the two audio assets that will be combined into
// a single track
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// grab the two audio assets as AVURLAssets according to the file paths
AVURLAsset* masterAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.masterFile] options:nil];
AVURLAsset* activeAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.newRecording] options:nil];
NSError* error = nil;
// grab the portion of interest from the master asset
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, masterAsset.duration)
ofTrack:[[masterAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
// report the error
return;
}
// append the entirety of the active recording
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, activeAsset.duration)
ofTrack:[[activeAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:masterAsset.duration
error:&error];
if (error)
{
// report the error
return;
}
// now export the two files
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession)
{
// report the error
return;
}
NSString* combined = #"combined file path";// create a new file for the combined file
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:combined]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// export status changed, check to see if it's done, errored, waiting, etc
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
NSError* error = nil;
// your code for dealing with the now combined file
}];
I can't take full credit for this work, but it was pieced together from the input of a couple of others:
AVAudioRecorder / AVAudioPlayer - append recording to file
(I can't find the other link at the moment)
We had the same requirements for our app as the OP described, and ran into the same issues (i.e., the recording has to be stopped, instead of paused, if the user wants to listen to what she has recorded up to that point). Our app (project's Github repo) uses AVQueuePlayer for playback and a method similar to kermitology's answer to concatenate the partial recordings, with some notable differences:
implemented in Swift
concatenates multiple recordings into one
no messing with tracks
The rationale behind the last item is that simple recordings with AVAudioRecorder will have one track, and the main reason for this whole workaround is to concatenate those single tracks in the assets (see Addendum 3). So why not use AVMutableComposition's insertTimeRange method instead, that takes an AVAsset instead of an AVAssetTrack?
Relevant parts: (full code)
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
/* App allows volunteers to record newspaper articles for the
blind and print-impaired, hence the name.
*/
var articleChunks = [AVURLAsset]()
func concatChunks() {
let composition = AVMutableComposition()
/* `CMTimeRange` to store total duration and know when to
insert subsequent assets.
*/
var insertAt = CMTimeRange(start: kCMTimeZero, end: kCMTimeZero)
repeat {
let asset = self.articleChunks.removeFirst()
let assetTimeRange =
CMTimeRange(start: kCMTimeZero, end: asset.duration)
do {
try composition.insertTimeRange(assetTimeRange,
of: asset,
at: insertAt.end)
} catch {
NSLog("Unable to compose asset track.")
}
let nextDuration = insertAt.duration + assetTimeRange.duration
insertAt = CMTimeRange(start: kCMTimeZero, duration: nextDuration)
} while self.articleChunks.count != 0
let exportSession =
AVAssetExportSession(
asset: composition,
presetName: AVAssetExportPresetAppleM4A)
exportSession?.outputFileType = AVFileType.m4a
exportSession?.outputURL = /* create URL for output */
// exportSession?.metadata = ...
exportSession?.exportAsynchronously {
switch exportSession?.status {
case .unknown?: break
case .waiting?: break
case .exporting?: break
case .completed?: break
case .failed?: break
case .cancelled?: break
case .none: break
}
}
/* Clean up (delete partial recordings, etc.) */
}
This diagram helped me to get around what expects what and inherited from where. (NSObject is implicitly implied as superclass where there is no inheritance arrow.)
Addendum 1: I had my reservations regarding the switch part instead of using KVO on AVAssetExportSessionStatus, but the docs are clear that exportAsynchronously's callback block "is invoked when writing is complete or in the event of writing failure".
Addendum 2: Just in case if someone has issues with AVQueuePlayer: 'An AVPlayerItem cannot be associated with more than one instance of AVPlayer'
Addendum 3: Unless you are recording in stereo, but mobile devices have one input as far as I know. Also, using fancy audio mixing would also require the use of AVCompositionTrack. A good SO thread: Proper AVAudioRecorder Settings for Recording Voice?
RecordAudioViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>
#interface record_audio_testViewController : UIViewController <AVAudioRecorderDelegate> {
IBOutlet UIButton * btnStart;
IBOutlet UIButton * btnPlay;
IBOutlet UIActivityIndicatorView * actSpinner;
BOOL toggle;
//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
NSError * error;
}
#property (nonatomic,retain)IBOutlet UIActivityIndicatorView * actSpinner;
#property (nonatomic,retain)IBOutlet UIButton * btnStart;
#property (nonatomic,retain)IBOutlet UIButton * btnPlay;
- (IBAction) start_button_pressed;
- (IBAction) play_button_pressed;
#end
RecordAudioViewController.m
#synthesize actSpinner, btnStart, btnPlay;
- (void)viewDidLoad {
[super viewDidLoad];
//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;
//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record.
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];
}
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setTitle:#"Stop Recording" forState: UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
//Begin the recording session.
//Error handling removed. Please add to your own code.
//Setup the dictionary object with all the recording settings that this
//Recording sessoin will use
//Its not clear to me which of these are required and which are the bare minimum.
//This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
//Now that we have our settings we are going to instanciate an instance of our recorder instance.
//Generate a temp file for use by the recording.
//This sample was one I found online and seems to be a good choice for making a tmp file that
//will not overwrite an existing one.
//I know this is a mess of collapsed things into 1 call. I can break it out if need be.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
//Setup the recorder to use this file and record to it.
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
//Use the recorder to start the recording.
//Im not sure why we set the delegate to self yet.
//Found this in antother example, but Im fuzzy on this still.
[recorder setDelegate:self];
//We call this to start the recording process and initialize
//the subsstems so that when we actually say "record" it starts right away.
[recorder prepareToRecord];
//Start the actual Recording
[recorder record];
//There is an optional method for doing the recording for a limited time see
//[recorder recordForDuration:(NSTimeInterval) 10]
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setTitle:#"Start Recording" forState:UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSLog(#"Using File called: %#",recordedTmpFile);
//Stop the recorder.
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
//The play button was pressed...
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end
RecordAudioViewController.xib
take 2 Buttons. 1 for begin recording and another for Play recording

NSSavePanel is not saving a file after sandboxing an app

I'm having a problem saving a string file with NSSavePanel after sandboxing the app for the Mac App Store. I set com.apple.security.files.user-selected.read-write to YES and the NSOpenPanel is working as it should.
When I try to save a new file, though, it seems that everything is working fine but then there is no saved file where it should be....
This is the code I am using to save the file:
NSSavePanel *save = [NSSavePanel savePanel];
long int result = [save runModal];
if (result == NSOKButton)
{
NSString *selectedFile = [save filename];
NSString *fileName = [[NSString alloc] initWithFormat:#"%#.dat", selectedFile];
NSString *arrayCompleto = [[NSString alloc]initWithFormat:#"bla bla bla"];
[arrayCompleto writeToFile:fileName
atomically:NO
encoding:NSUTF8StringEncoding
error:nil];
}
First of all, the -[NSSavePanel filename] selector has been deprecated. Use -[NSSavePanel URL] instead. Second, the way that the -[NSString writeToFile:atomically:encoding:error] tells you what you're doing wrong is with the error:(NSError**) argument.
You should also handle errors for file I/O in particular, because even if your code is 100% correct, there still might be errors on the user's system (insufficient privileges, etc.) and presenting the error to the user will allow them to see it failed (and have some idea why). Handling the error in code will also allow your app to recover. For instance, if you tried to read in the file below the code you pasted (after writing it to disk), but the user tried writing it to a network share they didn't have access to, your app might crash. If you know the write failed, you can proceed accordingly (perhaps prompting for a different save location).
In this case, though, I believe the following line is your problem:
NSString *fileName = [[NSString alloc] initWithFormat:#"%#.dat", selectedFile];
When your app is sandboxed, the user needs to give you permission for either a specific file or a specific directory through the open/save panels to bring them into your sandbox. What you're doing is taking the file the user gave you permission to write and saying "that's great, but I want to save a different file", which violates the sandbox. What you should do instead is set the extension in the Save Panel. The complete fixed solution would be:
NSSavePanel *save = [NSSavePanel savePanel];
[save setAllowedFileTypes:[NSArray arrayWithObject:#"dat"]];
[save setAllowsOtherFileTypes:NO];
NSInteger result = [save runModal];
if (result == NSOKButton)
{
NSString *selectedFile = [[save URL] path];
NSString *arrayCompleto = #"bla bla bla";
NSError *error = nil;
[arrayCompleto writeToFile:selectedFile
atomically:NO
encoding:NSUTF8StringEncoding
error:&error];
}
if (error) {
// This is one way to handle the error, as an example
[NSApp presentError:error];
}
If in the future something else is wrong, you can check the value of error at runtime. While debugging, set a breakpoint inside the if (error) statement to check error object's value (do a po error in Xcode's debugger). That should help you figure out what's wrong.

Calling -[NSFileManager setUbiquitous:itemAtURL:destinationURL:error:] never returns

I have a straightforward NSDocument-based Mac OS X app in which I am trying to implement iCloud Document storage. I'm building with the 10.7 SDK.
I have provisioned my app for iCloud document storage and have included the necessary entitlements (AFAICT). The app builds, runs, and creates the local ubiquity container Documents directory correctly (this took a while, but that all seems to be working). I am using the NSFileCoordinator API as Apple recommended. I'm fairly certain I am using the correct UbiquityIdentifier as recommended by Apple (it's redacted below tho).
I have followed Apple's iCloud Document storage demo instructions in this WWDC 2011 video closely:
Session 107 AutoSave and Versions in Lion
My code looks almost identical to the code from that demo.
However, when I call my action to move the current document to the cloud, I experience liveness problems when calling the -[NSFileManager setUbiquitous:itemAtURL:destinationURL:error:] method. It never returns.
Here is the relevant code from my NSDocument subclass. It is almost identical to Apple's WWDC demo code. Since this is an action, this is called on the main thread (as Apple's demo code showed). The deadlock occurs toward the end when the -setUbiquitous:itemAtURL:destinationURL:error: method is called. I have tried moving to a background thread, but it still never returns.
It appears that a semaphore is blocking while waiting for a signal that never arrives.
When running this code in the debugger, my source and destination URLs look correct, so I'm fairly certain they are correctly calculated and I have confirmed the directories exist on disk.
Am I doing anything obviously wrong which would lead to -setUbiquitous never returning?
- (IBAction)moveToOrFromCloud:(id)sender {
NSURL *fileURL = [self fileURL];
if (!fileURL) return;
NSString *bundleID = [[[NSBundle mainBundle] infoDictionary] objectForKey:#"CFBundleIdentifier"];
NSString *appID = [NSString stringWithFormat:#"XXXXXXX.%#.macosx", bundleID];
BOOL makeUbiquitous = 1 == [sender tag];
NSURL *destURL = nil;
NSFileManager *mgr = [NSFileManager defaultManager];
if (makeUbiquitous) {
// get path to local ubiquity container Documents dir
NSURL *dirURL = [[mgr URLForUbiquityContainerIdentifier:appID] URLByAppendingPathComponent:#"Documents"];
if (!dirURL) {
NSLog(#"cannot find URLForUbiquityContainerIdentifier %#", appID);
return;
}
// create it if necessary
[mgr createDirectoryAtURL:dirURL withIntermediateDirectories:NO attributes:nil error:nil];
// ensure it exists
BOOL exists, isDir;
exists = [mgr fileExistsAtPath:[dirURL relativePath] isDirectory:&isDir];
if (!(exists && isDir)) {
NSLog(#"can't create local icloud dir");
return;
}
// append this doc's filename
destURL = [dirURL URLByAppendingPathComponent:[fileURL lastPathComponent]];
} else {
// get path to local Documents folder
NSArray *dirs = [mgr URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask];
if (![dirs count]) return;
// append this doc's filename
destURL = [[dirs objectAtIndex:0] URLByAppendingPathComponent:[fileURL lastPathComponent]];
}
NSFileCoordinator *fc = [[[NSFileCoordinator alloc] initWithFilePresenter:self] autorelease];
[fc coordinateWritingItemAtURL:fileURL options:NSFileCoordinatorWritingForMoving writingItemAtURL:destURL options:NSFileCoordinatorWritingForReplacing error:nil byAccessor:^(NSURL *fileURL, NSURL *destURL) {
NSError *err = nil;
if ([mgr setUbiquitous:makeUbiquitous itemAtURL:fileURL destinationURL:destURL error:&err]) {
[self setFileURL:destURL];
[self setFileModificationDate:nil];
[fc itemAtURL:fileURL didMoveToURL:destURL];
} else {
NSWindow *win = ... // get my window
[self presentError:err modalForWindow:win delegate:nil didPresentSelector:nil contextInfo:NULL];
}
}];
}
I don't know if these are the source of your problems, but here are some things I'm seeing:
-[NSFileManager URLForUbiquityContainerIdentifier:] may take a while, so you shouldn't invoke it on the main thread. see the "Locating the Ubiquity Container" section of this blog post
Doing this on the global queue means you should probably use an allocated NSFileManager and not the +defaultManager.
The block passed to the byAccessor portion of the coordinated write is not guaranteed to be called on any particular thread, so you shouldn't be manipulating NSWindows or presenting modal dialogs or anything from within that block (unless you've dispatched it back to the main queue).
I think pretty much all of the iCloud methods on NSFileManager will block until things complete. It's possible that what you're seeing is the method blocking and never returning because things aren't configured properly. I'd double and triple check your settings, maybe try to simplify the reproduction case. If it still isn't working, try filing a bug or contacting DTS.
Just shared this on Twitter with you, but I believe when using NSDocument you don't need to do any of the NSFileCoordinator stuff - just make the document ubiquitous and save.
Hmm,
did you try not using a ubiquity container identifier in code (sorry - ripped out of a project so I've pseudo-coded some of this):
NSFileManager *fm = [NSFileManager defaultManager];
NSURL *iCloudDocumentsURL = [[fm URLForUbiquityContainerIdentifier:nil] URLByAppendingPathComponent:#"Documents"];
NSURL *iCloudFileURL = [iCloudDocumentsURL URLByAppendingPathComponent:[doc.fileURL lastPathComponent]];
ok = [fm setUbiquitous:YES itemAtURL:doc.fileURL destinationURL:iCloudRecipeURL error:&err];
NSLog(#"doc moved to iCloud, result: %d (%#)",ok,doc.fileURL.fileURL);
And then in your entitlements file:
<key>com.apple.developer.ubiquity-container-identifiers</key>
<array>
<string>[devID].com.yourcompany.appname</string>
</array>
Other than that, your code looks almost identical to mine (which works - except I'm not using NSDocument but rolling it all myself).
If this is the first place in your code that you are accessing iCloud look in Console.app for a message like this:
taskgated: killed yourAppID [pid 13532] because its use of the com.apple.developer.ubiquity-container-identifiers entitlement is not allowed
Anytime you see this message delete your apps container ~/Library/Containers/<yourAppID>
There may also be other useful messages in Console.app that will help you solve this issue.
I have found that deleting the app container is the new Clean Project when working with iCloud.
Ok, So I was finally able to solve the problem using Dunk's advice. I'm pretty sure the issue I was having is as follows:
Sometime after the WWDC video I was using as a guide was made, Apple completed the ubiquity APIs and removed the need to use an NSFileCoordinator object while saving from within an NSDocument subclass.
So the key was to remove both the creation of the NSFileCoordinator and the call to -[NSFileCoordinator coordinateWritingItemAtURL:options:writingItemAtURL:options:error:byAccessor:]
I also moved this work onto a background thread, although I'm fairly certain that was not absolutely required to fix the issue (although it was certainly a good idea).
I shall now submit my completed code to Google's web crawlers in hopes of assisting future intrepid Xcoders.
Here's my complete solution which works:
- (IBAction)moveToOrFromCloud:(id)sender {
NSURL *fileURL = [self fileURL];
if (!fileURL) {
NSBeep();
return;
}
BOOL makeUbiquitous = 1 == [sender tag];
if (makeUbiquitous) {
[self displayMoveToCloudDialog];
} else {
[self displayMoveFromCloudDialog];
}
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self doMoveToOrFromCloud:makeUbiquitous];
});
}
- (void)doMoveToOrFromCloud:(BOOL)makeUbiquitous {
NSURL *fileURL = [self fileURL];
if (!fileURL) return;
NSURL *destURL = nil;
NSFileManager *mgr = [[[NSFileManager alloc] init] autorelease];
if (makeUbiquitous) {
NSURL *dirURL = [[MyDocumentController instance] ubiquitousDocumentsDirURL];
if (!dirURL) return;
destURL = [dirURL URLByAppendingPathComponent:[fileURL lastPathComponent]];
} else {
// move to local Documentss folder
NSArray *dirs = [mgr URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask];
if (![dirs count]) return;
destURL = [[dirs firstObject] URLByAppendingPathComponent:[fileURL lastPathComponent]];
}
NSError *err = nil;
void (^completion)(void) = nil;
if ([mgr setUbiquitous:makeUbiquitous itemAtURL:fileURL destinationURL:destURL error:&err]) {
[self setFileURL:destURL];
[self setFileModificationDate:nil];
completion = ^{
[self hideMoveToFromCloudDialog];
};
} else {
completion = ^{
[self hideMoveToFromCloudDialog];
NSWindow *win = [[self canvasWindowController] window];
[self presentError:err modalForWindow:win delegate:nil didPresentSelector:nil contextInfo:NULL];
};
}
dispatch_async(dispatch_get_main_queue(), completion);
}