I'm trying to iterate through a user's photo library on OS X. On iOS, I would use the Photos framework, but apparently that's not available on OS X, and we're supposed to use the Media Library framework instead. However, while I was able to use the code linked above to get access to a MLMediaSource object, I'm having a hard time figuring out how to iterate through the photo assets.
The Media Library Framework documentation makes reference to a mediaObjectForIdentifier: method, which sounds promising, but doesn't indicate what sort of identifiers should be used. I'm fairly experienced with iOS, but a complete n00b on OS X, so I'm feeling a little lost.
If I just wanted to iterate through a user's library, NSLog'ing each photo, how might I go about that? Either using the Media Library Framework, or another framework so long as it works for the current Photos library?
This framework isn't hard to work with, but it is tedious because it async/lazy loads the properties, and KVO is the only want to be notified about the async completion.
If you want to iterate the photos, you don't need to know the identifiers in advance.
Create a media library:
self.library = [[MLMediaLibrary alloc] initWithOptions:options];
Add a KVO observer for mediaSources. Access mediaSources, if non-nil, go to the next step, otherwise go to the next step when your KVO observer fires.
In the next step, iterate the sources, add a KVO observer on rootMediaGroup, and access rootMediaGroup on each source. If it is non-nil, call your iterator now, otherwise call it from the KVO notification handler.
For each rootMediaGroup, follow the same strategy as above, but for mediaObjects. The media objects are the things you are ultimately after.
Related
This is partly a question regarding the Google Maps SDK in iOS, though perhaps mostly a question about atomic operations in Objective-C.
In my app, I wish to draw a GMSPolyline on a map between the user's current location and a selected destination. I can retrieve the user's current location using the myLocation property on a GMSMapView. According to the documentation for myLocation:
If My Location is enabled, reveals where the user location dot is
being drawn.
If it is disabled, or it is enabled but no location data is available,
this will be nil. This property is observable using KVO.
I have code to draw the polyline which takes the form:
if (myMapView.myLocation) {
// draw polyline between myMapView.myLocation and the selected destination
}
My concern is that, however remote the possibility, that between the check if (myMapView.myLocation) and drawing the polyline // draw polyline between myMapView.myLocation and the selected destination, myMapView.myLocation might become nil if the location is lost at an inopportune moment.
So my question is, in Objective-C, is there a way for me to wrap both the check that myMapView.myLocation is not nil and drawing the polyline into an operation which locks access to myMapView.myLocation, so it can't be changed after the check but before attempting to draw the polyline. If Objective-C does provide a mechanism, what affect does this have if the Google library does attempt to update myLocation while it is locked? Does the update just get queued until I've finished drawing the polyline and release the lock?
So I guess this is mostly a question about atomic transactions in Objective-C, in the context of the Google Maps library.
I don't know specifically for Objective C, but based on locking in other languages, I think what you want could only work if it was done in cooperation with the third-party library (ie Google Maps).
So for example if you were to somehow lock myMapView.myLocation while using it, this would only work if the Google Maps SDK also promised to lock it while modifying it, with the same lock object that you are using. Since myMapView.myLocation can be nil, this would be unlikely to work since you couldn't use a nil object as the 'key' to lock.
Similarly if you could lock the entire GMSMapView, such that none of its properties could change while the lock was held, this could only work if the Google Maps SDK promised to take the same lock before making any modifications.
In general, it's probably not a good idea to lock arbitrary objects from third-party libraries, as this might interfere with that library's own synchronization (unless the library explicitly offers this as part of its interface). Vice versa, when writing a library for use by third parties, it's best not to implement internal locking on objects which the library makes public - instead locking should usually be implementated on internal objects - just in case a user of the library tries to lock the public object.
In your case though, you could avoid the problem by doing this:
CLLocation* location = myMapView.myLocation;
if (location)
{
// draw polyline between location and the selected destination
}
That way even if the map view sets its .myLocation property to nil in between your check and your draw, you will still have a reference to the CLLocation, which can't be set to nil.
Note though that the documentation says:
GMSMapView can only be read and modified from the main thread, similar
to all UIKit objects. Calling these methods from another thread will
result in an exception or undefined behavior.
It seems likely that GMSMapView would honour the same requirement and so only modify its public properties from the main thread. So if your code is running in the main thread (which it has to), its unlikely that the map view's properties will change in the middle of your code running.
Is there any way to detect whether QTMovie successfully loaded a movie? I.E. whether a valid component is found?
Or even better enumerate the components on launch and detect then what components have been loaded without having to provide a sample movie?
Thanks.
Have you read the QTMovie API reference? When you try to create a QTMovie object, you'll get either a valid movie object or nil. If you make use of the NSError argument (that's included on all the initializers/factory methods that create movies from files/URLs), you'll even get an explanation of what went wrong if the method returns nil. Also in that reference (handily categorized in the Tasks section), are a number of ways to get pretty much any information about the movie you want.
Regarding the "components" part of your question, I don't think QTKit gives you access to this directly. You might have to use the QuickTime.framework and dive deeper. You might be able to figure out whether your third-party-supported file type is actually supported by using the +[QTMovie movieFileTypes:] method (also found in the API reference I linked) and seeing if your file's extension appears there.
I am attempting to write an app that reads images from the asset library, modifies the image's GPS data and writes it back to the asset library. I store the assets in a mutableArray via the "enumerating assets" methods. Most of the details on how to do the various steps, I got from searching this forum. Thanks!
I have found that when I write the first "asset" via the "writeimagedatatosavedphotosalbum" method, all the elements of the mutableArray associated with the assets' URL became null. Furthermore, I noticed that writing back an image does not replace the original image, but instead creates a second instance of the image.
Just thought I'd pass these results along, in case others had questions. And, of course, I'd be interested in other's comments, observations, etc.
This forum has provided me with great information. Thanks again.
Your ALAsset object is only as good for the amount of time that your ALAssetsLibrary object is around. You either need to do everything you want in the completion block when you get the ALAsset, or store the ALAssetsLibrary in an instance variable so ARC does not deallocate it.
An ALAsset is essentially a Core Data object who can have properties accessed from multiple threads but a NSManagedObject or a subclass of NSManagedObject does not make sense without a parent NSManagedObjectContext much in the same way an ALAsset doesn't make sense without an ALAssetsLibrary.
It is common practice to store the NSManagedObjectContext on the AppDelegate; and while I abstract that functionality into a wrapper/singleton there is a retained reference to the NSManagedObjectContext throughout the app lifecycle. Apply the same logic to the ALAssetsLibrary and everything will works as expected.
I am new to mac os X development ,I downloaded an open source mac application ,but i couldn't able to understand the flow of execution of cocoa program.so any one can explain the program flow of a general cocoa program briefly.
Thanks in advance
Start in main. It's not likely to contain anything interesting, but worth checking just in case. Most probably, it will contain only a call to NSApplicationMain, which will create the NSApplication object and send it a run message. That's what gets the application running, and this method will run for the rest of the rest of the process.
Then look in the MainMenu nib. Loading this is one of the first things the application will do. Any windows here that are set as “Visible on Launch” will come up immediately; more importantly, the application delegate will probably be here. Check the application's or File's Owner's (the application is both of them in this nib, so you need to check both) delegate outlet, and if one of them is connected, follow the connection. See what class that object is an instance of.
Once you've found the application delegate class, open it up in Xcode. Look through the list of application delegate methods and find which ones are implemented, and read the ones that are. The application:…FinishLaunching: twins will be particularly important at the start of the process.
From there, it's all just reading code, seeing what it does, and going where it takes you.
Peter's answers are good - I'd also say to check for implementations of 'awakeFromNib', especially for object loaded from MainMenu.nib. You often find interesting things stashed away in that method, rightly or wrongly.
I'm observing NSWorkspaceDidDeactivateApplicationNotification notification to get the application that has just lost focus. I'm ending up with an instance of NSRunningApplication which you get from the userInfo dictionary key - NSWorkspaceApplicationKey - of the notification object.
I was thinking that I'd be able to get the main window from the app from the notification but I'm not sure where to go from here as NSRunningApplication seems to pretty limited. Any help would be appreciated.
BTW - I'm using MacRuby but the answer doesn't need to be in MacRuby.
Thanks
Apple has traditionally been pretty locked-down about this sort of thing. NSRunningApplication itself was just introduced in 10.6, and as you said, it's a bit limited. Depending on what you want to do, the answer might be in the Accessibility framework or it might be the CGWindow API. You can use the processIdentifier from the NSRunningApplication to match it up with those APIs.
It's very difficult to get the main window of other apps; they're not even guaranteed to be Cocoa! They can be either Carbon or Java, or Qt, or Mono... So there's no way you can get NSWindow of another app, unless you do a hack. You can try Accessibility API to get the window info etc. of other apps independently of the framework used, but it's not so easy to use.
Unless the application opts to participate in IAC via AppleScript support or some other means, you simply don't touch its windows or anything else outside of your own heap space.