I need to programmatically minimize my iPad application. exit(0) didn't do what I required. Is this possible and if so how? I don't need to worry about AppStore rejections.
Apple do not recommend (and don't allow, will reject your app during review) an iOS application programatically quit.
You should read Apple's Technical Q&A at http://developer.apple.com/library/ios/#qa/qa1561/_index.html
And also "Don’t Quit Programmatically" in the User Experience Guidelines
You may argue that minimizing is not quit, but the principle is Apple wants the user have the ultimate decision on leaving an app.
It is true, you can only "minimize" the app problematically - because it is very problematic to just quit on the user programmatically.
For sending your app to the background, this is an old problem e.g. discussed here.
Related
An interesting feature in the new iTunes is it's inability to accept debugger processes that are attached to it (crippling tools like F-Script) Not only would this involve a detection method, but it would require some kind of process that was either checking for the debugger attaching itself mid-run, or an entry-point method that the debugger would emit when it attempts to attach itself. In addition, it would need a way to tell the debugger to go away (as it were) without terminating the process. The question is: How? Clearly, polling for a debugger every X number of seconds is inefficient, and not allowing it to attach to a given process (sans override like ptrace()) seems intensely private.
iTunes is calling ptrace(PT_DENY_ATTACH) which sets the P_LNOATTACH flag which stops debuggers (and other processes, e.g. F-Script and DTrace) from attaching to the process.
See Is it possible to conceal a OS X app from DTrace? for more information.
I wouldn't be surprised if iTunes is also actively using detection methods to identify debuggers. Apple have gone to great lengths to try to protect the DRM in iTunes.
There are a number of books that have methods of securing Cocoa applications, including detecting debuggers. Some potential titles that spring to mind (I haven't double checked the contents of these so don't assume they have detection methods): "Mac Hacker's Handbook", "Hacking and Securing iOS Applications", "Professional Cocoa Application Security" and
"Secure Programming Cookbook for C & C++".
"Mac OS X Internals" and "Mac OS X and iOS Internals" might have something on PT_DENY_ATTACH.
Is there a way to tell iOS that the Accessibility feature Speak Selection should be turned on?
I thought there might be a way similar to location service where as soon as you start using it, iOS asks the user to turn it on.
I know that Speak Selection is used in a more 'passive' way, but perhaps someone knows the trick, especially now that you cannot open the Settings app using "prefs:root..." anymore since iOS 5.1
I need to create an application capable to modify and manage files on IOS.
With IOS 5 is "easy" to create a Document-Based Application, but I need to support IOS 4 too.
Anyone knows if there is a way to create a Document-Based Applications in IOS 4?
Thanks in advance.
Short answer is no as UIDocument and UIManagedDocument only arrived with iOS 5
Long answer is yes. There are hundreds of document based apps for iOS. e.g Brushes, Sketchbook Pro which are all document based apps. My own app is document based, its not that hard to do.
What UIDocument/UIManagedDocument provides is a canned API for making a generic document. Feed it a URL and it does (most of) the rest of the housekeeping.
If you wish to do an iOS4 based app then stuff you will need to pay attention to is.
UIApplicationWillTerminateNotification/UIApplicationDidEnterBackgroundNotification
Opening a new document.
Saving a document.
Shutting a document.
Autosave (maybe)
Core Data stack , if you're using Core Data
www.raywenderlich.com has some great tutorials. Maybe even iOS4 based ones still.
IMO - Don't bother with iOS4 support. Like above post states 85% use iOS5 and anyone still on iOS4 probably isn't in your target market. Especially as this is (I assume) a new app and iOS6 will be around by the time you go to market.
Just going to ask, do you really need to support iOS 4?
The adoption of iOS is over 85% of devices.
Whats the basis for the need to continue iOS 4 support?
We're in process of developing a desktop application which needs to record user's screen once he clicks a button. I read a tutorial about Adobe AIR, which says it is easy to do with AIR: http://www.adobe.com/devnet/air/flex/articles/air_screenrecording.html
But our preference is Titanium as we've explored it a little bit. So I want to know is that even possible? If yes, how can we get started with?
There's also an interesting solution which uses Java applet for recording, as demonstrated here: http://www.screencast-o-matic.com/create?step=info&sid=default&itype=choose
But again, we're not sure about JAVA and would like to know how can it be done? or if its even possible to run a JAVA applet in Titanium?
When you say "record screen", I'm assuming you mean video. Correct?
The only way to do this in Titanium Desktop right now is to take a bunch of screenshots and string them together (encoding would probably need to be done server-side).
Depending on how long your videos need to be, this probably won't work for you. I'm also not confident in how quickly you could capture screenshots, and if it would have a high enough frame rate to be usable.
Past that, a module could be developed for Desktop to support some native APIs to record video. That's not something I see on the horizon, though.
I hope this helps, albiet a rather dismal answer. -Dawson
I was wondering if and in how many way an app can access specific funcions of another app.
for example
open an url in safari/firefox/chrome
run a javascript in current browser-tab
play/pause itunes
rename selected files in Finder
I am aware of the existence of applescript but i was wondering if that's the only way i have to interact with those apps and others
thanks
There are three main ways an app exposes its function to the outside world.
One is by supporting an URL protocol. To open an URL, just use NSWorkspace. There are many methods; if an app registers a specific protocol like x-my-app://some-work, you can just do
[[NSWorkspace sharedWorkspace] openURL:[NSURL URLWithString:#"x-my-app://some-work"] ];
If you want to open an URL whose protocol (say http) is supported by many apps and if you want to specify which app to use, use openURLs:withAppBundleIdentifier:options:additionalEventParamDescriptor:launchIdentifiers:
.
Another is the System Services. With this, an app can add entries in the Service menu and in the context menu of other apps; you can also call it programmatically.
Otherwise, it's via Apple events. Applescript is one way to deal with them, but not the only one. It's just a language to issue Apple events. There are many ways to deal with Apple events from Cocoa, see this detailed document by Apple.
Basically, an app can export its internal as an object-oriented manner (which is not just its Objective-C hierarchy; you can control how much of its internal objects and methods you expose, etc.) by an sdef file. Then, another app can use this object-oriented system via Apple events.
To send and receive Apple events, you can of course construct them by hand, but you can use higher-level objects like
Applescript via NSAppleScript
Scripting Bridge
or AppScript.
To learn what kind of aspects an app exposes, just open the AppleScript Editor and choose the menu File → Open Dictionary, and choose an app.
Now, it's rather hard to use features of an app which the app does not expose via any of these methods. You still have a few workaround.
UI Scripting. This is done by sending Apple Events to a headless app called System Events which is one of the core program in OS X. This way, you can programmatically emulate clicking a button, choosing a menu, etc. of another app. So, almost whatever you can do using GUI with another app can be done programmatically from another app. To see the hierarchy of UI objects accessible from UI scripting, use a utility which comes with XCode tools, at
/Developer/Applications/Utilites/Accessibility Tools/Accessibility Inspector.app
This is very rudimentary but does the job; if you regularly use UI scripting, consider obtaining UI browser, as Zygmunt suggests.
Finally, if you want to use a non-GUI non-exposed feature of another app, you can inject a code into another app.
Just expanding on Yuji's answer. If you were forced to go the UI scripting path, there's a nice application to analyze the interface - hxxp://pfiddlesoft.com/uibrowser/. However, the examples you mentioned should expose some APIs.
I might also recommend using Sikuli hxxp://groups.csail.mit.edu/uid/sikuli/ as an IDE to script around user interface robustly.
For some applications usually coming from GNU/Linux there is D-BUS hxxp://en.wikipedia.org/wiki/D-Bus - although I haven't used it on a Mac on my own yet. And let me also quote Wikipedia about Cocoa "It is one of five major APIs available for Mac OS X; the others are Carbon, POSIX (for the BSD environment), X11 and Java." hxxp://en.wikipedia.org/wiki/Cocoa_%28API%29 That's just a loose tip for further exploration as Yuji has already explained Apple events that are key to your question.