Porting CLI/GUI Windows program to OS X - objective-c

I have a Windows program that has a GUI which also uses a command line interface (a cmd Window) as a debugging console. Basically, when it is double clicked, it launches a command line window and then the program creates all the GUI windows. Then you'd have two Windows: the main GUI and a debugging console.
I'm trying to port this pogram to OS X. Because OS X (and all Unix OSs for that matter) doesn't automatically launch a command line window when you run a command line application. So, I obviously need another way to port this application.
My initial thought was simply to import the source code into a XCode project, redirect standard input and output and then port the GUI. The GUI and console would run side by side just like in Windows. I don't think this is the most optimal solution since that would mean I'd essentially have to write a terminal emulator.
My other thought would be to port the application as a command line application which creates its GUI just like in Windows. The application would then have to be run from Terminal.app which could handle all the I/O. Unfortunately, I don't think you can use the Cocoa framework without using a NSApplication loop.
Any ideas how I could approach this?

You can of course create a run loop from a terminal-launched app. But that generally isn't what you want to do.
It sounds like on Windows, the CLI is just being used as a shortcut to creating a real debugging console window. So the simplest answer is to create a debugging console window. It's pretty easy to create a window which contains just a multi-line text or list view. (If you want something fancier, consider borrowing code from iTerm2 or other open source projects instead of trying to build a complete terminal.) If your debug information is being printed with some fancy macros, just change the macros to log to your list view.
If you're directly doing something like fprintf or syslog to do your logging, it might be simpler to create a wrapper app that launches the main app, and the wrapper creates the debugging console window and displays the main app's stdout and/or stderr. (This might be as simple as using popen.)

Related

Objective equivalent to AppleScript : Opening directory without bringing every finder window to from

There this AppleScript
tell application "Google Chrome" to set index of window 1 to 1
do shell script "open /Volumes"
Which opens a directory in Finder without bringing every onther Finder windows to the front.
Currently I'm using :
[[NSWorkspace sharedWorkspace] openURL:fileURL];
But it has the flaw to bring every Finder windows on the top of others.
Any idea how I could achieve the same behaviour as the AppleScript ?
You can always use NSAppleScript to run applescript code in Objective-C if Cocoa doesn't provide the functionality you want.
At a guess, -[NSWorkspace openURL:] also sends the application an activate event whereas the open process does not.
I'd recommend looking into the LaunchServices API. It's what both NSWorkspace and open use behind the scenes, but gives you more control than NSWorkspace's limited API.
--
p.s. If you do have to call out to open (or any other command line tool) from ObjC, you should use Cocoa's NSTask. (AppleScript's do shell script command is just its [crappy] equivalent of NSTask.)

Can you attach a drawer to another application in Cocoa?

Is there a way for one Cocoa application to attach drawer-like windows to another application? We might for example want a terminal drawer that followed around a particular Finder window.
There is a program called DTerm that opens little transparent windows over Finder windows, but one might prefer persistence.
You may want to checkout SIMBL. It allows you to write nifty bundles that are loaded into the application your targeting. If you go along with it I'd reccomend using class dump to gather more information on the application your working with (although Im not sure it would work with Finder)

Windows Shell Context Menu recreated with qt

Is there a way to query from qt the entries of the shell context menu (name and command)? Only if the application is run on Windows of course.
This is very similar to this question, basically SHParseDisplayName+SHBindToParent to get IShellFolder, then call GetUIObjectOf on that to get IContextMenu. That is the "native" way to do it, not sure if qt has any wrappers you can use.

Capture text in Xcode's console window

My problem is that i wish to capture the text that is displayed in the
console of Xcode when an application is executed and display it in a
text box in my App.
If i override NSLog i can just capture the explicit NSLog commands
that are issued in the course of the program. However many statements
that are just inserted by the compiler are not captured.
Is there a way to read the Xcode Console buffer while the app is
running and display it in the app too ??
What you see in the log window of Xcode is a composite of the messages that would normally go to standard out and standard error file streams and to the system log. If you want to capture those streams, you need to close them and reopen them as pipes or files.
If you do this, the documentation says that if you redirect standard error from the default, NSLog will log to that as well as the console. Thus you don't need to override it.
Redirecting standard error and standard out is a fairly common thing to do in Unix. The basic technique to redirect to a file is to close the file descriptor using close(2) and then reopen it using open(2) or pipe(2).
The Xcode Console is just a window that reads errors and such from the Console logs. Try reading from there.

Why do AppleScript "tell" commands run a non-GUI instance of my GUI application in the background?

I'm writing a standard Cocoa application, and I've just started implementing AppleScript support for it. I've defined and implemented a property called testprop for my top-level application scripting class, and accessing it works just fine: if I launch an instance of my app and run the following AppleScript in Script Editor, I get the output I expect:
tell application "MyApp"
testprop
end tell
However if I run this very same AppleScript in the Script Editor when my app is not running, it returns the last known value for this property, and continues to return it for subsequent calls. I don't see an instance of my app getting started anywhere in the GUI.
After I noticed this, I ran "ps xawww | grep MyApp" in the shell, which told me that a process had been created using my app's main executable, with an argument that looks something like this: -psn_0_323663 (the number at the end changes each time this process is started -- I gather that it's the "process serial number" that AppleScript (among others) uses to keep track of and control processes).
What is going on here? How can I prevent this from happening (i.e. launch my app as a full, proper GUI-enabled instance when AppleScript "tell" commands for it are run)?
Edit:
The above seems to occur only on my laptop. When I try exactly the same thing on my Mac Mini (the OS version is the same on both: 10.5.8), I simply get an error message:
$ osascript -e "tell application \"MyApp\"" -e "testprop" -e "end tell"
26:40: execution error: The variable testprop is not defined. (-2753)
I don't think it's running a "non-GUI" instance, it just hides the app. You could add the line "activate" to the applescript to get the app to become active, in which case you'll see the windows and menu.