Xcode cocoa - dynamically change the checkbox bool - objective-c

I want to write test_application for MacOS which will show whether IM client is running. This test_application also will have ability to start and kill IM Messenger (checkbox on/off)
I understand how to run / kill application with the help of push-buttons and now i want to show status of IM (is it running or not) and show checkbox ON or OFF depending on it
I suppose, i need to use some system call like "ps -aux processname" and parse it or use some API from Cocoa - but i can't understand how to get that information to test_application and how to do it outside any method (i want test_application to load initial information at it start, so if i open test_application it looks whether IM Messenger is running and makes checkbox ON or OFF without any clicks)

You can have a look at GetBSDProcessList and the Process Manager Reference to get running processes and daemons.
You could also use NSAppleScript and some AppleScript to launch the application or NSTask along with ps -aux processname and grep

Related

How can i keep focus on my own application or regain it when lost in delphi? [duplicate]

I need to create a simple Delphi application, kiosk style.
It is a very simple thing, a single form where the user writes some personal info to register to an event. 4 TEdit and a TButton.
What I want to achieve is to avoid the user does any action different then typing in TEdit or clicking on the TButton. For example I don't want he does ALT TAB (switchin applications), pressing windows key on keyboard, doing ctrl-alt-canc, etc...
I can add a passowrd protected Button that enables/disables this "Kiosk mode", in this way as I need to exit the kiosk mode I simply press that button and exit.
How to achieve this "kiosk mode" in Delphi without intercepting all the keystrokes manually? Or did anyone already develop this so it can be shared?
I think you'd better create a new desktop, and run your app in there. When your app is done, you can bring back user's desktop. That is how Windows login screen works. Of course Windows login screen uses a special secure desktop. Your app in a separate desktop would be isolated. You will have a desktop background with no start menu, taskbar, or desktop icons because explorer.exe is not running there automatically. Of course a can start a new process, using Task Manager, but desktops in Windows are securable objects; therefore, you can make restrictions if you want. Of course if your app has sufficient permissions.
To create a new desktop, you can use CreateDesktop Windows API, and to switch to the newly created desktop, you can use OpenDesktop function.
You can try Change the Windows Shell.
When you start windows, you don't execute the default shell (explorer.exe), you can execute your application.
Al internet you can find alternative Shell (more attractive) to default windows like:
BlueBox or
SharpE
This option is used for purposes similars at the application that you are developing. Kiosks or TPV.
For change the default applicacion you must modify a registry key:
In Win3.x and Win9x, SYSTEM.INI file:
[boot]
shell=MiAplicacion.exe
In Win2k and WinXP, use Registry:
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon]
Shell=MiAplicacion.exe
If you test this option, think the mode to turn the configuration to the original value (button or option). You must reboot to test changes.
ADDED: In addition, if you search on the web some similar at this "Delphi Change default windows shell", you can find more code, samples and information about this.
Regards
P.D: Excuse me for mistakes with english.
Well but if someone can open the taskmgr he could just create a new task and run explorer.exe from there so its not really secure though...
Ok Taskmgr can be stopped with policies...
Well and for disabling the cad sequence you can use saslibex which Remko Weijnen had created you can find it here: SASLibEx
kindest regrads,
s!

Cocoa and AppleScript / Apple Events

I'm using Cocoa’s [[NSWorkspace sharedWorkspace] launchApplicationAtURL… with the NSWorkspaceLaunchNewInstance option to spawn new instances of an AppleScriptable application (Adobe Acrobat), and I want to be able to trigger different Apple Events (do script, quit, save etc…) for each instance.
So far I’ve tried to AppleScript ”System Events” and tell commands based on a new process’ id but for some reason the commands aren't executed by the target process.
I'm getting the process id as [NSRunningApplication processIdentifier] and use that to compile an applescript by [[[NSAppleScript alloc] initWithSource: AppleScript] executeAndReturnError: nil]. The string representation of the AppleScript is something like the following:
tell application "System Events"
tell (process id [insert process id here]) to do script "this.preflight(Preflight.getProfileByName('Magazine Ads'),false,false);"
end tell
I'm suspecting that the processIdentifier returned by NSRunningApplication is different from the process id used by ”System Events”, but I'm stuck and don't know where to look to get any further. I need a pointers on how to trigger AppleScriptable methods of specific application process from Cocoa, given that
there can be several instances of the same application running and
each process that I want to communicate with will be created within the scope of my code
(Running new processes of Adobe Acrobat is necessary to allow the user to do other work while a preflight is running.)
Edit: The process id returned by cocoa and AppleScript are different:
tell application "System Events" to set process_id to id of every process whose name contains "AdobeAcrobat"
returned {5584211,…}, while at the same time
[NSRunningApplication processIdentifier]
returned 8722
Edit 2: The AppleEvent object does make it possible to address a process with a certain process id but I haven't been able to figure out how to apply it to an AppleScript object.
pid_t process_id = …;
NSAppleEventDescriptor* appleevent = [[NSAppleEventDescriptor alloc] initWithDescriptorType:typeKernelProcessID bytes:&process_id length:sizeof(pid)];
I still haven't figured out how NSAppleEventDescriptor be used to access AppleScriptable methods of a process with the given process_id. Any pointers to resources and possibly an example of this would be a perfectly answer to my question.
This does not work because this command is for Acrobat.
You must use the Processes suite commands ("System Events") when you want to simulate (click or the keyboard), get or set (value, property or attribute) on the `GUI elements of the process.
Here is the syntax for using the command do script
tell application id "process.id.here" to do script "this.preflight(Preflight.getProfileByName('Magazine Ads'),false,false);"
Edit :
I thought it was the bundle identifier.
It's not possible to execute a do script by "System Events".
But it is possible, if you have this script in a menu of the application.
You need to find the correct application ID, try unix ID.
To do what you want, put the application in the foreground and run the script
using terms from application "Adobe Acrobat"
tell application (path to frontmost application)
do script "this.preflight(Preflight.getProfileByName('Magazine Ads'),false,false);"
end tell
end using terms from
The Processes suite : Here is a sample script that click on a menu.
tell application "System Events"
tell (first process whose unix id is (processIdHere as integer))
set frontmost to true
click menu item "Crop to TrimBox*" of menu "Document" of menu bar 1
end tell
end tell

Can a WinRT app continue to run while Screen Off?

Can a WinRT app continue to run while Screen Off?
I know that a WinRT application can create a Background Task that periodically executes even when the application is not running. Handy, but not what I am asking. What I am asking is, when the user clicks the power button and invokes Connected Standby, is there anything an app can do to remain active. Can it ask for some special capability?
Example - in Windows Phone there is a handy Running and Walking app that keeps track of "where you are" while it is running - then tallies your distances, etc. Even when the screen is off! Turn the screen on and the "where was I" map is up-to-date. Is this type of application possible in WinRT?
I've been looking into the same thing recently, and unfortunately it seems that what you want to do isn't possible with WinRT.
Why don't you use Background task to simulate what you are trying to achieve. When the user starts the app again, you could have the info populated to the latest data by looking at the store where the background process updated. Just a thought.

Porting CLI/GUI Windows program to OS X

I have a Windows program that has a GUI which also uses a command line interface (a cmd Window) as a debugging console. Basically, when it is double clicked, it launches a command line window and then the program creates all the GUI windows. Then you'd have two Windows: the main GUI and a debugging console.
I'm trying to port this pogram to OS X. Because OS X (and all Unix OSs for that matter) doesn't automatically launch a command line window when you run a command line application. So, I obviously need another way to port this application.
My initial thought was simply to import the source code into a XCode project, redirect standard input and output and then port the GUI. The GUI and console would run side by side just like in Windows. I don't think this is the most optimal solution since that would mean I'd essentially have to write a terminal emulator.
My other thought would be to port the application as a command line application which creates its GUI just like in Windows. The application would then have to be run from Terminal.app which could handle all the I/O. Unfortunately, I don't think you can use the Cocoa framework without using a NSApplication loop.
Any ideas how I could approach this?
You can of course create a run loop from a terminal-launched app. But that generally isn't what you want to do.
It sounds like on Windows, the CLI is just being used as a shortcut to creating a real debugging console window. So the simplest answer is to create a debugging console window. It's pretty easy to create a window which contains just a multi-line text or list view. (If you want something fancier, consider borrowing code from iTerm2 or other open source projects instead of trying to build a complete terminal.) If your debug information is being printed with some fancy macros, just change the macros to log to your list view.
If you're directly doing something like fprintf or syslog to do your logging, it might be simpler to create a wrapper app that launches the main app, and the wrapper creates the debugging console window and displays the main app's stdout and/or stderr. (This might be as simple as using popen.)

Why do AppleScript "tell" commands run a non-GUI instance of my GUI application in the background?

I'm writing a standard Cocoa application, and I've just started implementing AppleScript support for it. I've defined and implemented a property called testprop for my top-level application scripting class, and accessing it works just fine: if I launch an instance of my app and run the following AppleScript in Script Editor, I get the output I expect:
tell application "MyApp"
testprop
end tell
However if I run this very same AppleScript in the Script Editor when my app is not running, it returns the last known value for this property, and continues to return it for subsequent calls. I don't see an instance of my app getting started anywhere in the GUI.
After I noticed this, I ran "ps xawww | grep MyApp" in the shell, which told me that a process had been created using my app's main executable, with an argument that looks something like this: -psn_0_323663 (the number at the end changes each time this process is started -- I gather that it's the "process serial number" that AppleScript (among others) uses to keep track of and control processes).
What is going on here? How can I prevent this from happening (i.e. launch my app as a full, proper GUI-enabled instance when AppleScript "tell" commands for it are run)?
Edit:
The above seems to occur only on my laptop. When I try exactly the same thing on my Mac Mini (the OS version is the same on both: 10.5.8), I simply get an error message:
$ osascript -e "tell application \"MyApp\"" -e "testprop" -e "end tell"
26:40: execution error: The variable testprop is not defined. (-2753)
I don't think it's running a "non-GUI" instance, it just hides the app. You could add the line "activate" to the applescript to get the app to become active, in which case you'll see the windows and menu.