How to register for remote notifications in iOS 7 using Swift 2? - ios7

How to register for remote notifications in iOS 7 using Swift 2?
I have tried this:
UIApplication.sharedApplication().registerForRemoteNotificationTypes(
UIRemoteNotificationType.Badge |
UIRemoteNotificationType.Sound |
UIRemoteNotificationType.Alert)
also this
UIApplication.sharedApplication().registerForRemoteNotificationTypes(.Alert | .Badge | .Sound)
But it's not working.

In Swift 2, these values are OptionSetType and you use them in an array like this:
UIApplication.sharedApplication().registerForRemoteNotificationTypes([.Alert, .Badge, .Sound])

Related

Using hexagonal architecture in embedded systems

I'm trying to work out how the hexagonal (ports and adapters) architecture might be used in the context of an embedded software system.
if I understand right, the architecture is something like this.
/-----------------\ /-----------------------------\
| | | |
| Application | | Domain |
| | | |
| +----------+ | | +---------+ |
| | +-------------->|interface| | /-------------------\
| +----------+ | | +---------+ | | |
| | | ^ | | Infrastructure |
| | | | | | |
\---------------+-/ | +---+---+ +---------+ | | +----------+ |
| | +---->|interface|<-------------+ | |
Code that allows | +-------+ +---------+ | | +----------+ |
interaction with | | | |
user \--------------------------+--/ \-----------------+-/
Business logic What we (the business)
depend on - persistence,
crypto services etc
Let's take a concrete example of where one of the user interfaces is a touch screen that the main controller talks to over a serial UART. The software sends control commands to draw elements on the screen and the user actions generate text messages.
The bits I see working in this scenario are:
Serial driver
Sends data over the UART
Receives data (an ISR is invoked)
Screen Command builder
Screen Response/Event parser
Business logic such as presenting and responding to menus, widgets etc
The bit I'm struggling with is where these pieces should reside. Intuitively, I feel it's as follows:
Infrastructure - UART driver
Domain - Business logic
Application - Message builder/parser
But this arrangement forces a dependency between Application and Infrastructure where the parser needs to retrieve the data and the builder needs to send the data through the UART.
Bringing the messages builder and parser to Infrastructure or Domain takes the whole user interaction thing away from the Application.
Whichever way I look at it, it seems to violate some aspect of the diagram that I drew above. Any suggestions?

application-level events in ktor not invoked

I am currently experimenting with app-events in ktor using Netty. However the only hook that is being called is "ApplicationStarted". What am i doing wrong here?
I set breakpoints in all of the functions, the subscriptions are being made but not all of the event-listeners are invoked.
Also I tried to find some explanation in the ktor docs but that was not successful.
import io.ktor.application.*
fun Application.events(){
environment.monitor.subscribe(ApplicationStarting, ::onStarting)
environment.monitor.subscribe(ApplicationStarted, ::onStarted)
environment.monitor.subscribe(ApplicationStopping, ::onStopping)
environment.monitor.subscribe(ApplicationStopped, ::onStopped)
environment.monitor.subscribe(ApplicationStopPreparing, ::onPrepareStop)
}
private fun onStarting(app: Application){
app.log.info("Application starting")
}
private fun onStarted(app: Application){
app.log.info("Application started")
}
private fun onStopping(app: Application){
app.log.info("Application stopping")
}
private fun onStopped(app: Application){
app.log.info("Application stopped")
}
private fun onPrepareStop(env: ApplicationEnvironment){
env.log.info("Preparing App Stop")
}
"Application started" appears in the log messages but no other output.
How bad am I doing, or is this a bug?
Ok I've been looking into this and found, that the amount of invoked application-level events depend on the server you are using. The following embedded servers support the following events:
+--------+----------+---------+---------------+----------+---------+
| Engine | Starting | Started | StopPreparing | Stopping | Stopped |
+--------+----------+---------+---------------+----------+---------+
| Netty | NO | YES | NO | NO | NO |
| CIO | NO | YES | YES | YES | YES |
| Tomcat | NO | YES | NO | NO | NO |
| Jetty | NO | YES | NO | NO | NO |
+--------+----------+---------+---------------+----------+---------+
Tested on Ktor version 1.1.2
So it currently seems if you want to respond to application stopping events you should use CIO as the server.
EDIT:
CIO does not support HTTPS as of now, so if needed you must stick to one of the other three. However you can use the JVM runtime shutdown hook to raise a stopped event yourself. Beware that the handlers are invoked in a different thread.
private fun onStarted(app: Application){
Runtime.getRuntime()?.addShutdownHook( Thread {
app.environment.monitor.raise(ApplicationStopped, app)
})
app.log.info("Application started")
}

Issue on connecting to the Image Acquisition Device using HALCON

My setup includes a POE camera connected directly to my computer on which I have HDevelop. From the past few days I am running into a problem wherein the first attempt to connect to the camera using HDevelop fails.
When using Connect from the Image Acquisition GUI, I get an error stating "HALCON ERROR. Image acquisition: device cannot be initialized"
When using the open_framegrabber() method from the Program Console, I get a the same error as well, with the addition of the HALCON error code:5312
After I get this error, attempting the connection again, it succeeds. This is the workaround I have at the moment, but its annoying as it keeps repeating quite frequently and I am not sure what is the cause for this issue. I tried pinging my camera from the command prompt which did not show any ping losses. And using the camera from VIMBA viewer I do not get such connection issues.
I know this is not a support site where I should be asking such questions, but if anyone can give me some inputs on solving this, it would be of great help.
Regards,
Sanjay
To solve your question is important to understand the HALCON Framegrabber communication object, I assume that you are coding in HDev code structure.
To create a communication channel with the camera on the proper way, avoiding to reject the connection (due to parameter miss-configuration), you have to specify the camera device ID on the framegrabber creation, and avoid to use default options.
In order to consult, according to your communication protocol, the available devices conected to your board, use:
info_framegrabber('GigEVision2', 'info_boards', Information, ValueList)
where,
The first parameter is the communication protocol and ValueList will throw all the information of the connected devices with a token:param splitted by '|'
i.e
| device:ac4ffc00d5db_SVSVISTEKGmbH_eco274MVGE67 | unique_name:ac4ffc00d5db_SVSVISTEKGmbH_eco274MVGE67 | interface:Esen_ITF_78d004253353c0a80364ffffff00 | producer:Esen | vendor:SVS-VISTEK GmbH | model:eco274MVGE67 | tl_type:GEV | device_ip:192.168.3.101/24 | interface_ip:192.168.3.100/24 | status:busy | device:ac4ffc009cae_SVSVISTEKGmbH_eco274MVGE67 | unique_name:ac4ffc009cae_SVSVISTEKGmbH_eco274MVGE67 | interface:Esen_ITF_78d004253354c0a80264ffffff00 | producer:Esen | vendor:SVS-VISTEK GmbH | model:eco274MVGE67 | tl_type:GEV | device_ip:192.168.2.101/24 | interface_ip:192.168.2.100/24 | status:busy | device:ac4ffc009dc6_SVSVISTEKGmbH_eco274MVGE67 | unique_name:ac4ffc009dc6_SVSVISTEKGmbH_eco274MV
......... and going
In this way you can cast the device ID (device:) automatically, and put this parameter on your framegrabber creation.
open_framegrabber ('GigEVision2', 0, 0, 0, 0, 0, 0, 'default', -1, 'default', -1, 'false', 'here piut the device ID', '', -1, -1, AcqHandle)
At the end you will be able to do a direct connection or create a automatically re-connection routine.
I hope this information helps you.

How to get all audio devices for Android APIs < 23

how can i get all device info not using the method of api 23 ?
AudioDeviceInfo[] devices = audioManager.getDevices(AudioManager.GET_DEVICES_ALL); //or GET_DEVICES_OUTPUTS
is there any way?
In this case the target api is 19 - KitKat.
Thank you.

serialization of Base64 string in JSON payload with HessianKit (Objective-C/Cocoa)

I'm trying to connect my iOS-App to an existing Grails backend server. The backend exposes a hessian webservice by using the grails remoting plugin (version 1.3). My Android app successfully calls all webservice methods.
My goal is to transmit a jpeg image from the phone to the server (works with the Android app). My approach is to use create a JSON object with JSONKit and include the image as a base64 encoded string. I'm using HessianKit in an XCode 4 project with ARC targeting iOS 4.2 and Nick Lockwood's NSData+Base64 categories for Base64 encoding (https://github.com/nicklockwood/Base64).
Here's my code:
NSMutableDictionary *jsonPayload = [NSMutableDictionary dictionary];
[jsonPayload setObject:[theImage base64EncodedString] forKey:#"photo"];
NSString* jsonString = [jsonPayload JSONString];
NSURL* url = server_URL;
id<BasicAPI> proxy = (id<BasicAPI>)[CWHessianConnection proxyWithURL:url protocol:#protocol(BasicAPI)];
[proxy addImage:jsonString];
The problem is that the server throws an expection when called by the app:
threw exception [Hessian skeleton invocation failed; nested exception is com.caucho.hessian.io.HessianProtocolException: addImage__1: expected string at 0x7b ({)] with root cause
Message: addImage__1: expected string at 0x7b ({)
Line | Method
->> 1695 | error in com.caucho.hessian.io.HessianInput
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 1681 | expect in ''
| 1473 | parseChar in ''
| 792 | readString in ''
| 181 | readObject in com.caucho.hessian.io.BasicDeserializer
| 1078 | readObject in com.caucho.hessian.io.HessianInput
| 300 | invoke . . in com.caucho.hessian.server.HessianSkeleton
| 221 | invoke in ''
| 886 | runTask . in java.util.concurrent.ThreadPoolExecutor$Worker
| 908 | run in ''
^ 680 | run . . . in java.lang.Thread
All other JSON payloads from my app (Strings, dates, numbers, etc.) can be deserialized by the server without any problem and the other way round, i.e. sending a base64 encoded image as json payload to the app from the server as a response also works.
After spending hours reading bug reports and mailing lists, I suspect that the problem might be that HessianKit only supports the Hessian 1 protocol but the hessian version shipped with remoting 1.3 is 4.0.7. 4.0.7 probably uses the Hessian 2 protocol and isn't compatible backwards. But that's just guessing.
EDIT: Actually, the issue has nothing to do with JSON. The same exception is thrown when I just pass the string as a normal string (and not embedded in JSON) to the webservice.
Has someone experienced a similar problem and knows a solution?