UWP access buttons - testing

Helloo all
I'm currently developing an app for surface pro 3 that should be capable of:
detecting whether touchscreen was tapped and where
getting device information (product ID, amount of RAM, CPu model etc.) done by launching console application first which is capable of getting this information and saves it to a specific folder, UWP app can read its results from there and log them.
accessing sensor data like: accelerometer, gyroscope and ambient light sensor
testing cameras as i can command an app to make picture using either front or rear camera
testing microphones (both front and back)
testing speakers (i made synthesizer that is able to make beeps at given frequency at given stereo mode (left or right if both)
testing wifi - so it can connect to desired wifi network
bluetooth (swill working on it...)
I have already figured out quite a lot and put out a lot of work into it already, all listed tests are already developed on that UWP application so switching to completely different platform means rewriting whole app which I don't have time anymore.
UWP was chosen because this can run on different windows 10 devices and after completing this app the same app (with minor modifications) will be used on other windows 10 devices (like other surfaces and many different windows 10 phones)
This app will be automatically installed on a factory-resetted surface pro 3 that has no special configurations enabled, so tinkering with its settings is resource-hungry and not recommended process at all.
Now I have some other serious issues regarding of device:
how can I test the functionality of all the buttons Surface pro 3 has ?
It has 3 buttons: Volume up, Volume down and power button
but pressing power button sets screen to clack and locks device.
Can I make app override basic functionality of a button so that if button is pressed it detects it and logs its result.
same question goes to volume up and down buttons.
Only similiar questions about this are here:
Another thread on StackOverflow
I also cannot use same solution as I did with getting device information because test must be repetitive while app is running. (and UWP app cannot launch console application by itself)
Any help regarding this topic is highly welcome.

First for the power button behavior try this :
https://www.windowscentral.com/how-customize-power-button-action-when-pressed-windows-10
I don't have a surface so I cannot test it
also, I do have some inputs and thoughts about your app:
it seems to me that you are doing some sort of sanity check software for pieces of hardware I'd suggest looking into this
https://support.microsoft.com/en-my/help/4037239/surface-fix-common-surface-problems-using-surface-diagnostic-toolkit
https://www.lovemysurface.net/surface-diagnostic-toolkit/
additional thoughts of mine :
Overriding hardware behavior programmatically ould be considered as a harmful action especially when it comes to prebuilt devices such as the surface and by extension, I don't foresee MS providing API's for such a capability also such button might be communicating to the hardware directly rather and doesn't go through the software, runtime or the OS at all, changing it manually using the link i provided might be reflecting some registry settings changes but since UWP apps run in containers and cannot directly edit registry there is a dirty workaround look into this
read/write registery key file in uwp
hope this would help

You can use SystemInformation helper class from windows community toolkit it gives you a lot of details about the device.
https://learn.microsoft.com/en-us/windows/communitytoolkit/helpers/systeminformation

Related

Is an Windows 8.1 Emergency App for Windows Store doable?

At my current level of knowledge, I think it is not possible, to develop an emergency or time critical alarm app for Windows 8.1 as a Windows Store App (no Desktop).
Maybe also Windows Phone 8 would fit such app scenario, for having smaller devices to carry around.
For (fictitious) example you have a blind person walking down a road, now he or she unfortunately enters a area, where no one should walk around without seeing something, then the app should warn the person with a toast notification (with sound or vibration) or some UI related function if the app is on top but the Screen is turned off.
As Windows 8.1 sets time limits to the background tasks to safe battery life, I think it is not possible to build such an app on that plattform.
A similar scenario is this question Location tracking windows 8, but it was Windows 8 he asks about and now it is 1 year over and I hope there is something new which I have missed.
Maybe there is way to hook up the Tracking Service?
If the interaction works for you, you would want to create a windows phone app that supports running in the background much the way that GPS navigation applications work (the application must be navigating for it to run in the background).
How to run location-tracking apps in the background for Windows Phone 8
Monitoring geolocation this way will be timely and you can create toast notifications, but you have to be wary of when the application will be deactivated. Whether the platform will work for you depends on your requirements and centers around deactivation. Take a look to see if the conditions for deactivation don't meet your standards, and implement a proof of concept.

Starting an app in snapped view on an embedded system

A colleague recently asked a similar question ( How to start a MetroApp directly in Snapped mode? ), but this question is not a duplicate...
Programmatically forcing a Windows Store app to open in snapped view does not seem to be possible – by design. But can you do this or something similar on a Windows Embedded 8 machine? Similar things could include:
automatically start an app in snapped view on system start up, or
always start a specific app in snapped view
???
What we are trying to achieve:
The user logs in (on a preconfigured machine, assembled by us, possibly running Windows Embedded 8), starts our app and a snapped communication app (e.g. Skype or Lync) is (A) automatically there alongside our app, or (B) can be opened by pushing a button in our app.
Developers have the API necessary to take their app from SnapView to Fill or FullView. This is usually a less-advertised API because it could easily be abused. To that end, the reverse is not available. There is no API to move to SnapView.
I might also caution you, that unless you are sure of the device resolution, starting in SnapView (which is not possible as it is) would be a dangerously unreliable step as many (most?) displays do not support it (too small). And since you are talking embedded, I imagine this might be exaggeratedly so. Since it sounds like you might know the hardware, you can take that as a general rule, not for your circumstance.

Turn iPhone into a server programmatically?

I want to make my iPhone app display on a Mac's screen, kind of like AirPlay does with other machines. The only way I have heard to do this is, although I do not like it, turn the iPhone into a server. Unfortunately, I cannot figure out how to do so. I also wanted to set it up in such a way that my Mac automatically detects it. I have seen a similar setup in the game Chopper 2. My Mac app will have a simple timer that fires every few seconds to look for the iPhone, in the same way that Chopper 2's "Find iPhone" button does.
Is there a simple way to turn the iPhone into a server, or start a "session" like Game Center does?
One last thing: I know it is somehow possible, because another app I have actually gives my iPhone a web address at the click of a button. It is called the Dicenomicon, if you want proof.
First there is no easy way to redirect your display to Mac, even if you made it a server of some kind.
Second, to discover or publish customized services on WLAN, you may want to refer to the samples on Bonjour:
CocoaHTTPServer: a simple TCP/HTTP server.
WiTap: an app that discovers and connects to services of the same kind on WLAN by Bonjour.
I'm not really sure what you mean by "server", because there is no way to share the screen of an iPhone using the official SDK, although this is possible by jailbreaking.
It would be possible, however, to send data back and forth between the Mac and iPhone, and display the data on the iPhone, on a Mac. Using that data, you could try to recreate the interface on the Mac. All of this could be accomplished using sockets. A class that might help with that would be cocoaAsyncSocket, which makes network programming a lot easier.
The auto-discovery of iPhones on the local network is achievable with Bonjour. Without getting into too many details, NSNetService would allow you to publish a service for your app from an iPhone, and NSNetServiceBrowser would allow you to find that service on the local network from the Mac. From the NSNetServiceBrowser, you could establish a socket connection with the iPhone.
Good luck!
You might want to take a look at the GameKit APIs, I know they do something similar between two iOS devices.
http://developer.apple.com/library/ios/#documentation/NetworkingInternet/Conceptual/GameKit_Guide/Introduction/Introduction.html

Windows 8 Preview Samples

I have downloaded the Metro Style Application Sample that is available on the Microsoft Web Site. There are lots of examples that shows how you can interact with the hardware device (sensors,gps,etc). I have of course downloaded the Windows 8 Developer preview to execute those examples. My question is : how can I test those samples that uses the device hardware (gps, accelerometer) or that accesses to the phone features (sms,etc) using the emulator?
At the moment there are not devices that support windows 8 (the first phone will probably come out this autumn) ?
I'd like to start to develop some metro style applications to be ready when windows 8 store will be online but using just the emulator is a big limitation isn't it?
Yes and no. There are slate devices that can run the Windows 8 Dev Preview just fine.
If you are unable to get one of these, one option is to create your own interfaces for all the devices. Underneath you can have two implementations.
First one, you connect to the actual underlying devices via the Windows 8 APIs. Sure you won't be able to test these until you have a device but such is life.
The second implementation can be a dummy one. For example, you can have a thread running and every 2-3 seconds publishing some GPS event.
That way you at least have some dummy device data coming in that you can test with for the time being.

Robust real-time communication between iOS App and Mac App

We're working on an exhibit (http://pulse.media.mit.edu) and I'm brand new to iOS, objective-c and Xcode. The exhibit deadline is in one week and I'm stuck.
The problem I'm having seemed simple enough.
Our exhibit has a projector and an iPad. The projector will be hooked up to a mac and be playing a video. The iPad will act as a controller for those videos. More simply:
I have 50 videos on a Mac. I need to develop an application on the Mac, that, when opened will loop one of the videos.
On an iPad, I need to develop an app that can change between the videos on the mac in real time. The iOS app is already designed, we're just struggling with some code.
On the mac, when the iPad tells the Mac to change video, we'd like it to switch between videos using Core Animation, like this (http://youtu.be/pyd8O-2mkgk?t=1m).
So my question: What is the most robust way to do this? It has to be able to run in a museum, for two months. Some things to consider:
We are 4,000 miles away and can't monitor it all the time. We'll
check nightly to see if it's still working, but it should run the day
without breaking.
If people unplug the iPad, it should still work.
It should be as robust as possible.
How can I best do this? Should I write to a database from the iPad onto a database running locally on the connected Mac and then monitor that database 10 times every second? Are sockets robust enough to use alone?
If you do suggest a way, can you please point me in the direction of some resources (frameworks, function names, etc) that can help me do this quickly?
Thank you for your time.
I would go for the server-in-the-middle option, because it will be easiest to debug, and requires nothing more than a working internet (wifi) connection on the client side. When having connection issues, all you need is someone who knows how to hook up an iPad or Mac to the internet. And you can see server-side which device has issues connecting.
Furthermore, using plain HTTP sounds like the best way to go for communication, and the backend can be written in any server side scripting language. Both clients should be polling the server every X seconds.
If you get this basic setup working well before your deadline, you could try to get the devices to connect to eachother directly (for a less sluggish user experience), and leave the server solution in there as a fallback method. The Mac app could function as a HTTP server, accepting the very same commands that the iPad normally sends to your server. The challenge will be to reliably know which IP to connect to. You could hardcode it or use Bonjour.