Android Things - Front facing camera on IMX7d - camera

I am currently developing an Android Things application on an IMX7d developer board and need to figure out how to get the default camera, which is configured as BACK, to be a FRONT facing one, with correct orientation. This application will make heavy use of the camera, for example with face recognition.
I've noticed with other android phones you can edit /etc/nvcamera.conf to do this, but the Android things OS image I have doesn't seem to have that file. Is there some other way to do this?

Related

Appium test shows "Automation Running" overlay on iOS15 - How can I get rid of it?

I am using a tool to automate tests on iPhones. The tool uses Appium as the test framework. In iOS version 15, the screen shows a dark overlay with the text "Automation Running". I am aware that this does not affect the test at all.
However, my problem is that I use a camera placed in front of the mobile phone to capture the screen and do some checks on the captured video. I have to use the camera itself since I run tests on OTT applications and there is no way to capture the video using software mechanisms because of DRM protection. This "Automation Running" overlay messes up with the checks that I am running on the video captured through the camera.
Is there any way to get rid of this overlay in iOS15 when we run Appium based tests?
After extensive exploration, I have come to the conclusion that there is no way this overlay can be gotten rid of unless Apple wants to.

React Native app is recognized as a game on Samsung Note8

I wrote react native application. The application is simple, more informational, it is used by Redux, Saga, several linked npm packages. The app runs in normal mode, not full-screen. The structure of the application was built on the basis of Ignite.
The problem is that on the phone (Samsung Note8) the application is recognized as a game. E.g. while app is running there is a message "The game is running" on the lock screen. Also there are additional buttons for the gamepad or something like that. In additional, app has a paddings on the top and on the bottom when it's working on real device (Samsung Note8). This effect real exists when some games running. When app is running on another device (e.g. ZTE Blade 610) it's running as usual and without any side effects.
The main version is the cause of all is Game Tools that's existed on Samsung Note8 but others apps have no similar effects and running as expected.
Is there a possibility to make a react native app as a real app but not a game? Why Game Tools recognizes my app as a game? Or what is the reason and how can it be affected?
Thanks.
I think there are few possibilities.
you (or one of your dependencies) have included the google play service API which inside of play service API has a module named games that samsung will automatically treat it as game.
You could find which of your dependency is loading google play service API and create a exclude like:
compile (project ('your.dependency')){
exclude group: 'com.google.android.gms', module:'play-services-game'
}
Your application id (can see on build.gradle) is registered on samsung game database. You could check by going into playstore and search for your application id
this is something that can happen on samsung phones due to the package name of your app.
we cant change this after the initial release, you must contact samsung developer support and they can fix it on the fly.
i wrote a gist on github about it:
https://gist.github.com/Adnan-Bacic/718eb3b4e70380696c91dc21c4804112

Android - Things Raspberry Pi - Google Mobile Vision Support

I'm trying to run an android app on Android Things OS.
The app uses facial detection as a first step filter for reaching facial recognition.
The recognition process is made by a third party (remote) API, so there is nothing to worry about it, but the detection is being carried out by the Google Mobile Vision API for Android. The problem I'm facing is that the app crashes every time a camera preview is about to start.
The code of the app is derived from this example: https://github.com/googlesamples/android-vision (Face tracking). So if this code runs, my app runs.
I also know that there is a known issue with the raspberry pi and the camera trying to create more than one output surface.
My questions are:
(1) Is there a way to successfully run the code in the example https://github.com/googlesamples/android-vision (Face tracking)?
(2) When is going to resolved that known issue?
Thanks in advance.
Attentive,
Gersain Castañeda.
The latest version of Android Things(DP6) which is with API 27, supports the new camera API Camera2 as its explained here.
Camera2 API supports more than one output surface and its runnable on Android Things.
For getting more inspirations of how doing this check this tutorial(how to use camera2) and this very useful sample(how to use Google Vision).

This app is not available for your phone because it requires front facing camera

We have developed an app using Xamarin Forms which targets iOS, Android and Windows Phone. We are using camera feature to take/select photos to be uploaded to the app and the feature works well on all three mentioned platforms.
We have some users facing issues on few windows devices. When a user tries to download the app from Windows App Store, they receive following error:
"This app is not available for your phone because it requires front facing camera”
I have attached screen shot showing the error for reference. The screen shot is taken from Nokia Lumia 635. Our finding that this device has only back camera, no front camera and we believe this might be the reason for above mentioned issue.
Ideally, users should not get this error as this device has back camera and they still can get the photos.
Is this a known issue in windows phone world?
Can this be fixed from code, device permissions?
Is this a device specific problem?
Any ideas?

How to use a camera in Corona Simulator?

I'm wondering how to use a camera in Corona Simulator. I want to test my app (which requires a camera) out on the simulator but i don't know how to enable usage for a camera. Is there anyway to set up my webcam to act as the simulators camera? I just want to know if this is possible or would I just have to put my app onto a device to test it with a camera.
Thanks in advance
To access the device camera or photo library you would call something like:
media.show( mediaSource, listener [, filetable] )
In order to test this, you'd have to load the app on your phone. Corona doesn't allow you to use a webcam for the simulator. For more information check out the Corona Docs # http://docs.coronalabs.com/api/library/media/show.html