MacOS 10.14
<key>NSMicrophoneUsageDescription</key>
<string>Record audio!</string>
This works in a swift project:
AVCaptureDevice.requestAccess(for: .audio) { granted in
if granted {
//self.setupCaptureSession()
}
}
But this does not work in an ObjectiveC project (Thread 8: signal SIGABRT)
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {
if (granted) {
//self.microphoneConsentState = PrivacyConsentStateGranted;
}
else {
//self.microphoneConsentState = PrivacyConsentStateDenied;
}
}];
What have I done wrong or missed in the ObjectiveC project? (I don't want to convert my project to swift.)
Any help appreciated. Thanks, paul
In Order to use the AVCaptureDevice you need to add the description for both items i.e NSCameraUsageDescription, NSMicrophoneUsageDescription
For Reference Please see the Apple Doc
If you only want to record the Audio you can also use the AVAudioSession API.
I began a new ObjectiveC project (MacOs 10.14) and copied everything from the old project to it (xibs etc). It displayed the appropriate access dialogue but didn't actually record. Had to check the audio input checkbox in capabilities - after messing about for an hour. :-)
Related
Not sure what to even document here. I update Xcode to 12.0.1, and out of nowhere, after building and running my application, the images inside the app are not rendering, not counting the splash screen, containing an image, which are built out natively.
These images, whether they're coming from firebase storage (remotely) or whether they're icons (locally) are simply not rendering. no changes in any image code or anything, anywhere in the application. the only change was the Xcode 12 update. and also a macOS update to Catalina 10.15.7
Any ideas on whats going on? let me know If I can provide additional details.
This behavior is a known issue with iOS 14 as you can see here.
You need to change the following file:
react-native/Libraries/Image/RCTUIImageViewAnimated.m
and add the following line:
- (void)displayLayer:(CALayer *)layer
{
if (_currentFrame) {
layer.contentsScale = self.animatedImageScale;
layer.contents = (__bridge id)_currentFrame.CGImage;
} else {
[super displayLayer:layer]; // add else statement with this line here
}
}
Source: https://github.com/facebook/react-native/issues/29279#issuecomment-658244428
I'm starting with Kinect SDK 1.7, using KinectRegion and other controls like KinectTileButton and KinectScrollViewer from the toolkit. My questions are:
How to enable KinectRegion to work with left and right hands?
Does the SDK 1.7 have something ready to work with zooming?
How to detect grip and release?
Any code available on the Internet?
Thank you!
To Enable the Kinect Region :
Import "Microsoft.Kinect.Toolkit.Controls" project into your solution. (Use Add -> Existing Project)
Add Reference of "Microsoft.Kinect.Toolkit.Controls" into your project.
Add KinectRegion into your XAML using this code :
Import/Use "Microsoft.Kinect.Toolkit.Controls" in your xaml.cs file :
using Microsoft.Kinect.Toolkit;
Bind the sensor chooser's current sensor to the KinectRegion :
var regionSensorBinding = new Binding("Kinect") { Source = this.sensorChooser };
BindingOperations.SetBinding(this.kinectRegion, KinectRegion.KinectSensorProperty,
regionSensorBinding);
I don't get what you mean about "zooming". Please give more detail.
To detect hand gripping and hand release, you can add "AddHandPointerGripHandler" and "AddHandPointerGripReleaseHandler" to your KinectRegion. Please take a look to the KinectScrollViewer.cs.
You can explore the code about the hand pointer and stuff from the "Kinect Developer Toolkit Browser App".
As far as I can remember, KinectRegion works with both hand and automatically detects which one is the main one.
The grip and release detection is also automatic on KinectScrollViewer controls.
About the zooming I have no idea.
You'll find good tutorial on Kinect SDK 1.7 Interactions features on this link
Absolutely outstanding course is on the links below:
The first part shows the basic of Kinect SDK
The second part is similiar to firs part, but with use of MS Blend
And the third part is tutorial for interaction stream, where you can get information of both hands.
But if you´d like to use both hand at Kinect region, you have to edit Microsoft.Kinect.Toolkit.Controls -> KinectRegion.cs -> line 1000 (more info in MSDN Blog question)
It helped to me! (I have the same problem)!
For Grip detection is available in kinectRegion -> kinectRegion.HandPointers[idex of hand(0 is left, 1 is right)].IsInGripInteraction - it´s bool - I added some code:
private Skeleton []skeleton;
private void kinect_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)
{
using (SkeletonFrame sf = e.OpenSkeletonFrame())
{
if (sf != null && this.skeleton != null) // check that a frame is available
{
sf.CopySkeletonDataTo(this.skeleton); // get the skeletal information in this frame
}
}
}
sensor.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(kinect_SkeletonFrameReady);
foreach (var sk in skeleton)
{
if (sk.TrackingId == 0) continue;
else
{
if (kinectRegion.HandPointers[0].IsInGripInteraction == true)
{
.......
}
}
}
I am trying to get my brain around what I can and can't reasonably do UI-wise in a multi-platform app. Initially we are only concerned about iOS and Android, but may need a mobile Windows version eventually.
The specific question is: How do I replicate the Android ExpandableListView functionality in iOS? I've tried a few searches, but haven't found a hint. The key I need is collapsible sections. Is that doable with an iOS listview? If so, do you have/know of an example?
The related non-specific question is: What advice do you have for someone just starting out developing in multimobilemono? I've been working from Greg Shackles' excellent book, "Mobile Development in C#" (which has been wildly helpful!), so I've got some basics. But I'm sure there are some hidden landmines when you get into more complex UI design. Any advice would be greatly appreciated.
Thank you!
You can use the UITableView, and merely change the size of your cell to display more content as needed.
Let us discuss DialogViewController (part of MonoTouch.Dialog) which simplifies the setup of a UITableView.
What you could do is create a UIView that contains both the content, and the expanded content. It would be controller with some property, for example:
bool expanded;
public bool Expanded { get { return expanded; }}
set {
if (expanded == value)
return;
Frame = ComputeSize (value);
expanded = value;
}
}
Then, create a UIViewElement:
new RootElement ("My Root") {
new Section () {
new UIViewElement (new MyView ());
}
}
For your first question, perhaps you could try :
https://github.com/OliverLetterer/SLExpandableTableView
Here I'm implementing a sample.js file in which designing a floating component.
This sample.js file is independent of any other application. It means when i'm adding this file to sencha-touch project it should show the floating component and if i'm adding this sample.js to ext4.1 project then also it should show the component.
For this i want to know whether the application is using Sencha touch sdk or ext 4.1 sdk?
How can i achieve this?
Any help would be highly appreciated.
We can know whether it's touch or extjs by executing the following condition
if( Ext.versions.touch ){
//write your touch related code here
}else if( Ext.versions.extjs ){
//write your extjs code here
}
Ext.version is documented for Sencha Touch, Ext.versions is not. Alternative to url solution will be:
if(window.Ext) {
if(Ext.version) {
// Touch
}
else {
// Extjs
}
}
Before IOS 6, I was using this URL scheme to open the native maps app and find directions from the users current location to an address that I created.
http://maps.google.com/maps?daddr=" + address + "&saddr=Current+Location
This was working great, but now that they got rid google maps with IOS 6, we had to check which IOS version they were on and then refer them to the new apple maps url scheme if they were using IOS 6.0 or greater. The new url scheme we are using is this....
http://maps.apple.com/maps?daddr=" + address + "&saddr=Current+Location
This is based on the new documentation for map url schemes, which can be found here..
Anyways, I've tested it a bunch and it boils down to the new apple maps does recognize Current Location, like google maps did.
Does anyone know how I fix this?
Keep in mind I am building a html app with phone gap, so using native code to set the starting address to current location won't help me.
I am having the same problem. I haven't found a solution yet but if you leave off the saddr
http://maps.apple.com/maps?daddr=" + address
it will just ask them where to start and the first option is "Current Location" so when they click "Current Location" it will show the map correctly.
If anyone finds a better solution please post it as I am still looking for a better solution.
You can use my method:
<script type="text/javascript">
var link = "maps:saddr=YPlat,YPlong&daddr=42.118599,-72.625122";
navigator.geolocation.getCurrentPosition(showPosition);
function showPosition(position)
{
link = link.replace("YPlat",position.coords.latitude);
link = link.replace("YPlong",position.coords.longitude);
window.location = link;
}
</script>
confirmed with iOS 5.1 and iOS 6
Just pass "Current Location" as the source address:
http://maps.apple.com/maps?saddr=Current%20Location&daddr=Your_Address
You can get the coordinates of the current location using CLLocationManager, or its wrapper DKLocationManager (on github), created by Keith Pitt.
Once you have the coordinates, you can use the following code sample.
+ (void) openDirectionFrom:CLLocation* currentLocation To:(NSString*) daddr {
NSString* urlStr;
NSString* saddr = #"Current+Location";
if ([[UIDevice currentDevice] systemVersion] floatValue] >=6) {
//iOS 6+, Should use map.apple.com. Current Location doesn't work in iOS 6 . Must provide the coordinate.
if ((currentLocation.coordinate.latitude != kCLLocationCoordinate2DInvalid.latitude) && (currentLocation.coordinate.longitude != kCLLocationCoordinate2DInvalid.longitude)) {
//Valid location.
saddr = [NSString stringWithFormat:#"%f,%f", currentLocation.coordinate.latitude,currentLocation.coordinate.longitude];
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?saddr=%#&daddr=%#", saddr, daddr];
} else {
//Invalid location. Location Service disabled.
urlStr = [NSString stringWithFormat:#"http://maps.apple.com/maps?daddr=%#", daddr];
}
} else {
// < iOS 6. Use maps.google.com
urlStr = [NSString stringWithFormat:#"http://maps.google.com/maps?saddr=%#&daddr=%#", saddr, daddr];
}
[(UIApplicationWithEvents*)[UIApplication sharedApplication] openURL:[NSURL URLWithString:urlStr]];
}