Can I record device sound with AudioKit Swift - objective-c

Does AudioKit provide anyway to record the internal device audio not the microphone input but the device audio in iOS using swift or Objective-C

See here for an official Apple response:
The simplest solution would be to use ReplayKit to record your app and have replay kit deliver sample buffers directly to your app. (You can just ignore video and mic buffers if you don't need them.)
Take at look at the WWDC video (https://developer.apple.com/videos/play/wwdc2020/10633/) at about the 8:45 mark.

Related

do MobileVLCKit support Airplay option in iOS?

I have implemented VCL to play the live streaming video and Now I want to play it on apple TV by airplay option on my streaming screen..
Can you pls help me out.
Thanks
By question its not clear what you mean by implementing VLC. Apple provided framework to stream live video (AVPlayer). Which supports Picture-in-picture and airplay.
For making things more clear please elaborate, what exactly you are trying to do.
Support for AirPlay will be available in libvlc and VLCKit 4.0 next year. It will allow you play anything from any source on Apple TV and if needed, will convert the media on-the-fly. Thus, AirPlay support will match what's possible with Chromecast right now.
With VLCKit 3, there is no good way to do it. You could do Display Mirroring, but this would have a bad impact on performance and video quality. Audio-only output via AirPlay / RAOP will just work fine and quality is good. It even supports multichannel now.

Is there something that iBeacon offers that can't be done on iOS 6 using Core Bluetooth?

I've been starting to read about the above and on BLE devices in general and trying to find the differences between those 2 frameworks. Is there something that the iBeacon API offers beside the option to use startMonitoringForRegion with CLBeaconRegion (which will basically "wake up" or notify the app that you're in range)?
To my understanding on iPhone 4S and up I can get a list of BLE devices and check their signal strength on iOS 6 sure, i't will not be as simple as the API for ranging but still, am I missing something?
Thanks
Yes, the CoreLocation APIs allow you to see iBeacon devices where CoreBluetooth does not. See my in depth discussion of this here: http://developer.radiusnetworks.com/2013/10/21/corebluetooth-doesnt-let-you-see-ibeacons.html
This may not matter if you want to roll your own Bluetooth LE devices that are fully visible by CoreBluetooth. But such devices are somewhat more complex and more power-hungry. iBeacons by contrast are transmit only and send a minimal amount of data.
The bottom line is that if you want your app to see standard iBeacons, CoreBluetooth simply will not do the job.

iOS internal mic to AirPlay (streaming)

There are plenty of examples (incl. source) of apps out there that a) Record audio (from the internal mic of an iOS device), and b) Play audio (over AirPlay)
I'm having trouble finding any sort of example/documentation that might demonstrate how to do both of these simultaneously (or, within expected network latency). I'm interested in routing audio, in real time, from the internal mic of an iOS device to an AirPlay device (Airport Express, Apple TV, etc.). No recording necessary.
Can anyone either describe one way this might be done, or point me in the direction of some related documentation? Thank you!

How to catch bluetooth peripheral's command

I want to catch commands from bluetooth peripheral in iOS.
Could anyone help me out?
There are a few ways of doing Bluetooth on iOS, and different ones have different methods:
An accessory that's part of the Made for iPhone program (see the answer to this question if you want to know what that entails)
A device that uses Bluetooth 4.0 Low Energy mode
Talking to another iPhone with GameKit
A device using one of the Bluetooth profiles Apple support natively: Hands-free profile, headset profile, A2DP, AVRCP, etc.
To answer those in order:
If you're part of the Made for iPhone program, you'll already know where to find this information, and it's not public: ask your contact at Apple for help.
If your device uses Bluetooth 4.0 Low Energy, look into the Core Bluetooth framework.
If you're trying to get two iPhones talking together, look into GameKit.
If you're working with a device that uses an Apple provided profile, you shouldn't have to do anything, it'll just work. For example, an A2DP device will stream audio played from the phone without needing programmer intervention. You can do a few things to control it: there's a Core Audio function somewhere that lets you choose whether to send audio to a Bluetooth device. If you're trying to support AVRCP, look into handling remote control events.
If your device doesn't fall into any of the above categories, you're probably out of luck and can't use it with iOS.

Streaming encoded video in Adobe AIR Application

I am developing a desktop application in Adobe AIR that will be used to stream the user's camera video to a wowza media server. I want to encode the video on the fly, means transmit the H.264 encoded video instead of the default flash player encoded video for quality purpose. Is there any way around for this?
Waiting for the help from people around,
Rick
H.264 encoding is usually done in Native Code C or C++ because it is a cpu
intensive set of algorithms. The source code for x264 can give you an
idea of the code required but it is a tough read if you start from scratch.
Here is a book to get you started or you can read the original AVC standard
if you suffer from insomnia.