Sandboxing turned on and is there a more Objective-C way to go about opening a com port? - objective-c

Right now I got the following class and its not working. It fails at the open call. I think this is because of sandboxing but I'm not sure how to move forward. Also this seems ver C like, is there a better more Objective-C (IOKIT?) way to fo this?
The device itself is a USB serial port.
Error given:
Failed to open device: Operation not permitted
Current code:
#include <termios.h>
#import "Com.h"
#define BAUDCOUNT 17
speed_t baud_const[BAUDCOUNT] = { B50, B75, B110, B134, B150, B200,
B300, B600, B1200, B1800, B2400, B4800,
B9600, B19200, B38400, B57600, B115200 };
unsigned long baud_value[BAUDCOUNT] = { 50, 75, 110, 134, 150, 200,
300, 600, 1200, 1800, 2400, 4800,
9600, 19200, 38400, 57600, 115200 };
#interface Com()
#property (assign) int fd;
#property (assign) struct termios oldio;
#end
#implementation Com
#synthesize fd = _fd;
#synthesize oldio = _oldio;
- (id)init
{
self = [super init];
if (self)
{
self.fd = -1;
return self;
}
return nil;
}
- (BOOL)open:(NSString*)device withBaud:(int)baud
{
struct termios oldio;
struct termios newtio;
// Find the correct baud rate
int baudid = -1;
for(int i = 0; i < BAUDCOUNT; i++) {
if (baud_value[i] == baud) {
baudid = i;
break;
}
}
if(baudid == -1)
{
NSLog(#"Invlaid baud rate: %d", baud);
return NO;
}
// Open the device
self.fd = open([device UTF8String], O_RDWR | O_NOCTTY | O_NDELAY);
if (self.fd < 0)
{
NSLog(#"Failed to open device: %s", strerror(errno));
return NO;
}
// Save old settings
tcgetattr(self.fd, &oldio);
self.oldio = oldio;
// Init memory
memset(&newtio, 0x00 , sizeof(newtio));
cfmakeraw(&newtio);
// settings flags
newtio.c_cflag |= CS8;
newtio.c_cflag |= CLOCAL;
newtio.c_cflag |= CREAD;
newtio.c_iflag = IGNPAR | IGNBRK;
newtio.c_oflag = 0;
newtio.c_lflag = 0;
// Timeout in 100ms
newtio.c_cc[VTIME] = 0;
// read 1 character
newtio.c_cc[VMIN] = 0;
// Setting baudrate
cfsetispeed (&newtio, baud_const[baudid]);
cfsetospeed (&newtio, baud_const[baudid]);
// Flushing buffer
tcflush(self.fd, TCIOFLUSH);
// aplying new configuration
tcsetattr(self.fd, TCSANOW, &newtio);
return YES;
}
#end
Current Entitlements:
<key>com.apple.security.app-sandbox</key>
<true/>
<key>com.apple.security.device.usb</key>
<true/>
<key>com.apple.security.files.user-selected.read-write</key>
<true/>

There are a couple of Objective C frameworks for handling the serial port.
AMSerialPort
SerialConnect
Update: It looks to me like you might need the com.apple.security.device.serial entitlement. Despite the fact you are using a USB Serial adapter I believe the OS device drivers make all serial ports appear the same, i.e. not USB devices. I suspect the com.apple.security.device.usb entitlement that you do have doesn't apply to USB serial devices.
Update 2: I've just found this Open Radar from August last year which reports your problem. So it looks like the com.apple.security.device.serial entitlement doesn't work (or certainly didn't work back in August).
Update 3: If you do need to access the USB device directly (and assuming this works with the com.apple.security.device.usb entitlement) then you probably want to look at the IOSerialFamily source code which is available at http://opensource.apple.com/source/IOSerialFamily/IOSerialFamily-59/.
Source tarball: http://opensource.apple.com/tarballs/IOSerialFamily/IOSerialFamily-59.tar.gz

you are doing some kernel programming which may be what you are wanting to do, but if you can use the IOKit then I do recommend that. In short doing anything with the kernel + sandboxing I simply do not recommend as you have found out.
The IOKit is another type of animal and really isn't too bad. Here is an example with a quick snippet that will disable the system from sleeping. This demonstrates how you can use the IOKit and I do believe it does work with a Sandboxed app.
IOPMAssertionID assertionID;
IOReturn err = IOPMAssertionCreateWithName(kIOPMAssertionTypeNoDisplaySleep, kIOPMAssertionLevelOn, CFSTR("Add Reason Here"), &assertionID);
if ( err == kIOReturnSuccess ) {
// Do stuff here while system is unable to sleep
// Let the system resume ability to sleep
err = IOPMAssertionRelease(assertionID);
}
This documentation should also help: link
Note: "Important If your application is sandboxed, it must request the com.apple.security.device.usb entitlement in order to access USB devices."

Related

How do I create an Inter App MIDI In port

I will program an inter App MIDI In Port in my Arranger App, that can be accessed by other MIDI App's. I would appreciate very much to get some sample code. I built a virtual MIDI In port like this, but how to make it visible for other App's:
MIDIClientRef virtualMidi;
result = MIDIClientCreate(CFSTR("Virtual Client"), MyMIDINotifyProc, NULL, &virtualMidi);
You need to use MIDIDestinationCreate, which will be visible to other MIDI Clients. You need to provide a MIDIReadProc callback that will be notified when a MIDI event arrives to your MIDI Destination. You may create another MIDI Input Port as well, with the same callback, that you can connect yourself from within your own program to an external MIDI Source.
Here is an example (in C++):
void internalCreate(CFStringRef name)
{
OSStatus result = noErr;
result = MIDIClientCreate( name , nullptr, nullptr, &m_client );
if (result != noErr) {
qDebug() << "MIDIClientCreate() err:" << result;
return;
}
result = MIDIDestinationCreate ( m_client, name, MacMIDIReadProc, (void*) this, &m_endpoint );
if (result != noErr) {
qDebug() << "MIDIDestinationCreate() err:" << result;
return;
}
result = MIDIInputPortCreate( m_client, name, MacMIDIReadProc, (void *) this, &m_port );
if (result != noErr) {
qDebug() << "MIDIInputPortCreate() error:" << result;
return;
}
}
Another example, in ObjectiveC from symplesynth
- (id)initWithName:(NSString*)newName
{
PYMIDIManager* manager = [PYMIDIManager sharedInstance];
MIDIEndpointRef newEndpoint;
OSStatus error;
SInt32 newUniqueID;
// This makes sure that we don't get notified about this endpoint until after
// we're done creating it.
[manager disableNotifications];
MIDIDestinationCreate ([manager midiClientRef], (CFStringRef)newName, midiReadProc, self, &newEndpoint);
// This code works around a bug in OS X 10.1 that causes
// new sources/destinations to be created without unique IDs.
error = MIDIObjectGetIntegerProperty (newEndpoint, kMIDIPropertyUniqueID, &newUniqueID);
if (error == kMIDIUnknownProperty) {
newUniqueID = PYMIDIAllocateUniqueID();
MIDIObjectSetIntegerProperty (newEndpoint, kMIDIPropertyUniqueID, newUniqueID);
}
MIDIObjectSetIntegerProperty (newEndpoint, CFSTR("PYMIDIOwnerPID"), [[NSProcessInfo processInfo] processIdentifier]);
[manager enableNotifications];
self = [super initWithMIDIEndpointRef:newEndpoint];
ioIsRunning = NO;
return self;
}
Ports can't be discovered from the API, but sources and destinations can. You want to create a MIDISource or MIDIDestination so that MIDI clients can call MIDIGetNumberOfDestinations/MIDIGetDestination or MIDIGetNumberOfSources/MIDIGetSource and discover it.
FYI, there is no need to do what you are planning to do on macOS because the IAC driver already does it. If this is for iOS, these are the steps to follow:
Create at least one MIDI Client.
Create a MIDIInputPort with a read block for I/O.
Use MIDIPortConnectSource to attach the input port to every MIDI Source of interest.
[From now, every MIDI message received by the source will come to your read block.]
If you want to resend this data to a different destination, you'll need to have created a MIDIOutputPort as well. Use MIDISend with that port to the desired MIDI Destination.

How to put BG96 on power save mode between sending messages to Azure IoT Hub over HTTP

I'm using a Nucleo L496ZG, X-NUCLEO-IKS01A2 and the Quectel BG96 module to send sensor data (temperature, humidity etc..) to Azure IoT Central over HTTP.
I've been using the example implementation provided by Avnet here, which works fine but it's not power optimized and with a 6700mAh battery pack it only lasts around 30 hours sending telemetry ever ~10 seconds. Goal is for it to last around a week. I'm open to increasing the time between messages but I also want to save power in between sending.
I've gone over the Quectel BG96 manuals and I've tried two things:
1) powering off the device by driving the PWRKEY and turning it back on when I need to send a message
I've gotten this to work, kinda… until I get a hardfault exception which happens seemingly randomly anywhere from within ~5 minutes of running to 2 hours (messages successfully sending prior to the exception). Output of crash log parser is the same every time:
Crash location = strncmp [0x08038DF8] (based on PC value)
Caller location = _findenv_r [0x0804119D] (based on LR value)
Stack Pointer at the time of crash = [20008128]
Target and Fault Info:
Processor Arch: ARM-V7M or above
Processor Variant: C24
Forced exception, a fault with configurable priority has been escalated to HardFault
A precise data access error has occurred. Faulting address: 03060B30
The caller location traces back to my .map file and I don't know what to make of it.
My code:
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//#define USE_MQTT
#include <stdlib.h>
#include "mbed.h"
#include "iothubtransporthttp.h"
#include "iothub_client_core_common.h"
#include "iothub_client_ll.h"
#include "azure_c_shared_utility/platform.h"
#include "azure_c_shared_utility/agenttime.h"
#include "jsondecoder.h"
#include "bg96gps.hpp"
#include "azure_message_helper.h"
#define IOT_AGENT_OK CODEFIRST_OK
#include "azure_certs.h"
/* initialize the expansion board && sensors */
#include "XNucleoIKS01A2.h"
static HTS221Sensor *hum_temp;
static LSM6DSLSensor *acc_gyro;
static LPS22HBSensor *pressure;
static const char* connectionString = "xxx";
// to report F uncomment this #define CTOF(x) (((double)(x)*9/5)+32)
#define CTOF(x) (x)
Thread azure_client_thread(osPriorityNormal, 10*1024, NULL, "azure_client_thread");
static void azure_task(void);
EventFlags deleteOK;
size_t g_message_count_send_confirmations;
/* create the GPS elements for example program */
BG96Interface* bg96Interface;
//static int tilt_event;
// void mems_int1(void)
// {
// tilt_event++;
// }
void mems_init(void)
{
//acc_gyro->attach_int1_irq(&mems_int1); // Attach callback to LSM6DSL INT1
hum_temp->enable(); // Enable HTS221 enviromental sensor
pressure->enable(); // Enable barametric pressure sensor
acc_gyro->enable_x(); // Enable LSM6DSL accelerometer
//acc_gyro->enable_tilt_detection(); // Enable Tilt Detection
}
void powerUp(void) {
if (platform_init() != 0) {
printf("Error initializing the platform\r\n");
return;
}
bg96Interface = (BG96Interface*) easy_get_netif(true);
}
void BG96_Modem_PowerOFF(void)
{
DigitalOut BG96_RESET(D7);
DigitalOut BG96_PWRKEY(D10);
DigitalOut BG97_WAKE(D11);
BG96_RESET = 0;
BG96_PWRKEY = 0;
BG97_WAKE = 0;
wait_ms(300);
}
void powerDown(){
platform_deinit();
BG96_Modem_PowerOFF();
}
//
// The main routine simply prints a banner, initializes the system
// starts the worker threads and waits for a termination (join)
int main(void)
{
//printStartMessage();
XNucleoIKS01A2 *mems_expansion_board = XNucleoIKS01A2::instance(I2C_SDA, I2C_SCL, D4, D5);
hum_temp = mems_expansion_board->ht_sensor;
acc_gyro = mems_expansion_board->acc_gyro;
pressure = mems_expansion_board->pt_sensor;
azure_client_thread.start(azure_task);
azure_client_thread.join();
platform_deinit();
printf(" - - - - - - - ALL DONE - - - - - - - \n");
return 0;
}
static void send_confirm_callback(IOTHUB_CLIENT_CONFIRMATION_RESULT result, void* userContextCallback)
{
//userContextCallback;
// When a message is sent this callback will get envoked
g_message_count_send_confirmations++;
deleteOK.set(0x1);
}
void sendMessage(IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle, char* buffer, size_t size)
{
IOTHUB_MESSAGE_HANDLE messageHandle = IoTHubMessage_CreateFromByteArray((const unsigned char*)buffer, size);
if (messageHandle == NULL) {
printf("unable to create a new IoTHubMessage\r\n");
return;
}
if (IoTHubClient_LL_SendEventAsync(iotHubClientHandle, messageHandle, send_confirm_callback, NULL) != IOTHUB_CLIENT_OK)
printf("FAILED to send! [RSSI=%d]\n", platform_RSSI());
else
printf("OK. [RSSI=%d]\n",platform_RSSI());
IoTHubMessage_Destroy(messageHandle);
}
void azure_task(void)
{
//bool tilt_detection_enabled=true;
float gtemp, ghumid, gpress;
int k;
int msg_sent=1;
while (true) {
powerUp();
mems_init();
/* Setup IoTHub client configuration */
IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle = IoTHubClient_LL_CreateFromConnectionString(connectionString, HTTP_Protocol);
if (iotHubClientHandle == NULL) {
printf("Failed on IoTHubClient_Create\r\n");
return;
}
// add the certificate information
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "TrustedCerts", certificates) != IOTHUB_CLIENT_OK)
printf("failure to set option \"TrustedCerts\"\r\n");
#if MBED_CONF_APP_TELUSKIT == 1
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "product_info", "TELUSIOTKIT") != IOTHUB_CLIENT_OK)
printf("failure to set option \"product_info\"\r\n");
#endif
// polls will happen effectively at ~10 seconds. The default value of minimumPollingTime is 25 minutes.
// For more information, see:
// https://azure.microsoft.com/documentation/articles/iot-hub-devguide/#messaging
unsigned int minimumPollingTime = 9;
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "MinimumPollingTime", &minimumPollingTime) != IOTHUB_CLIENT_OK)
printf("failure to set option \"MinimumPollingTime\"\r\n");
IoTDevice* iotDev = (IoTDevice*)malloc(sizeof(IoTDevice));
if (iotDev == NULL) {
return;
}
setUpIotStruct(iotDev);
char* msg;
size_t msgSize;
hum_temp->get_temperature(&gtemp); // get Temp
hum_temp->get_humidity(&ghumid); // get Humidity
pressure->get_pressure(&gpress); // get pressure
iotDev->Temperature = CTOF(gtemp);
iotDev->Humidity = (int)ghumid;
iotDev->Pressure = (int)gpress;
printf("(%04d)",msg_sent++);
msg = makeMessage(iotDev);
msgSize = strlen(msg);
sendMessage(iotHubClientHandle, msg, msgSize);
free(msg);
iotDev->Tilt &= 0x2;
/* schedule IoTHubClient to send events/receive commands */
IOTHUB_CLIENT_STATUS status;
while ((IoTHubClient_LL_GetSendStatus(iotHubClientHandle, &status) == IOTHUB_CLIENT_OK) && (status == IOTHUB_CLIENT_SEND_STATUS_BUSY))
{
IoTHubClient_LL_DoWork(iotHubClientHandle);
ThisThread::sleep_for(100);
}
deleteOK.wait_all(0x1);
free(iotDev);
IoTHubClient_LL_Destroy(iotHubClientHandle);
powerDown();
ThisThread::sleep_for(300000);
}
return;
}
I know PSM is probably the way to go since powering on/off the device draws a lot of power but it would be useful if someone had an idea of what is happening here.
2) putting the device to PSM between sending messages
The BG96 library in the example code I'm using doesn't have a method to turn on PSM so I tried to implement my own. When I tried to run it, it basically runs into an exception right away so I know it's wrong (I'm very new to embedded development and have no prior experience with AT commands).
/** ----------------------------------------------------------
* this is a method provided by current library
* #brief Tx a string to the BG96 and wait for an OK response
* #param none
* #retval true if OK received, false otherwise
*/
bool BG96::tx2bg96(char* cmd) {
bool ok=false;
_bg96_mutex.lock();
ok=_parser.send(cmd) && _parser.recv("OK");
_bg96_mutex.unlock();
return ok;
}
/**
* method I created in an attempt to use PSM
*/
bool BG96::psm(void) {
return tx2bg96((char*)"AT+CPSMS=1,,,”00000100”,”00000001”");
}
Can someone tell me what I'm doing wrong and provide any guidance on how I can achieve my goal of having my device run on battery for longer?
Thank you!!
I got Power Saving Mode working by using Mbed's ATCmdParser and the AT+QPSMS commands as per Quectel's docs. The modem doesn't always go into power saving mode right away so that should be noted. I also found that I have to restart the modem afterwards or else I get weird behaviour. My code looks something like this:
bool BG96::psm(char* T3412, char* T3324) {
_bg96_mutex.lock();
if(_parser.send("AT+QPSMS=1,,,\"%s\",\"%s\"", T3412, T3324) && _parser.recv("OK")) {
_bg96_mutex.unlock();
}else {
_bg96_mutex.unlock();
return false;
}
return BG96Ready(); }//restarts modem
To send a message to Azure, the modem will need to be manually woken up by driving the PWRKEY to start bi-directional communication, and a new client handle needs to be created and torn down every time since Azure connection uses keepAlive and the modem will be unreachable when it's in PSM.

CoreAudio AudioQueue callback function never called, no errors reported

I am trying to do a simple playback from a file functionality and it appears that my callback function is never called. It doesn't really make sense because all of the OSStatuses come back 0 and other numbers all appear correct as well (like the output packets read pointer from AudioFileReadPackets).
Here is the setup:
OSStatus stat;
stat = AudioFileOpenURL(
(CFURLRef)urlpath, kAudioFileReadPermission, 0, &aStreamData->aFile
);
UInt32 dsze = 0;
stat = AudioFileGetPropertyInfo(
aStreamData->aFile, kAudioFilePropertyDataFormat, &dsze, 0
);
stat = AudioFileGetProperty(
aStreamData->aFile, kAudioFilePropertyDataFormat, &dsze, &aStreamData->aDescription
);
stat = AudioQueueNewOutput(
&aStreamData->aDescription, bufferCallback, aStreamData, NULL, NULL, 0, &aStreamData->aQueue
);
aStreamData->pOffset = 0;
for(int i = 0; i < NUM_BUFFERS; i++) {
stat = AudioQueueAllocateBuffer(
aStreamData->aQueue, aStreamData->aDescription.mBytesPerPacket, &aStreamData->aBuffer[i]
);
bufferCallback(aStreamData, aStreamData->aQueue, aStreamData->aBuffer[i]);
}
stat = AudioQueuePrime(aStreamData->aQueue, 0, NULL);
stat = AudioQueueStart(aStreamData->aQueue, NULL);
(Not shown is where I'm checking the value of stat in between the functions, it just comes back normal.)
And the callback function:
void bufferCallback(void *uData, AudioQueueRef queue, AudioQueueBufferRef buffer) {
UInt32 bread = 0;
UInt32 pread = buffer->mAudioDataBytesCapacity / player->aStreamData->aDescription.mBytesPerPacket;
OSStatus stat;
stat = AudioFileReadPackets(
player->aStreamData->aFile, false, &bread, NULL, player->aStreamData->pOffset, &pread, buffer->mAudioData
);
buffer->mAudioDataByteSize = bread;
stat = AudioQueueEnqueueBuffer(queue, buffer, 0, NULL);
player->aStreamData->pOffset += pread;
}
Where aStreamData is my user data struct (typedefed so I can use it as a class property) and player is a static instance of the controlling Objective-C class. If any other code is wanted please let me know. I am a bit at my wit's end. Printing any of the numbers involved here yields the correct result, including functions in bufferCallback when I call it myself in the allocate loop. It just never gets called thereafter. The start up method returns and nothing happens.
Also anecdotally, I am using a peripheral device (an MBox Pro 3) to play the sound which CoreAudio only boots up when it is about to output. IE if I start iTunes or something, the speakers pop faintly and there is an LED that goes from blinking to solid. The device boots up like it does so CA is definitely doing something. (Also I've of course tried it with the onboard Macbook sound sans the device.)
I've read other solutions to problems that sound similiar and they don't work. Stuff like using multiple buffers which I am doing now and doesn't appear to make any difference.
I basically assume I am doing something obviously wrong somehow but not sure what it could be. I've read the relevant documentation, looked at the available code examples and scoured the net a bit for answers and it appears that this is all I need to do and it should just go.
At the very least, is there anything else I can do to investigate?
My first answer was not good enough, so I compiled a minimal example that will play a 2 channel, 16 bit wave file.
The main difference from your code is that I made a property listener listening for play start and stop events.
As for your code, it seems legit at first glance. Two things I will point out, though:
1. Is seems you are allocating buffers with TOO SMALL a buffer size. I have noticed that AudioQueues won't play if the buffers are too small, which seems to fit your problem.
2. Have you verified the properties returned?
Back to my code example:
Everything is hard coded, so it is not exactly good coding practice, but it shows how you can do it.
AudioStreamTest.h
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
uint32_t bufferSizeInSamples;
AudioFileID file;
UInt32 currentPacket;
AudioQueueRef audioQueue;
AudioQueueBufferRef buffer[3];
AudioStreamBasicDescription audioStreamBasicDescription;
#interface AudioStreamTest : NSObject
- (void)start;
- (void)stop;
#end
AudioStreamTest.m
#import "AudioStreamTest.h"
#implementation AudioStreamTest
- (id)init
{
self = [super init];
if (self) {
bufferSizeInSamples = 441;
file = NULL;
currentPacket = 0;
audioStreamBasicDescription.mBitsPerChannel = 16;
audioStreamBasicDescription.mBytesPerFrame = 4;
audioStreamBasicDescription.mBytesPerPacket = 4;
audioStreamBasicDescription.mChannelsPerFrame = 2;
audioStreamBasicDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioStreamBasicDescription.mFormatID = kAudioFormatLinearPCM;
audioStreamBasicDescription.mFramesPerPacket = 1;
audioStreamBasicDescription.mReserved = 0;
audioStreamBasicDescription.mSampleRate = 44100;
}
return self;
}
- (void)start {
AudioQueueNewOutput(&audioStreamBasicDescription, AudioEngineOutputBufferCallback, (__bridge void *)(self), NULL, NULL, 0, &audioQueue);
AudioQueueAddPropertyListener(audioQueue, kAudioQueueProperty_IsRunning, AudioEnginePropertyListenerProc, NULL);
AudioQueueStart(audioQueue, NULL);
}
- (void)stop {
AudioQueueStop(audioQueue, YES);
AudioQueueRemovePropertyListener(audioQueue, kAudioQueueProperty_IsRunning, AudioEnginePropertyListenerProc, NULL);
}
void AudioEngineOutputBufferCallback(void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inBuffer) {
if (file == NULL) return;
UInt32 bytesRead = bufferSizeInSamples * 4;
UInt32 packetsRead = bufferSizeInSamples;
AudioFileReadPacketData(file, false, &bytesRead, NULL, currentPacket, &packetsRead, inBuffer->mAudioData);
inBuffer->mAudioDataByteSize = bytesRead;
currentPacket += packetsRead;
if (bytesRead == 0) {
AudioQueueStop(inAQ, false);
}
else {
AudioQueueEnqueueBuffer(inAQ, inBuffer, 0, NULL);
}
}
void AudioEnginePropertyListenerProc (void *inUserData, AudioQueueRef inAQ, AudioQueuePropertyID inID) {
//We are only interested in the property kAudioQueueProperty_IsRunning
if (inID != kAudioQueueProperty_IsRunning) return;
//Get the status of the property
UInt32 isRunning = false;
UInt32 size = sizeof(isRunning);
AudioQueueGetProperty(inAQ, kAudioQueueProperty_IsRunning, &isRunning, &size);
if (isRunning) {
currentPacket = 0;
NSString *fileName = #"/Users/roy/Documents/XCodeProjectsData/FUZZ/03.wav";
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: fileName];
AudioFileOpenURL((__bridge CFURLRef) fileURL, kAudioFileReadPermission, 0, &file);
for (int i = 0; i < 3; i++){
AudioQueueAllocateBuffer(audioQueue, bufferSizeInSamples * 4, &buffer[i]);
UInt32 bytesRead = bufferSizeInSamples * 4;
UInt32 packetsRead = bufferSizeInSamples;
AudioFileReadPacketData(file, false, &bytesRead, NULL, currentPacket, &packetsRead, buffer[i]->mAudioData);
buffer[i]->mAudioDataByteSize = bytesRead;
currentPacket += packetsRead;
AudioQueueEnqueueBuffer(audioQueue, buffer[i], 0, NULL);
}
}
else {
if (file != NULL) {
AudioFileClose(file);
file = NULL;
for (int i = 0; i < 3; i++) {
AudioQueueFreeBuffer(audioQueue, buffer[i]);
buffer[i] = NULL;
}
}
}
}
-(void)dealloc {
[super dealloc];
AudioQueueDispose(audioQueue, true);
audioQueue = NULL;
}
#end
Lastly, I want to include some research I have done today to test the robustness of AudioQueues.
I have noticed that if you make too small AudioQueue buffers, it won't play at all. That made me play around a bit to see why it is not playing.
If I try buffer size that can hold only 150 samples, I get no sound at all.
If I try buffer size that can hold 175 samples, it plays the whole song through, but with A lot of distortion. 175 amounts to a tad less than 4 ms of audio.
AudioQueue keeps asking for new buffers as long as you keep supplying buffers. That is regardless of AudioQueue actually playing your buffers or not.
If you supply a buffer with size 0, the buffer will be lost and an error kAudioQueueErr_BufferEmpty is returned for that queue enqueue request. You will never see AudioQueue ask you to fill that buffer again. If this happened for the last queue you have posted, AudioQueue will stop asking you to fill any more buffers. In that case you will not hear any more audio for that session.
To see why AudioQueues is not playing anything with smaller buffer sizes, I made a test to see if my callback is called at all even when there is no sound. The answer is that the buffers gets called all the time as long as AudioQueues is playing and needs data.
So if you keep feeding buffers to the queue, no buffer is ever lost. It doesn't happen. Unless there is an error, of course.
So why is no sound playing?
I tested to see if 'AudioQueueEnqueueBuffer()' returned any errors. It did not. No other errors within my play routine either. The data returned from reading from file is also good.
Everything is normal, buffers are good, data re-enqueued is good, there is just no sound.
So my last test was to slowly increase buffer size till I could hear anything. I finally heard faint and sporadic distortion.
Then it came to me...
It seems that the problem lies with that the system tries to keep the stream in sync with time so if you enqueue audio, and the time for the audio you wanted to play has passed, it will just skip that part of the buffer. If the buffer size becomes too small, more and more data is dropped or skipped until the audio system is in sync again. Which is never if the buffer size is too small. (You can hear this as distortion if you chose a buffer size that is barely large enough to support continuous play.)
If you think about it, it is the only way the audio queue can work, but it is a good realisation when you are clueless like me and "discover" how it really works.
I decided to take a look at this again and was able to solve it by making the buffers larger. I've accepted the answer by #RoyGal since it was their suggestion but I wanted to provide the actual code that works since I guess others are having the same problem (question has a few favorites that aren't me at the moment).
One thing I tried was making the packet size larger:
aData->aDescription.mFramesPerPacket = 512; // or some other number
aData->aDescription.mBytesPerPacket = (
aData->aDescription.mFramesPerPacket * aData->aDescription.mBytesPerFrame
);
This does NOT work: it causes AudioQueuePrime to fail with an AudioConverterNew returned -50 message. I guess it wants mFramesPerPacket to be 1 for PCM.
(I also tried setting the kAudioQueueProperty_DecodeBufferSizeFrames property which didn't seem to do anything. Not sure what it's for.)
The solution seems to be to only allocate the buffer(s) with the specified size:
AudioQueueAllocateBuffer(
aData->aQueue,
aData->aDescription.mBytesPerPacket * N_BUFFER_PACKETS / N_BUFFERS,
&aData->aBuffer[i]
);
And the size has to be sufficiently large. I found the magic number is:
mBytesPerPacket * 1024 / N_BUFFERS
(Where N_BUFFERS is the number of buffers and should be > 1 or playback is choppy.)
Here is an MCVE demonstrating the issue and solution:
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <AudioToolbox/AudioQueue.h>
#import <AudioToolbox/AudioFile.h>
#define N_BUFFERS 2
#define N_BUFFER_PACKETS 1024
typedef struct AStreamData {
AudioFileID aFile;
AudioQueueRef aQueue;
AudioQueueBufferRef aBuffer[N_BUFFERS];
AudioStreamBasicDescription aDescription;
SInt64 pOffset;
volatile BOOL isRunning;
} AStreamData;
void printASBD(AudioStreamBasicDescription* desc) {
printf("mSampleRate = %d\n", (int)desc->mSampleRate);
printf("mBytesPerPacket = %d\n", desc->mBytesPerPacket);
printf("mFramesPerPacket = %d\n", desc->mFramesPerPacket);
printf("mBytesPerFrame = %d\n", desc->mBytesPerFrame);
printf("mChannelsPerFrame = %d\n", desc->mChannelsPerFrame);
printf("mBitsPerChannel = %d\n", desc->mBitsPerChannel);
}
void bufferCallback(
void *vData, AudioQueueRef aQueue, AudioQueueBufferRef aBuffer
) {
AStreamData* aData = (AStreamData*)vData;
UInt32 bRead = 0;
UInt32 pRead = (
aBuffer->mAudioDataBytesCapacity / aData->aDescription.mBytesPerPacket
);
OSStatus stat;
stat = AudioFileReadPackets(
aData->aFile, false, &bRead, NULL, aData->pOffset, &pRead, aBuffer->mAudioData
);
if(stat != 0) {
printf("AudioFileReadPackets returned %d\n", stat);
}
if(pRead == 0) {
aData->isRunning = NO;
return;
}
aBuffer->mAudioDataByteSize = bRead;
stat = AudioQueueEnqueueBuffer(aQueue, aBuffer, 0, NULL);
if(stat != 0) {
printf("AudioQueueEnqueueBuffer returned %d\n", stat);
}
aData->pOffset += pRead;
}
AStreamData* beginPlayback(NSURL* path) {
static AStreamData* aData;
aData = malloc(sizeof(AStreamData));
OSStatus stat;
stat = AudioFileOpenURL(
(CFURLRef)path, kAudioFileReadPermission, 0, &aData->aFile
);
printf("AudioFileOpenURL returned %d\n", stat);
UInt32 dSize = 0;
stat = AudioFileGetPropertyInfo(
aData->aFile, kAudioFilePropertyDataFormat, &dSize, 0
);
printf("AudioFileGetPropertyInfo returned %d\n", stat);
stat = AudioFileGetProperty(
aData->aFile, kAudioFilePropertyDataFormat, &dSize, &aData->aDescription
);
printf("AudioFileGetProperty returned %d\n", stat);
printASBD(&aData->aDescription);
stat = AudioQueueNewOutput(
&aData->aDescription, bufferCallback, aData, NULL, NULL, 0, &aData->aQueue
);
printf("AudioQueueNewOutput returned %d\n", stat);
aData->pOffset = 0;
for(int i = 0; i < N_BUFFERS; i++) {
// change YES to NO for stale playback
if(YES) {
stat = AudioQueueAllocateBuffer(
aData->aQueue,
aData->aDescription.mBytesPerPacket * N_BUFFER_PACKETS / N_BUFFERS,
&aData->aBuffer[i]
);
} else {
stat = AudioQueueAllocateBuffer(
aData->aQueue,
aData->aDescription.mBytesPerPacket,
&aData->aBuffer[i]
);
}
printf(
"AudioQueueAllocateBuffer returned %d for aBuffer[%d] with capacity %d\n",
stat, i, aData->aBuffer[i]->mAudioDataBytesCapacity
);
bufferCallback(aData, aData->aQueue, aData->aBuffer[i]);
}
UInt32 numFramesPrepared = 0;
stat = AudioQueuePrime(aData->aQueue, 0, &numFramesPrepared);
printf("AudioQueuePrime returned %d with %d frames prepared\n", stat, numFramesPrepared);
stat = AudioQueueStart(aData->aQueue, NULL);
printf("AudioQueueStart returned %d\n", stat);
UInt32 pSize = sizeof(UInt32);
UInt32 isRunning;
stat = AudioQueueGetProperty(
aData->aQueue, kAudioQueueProperty_IsRunning, &isRunning, &pSize
);
printf("AudioQueueGetProperty returned %d\n", stat);
aData->isRunning = !!isRunning;
return aData;
}
void endPlayback(AStreamData* aData) {
OSStatus stat = AudioQueueStop(aData->aQueue, NO);
printf("AudioQueueStop returned %d\n", stat);
}
NSString* getPath() {
// change NO to YES and enter path to hard code
if(NO) {
return #"";
}
char input[512];
printf("Enter file path: ");
scanf("%[^\n]", input);
return [[NSString alloc] initWithCString:input encoding:NSASCIIStringEncoding];
}
int main(int argc, const char* argv[]) {
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
NSURL* path = [NSURL fileURLWithPath:getPath()];
AStreamData* aData = beginPlayback(path);
if(aData->isRunning) {
do {
printf("Queue is running...\n");
[NSThread sleepForTimeInterval:1.0];
} while(aData->isRunning);
endPlayback(aData);
} else {
printf("Playback did not start\n");
}
[pool drain];
return 0;
}

MIDISend: to play a musical note on iPhone

I am trying to generate a musical note that will play through the iPhone speakers using Objective-C and MIDI. I have the code below but it is not doing anything. What am I doing wrong?
MIDIPacketList packetList;
packetList.numPackets = 1;
MIDIPacket* firstPacket = &packetList.packet[0];
firstPacket->timeStamp = 0; // send immediately
firstPacket->length = 3;
firstPacket->data[0] = 0x90;
firstPacket->data[1] = 80;
firstPacket->data[2] = 120;
MIDIPacketList pklt=packetList;
MIDISend(MIDIGetSource(0), MIDIGetDestination(0), &pklt);
You've got three problems:
Declaring a MIDIPacketList doesn't allocate memory or initialize the structure
You're passing the results of MIDIGetSource (which returns a MIDIEndpointRef) as the first parameter to MIDISend where it is expecting a MIDIPortRef instead. (You probably ignored a compiler warning about this. Never ignore compiler warnings.)
Sending a MIDI note in iOS doesn't make any sound. If you don't have an external MIDI device connected to your iOS device, you need to set up something with CoreAudio that will generate sounds. That's beyond the scope of this answer.
So this code will run, but it won't make any sounds unless you've got external hardware:
//Look to see if there's anything that will actually play MIDI notes
NSLog(#"There are %lu destinations", MIDIGetNumberOfDestinations());
// Prepare MIDI Interface Client/Port for writing MIDI data:
MIDIClientRef midiclient = 0;
MIDIPortRef midiout = 0;
OSStatus status;
status = MIDIClientCreate(CFSTR("Test client"), NULL, NULL, &midiclient);
if (status) {
NSLog(#"Error trying to create MIDI Client structure: %d", (int)status);
}
status = MIDIOutputPortCreate(midiclient, CFSTR("Test port"), &midiout);
if (status) {
NSLog(#"Error trying to create MIDI output port: %d", (int)status);
}
Byte buffer[128];
MIDIPacketList *packetlist = (MIDIPacketList *)buffer;
MIDIPacket *currentpacket = MIDIPacketListInit(packetlist);
NSInteger messageSize = 3; //Note On is a three-byte message
Byte msg[3] = {0x90, 80, 120};
MIDITimeStamp timestamp = 0;
currentpacket = MIDIPacketListAdd(packetlist, sizeof(buffer), currentpacket, timestamp, messageSize, msg);
MIDISend(midiout, MIDIGetDestination(0), packetlist);

libusb_open returns 'LIBUSB_ERROR_NOT_SUPPORTED' on Windows 7

I have been developing USB drivers using LibUSB on Linux, but now I want to have one of my drivers compiled for Windows (this is the first time I am doing it).
My environment
I am working on Windows 7 using the MinGW compiler (also using Dev-cpp IDE), and I am using a pre-compiled libusb library downloaded from this link.
My device: It's a HID touch device. So no drivers are required for Windows. I have an additional endpoint to get certain debug data.
My code:
I have compiled code to list all the devices and USB devices connected to my machine, and the code works. Now I add code to open the device so that I get a device handle and start communication. But the function returns -12 That is, LIBUSB_ERROR_NOT_SUPPORTED.
How can I fix this problem?
I searched through the Internet and did not find a definite solution for this problem. While it's code which works beautifully on Linux.
P.S.: I have added the whole code below. The DoList(); function works fine, but the GetTRSDevice(); function fails at libusb_open(dev, &handle);.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <libusb.h>
libusb_device_handle* deviceHandle = NULL;
int DoList();
libusb_device_handle* GetTRSDevice(void);
int main()
{
int ret = libusb_init(NULL);
if (ret < 0) {
printf("Failed to init libusb");
return ret;
}
DoList();
deviceHandle = GetTRSDevice();
if(!deviceHandle) {
printf("Failed to locate device");
goto fail_dev_open;
}
printf("Device opened");
libusb_close(deviceHandle);
fail_dev_open:
libusb_exit(NULL);
return(ret);
}
int DoList()
{
libusb_device **devs;
ssize_t cnt;
cnt = libusb_get_device_list(NULL, &devs);
if (cnt < 0)
return (int) cnt;
libusb_device *dev;
int i = 0;
while ((dev = devs[i++]) != NULL) {
struct libusb_device_descriptor desc;
int r = libusb_get_device_descriptor(dev, &desc);
if (r < 0) {
fprintf(stderr, "failed to get device descriptor");
return(-1);
}
printf("%04x:%04x (bus %d, device %d)\n",
desc.idVendor, desc.idProduct,
libusb_get_bus_number(dev), libusb_get_device_address(dev));
}
libusb_free_device_list(devs, 1);
return 0;
}
libusb_device_handle* GetTRSDevice(void)
{
int i = 0;
ssize_t cnt;
libusb_device *dev;
libusb_device **devs;
libusb_device_handle* handle = NULL;
cnt = libusb_get_device_list(NULL, &devs);
if (cnt < 0) {
printf("Failed libusb_get_device_list");
return(0);
}
while ((dev = devs[i++]) != NULL) {
struct libusb_device_descriptor desc;
int ret = libusb_get_device_descriptor(dev, &desc);
if (ret < 0) {
printf("Failed libusb_get_device_descriptor");
continue;
}
if(desc.idVendor == 0X238f && desc.idProduct == 1) {
int ret = libusb_open(dev, &handle);
if (ret < 0) {
printf("Failed libusb_open: %d\n\r",ret);
break;
}
#ifndef WIN32
libusb_detach_kernel_driver(handle, 0);
#endif
ret = libusb_claim_interface(handle,0);
if (ret < 0) {
libusb_close(handle);
handle=NULL;
break;
}
break;
}
}
libusb_free_device_list(devs, 1);
return(handle);
}
You can easily install the WinUSB driver or the other drivers which libusb supports (libusb-win32 and libusbK) through the use of Zadig, an application that was developed just to solve this problem.
See https://zadig.akeo.ie.
One thing to keep in mind, though, is that if you replace a Mass Storage driver or HID driver (which Windows installs automatically) with WinUSB, you will only be able to access your device through libusb and won't be able to access your device as Mass Storage or HID until you uninstall the WinUSB driver.
Finally, if you have control of the firmware for your device, it is also possible to create devices that will automatically install the WinUSB driver on Vista or later, so that users don't have to go through a manual driver installation (this may require a connection to Windows Update for Windows 7 or earlier, but should work even without an internet connection for Windows 8 or later). See https://github.com/pbatard/libwdi/wiki/WCID-Devices.
[DISCLAIMER] I am the author of Zadig/libwi, the WCID wiki pages as well as a contributor to the libusb Windows backend.
It seems you need to install the winusb driver - libusb can get information about devices without this driver, but it cannot open them.
http://libusb.6.n5.nabble.com/LIBUSB-ERROR-NOT-SUPPORTED-td5617169.html:
On Wed, Apr 4, 2012 at 11:52 PM, Quân Phạm Minh <[hidden email]>
wrote:
although I never install winusb driver but I use libusb to get
information of my usb (kingston usb, and already
recognize by system)
Yes that is possible. But you can not open the device and do further
things. That is the confusing part for new users with regard to
libusb Windows backend, and similarly for Mac OS X as well. libusb
can get some basic information for device with a non-proper driver
(e.g.: USB mass storage device), but will not be able to open the
device without changing the driver to a supported one.
-- Xiaofan
I had this same issue and it was not solved by installing WinUSB drivers with Zadig.
Consistently I found that libusb_open() returns LIBUSB_ERROR_NOT_SUPPORTED if and only if I have a Logitech Unifying Receiver plugged into another USB port. This causes the pyusb libusb1 backend to raise an exception like "NotImplementedError: Operation not supported or unimplemented on this platform".
I have removed the Logitech receiver (so I am using a wired keyboard) and the problem is solved for me. I would love to know why or how the Logitech receiver can cause this error on another USB port, but I don't.
I had the same issue with Zadig not working. I fixed it my connection the device directly to my laptop not through a USB-C hub
I had the same issue. But to my surprize, ignoring the error and continuing the execution of the rest of the code, send the data as normal. So the device was supported and could communicate as normal. And for some reason, libUSB-1.0 thought it could not support the device.
My code example:
#include <libusb.h>
#include <stdio.h>
#include <unistd.h>
libusb_context* context = NULL;
int main(void)
{
int kernelDriverDetached = 0; /* Set to 1 if kernel driver detached*/
uint8_t buffer[64]; /* 64 byte transfer buffer */
int numBytes = 0; /* Actual bytes transferred. */
libusb_device_handle* handle = NULL;
int res = libusb_init(&context); //initialize the library using libusb_init
if (res != 0)
{
fprintf(stderr, "Error initialising libusb.\n");
}
/* Get the first device with the matching Vendor ID and Product ID.If
* intending to allow multiple demo boards to be connected at once,you
* will need to use libusb_get_device_list() instead. Refer to the libusb
* documentation for details. */
handle = libusb_open_device_with_vid_pid(0, 0x1cbe, 0x0003);
if (!handle)
{
fprintf(stderr, "Unable to open device.\n");
}
/* Check whether a kernel driver is attached to interface #0. If so, we'll
* need to detach it.*/
if (libusb_kernel_driver_active(handle, 0))
{
res = libusb_detach_kernel_driver(handle, 0);
if (res == 0)
{
kernelDriverDetached = 1;
}
else
{
fprintf(stderr, "Error detaching kernel driver. %s\n", libusb_error_name(res));
}
}
/* Claim interface #0. */
res = libusb_claim_interface(handle, 0);
if (res != 0)
{
fprintf(stderr, "Error claiming interface.\n");
}
memset(buffer, 0, 12);
buffer[0] = 0x55;
buffer[1] = 0xAA;
buffer[2] = 0x01;
buffer[3] = 0x00;
buffer[4] = 0x00;
buffer[5] = 0x00;
buffer[6] = 0x00;
buffer[7] = 0x00;
buffer[8] = 0x01;
buffer[9] = 0x00;
buffer[10] = 0x01;
buffer[11] = 0x01;
res = libusb_bulk_transfer(handle, 0x01, buffer, 12, &numBytes, 100);
if (res == 0)
{
printf("%d bytes transmitted successfully.\n", numBytes);
}
else
{
fprintf(stderr, "Error during send message: %s\n",libusb_error_name(res));
}
memset(buffer, 0, 12);
res = libusb_bulk_transfer(handle, 0x81, buffer, 12, &numBytes, 100);
if (res == 0)
{
printf("%d bytes receibed successfully.\n", numBytes);
}
else
{
fprintf(stderr, "Error during receibe response:%s\n",libusb_error_name(res));
}
/* Release interface #0. */
res = libusb_release_interface(handle, 0);
if (0 != res)
{
fprintf(stderr, "Error releasing interface.\n");
}
/* If we detached a kernel driver from interface #0 earlier, we'll now
* need to attach it again. */
if (kernelDriverDetached)
{
libusb_attach_kernel_driver(handle, 0);
}
/* Shutdown libusb. */
libusb_exit(0);
system("pause");
return 0;
}