"Could not allocate arena" after adding Bluetooth functions to voice recognition code - tensorflow

I am trying to make a Bluetooth speaker controlled by voice commands. I have successfully written code to connect to the speaker via Bluetooth. Then I have successfully trained a neural network and got the speaker to recognize my commands (Speaker has a microphone). I followed this tutorial and code: https://github.com/atomic14/voice-controlled-robot/tree/main/firmware.
However, when I try to combine the two codes together, I get a "could not allocate arena" error and the code doesn't run properly. All I did was add "#include <btAudio.h>", "btAudio audio = btAudio("ESP_Speaker");", and "audio.begin();" to the main file of the voice recognition code. I have attached code for the main file. All of the other files are pretty much unchanged. In addition, I've attached a picture of the output after the program uploads: Code output. Does anyone have an idea on how I should approach fixing this error? Thank you.
#include <Arduino.h>
#include <WiFi.h>
#include <driver/i2s.h>
#include <esp_task_wdt.h>
#include "I2SMicSampler.h"
#include "ADCSampler.h"
#include "config.h"
#include "CommandDetector.h"
#include "CommandProcessor.h"
#include <btAudio.h>
#include <stdint.h>
#include <esp_avrc_api.h>
btAudio audio = btAudio("ESP_Speaker");
// i2s config for using the internal ADC
i2s_config_t adcI2SConfig = {
.mode = (i2s_mode_t)(I2S_MODE_MASTER | I2S_MODE_RX | I2S_MODE_ADC_BUILT_IN),
.sample_rate = 16000,
.bits_per_sample = I2S_BITS_PER_SAMPLE_16BIT,
.channel_format = I2S_CHANNEL_FMT_ONLY_LEFT,
.communication_format = I2S_COMM_FORMAT_I2S_LSB,
.intr_alloc_flags = ESP_INTR_FLAG_LEVEL1,
.dma_buf_count = 4,
.dma_buf_len = 64,
.use_apll = false,
.tx_desc_auto_clear = false,
.fixed_mclk = 0};
// This task does all the heavy lifting for our application
void applicationTask(void *param)
{
CommandDetector *commandDetector = static_cast<CommandDetector *>(param);
const TickType_t xMaxBlockTime = pdMS_TO_TICKS(100);
while (true)
{
// wait for some audio samples to arrive
uint32_t ulNotificationValue = ulTaskNotifyTake(pdTRUE, xMaxBlockTime);
if (ulNotificationValue > 0)
{
commandDetector->run();
}
}
}
void setup()
{
Serial.begin(115200);
audio.begin();
delay(1000);
Serial.println("Starting up");
// make sure we don't get killed for our long running tasks
esp_task_wdt_init(10, false);
// start up the I2S input (from either an I2S microphone or Analogue microphone via the ADC)
#ifdef USE_I2S_MIC_INPUT
// Direct i2s input from INMP441 or the SPH0645
I2SSampler *i2s_sampler = new I2SMicSampler(i2s_mic_pins, false);
#else
// Use the internal ADC
I2SSampler *i2s_sampler = new ADCSampler(ADC_UNIT_1, ADC_MIC_CHANNEL);
#endif
// the command processor
CommandProcessor *command_processor = new CommandProcessor();
// create our application
CommandDetector *commandDetector = new CommandDetector(i2s_sampler, command_processor);
// set up the i2s sample writer task
TaskHandle_t applicationTaskHandle;
xTaskCreatePinnedToCore(applicationTask, "Command Detect", 8192, commandDetector, 1, &applicationTaskHandle, 0);
// start sampling from i2s device - use I2S_NUM_0 as that's the one that supports the internal ADC
#ifdef USE_I2S_MIC_INPUT
i2s_sampler->start(I2S_NUM_0, i2sMemsConfigBothChannels, applicationTaskHandle);
#else
i2s_sampler->start(I2S_NUM_0, adcI2SConfig, applicationTaskHandle);
#endif
}
void loop()
{
vTaskDelay(pdMS_TO_TICKS(1000));
}

Related

I2C communication between PIC32 and LCD

I am trying to communicate with LCD2041 using I2C . I am using PIC32MM curiosity board . I wrote the following code on MP lab code configurator , but the status for I2c communications is stuck on I2C2_MESSAGE_PENDING . I need help on what I might have done wrong or what I am missing .
#include <stdint.h>
#include <string.h>
#include <xc.h>
#include "mcc_generated_files/mcc.h"
//#include "lcd_i2c.h"
#define slave_Adress 0b01010000
void ByteDelay(void){
// Delay between bytes required by LCD2041 spec
DELAY_microseconds(625);
}
void ReadDelay(void){
// Delay between read commands required by LCD2041 spec
DELAY_milliseconds(3);
}
void TransactionDelay(void){
// Delay between transactions required by LCD2041 spec
DELAY_microseconds(375);
}
int main(void)
{
SYSTEM_Initialize();
uint8_t data = 0xFE; // host to tell data are output via I2c
uint8_t lcd_clear_display = 0xA4; // command to clear LCD
TRISBbits.TRISB2 = 1; // set B2 (scl) as input
TRISBbits.TRISB3 = 1; // set B3 (SDA) as input
I2C2_Initialize() ;
I2C2_MESSAGE_STATUS status ;
I2C2_MasterWrite(data, 1 , slave_Adress, &status);
ByteDelay();
if ( status == I2C2_MESSAGE_PENDING) {led_3_SetHigh();}
return 1;
}
The default slave address for the LCD is 0x50
I had a similar error once with an pic18f and mcc. I forgot to activate the global interrups. I don't see a place where you do that in your code.

How to put BG96 on power save mode between sending messages to Azure IoT Hub over HTTP

I'm using a Nucleo L496ZG, X-NUCLEO-IKS01A2 and the Quectel BG96 module to send sensor data (temperature, humidity etc..) to Azure IoT Central over HTTP.
I've been using the example implementation provided by Avnet here, which works fine but it's not power optimized and with a 6700mAh battery pack it only lasts around 30 hours sending telemetry ever ~10 seconds. Goal is for it to last around a week. I'm open to increasing the time between messages but I also want to save power in between sending.
I've gone over the Quectel BG96 manuals and I've tried two things:
1) powering off the device by driving the PWRKEY and turning it back on when I need to send a message
I've gotten this to work, kinda… until I get a hardfault exception which happens seemingly randomly anywhere from within ~5 minutes of running to 2 hours (messages successfully sending prior to the exception). Output of crash log parser is the same every time:
Crash location = strncmp [0x08038DF8] (based on PC value)
Caller location = _findenv_r [0x0804119D] (based on LR value)
Stack Pointer at the time of crash = [20008128]
Target and Fault Info:
Processor Arch: ARM-V7M or above
Processor Variant: C24
Forced exception, a fault with configurable priority has been escalated to HardFault
A precise data access error has occurred. Faulting address: 03060B30
The caller location traces back to my .map file and I don't know what to make of it.
My code:
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//#define USE_MQTT
#include <stdlib.h>
#include "mbed.h"
#include "iothubtransporthttp.h"
#include "iothub_client_core_common.h"
#include "iothub_client_ll.h"
#include "azure_c_shared_utility/platform.h"
#include "azure_c_shared_utility/agenttime.h"
#include "jsondecoder.h"
#include "bg96gps.hpp"
#include "azure_message_helper.h"
#define IOT_AGENT_OK CODEFIRST_OK
#include "azure_certs.h"
/* initialize the expansion board && sensors */
#include "XNucleoIKS01A2.h"
static HTS221Sensor *hum_temp;
static LSM6DSLSensor *acc_gyro;
static LPS22HBSensor *pressure;
static const char* connectionString = "xxx";
// to report F uncomment this #define CTOF(x) (((double)(x)*9/5)+32)
#define CTOF(x) (x)
Thread azure_client_thread(osPriorityNormal, 10*1024, NULL, "azure_client_thread");
static void azure_task(void);
EventFlags deleteOK;
size_t g_message_count_send_confirmations;
/* create the GPS elements for example program */
BG96Interface* bg96Interface;
//static int tilt_event;
// void mems_int1(void)
// {
// tilt_event++;
// }
void mems_init(void)
{
//acc_gyro->attach_int1_irq(&mems_int1); // Attach callback to LSM6DSL INT1
hum_temp->enable(); // Enable HTS221 enviromental sensor
pressure->enable(); // Enable barametric pressure sensor
acc_gyro->enable_x(); // Enable LSM6DSL accelerometer
//acc_gyro->enable_tilt_detection(); // Enable Tilt Detection
}
void powerUp(void) {
if (platform_init() != 0) {
printf("Error initializing the platform\r\n");
return;
}
bg96Interface = (BG96Interface*) easy_get_netif(true);
}
void BG96_Modem_PowerOFF(void)
{
DigitalOut BG96_RESET(D7);
DigitalOut BG96_PWRKEY(D10);
DigitalOut BG97_WAKE(D11);
BG96_RESET = 0;
BG96_PWRKEY = 0;
BG97_WAKE = 0;
wait_ms(300);
}
void powerDown(){
platform_deinit();
BG96_Modem_PowerOFF();
}
//
// The main routine simply prints a banner, initializes the system
// starts the worker threads and waits for a termination (join)
int main(void)
{
//printStartMessage();
XNucleoIKS01A2 *mems_expansion_board = XNucleoIKS01A2::instance(I2C_SDA, I2C_SCL, D4, D5);
hum_temp = mems_expansion_board->ht_sensor;
acc_gyro = mems_expansion_board->acc_gyro;
pressure = mems_expansion_board->pt_sensor;
azure_client_thread.start(azure_task);
azure_client_thread.join();
platform_deinit();
printf(" - - - - - - - ALL DONE - - - - - - - \n");
return 0;
}
static void send_confirm_callback(IOTHUB_CLIENT_CONFIRMATION_RESULT result, void* userContextCallback)
{
//userContextCallback;
// When a message is sent this callback will get envoked
g_message_count_send_confirmations++;
deleteOK.set(0x1);
}
void sendMessage(IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle, char* buffer, size_t size)
{
IOTHUB_MESSAGE_HANDLE messageHandle = IoTHubMessage_CreateFromByteArray((const unsigned char*)buffer, size);
if (messageHandle == NULL) {
printf("unable to create a new IoTHubMessage\r\n");
return;
}
if (IoTHubClient_LL_SendEventAsync(iotHubClientHandle, messageHandle, send_confirm_callback, NULL) != IOTHUB_CLIENT_OK)
printf("FAILED to send! [RSSI=%d]\n", platform_RSSI());
else
printf("OK. [RSSI=%d]\n",platform_RSSI());
IoTHubMessage_Destroy(messageHandle);
}
void azure_task(void)
{
//bool tilt_detection_enabled=true;
float gtemp, ghumid, gpress;
int k;
int msg_sent=1;
while (true) {
powerUp();
mems_init();
/* Setup IoTHub client configuration */
IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle = IoTHubClient_LL_CreateFromConnectionString(connectionString, HTTP_Protocol);
if (iotHubClientHandle == NULL) {
printf("Failed on IoTHubClient_Create\r\n");
return;
}
// add the certificate information
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "TrustedCerts", certificates) != IOTHUB_CLIENT_OK)
printf("failure to set option \"TrustedCerts\"\r\n");
#if MBED_CONF_APP_TELUSKIT == 1
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "product_info", "TELUSIOTKIT") != IOTHUB_CLIENT_OK)
printf("failure to set option \"product_info\"\r\n");
#endif
// polls will happen effectively at ~10 seconds. The default value of minimumPollingTime is 25 minutes.
// For more information, see:
// https://azure.microsoft.com/documentation/articles/iot-hub-devguide/#messaging
unsigned int minimumPollingTime = 9;
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "MinimumPollingTime", &minimumPollingTime) != IOTHUB_CLIENT_OK)
printf("failure to set option \"MinimumPollingTime\"\r\n");
IoTDevice* iotDev = (IoTDevice*)malloc(sizeof(IoTDevice));
if (iotDev == NULL) {
return;
}
setUpIotStruct(iotDev);
char* msg;
size_t msgSize;
hum_temp->get_temperature(&gtemp); // get Temp
hum_temp->get_humidity(&ghumid); // get Humidity
pressure->get_pressure(&gpress); // get pressure
iotDev->Temperature = CTOF(gtemp);
iotDev->Humidity = (int)ghumid;
iotDev->Pressure = (int)gpress;
printf("(%04d)",msg_sent++);
msg = makeMessage(iotDev);
msgSize = strlen(msg);
sendMessage(iotHubClientHandle, msg, msgSize);
free(msg);
iotDev->Tilt &= 0x2;
/* schedule IoTHubClient to send events/receive commands */
IOTHUB_CLIENT_STATUS status;
while ((IoTHubClient_LL_GetSendStatus(iotHubClientHandle, &status) == IOTHUB_CLIENT_OK) && (status == IOTHUB_CLIENT_SEND_STATUS_BUSY))
{
IoTHubClient_LL_DoWork(iotHubClientHandle);
ThisThread::sleep_for(100);
}
deleteOK.wait_all(0x1);
free(iotDev);
IoTHubClient_LL_Destroy(iotHubClientHandle);
powerDown();
ThisThread::sleep_for(300000);
}
return;
}
I know PSM is probably the way to go since powering on/off the device draws a lot of power but it would be useful if someone had an idea of what is happening here.
2) putting the device to PSM between sending messages
The BG96 library in the example code I'm using doesn't have a method to turn on PSM so I tried to implement my own. When I tried to run it, it basically runs into an exception right away so I know it's wrong (I'm very new to embedded development and have no prior experience with AT commands).
/** ----------------------------------------------------------
* this is a method provided by current library
* #brief Tx a string to the BG96 and wait for an OK response
* #param none
* #retval true if OK received, false otherwise
*/
bool BG96::tx2bg96(char* cmd) {
bool ok=false;
_bg96_mutex.lock();
ok=_parser.send(cmd) && _parser.recv("OK");
_bg96_mutex.unlock();
return ok;
}
/**
* method I created in an attempt to use PSM
*/
bool BG96::psm(void) {
return tx2bg96((char*)"AT+CPSMS=1,,,”00000100”,”00000001”");
}
Can someone tell me what I'm doing wrong and provide any guidance on how I can achieve my goal of having my device run on battery for longer?
Thank you!!
I got Power Saving Mode working by using Mbed's ATCmdParser and the AT+QPSMS commands as per Quectel's docs. The modem doesn't always go into power saving mode right away so that should be noted. I also found that I have to restart the modem afterwards or else I get weird behaviour. My code looks something like this:
bool BG96::psm(char* T3412, char* T3324) {
_bg96_mutex.lock();
if(_parser.send("AT+QPSMS=1,,,\"%s\",\"%s\"", T3412, T3324) && _parser.recv("OK")) {
_bg96_mutex.unlock();
}else {
_bg96_mutex.unlock();
return false;
}
return BG96Ready(); }//restarts modem
To send a message to Azure, the modem will need to be manually woken up by driving the PWRKEY to start bi-directional communication, and a new client handle needs to be created and torn down every time since Azure connection uses keepAlive and the modem will be unreachable when it's in PSM.

Read status of FT245RL pins

Sorry for my ignorance but I am very new in FTDI chip Linux software development.
I have module based on FT245RL chip, programmed to be 4 port output (relays) and 4 port opto isolated input unit.
I found out in Internet program in C to turn on/off relays connected to outputs D0 to D3. After compiling it works properly. Below draft of this working program:
/* switch4.c
* # gcc -o switch4 switch4.c -L. -lftd2xx -Wl,-rpath,/usr/local/lib
* Usage
* # switch4 [0-15], for example # switch4 1
* */
#include <stdio.h>
#include <stdlib.h>
#include "./ftd2xx.h"
int main(int argc, char *argv[])
{
FT_STATUS ftStatus;
FT_HANDLE ftHandle0;
int parametr;
LPVOID pkod;
DWORD nBufferSize = 0x0001;
DWORD dwBytesWritten;
if(argc > 1) {
sscanf(argv[1], "%d", ¶metr);
}
else {
parametr = 0;
}
FT_SetVIDPID(0x5555,0x0001); // id from lsusb
FT_Open(0,&ftHandle0);
FT_SetBitMode(ftHandle0,15,1);
pkod=&parametr;
ftStatus = FT_Write(ftHandle0,pkod,nBufferSize,&dwBytesWritten);
ftStatus = FT_Close(ftHandle0);
}
My question is. How can I read in the same program, status of D4 to D7 pins, programmed as inputs? I mean about "printf" to stdout the number representing status (zero or one) of input pins (or all input/output pins).
Can anybody help newbie ?
UPDATE-1
This is my program with FT_GetBitMode
// # gcc -o read5 read5.c -L. -lftd2xx -Wl,-rpath,/usr/local/lib
#include <stdio.h>
#include <stdlib.h>
#include "./ftd2xx.h"
int main(int argc, char *argv[])
{
FT_STATUS ftStatus;
FT_HANDLE ftHandle0;
UCHAR BitMode;
FT_SetVIDPID(0x5555,0x0001); // id from lsusb
ftStatus = FT_Open(0,&ftHandle0);
if(ftStatus != FT_OK) {
printf("FT_Open failed");
return;
}
FT_SetBitMode(ftHandle0,15,1);
ftStatus = FT_GetBitMode(ftHandle0, &BitMode);
if (ftStatus == FT_OK) {
printf("BitMode contains - %d",BitMode);
}
else {
printf("FT_GetBitMode FAILED!");
}
ftStatus = FT_Close(ftHandle0);
}
But it returns "FT_GetBitMode FAILED!" instead value of BitMode
FT_GetBitMode returns the instantaneous value of the pins. A single byte will be
returned containing the current values of the pins, both those which are inputs and
those which are outputs.
Source.
Finally I found out whats going wrong. I used incorrect version of ftdi library. The correct version dedicated for x86_64 platform is located here:
Link to FTDI library

PIC12LF1822 UART Programming with External Fosc

I am currently working with PIC12LF1822 a Microchip series Controller. In that I need to configure EUSART in 8-bit Asynchronous mode and pulse generation of 1MHZ using PWM...
First I started with UART send and receive of data..But after setting including oscillator selection in config register, my code is not working and failed to send the character via tx pin(RA0)..
I analysed deeply,but I can't able to find a solution for this. Can anybody suggest to solve this issue.
Below I had pasted my code,
#include<htc.h>
#include <PIC12LF1822.H>
#include "BIPOLAR_SERIAL.C"
#define _XTAL_FREQ 12000000
//Configuration word 1 register for oscillator selection
__CONFIG(CP_OFF & BOREN_OFF & WDTE_OFF & IESO_OFF & FCMEN_OFF & PWRTE_ON & CPD_OFF & FOSC_XT);
//Configuration word 2 register for disabling Low-Voltage program
__CONFIG(LVP_OFF);
void PWM_Duty_Cyle(unsigned int duty_cyc)
{
CCPR1L = duty_cyc>>2;
CCP1CON &= 0xcf;
CCP1CON |= ((duty_cyc & 0x03)<<4);
}
void PWM_Init()
{
TRISA2 = 0; //PWM Pin as output
CCP1CON = 0x0c; //PWM mode: P1A, P1C active-high; P1B, P1D active-high
PR2 = 0x00; //Timer Period Configuration
T2CON = 0x00; //Timer2 Prescale as 1
PWM_Duty_Cyle(0); //Initialize PWM to 0 percent duty cycle
T2CON |= 0x04; //Enable Timer2
}
void main()
{
APFCON = 0;
ANSA0=0;ANSA1=0;ANSA2=0;
serial_init();
serial_tx('S');
// PWM_Init();
// serial_str("PIC12LF1822_TEST");
while(1)
{
serial_tx('A');
serial_tx(' ');
}
}
In the above code I can't able to get character's 'A' as well as 'S'.
Note: I am using 12MHZ as Fosc and I had set the baud rate according to that..(SPBRG=77 for baud =900,brgh=1)
What is your "SerialInit()" and "SerialTx('S')" functions doing?
Post the code of these functions, I'd like to see how you're setting up your serial port
Probably you havn't set all the required bits (RCSTA, TXSTA, SPBRG). To setup these bits properly, be sure to know the appropriate BaudRate you want to have, if you need high speed be sure to setup SPBRGH correctly.
My two functions to transmit via UART when properly initialized :
void vTx232 (UC ucSend)
{
STATUS.RP0 = PIR1; //Sure we're in PIR1
while (PIR1.TXIF == 0);//While last TX not done
TXREG = ucSend; //Input param into TXREG
}
void vTxString(UC *ucpString)
{
while (*ucpString!= 0x00) //While string not at its end
{
vTx232(*ucpString); //Send string character
ucpString++; //Increm string pointer
}
}
Datasheet : PIC12LF1822

Arduino : time scheduler is not working

I am trying to use the Arduino time scheduler. I copied the code from here and I imported the library, but it didn't compile. Here's the compilation error code and the code itself:
Error:
In file included from time2.ino:1:
C:\arduino\arduino-1.0.3\libraries\Scheduler/Scheduler.h:62: error: 'byte' does not name a type
C:\arduino\arduino-1.0.3\libraries\Scheduler/Scheduler.h:64: error: 'NUMBER_OF_SCHEDULED_ACTIONS' was not declared in this scope
C:\arduino\arduino-1.0.3\libraries\Scheduler/Scheduler.h:65: error: 'byte' does not name a type
Code:
#include <Scheduler.h> // [url=http://playground.arduino.cc/uploads/Code/Scheduler.zip]Scheduler.zip[/url]
Scheduler scheduler = Scheduler(); //create a scheduler
const byte ledPin = 13; //LED on pin 13
void setup(){
Serial.begin(9600); //Iitialize the UART
pinMode(ledPin,OUTPUT); //set pin 13 to OUTPUT
}
void loop(){
scheduler.update(); //update the scheduler, maybe it is time to execute a function?
if (Serial.available()){ //if we have recieved anything on the Serial
scheduler.schedule(setHigh,500); //schedule a setHigh call in 500 milliseconds
Serial.flush(); //flush Serial so we do not schedule multiple setHigh calls
}
}
void setHigh(){
digitalWrite(ledPin,HIGH); //set ledPin HIGH
scheduler.schedule(setLow,500); //schedule setLow to execute in 500 milliseconds
}
void setLow(){
digitalWrite(ledPin,LOW); //set ledPin LOW
}
How can I fix this?
the library is not 1.0.0+ compliant.
Change replace the following .\Scheduler\Scheduler.h :
< #include <WProgram.h>
---
> #if defined(ARDUINO) && ARDUINO >= 100
> #include "Arduino.h"
> #else
> #include "WProgram.h"
> #endif
Then it will compile, for me at least.