Hello i am using AM1808 ARM9 microprocessor.
I have an interfacing of GSM dongle.I want to make the GSM dongle connection as well all data transmission as well reception in the background using pthreads.
When i am trying to connect the dongle in background it is continuously blinking green light and i could not get connect to the server.
I can not find the problem.
Here is my code for the datacard initialisation as well as communication routine.
I am initialising the Thread in the Main thread.
int InitializeDataCard(void)
{
static int connect_ret = 0;
pthread_t processing_th;
pthread_create(&processing_th, NULL, Datacard_Connection_Thread, &db_mc_object);
pthread_detach(processing_th);
ShowMessageBox(msgbox_text[136], MB_TASKMODAL);
if(connect_ret)
{
ShowMessageBox(msgbox_text[163], MB_ICONERROR);
return -1;`enter code here`
}
else
{
return 0;
}
}
int ConnectToServer(void)
{
int connect_ret = 0;
Dprintf(__func__,"Trying to connect ....");
DPUUtilsLib_RetrieveParameter(PID_DATACARD_INFO,(UCHAR *)&datacard_info,sizeof(datacard_info));
connect_ret = DataCardConnect(datacard_info);
sleep(3);
//g_do_process = 0;
Dprintf(__func__,"DataCardConnect ret=%d",connect_ret);
return connect_ret;
}
void * Datacard_Connection_Thread(void *tempdata)
{
int ret=0,response = -1,enc_ret=0;
static int g_gsm_response = 0;
dpu_csv_file_param_t fileparam;
dpu_db_export_search_params_t tempparams;
dpu_db_milk_collection_t *livedata,*updatedata;
dpu_db_user_master_t CreatedBy,UpdatedBy;
dpu_db_society_master_t soc_info;
char filename[50]={0};
livedata = (dpu_db_milk_collection_t *)tempdata;
//Connect to the Network
create_connection :
ret = ConnectToServer();
//connected successfully
if(!ret)
{
//Get the SocietyCode from the Database
if(g_data_available)
{
g_data_available=0;
soc_info.SocietyId = g_society_list[g_selected_society].SocietyId;
DPUDBLib_search_society_master(&soc_info);
strncpy(livedata->SocietyCode,soc_info.SocietyCode,5);
Dprintf(__func__,"%04d\n %5.2f\n %5.2f\n %5.2f\n %5.2f\n %5.2f\n %5.2f\n %7.2f\n %7.2f\n %03d\n %c\n %d\n %5.2d\n %s\n %d",
livedata->MemberCode,livedata->Fat,livedata->LRCLR,livedata->SNF,livedata->Solid,
livedata->FatKG,livedata->SNFKG,livedata->Rate,livedata->Amount,
livedata->CanNumber,livedata->EntryMode,livedata->MC_RateChartId,livedata->Water,livedata->SocietyCode,__LINE__);
sprintf(tempparams.FileName,"%s/%s",DATA_TEMP,MT_MILKLIVEFILE);
memcpy(fileparam.FilePath,tempparams.FileName,sizeof(tempparams.FileName));
fileparam.Type = DPU_CSV_EXPORT;
fileparam.FileType = DPU_CSV_MILK_LIVE_DATA_FILE;
//open a csv file
DPUCSVLib_open_csv_file(&fileparam);
memset(&CreatedBy,0,sizeof(dpu_db_user_master_t));
memset(&UpdatedBy,0,sizeof(dpu_db_user_master_t));
strncpy(CreatedBy.Username,TempUser,5);
//write the live data into the file
DPUCSVLib_write_csv_file(&fileparam,livedata,&CreatedBy,&UpdatedBy);
//encrypt the file
enc_ret = DPUEncryptFile(tempparams.FileName,filename);
if(!enc_ret)
{
//send file request to server for the accepting the data
response = SendFileRequest(g_gsm_response,filename,9);
if(!response)
{
//receive the response of the server
response = GetResponceFromServer(g_gsm_response,&response,9);
if(response || g_gsm_response) response = -1;
else
{
//If record is successfully sent to the server then update the FlagGSM flag of the record as well as
//Update the database
g_update_record = 1;
livedata->FlagGSM = 1;
updatedata->MilkCollectionId = livedata->MilkCollectionId;
DPUDBLib_search_milk_collection_entry_by_record(&updatedata);
DPUDBLib_edit_milk_collection_entry(&updatedata,&livedata);
g_update_record = 0;
}
}
//remove the temp file generated
remove(filename);
}
}
}
else
{
//if connection is not successfully done then try to reconnect the server
ShowMessageBox(msgbox_text[156], MB_ICONERROR);
goto create_connection;
}
}
I think there is basic mistake here. By declaring the static variable connect_ret in InitializeDataCard, it does not mean in any way that it is going to be the same variable declared in ConnectToServer. Therefore, the first function will always have the same behaviour...
I think you'll need a global variable (i.e. not defined in a function) and possibily some kind of synchronization, because when you create the thread, then probably it won't be executed immediately, so even if you have a global variable, you can't check against it until you know that it has been correctly set.
Related
I'm using a Nucleo L496ZG, X-NUCLEO-IKS01A2 and the Quectel BG96 module to send sensor data (temperature, humidity etc..) to Azure IoT Central over HTTP.
I've been using the example implementation provided by Avnet here, which works fine but it's not power optimized and with a 6700mAh battery pack it only lasts around 30 hours sending telemetry ever ~10 seconds. Goal is for it to last around a week. I'm open to increasing the time between messages but I also want to save power in between sending.
I've gone over the Quectel BG96 manuals and I've tried two things:
1) powering off the device by driving the PWRKEY and turning it back on when I need to send a message
I've gotten this to work, kinda… until I get a hardfault exception which happens seemingly randomly anywhere from within ~5 minutes of running to 2 hours (messages successfully sending prior to the exception). Output of crash log parser is the same every time:
Crash location = strncmp [0x08038DF8] (based on PC value)
Caller location = _findenv_r [0x0804119D] (based on LR value)
Stack Pointer at the time of crash = [20008128]
Target and Fault Info:
Processor Arch: ARM-V7M or above
Processor Variant: C24
Forced exception, a fault with configurable priority has been escalated to HardFault
A precise data access error has occurred. Faulting address: 03060B30
The caller location traces back to my .map file and I don't know what to make of it.
My code:
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
//#define USE_MQTT
#include <stdlib.h>
#include "mbed.h"
#include "iothubtransporthttp.h"
#include "iothub_client_core_common.h"
#include "iothub_client_ll.h"
#include "azure_c_shared_utility/platform.h"
#include "azure_c_shared_utility/agenttime.h"
#include "jsondecoder.h"
#include "bg96gps.hpp"
#include "azure_message_helper.h"
#define IOT_AGENT_OK CODEFIRST_OK
#include "azure_certs.h"
/* initialize the expansion board && sensors */
#include "XNucleoIKS01A2.h"
static HTS221Sensor *hum_temp;
static LSM6DSLSensor *acc_gyro;
static LPS22HBSensor *pressure;
static const char* connectionString = "xxx";
// to report F uncomment this #define CTOF(x) (((double)(x)*9/5)+32)
#define CTOF(x) (x)
Thread azure_client_thread(osPriorityNormal, 10*1024, NULL, "azure_client_thread");
static void azure_task(void);
EventFlags deleteOK;
size_t g_message_count_send_confirmations;
/* create the GPS elements for example program */
BG96Interface* bg96Interface;
//static int tilt_event;
// void mems_int1(void)
// {
// tilt_event++;
// }
void mems_init(void)
{
//acc_gyro->attach_int1_irq(&mems_int1); // Attach callback to LSM6DSL INT1
hum_temp->enable(); // Enable HTS221 enviromental sensor
pressure->enable(); // Enable barametric pressure sensor
acc_gyro->enable_x(); // Enable LSM6DSL accelerometer
//acc_gyro->enable_tilt_detection(); // Enable Tilt Detection
}
void powerUp(void) {
if (platform_init() != 0) {
printf("Error initializing the platform\r\n");
return;
}
bg96Interface = (BG96Interface*) easy_get_netif(true);
}
void BG96_Modem_PowerOFF(void)
{
DigitalOut BG96_RESET(D7);
DigitalOut BG96_PWRKEY(D10);
DigitalOut BG97_WAKE(D11);
BG96_RESET = 0;
BG96_PWRKEY = 0;
BG97_WAKE = 0;
wait_ms(300);
}
void powerDown(){
platform_deinit();
BG96_Modem_PowerOFF();
}
//
// The main routine simply prints a banner, initializes the system
// starts the worker threads and waits for a termination (join)
int main(void)
{
//printStartMessage();
XNucleoIKS01A2 *mems_expansion_board = XNucleoIKS01A2::instance(I2C_SDA, I2C_SCL, D4, D5);
hum_temp = mems_expansion_board->ht_sensor;
acc_gyro = mems_expansion_board->acc_gyro;
pressure = mems_expansion_board->pt_sensor;
azure_client_thread.start(azure_task);
azure_client_thread.join();
platform_deinit();
printf(" - - - - - - - ALL DONE - - - - - - - \n");
return 0;
}
static void send_confirm_callback(IOTHUB_CLIENT_CONFIRMATION_RESULT result, void* userContextCallback)
{
//userContextCallback;
// When a message is sent this callback will get envoked
g_message_count_send_confirmations++;
deleteOK.set(0x1);
}
void sendMessage(IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle, char* buffer, size_t size)
{
IOTHUB_MESSAGE_HANDLE messageHandle = IoTHubMessage_CreateFromByteArray((const unsigned char*)buffer, size);
if (messageHandle == NULL) {
printf("unable to create a new IoTHubMessage\r\n");
return;
}
if (IoTHubClient_LL_SendEventAsync(iotHubClientHandle, messageHandle, send_confirm_callback, NULL) != IOTHUB_CLIENT_OK)
printf("FAILED to send! [RSSI=%d]\n", platform_RSSI());
else
printf("OK. [RSSI=%d]\n",platform_RSSI());
IoTHubMessage_Destroy(messageHandle);
}
void azure_task(void)
{
//bool tilt_detection_enabled=true;
float gtemp, ghumid, gpress;
int k;
int msg_sent=1;
while (true) {
powerUp();
mems_init();
/* Setup IoTHub client configuration */
IOTHUB_CLIENT_LL_HANDLE iotHubClientHandle = IoTHubClient_LL_CreateFromConnectionString(connectionString, HTTP_Protocol);
if (iotHubClientHandle == NULL) {
printf("Failed on IoTHubClient_Create\r\n");
return;
}
// add the certificate information
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "TrustedCerts", certificates) != IOTHUB_CLIENT_OK)
printf("failure to set option \"TrustedCerts\"\r\n");
#if MBED_CONF_APP_TELUSKIT == 1
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "product_info", "TELUSIOTKIT") != IOTHUB_CLIENT_OK)
printf("failure to set option \"product_info\"\r\n");
#endif
// polls will happen effectively at ~10 seconds. The default value of minimumPollingTime is 25 minutes.
// For more information, see:
// https://azure.microsoft.com/documentation/articles/iot-hub-devguide/#messaging
unsigned int minimumPollingTime = 9;
if (IoTHubClient_LL_SetOption(iotHubClientHandle, "MinimumPollingTime", &minimumPollingTime) != IOTHUB_CLIENT_OK)
printf("failure to set option \"MinimumPollingTime\"\r\n");
IoTDevice* iotDev = (IoTDevice*)malloc(sizeof(IoTDevice));
if (iotDev == NULL) {
return;
}
setUpIotStruct(iotDev);
char* msg;
size_t msgSize;
hum_temp->get_temperature(>emp); // get Temp
hum_temp->get_humidity(&ghumid); // get Humidity
pressure->get_pressure(&gpress); // get pressure
iotDev->Temperature = CTOF(gtemp);
iotDev->Humidity = (int)ghumid;
iotDev->Pressure = (int)gpress;
printf("(%04d)",msg_sent++);
msg = makeMessage(iotDev);
msgSize = strlen(msg);
sendMessage(iotHubClientHandle, msg, msgSize);
free(msg);
iotDev->Tilt &= 0x2;
/* schedule IoTHubClient to send events/receive commands */
IOTHUB_CLIENT_STATUS status;
while ((IoTHubClient_LL_GetSendStatus(iotHubClientHandle, &status) == IOTHUB_CLIENT_OK) && (status == IOTHUB_CLIENT_SEND_STATUS_BUSY))
{
IoTHubClient_LL_DoWork(iotHubClientHandle);
ThisThread::sleep_for(100);
}
deleteOK.wait_all(0x1);
free(iotDev);
IoTHubClient_LL_Destroy(iotHubClientHandle);
powerDown();
ThisThread::sleep_for(300000);
}
return;
}
I know PSM is probably the way to go since powering on/off the device draws a lot of power but it would be useful if someone had an idea of what is happening here.
2) putting the device to PSM between sending messages
The BG96 library in the example code I'm using doesn't have a method to turn on PSM so I tried to implement my own. When I tried to run it, it basically runs into an exception right away so I know it's wrong (I'm very new to embedded development and have no prior experience with AT commands).
/** ----------------------------------------------------------
* this is a method provided by current library
* #brief Tx a string to the BG96 and wait for an OK response
* #param none
* #retval true if OK received, false otherwise
*/
bool BG96::tx2bg96(char* cmd) {
bool ok=false;
_bg96_mutex.lock();
ok=_parser.send(cmd) && _parser.recv("OK");
_bg96_mutex.unlock();
return ok;
}
/**
* method I created in an attempt to use PSM
*/
bool BG96::psm(void) {
return tx2bg96((char*)"AT+CPSMS=1,,,”00000100”,”00000001”");
}
Can someone tell me what I'm doing wrong and provide any guidance on how I can achieve my goal of having my device run on battery for longer?
Thank you!!
I got Power Saving Mode working by using Mbed's ATCmdParser and the AT+QPSMS commands as per Quectel's docs. The modem doesn't always go into power saving mode right away so that should be noted. I also found that I have to restart the modem afterwards or else I get weird behaviour. My code looks something like this:
bool BG96::psm(char* T3412, char* T3324) {
_bg96_mutex.lock();
if(_parser.send("AT+QPSMS=1,,,\"%s\",\"%s\"", T3412, T3324) && _parser.recv("OK")) {
_bg96_mutex.unlock();
}else {
_bg96_mutex.unlock();
return false;
}
return BG96Ready(); }//restarts modem
To send a message to Azure, the modem will need to be manually woken up by driving the PWRKEY to start bi-directional communication, and a new client handle needs to be created and torn down every time since Azure connection uses keepAlive and the modem will be unreachable when it's in PSM.
I have a redirected printer port that use redmon (redirect port monitor) with a postscript printer driver to convert postscript to pdf and apply some other effects like watermarks, overlays, etc.
In win 7 all work fine but in windows 10 the process run under system user account.
In the configuration window of the printer port there is a flag called "Run as user" and in win7, checking this flag let the job running under the user account.
In Windows 10 it seems not working.
Any suggestion will be very appreciated.
Thank you.
Roy
I had a similar problem. I needed the user that printed the document to select the type of document and a patient ID. Then print the document to our EHR system as a PDF. Works in Windows 7 when "Run as User" is checked, but not on Windows 10. Redmon always runs the program as "SYSTEM". So I added a bit to the beginning of the program to check the user name. If it is "SYSTEM" the program looks for the an interactive user on the system by finding an instance of explorer.exe. If more than one interactive user is logged onto the system this will fail. Not a problem for my task. The program then starts another instance of itself running as the same user as explorer.exe, passing the same command line. A pipe is used so that stdin from the first instance can be piped to stdin on the second instance. Another limitation is that on a 64 bit OS, a 64 bit version of the program must be used. Otherwise explorer.exe may not be found.
The following code is what I placed at the beginning of my program. Don't be fooled by the program starting at main(). I am using a GUII toolkit that has WinMain() in it and then calls main(). I have only tested the code on ASCII programs. I tried to use the ASCII version of calls so that it would work with non-ASCII programs, but I am not sure I got all of them.
The LogInfoSys("Hello World"); function just writes to a log file.
Good luck.
#include <Windows.h>
#include <stdlib.h>
#include <stdio.h>
#include <malloc.h>
#include <time.h>
#include <direct.h>
#include <process.h>
#include <sqlext.h>
#include <Psapi.h>
#include <tlhelp32.h>
int main(int argc, char *argv[])
{
int error;
char msg[1024];
DWORD *processIDs;
int processCount;
HANDLE hProcess = NULL;
HANDLE hToken;
char userName[64];
char progName[1024];
int i, j;
char nameMe[256];
char domainMe[256];
PTOKEN_USER ptuMe = NULL;
PROCESS_INFORMATION procInfo;
STARTUPINFO startUpInfo;
HMODULE *hMod;
DWORD cbNeeded;
SECURITY_ATTRIBUTES saAttr;
HANDLE hChildStd_IN_Rd = NULL;
HANDLE hChildStd_IN_Wr = NULL;
i = 64; // Get user name, if it is "SYSTEM" redirect input to output to a new instance of the program
GetUserNameA(userName, &i);
if (_stricmp(userName, "system") == 0)
{
LogInfoSys("Running as SYSTEM");
processIDs = (DWORD *)calloc(16384, sizeof(DWORD)); // Look for explorer.exe running. If found that should be the user we want to run as.
EnumProcesses(processIDs, sizeof(DWORD) * 16384, &i); // If there is more than one that is OK as long as they are both being run by the same
processCount = i / sizeof(DWORD); // user. If more than one user is logged on, this will be a problem.
hMod = (HMODULE *)calloc(4096, sizeof(HMODULE));
hProcess = NULL;
for (i = 0; (i < processCount) && (hProcess == NULL); i++)
{
if (processIDs[i] == 11276)
Sleep(0);
if (processIDs[i] != 0)
{
hProcess = OpenProcess(PROCESS_QUERY_INFORMATION | PROCESS_VM_READ, FALSE, processIDs[i]);
if (hProcess != NULL)
{
cbNeeded = 0;
error = EnumProcessModules(hProcess, hMod, sizeof(HMODULE) * 4096, &cbNeeded);
if (error == 0)
{
error = GetLastError();
Sleep(0);
}
progName[0] = 0;
error = GetModuleBaseNameA(hProcess, hMod[0], progName, 1024);
if (error == 0)
{
error = GetLastError();
Sleep(0);
}
if (_stricmp(progName, "explorer.exe") != 0)
{
CloseHandle(hProcess);
hProcess = NULL;
}
else
{
LogInfoSys("Found explorer.exe");
}
}
}
}
LogInfoSys("After looking for processes.");
nameMe[0] = domainMe[0] = 0;
if (hProcess != NULL)
{
saAttr.nLength = sizeof(SECURITY_ATTRIBUTES);
saAttr.bInheritHandle = TRUE;
saAttr.lpSecurityDescriptor = NULL;
if (!CreatePipe(&hChildStd_IN_Rd, &hChildStd_IN_Wr, &saAttr, 0)) // Create a pipe for the child process's STDIN.
LogInfoSys("Stdin CreatePipe error");
if (!SetHandleInformation(hChildStd_IN_Wr, HANDLE_FLAG_INHERIT, 0)) // Ensure the write handle to the pipe for STDIN is not inherited.
LogInfoSys("Stdin SetHandleInformation errir");
if (OpenProcessToken(hProcess, TOKEN_ALL_ACCESS, &hToken) != 0)
{
GetStartupInfo(&startUpInfo);
startUpInfo.cb = sizeof(STARTUPINFO);
startUpInfo.lpReserved = NULL;
startUpInfo.lpDesktop = NULL;
startUpInfo.lpTitle = NULL;
startUpInfo.dwX = startUpInfo.dwY = 0;
startUpInfo.dwXSize = 0;
startUpInfo.dwYSize = 0;
startUpInfo.dwXCountChars = 0;
startUpInfo.dwYCountChars = 0;
startUpInfo.dwFillAttribute = 0;
startUpInfo.dwFlags |= STARTF_USESTDHANDLES;
startUpInfo.wShowWindow = 0;
startUpInfo.cbReserved2 = 0;
startUpInfo.lpReserved = NULL;
startUpInfo.hStdInput = hChildStd_IN_Rd;
startUpInfo.hStdOutput = NULL;
startUpInfo.hStdError = NULL;
GetModuleFileName(NULL, progName, 1024);
i = CreateProcessAsUserA(hToken, progName, GetCommandLine(), NULL, NULL, TRUE, NORMAL_PRIORITY_CLASS, NULL, NULL, &startUpInfo, &procInfo);
if (i == 0)
{
i = GetLastError();
}
do
{
i = (int)fread(msg, 1, 1024, stdin);
if (i > 0)
WriteFile(hChildStd_IN_Wr, msg, i, &j, NULL);
} while (i > 0);
}
}
LogInfoSys("End of running as SYSTEM.");
exit(0);
}
/**********************************************************************************************************
*
* End of running as SYSTEM and start of running as the user that printed the document (I hope).
*
**********************************************************************************************************/
exit(0);
}
For example, when program runs, I entered 1 as input and wanted to get MAC address of that interface in here. How can I do that?
I did a lot of work trying to figure out how to both get the mac address for an arbitrary interface under Windows, and matching that to the device info you get back from WinPCap.
A lot of posts say you should use GetAdaptersInfo or similar functions to get the mac address, but unfortunately, those functions only work with an interface that has ipv4 or ipv6 bound to it. I had a network card that purposely wasn't bound to anything, so that Windows wouldn't send any data over it.
GetIfTable(), however, seems to actually get you every interface on the system (there were 40 some odd on mine). It has the hardware mac address for each interface, so you just need to match it to the corresponding WinPCap device. The device name in WinPCap has a long GUID, which is also present in the name field of the interface table entries you get from GetIfTable. The names don't match exactly, though, so you have to extract the GUID from each name and match that. An additional complication is that the name field in the WinPCap device is a regular character string, but the name in the data you get from GetIfTable is a wide character string. The code below worked for me on two different systems, one Windows 7 and another Windows 10. I used the Microsoft GetIfTable example code as a starting point:
#include <winsock2.h>
#include <iphlpapi.h>
#include <stdio.h>
#include <stdlib.h>
#pragma comment(lib, "IPHLPAPI.lib")
#include <pcap.h>
// Compare the guid parts of both names and see if they match
int compare_guid(wchar_t *wszPcapName, wchar_t *wszIfName)
{
wchar_t *pc, *ic;
// Find first { char in device name from pcap
for (pc = wszPcapName; ; ++pc)
{
if (!*pc)
return -1;
if (*pc == L'{'){
pc++;
break;
}
}
// Find first { char in interface name from windows
for (ic = wszIfName; ; ++ic)
{
if (!*ic)
return 1;
if (*ic == L'{'){
ic++;
break;
}
}
// See if the rest of the GUID string matches
for (;; ++pc,++ic)
{
if (!pc)
return -1;
if (!ic)
return 1;
if ((*pc == L'}') && (*ic == L'}'))
return 0;
if (*pc != *ic)
return *ic - *pc;
}
}
// Find mac address using GetIFTable, since the GetAdaptersAddresses etc functions
// ony work with adapters that have an IP address
int get_mac_address(pcap_if_t *d, u_char mac_addr[6])
{
// Declare and initialize variables.
wchar_t* wszWideName = NULL;
DWORD dwSize = 0;
DWORD dwRetVal = 0;
int nRVal = 0;
unsigned int i;
/* variables used for GetIfTable and GetIfEntry */
MIB_IFTABLE *pIfTable;
MIB_IFROW *pIfRow;
// Allocate memory for our pointers.
pIfTable = (MIB_IFTABLE *)malloc(sizeof(MIB_IFTABLE));
if (pIfTable == NULL) {
return 0;
}
// Make an initial call to GetIfTable to get the
// necessary size into dwSize
dwSize = sizeof(MIB_IFTABLE);
dwRetVal = GetIfTable(pIfTable, &dwSize, FALSE);
if (dwRetVal == ERROR_INSUFFICIENT_BUFFER) {
free(pIfTable);
pIfTable = (MIB_IFTABLE *)malloc(dwSize);
if (pIfTable == NULL) {
return 0;
}
dwRetVal = GetIfTable(pIfTable, &dwSize, FALSE);
}
if (dwRetVal != NO_ERROR)
goto done;
// Convert input pcap device name to a wide string for compare
{
size_t stISize,stOSize;
stISize = strlen(d->name) + 1;
wszWideName = malloc(stISize * sizeof(wchar_t));
if (!wszWideName)
goto done;
mbstowcs_s(&stOSize,wszWideName,stISize, d->name, stISize);
}
for (i = 0; i < pIfTable->dwNumEntries; i++) {
pIfRow = (MIB_IFROW *)& pIfTable->table[i];
if (!compare_guid(wszWideName, pIfRow->wszName)){
if (pIfRow->dwPhysAddrLen != 6)
continue;
memcpy(mac_addr, pIfRow->bPhysAddr, 6);
nRVal = 1;
break;
}
}
done:
if (pIfTable != NULL)
free(pIfTable);
pIfTable = NULL;
if (wszWideName != NULL)
free(wszWideName);
wszWideName = NULL;
return nRVal;
}
I am trying to develop a webrtc simulator in C/C++. For media handling, I plan to use libav. I am thinking of below steps to realize media exchange between two webrtc simulator. Say I have two webrtc simulators A and B.
Read media at A from a input webm file using av_read_frame api.
I assume I will get the encoded media (audio / video) data, am I correct here?
Send the encoded media data to simulator B over a UDP socket.
Simulator B receives the media data in UDP socket as RTP packets.
Simulator B extracts audio/video data from just received RTP packet.
I assume the extracted media data at simulator B are the encoded data only (am I correct here). I do not want to decode it. I want to write it to a file. Later I will play the file to check if I have done everything right.
To simplify this problem lets take out UDP socket part. Then my question reduces to read data from a webm input file, get the encoded media, prepare the packet and write to a output file using av_interleaved_write_frame or any other appropriate api. All these things I want to do using libav.
Is there any example code I can refer.
Or can somebody please guide me to develop it.
I am trying with a test program. As a first step, my aim is to read from a file and write to an output file. I have below code, but it is not working properly.
//#define _AUDIO_WRITE_ENABLED_
#include "libavutil/imgutils.h"
#include "libavutil/samplefmt.h"
#include "libavformat/avformat.h"
static AVPacket pkt;
static AVFormatContext *fmt_ctx = NULL;
static AVFormatContext *av_format_context = NULL;
static AVOutputFormat *av_output_format = NULL;
static AVCodec *video_codec = NULL;
static AVStream *video_stream = NULL;
static AVCodec *audio_codec = NULL;
static AVStream *audio_stream = NULL;
static const char *src_filename = NULL;
static const char *dst_filename = NULL;
int main (int argc, char **argv)
{
int ret = 0;
int index = 0;
if (argc != 3)
{
printf("Usage: ./webm input_video_file output_video_file \n");
exit(0);
}
src_filename = argv[1];
dst_filename = argv[2];
printf("Source file = %s , Destination file = %s\n", src_filename, dst_filename);
av_register_all();
/* open input file, and allocate format context */
if (avformat_open_input(&fmt_ctx, src_filename, NULL, NULL) < 0)
{
fprintf(stderr, "Could not open source file %s\n", src_filename);
exit(1);
}
/* retrieve stream information */
if (avformat_find_stream_info(fmt_ctx, NULL) < 0)
{
fprintf(stderr, "Could not find stream information\n");
exit(2);
}
av_output_format = av_guess_format(NULL, dst_filename, NULL);
if(!av_output_format)
{
fprintf(stderr, "Could not guess output file format\n");
exit(3);
}
av_output_format->audio_codec = AV_CODEC_ID_VORBIS;
av_output_format->video_codec = AV_CODEC_ID_VP8;
av_format_context = avformat_alloc_context();
if(!av_format_context)
{
fprintf(stderr, "Could not allocation av format context\n");
exit(4);
}
av_format_context->oformat = av_output_format;
strcpy(av_format_context->filename, dst_filename);
video_codec = avcodec_find_encoder(av_output_format->video_codec);
if (!video_codec)
{
fprintf(stderr, "Codec not found\n");
exit(5);
}
video_stream = avformat_new_stream(av_format_context, video_codec);
if (!video_stream)
{
fprintf(stderr, "Could not alloc stream\n");
exit(6);
}
avcodec_get_context_defaults3(video_stream->codec, video_codec);
video_stream->codec->codec_id = AV_CODEC_ID_VP8;
video_stream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
video_stream->time_base = (AVRational) {1, 30};
video_stream->codec->width = 640;
video_stream->codec->height = 480;
video_stream->codec->pix_fmt = PIX_FMT_YUV420P;
video_stream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
video_stream->codec->bit_rate = 400000;
video_stream->codec->gop_size = 10;
video_stream->codec->max_b_frames=1;
#ifdef _AUDIO_WRITE_ENABLED_
audio_codec = avcodec_find_encoder(av_output_format->audio_codec);
if (!audio_codec)
{
fprintf(stderr, "Codec not found audio codec\n");
exit(5);
}
audio_stream = avformat_new_stream(av_format_context, audio_codec);
if (!audio_stream)
{
fprintf(stderr, "Could not alloc stream for audio\n");
exit(6);
}
avcodec_get_context_defaults3(audio_stream->codec, audio_codec);
audio_stream->codec->codec_id = AV_CODEC_ID_VORBIS;
audio_stream->codec->codec_type = AVMEDIA_TYPE_AUDIO;
audio_stream->time_base = (AVRational) {1, 30};
audio_stream->codec->sample_rate = 8000;
audio_stream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
#endif
if(!(av_output_format->flags & AVFMT_NOFILE))
{
if (avio_open(&av_format_context->pb, dst_filename, AVIO_FLAG_WRITE) < 0)
{
fprintf(stderr, "Could not open '%s'\n", dst_filename);
}
}
/* Before avformat_write_header set the stream */
avformat_write_header(av_format_context, NULL);
/* initialize packet, set data to NULL, let the demuxer fill it */
av_init_packet(&pkt);
pkt.data = NULL;
pkt.size = 0;
pkt.stream_index = video_stream->index;
ret = av_read_frame(fmt_ctx, &pkt);
while (ret >= 0)
{
index++;
//pkt.stream_index = video_avstream->index;
if(pkt.stream_index == video_stream->index)
{
printf("Video: Read cycle %d, bytes read = %d, pkt stream index=%d\n", index, pkt.size, pkt.stream_index);
av_write_frame(av_format_context, &pkt);
}
#ifdef _AUDIO_WRITE_ENABLED_
else if(pkt.stream_index == audio_stream->index)
{
printf("Audio: Read cycle %d, bytes read = %d, pkt stream index=%d\n", index, pkt.size, pkt.stream_index);
av_write_frame(av_format_context, &pkt);
}
#endif
av_free_packet(&pkt);
ret = av_read_frame(fmt_ctx, &pkt);
}
av_write_trailer(av_format_context);
/** Exit procedure starts */
avformat_close_input(&fmt_ctx);
avformat_free_context(av_format_context);
return 0;
}
When I execute this program, it outputs "codec not found". Now sure whats going wrong, Can somebody help please.
Codec not found issue is resolved by separately building libvpx1.4 version. Still struggling to read from source file, and writing to a destination file.
EDIT 1: After code modification, only video stuff I am able to write to a file, though some more errors are still present.
EDIT 2: With modified code (2nd round), I see video frames are written properly. For audio frames I added the code under a macro _AUDIO_WRITE_ENABLED_ , but if I enable this macro program crashing. Can somebody guide whats wrong in audio write part (code under macro _AUDIO_WRITE_ENABLED_).
I am not fully answering your question, but I hope we will get to the final solution eventually. When I tried to run your code, I got this error "time base not set".
Time base and other header specs are part of codec. This is, how I have this thing specified for writing into file (vStream is of AVStream):
#if LIBAVCODEC_VER_AT_LEAST(53, 21)
avcodec_get_context_defaults3(rc->vStream->codec, AVMEDIA_TYPE_VIDEO);
#else
avcodec_get_context_defaults2(rc->vStream->codec, AVMEDIA_TYPE_VIDEO);
#endif
#if LIBAVCODEC_VER_AT_LEAST(54, 25)
vStream->codec->codec_id = AV_CODEC_ID_VP8;
#else
vStream->codec->codec_id = CODEC_ID_VP8;
#endif
vStream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
vStream->codec->time_base = (AVRational) {1, 30};
vStream->codec->width = 640;
vStream->codec->height = 480;
vStream->codec->pix_fmt = PIX_FMT_YUV420P;
EDIT: I ran your program in Valgrind and it segfaults on av_write_frame. Looks like its time_base and other specs for output are not set properly.
Add the specs before avformat_write_header(), before it is too late.
I have a Multichannel Mixer audio unit playing back audio files in an iOS app, and I need to figure out how to update the app's UI and perform a reset when the render callback hits the end of the longest audio file (which is set up to run on bus 0). As my code below shows I am trying to use KVO to achieve this (using the boolean variable tapesUnderway - the AutoreleasePool is necessary as this Objective-C code is running outside of its normal domain, see http://www.cocoabuilder.com/archive/cocoa/57412-nscfnumber-no-pool-in-place-just-leaking.html).
static OSStatus tapesRenderInput(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData)
{
SoundBufferPtr sndbuf = (SoundBufferPtr)inRefCon;
UInt32 bufferFrames = sndbuf[inBusNumber].numFrames;
AudioUnitSampleType *in = sndbuf[inBusNumber].data;
// These mBuffers are the output buffers and are empty; these two lines are just setting the references to them (via outA and outB)
AudioUnitSampleType *outA = (AudioUnitSampleType *)ioData->mBuffers[0].mData;
AudioUnitSampleType *outB = (AudioUnitSampleType *)ioData->mBuffers[1].mData;
UInt32 sample = sndbuf[inBusNumber].sampleNum;
// --------------------------------------------------------------
// Set the start time here
if(inBusNumber == 0 && !tapesFirstRenderPast)
{
printf("Tapes first render past\n");
tapesStartSample = inTimeStamp->mSampleTime;
tapesFirstRenderPast = YES; // MAKE SURE TO RESET THIS ON SONG RESTART
firstPauseSample = tapesStartSample;
}
// --------------------------------------------------------------
// Now process the samples
for(UInt32 i = 0; i < inNumberFrames; ++i)
{
if(inBusNumber == 0)
{
// ------------------------------------------------------
// Bus 0 is the backing track, and is always playing back
outA[i] = in[sample++];
outB[i] = in[sample++]; // For stereo set desc.SetAUCanonical to (2, true) and increment samples in both output calls
lastSample = inTimeStamp->mSampleTime + (Float64)i; // Set the last played sample in order to compensate for pauses
// ------------------------------------------------------
// Use this logic to mark end of tune
if(sample >= (bufferFrames * 2) && !tapesEndPast)
{
// USE KVO TO NOTIFY METHOD OF VALUE CHANGE
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
FuturesEPMedia *futuresMedia = [FuturesEPMedia sharedFuturesEPMedia];
NSNumber *boolNo = [[NSNumber alloc] initWithBool: NO];
[futuresMedia setValue: boolNo forKey: #"tapesUnderway"];
[boolNo release];
[pool release];
tapesEndPast = YES;
}
}
else
{
// ------------------------------------------------------
// The other buses are the open sections, and are synched through the tapesSectionsTimes array
Float64 sectionTime = tapesSectionTimes[inBusNumber] * kGraphSampleRate; // Section time in samples
Float64 currentSample = inTimeStamp->mSampleTime + (Float64)i;
if(!isPaused && !playFirstRenderPast)
{
pauseGap += currentSample - firstPauseSample;
playFirstRenderPast = YES;
pauseFirstRenderPast = NO;
}
if(currentSample > (tapesStartSample + sectionTime + pauseGap) && sample < (bufferFrames * 2))
{
outA[i] = in[sample++];
outB[i] = in[sample++];
}
else
{
outA[i] = 0;
outB[i] = 0;
}
}
}
sndbuf[inBusNumber].sampleNum = sample;
return noErr;
}
At the moment when this variable is changed it triggers a method in self, but this leads to an unacceptable delay (20-30 seconds) when executed from this render callback (I am thinking because it is Objective-C code running in the high priority audio thread?). How do I effectively trigger such a change without the delay? (The trigger will change a pause button to a play button and call a reset method to prepare for the next play.)
Thanks
Yes. Don't use objc code in the render thread since its high priority. If you store state in memory (ptr or struct) and then get a timer in the main thread to poll (check) the value(s) in memory. The timer need not be anywhere near as fast as the render thread and will be very accurate.
Try this.
Global :
BOOL FlgTotalSampleTimeCollected = False;
Float64 HigestSampleTime = 0 ;
Float64 TotalSampleTime = 0;
in -(OSStatus) setUpAUFilePlayer:
AudioStreamBasicDescription fileASBD;
// get the audio data format from the file
UInt32 propSize = sizeof(fileASBD);
CheckError(AudioFileGetProperty(inputFile, kAudioFilePropertyDataFormat,
&propSize, &fileASBD),
"couldn't get file's data format");
UInt64 nPackets;
UInt32 propsize = sizeof(nPackets);
CheckError(AudioFileGetProperty(inputFile, kAudioFilePropertyAudioDataPacketCount,
&propsize, &nPackets),
"AudioFileGetProperty[kAudioFilePropertyAudioDataPacketCount] failed");
Float64 sTime = nPackets * fileASBD.mFramesPerPacket;
if (HigestSampleTime < sTime)
{
HigestSampleTime = sTime;
}
In RenderCallBack :
if (*actionFlags & kAudioUnitRenderAction_PreRender)
{
if (!THIS->FlgTotalSampleTimeCollected)
{
[THIS setFlgTotalSampleTimeCollected:TRUE];
[THIS setTotalSampleTime:(inTimeStamp->mSampleTime + THIS->HigestSampleTime)];
}
}
else if (*actionFlags & kAudioUnitRenderAction_PostRender)
{
if (inTimeStamp->mSampleTime > THIS->TotalSampleTime)
{
NSLog(#"inTimeStamp->mSampleTime :%f",inTimeStamp->mSampleTime);
NSLog(#"audio completed");
[THIS callAudioCompletedMethodHere];
}
}
This worked for me.
Test in Device.