ESP32-CAM how to publish large binary payload to AWS IOT ssl mqtt topic, tested many libs without success :-( - ssl

I'm currently working on an ESP32CAM project to publish on AWS IOT topic some captures from the camera in high resolution (UXGA).
I've managed to publish some short json payloads with attributes to different AWS IOT certificate protected topics but I'm facing an annoying issue to do the same with large payload as the capture binary file.
I've browsed many sites, forums, tested differents libs like MQTT, PubSubClient, AsyncMQTTClient... but I've not found a true working solution for large payload around 100KB size.
For example with the PubSubClient lib, I try to fragment my binary payload with the BeginPublish, write, endPublish scheme as below :
bool publishBinary(const uint8_t *buffer,size_t len, const char *topicPubName)
{
Serial.print("publishing binary ["+(String)len+"] ...");
if (len == 0) {
// Empty file
Serial.println("Error : binary payload is empty!");
return(false);
}
if (!client.beginPublish(topicPubName,len,false)) {
Serial.println("MQTT beginPublish failed.");
return(false);
}
size_t max_transfer_size=80;
size_t n=0;
size_t size_send;
size_t offset=0;
while ((len-offset)>0) {
n=(len-offset);
if (n > max_transfer_size)
n=max_transfer_size;
size_send=client.write((const uint8_t *)(buffer+offset),n);
Serial.printf("%d/%d : %.02f %%\n",offset,len,(double)((100*offset)/len));
//Serial.println("n: "+(String)n+" - send: "+(String)size_send);
if(size_send != n) {
// error handling. this is triggered on write fail.
Serial.println("Error during publishing..."+(String)size_send+" instead of "+(String)n);
client.endPublish();
return(false);
} else {
offset+=size_send;
}
}
client.endPublish();
Serial.println("ok");
return(true);
}
client is defined as PubSubClient client(net) where net is WiFiClientSecure object with validated CA_cert, cert and private key.
The MQTT connection is working well but when I try to publish the large binary payload, the function fragments buffer into chunks till the end but there is quite always an error like UNKNOWN ERROR CODE (0050) or when it succeeds to publish, only a small part of payload is published on the destination. In this case, my jpeg file is truncated on my S3 bucket where the payload lands.
I have to say that sometimes, I managed to publish a 65K payload but like a stroke of luck... :-)
I've looked for some examples on the web but very often it is for small payload. As mentioned in a post, I've tested the Publish_P(...) from PubSubClient... but same result, it aborts during transfer.
I begin to ask myself if it really possible by mqtt topic or do I have to create an API gateway with a lambda to handle such large payload. Tell me I'm wrong :-)
If you know a good solution for a true working large payload publishing, I would be delighted to discuss with you :-)
Thanks !

#include <PubSubClient.h>
void setup() {
...
boolean res = mqttClient.setBufferSize(50*1024); // ok for 640*480
if (res) Serial.println("Buffer resized."); else Serial.println("Buffer resizing failed");
...
}
I'm working with 50kB buffer and it works well, haven't tried beyond that bu it should work with 100kB as well.
After you've resized the buffer publish as you normally would.
BTW, setBufferSize function was only recently added IIRC.

Hey I had problems with the PubSubClient library and large files too. In the end I figured out, that I have to update the PubSubClient.h as follows:
//128000 = 128 kB is the maximum size for AWS I think..
#define MQTT_MAX_PACKET_SIZE 100000
// It takes a long time to transmit the large files
// maybe even more than 200 seconds...
#define MQTT_KEEPALIVE 200
#define MQTT_SOCKET_TIMEOUT 200

I had the same problem, and I found that PubSubClient's bufferSize is defined as uint16_t.
https://github.com/knolleary/pubsubclient/blob/v2.8/src/PubSubClient.h#L92
So, we can't extend the buffer size over 64 kB, and can't publish a large payload.
Michael's comment might help you.

Related

I have a problem about YouTube API with ESP8266

I try to make youtube subscribe counter but it a problem with youtube api library here the error message
Arduino: 1.8.12 (Windows 10), Board: "NodeMCU 1.0 (ESP-12E Module), 80 MHz, Flash, Legacy (new can return nullptr), All SSL ciphers (most compatible), 4MB (FS:2MB OTA:~1019KB), 2, v2 Lower Memory, Disabled, None, Only Sketch, 115200"
The sketch name had to be modified.
Sketch names must start with a letter or number, followed by letters,
numbers, dashes, dots and underscores. Maximum length is 63 characters.
C:\Users\Um Sythat\Documents\Arduino\libraries\arduino-youtube-api-master\src\YoutubeApi.cpp:95:11: error: DynamicJsonBuffer is a class from ArduinoJson 5. Please see arduinojson.org/upgrade to learn how to upgrade your program to ArduinoJson version 6
DynamicJsonBuffer jsonBuffer;
^
C:\Users\Um Sythat\Documents\Arduino\libraries\arduino-youtube-api-master\src\YoutubeApi.cpp: In member function 'bool YoutubeApi::getChannelStatistics(String)':
C:\Users\Um Sythat\Documents\Arduino\libraries\arduino-youtube-api-master\src\YoutubeApi.cpp:95:20: error: 'jsonBuffer' was not declared in this scope
DynamicJsonBuffer jsonBuffer;
^
C:\Users\Um Sythat\Documents\Arduino\libraries\arduino-youtube-api-master\src\YoutubeApi.cpp:97:10: error: 'ArduinoJson::JsonObject' has no member named 'success'
if(root.success()) {
^
exit status 1
Error compiling for board NodeMCU 1.0 (ESP-12E Module).
Invalid library found in C:\Program Files (x86)\Arduino\libraries\libraries: no headers files (.h) found in C:\Program Files (x86)\Arduino\libraries\libraries
Invalid library found in C:\Program Files (x86)\Arduino\libraries\youtube_control_arduino: no headers files (.h) found in C:\Program Files (x86)\Arduino\libraries\youtube_control_arduino
Invalid library found in C:\Program Files (x86)\Arduino\libraries\libraries: no headers files (.h) found in C:\Program Files (x86)\Arduino\libraries\libraries
Invalid library found in C:\Program Files (x86)\Arduino\libraries\youtube_control_arduino: no headers files (.h) found in C:\Program Files (x86)\Arduino\libraries\youtube_control_arduino
This report would have more information with
"Show verbose output during compilation"
option enabled in File -> Preferences.
I already download youtube api library and arduino json library and import it to arduino ide I always get error from it i dont know why it gone like this someone who know please help me. I like to hear from you.
And here my code :
/*******************************************************************
* Read YouTube Channel statistics from the YouTube API *
* *
* By Brian Lough *
* https://www.youtube.com/channel/UCezJOfu7OtqGzd5xrP3q6WA *
*******************************************************************/
#include <YoutubeApi.h>
#include <ESP8266WiFi.h>
#include <WiFiClientSecure.h>
#include <ArduinoJson.h> // This Sketch doesn't technically need this, but the library does so it must be installed.
//------- Replace the following! ------
char ssid[] = "xxx"; // your network SSID (name)
char password[] = "yyyy"; // your network key
#define API_KEY "zzzz" // your google apps API Token
#define CHANNEL_ID "UCezJOfu7OtqGzd5xrP3q6WA" // makes up the url of channel
WiFiClientSecure client;
YoutubeApi api(API_KEY, client);
unsigned long api_mtbs = 60000; //mean time between api requests
unsigned long api_lasttime; //last time api request has been done
long subs = 0;
void setup() {
Serial.begin(115200);
// Set WiFi to station mode and disconnect from an AP if it was Previously
// connected
WiFi.mode(WIFI_STA);
WiFi.disconnect();
delay(100);
// Attempt to connect to Wifi network:
Serial.print("Connecting Wifi: ");
Serial.println(ssid);
WiFi.begin(ssid, password);
while (WiFi.status() != WL_CONNECTED) {
Serial.print(".");
delay(500);
}
Serial.println("");
Serial.println("WiFi connected");
Serial.println("IP address: ");
IPAddress ip = WiFi.localIP();
Serial.println(ip);
}
void loop() {
if (millis() - api_lasttime > api_mtbs) {
if(api.getChannelStatistics(CHANNEL_ID))
{
Serial.println("---------Stats---------");
Serial.print("Subscriber Count: ");
Serial.println(api.channelStats.subscriberCount);
Serial.print("View Count: ");
Serial.println(api.channelStats.viewCount);
Serial.print("Comment Count: ");
Serial.println(api.channelStats.commentCount);
Serial.print("Video Count: ");
Serial.println(api.channelStats.videoCount);
// Probably not needed :)
//Serial.print("hiddenSubscriberCount: ");
//Serial.println(api.channelStats.hiddenSubscriberCount);
Serial.println("------------------------");
}
api_lasttime = millis();
}
}
I would ditch the libraray - it uses
#include <ArduinoJson.h>
and the error tells us whats wrong
error: DynamicJsonBuffer is a class from ArduinoJson 5 <=== you have probably version 6.x.x installed
which is a memory hog. Often one single function is used by libraries and the rest is useless. To solve your problem, you have to
downgrade ArduinoJson.h to version 5.13.5
The other reason I do not like it, breaking changes in nearly all major releases (real big breaking). So another option you have (if skilled enough) replace the ArduinoJson functions with
a lighter JSON library
or replace it with self written JSON functions - often a simple buffer and some char handling does the trick
Read the issues and PRs on github to inform yourself about missing updates and other problems. Development stopped March 2018 since then no adaption to the (breaking) changes in the youtube API

communication between board and the GPS module

I'm currently having trouble with talking between the dev board (STM32L476RG) and the GPS module (GP-207U). What my code does now is that, it can print out the very first packet received from GPS to PuTTY and will keep printing the same packet, even if I unplug the Tx wire from the dev board, PuTTY will still keep printing. I suspect that either the buffer that stores the received value is not getting updated(fulshed) or the HAL_UART_Receive() function only run once. (The receive function is in While(1) in main, so I'm confused)
enter image description here
(I unpluged the GPS, Putty still prints, so the receive function isn't doing anything after it received the very first packet from GPS)
/*retrive data from GPS*/
char UARTRxBuffer[1024] = "";
char RxBuffer[1024] = "";
void GetGPS(void) {
HAL_UART_Receive(&huart3, (uint8_t *)UARTRxBuffer, 1024, 1000);
HAL_Delay(100);
sprintf(RxBuffer,"%s\r\n\r\n", UARTRxBuffer);
HAL_UART_Transmit(&huart2, (uint8_t *)RxBuffer, strlen(RxBuffer), 5000);
HAL_Delay(100);
}
GetGPS() is put into while(1) in main().
I tried everything based on my guesses, but none worked.
Thanks ahead for any sort of help!
I suspect the call to HAL_UART_Receive is timing out (1000 ms in your code) during the second/subsequent attempts to read the GPS. if so, the buffer contents wouldn't get cleared or overwritten resulting the same data being printed over and over. It might help to read the GPS datasheet/manual to find out the maximum polling speed (here it appears to be ~200ms, considering the 2x 100 ms delays) and adjust the delay if the GPS device cannot keep up.
try this to confirm
HAL_StatusTypeDef status = HAL_UART_Receive(/*same as above*/);
if(status == HAL_OK){
// got valid data
sprintf(RxBuffer,"%s\r\n\r\n", UARTRxBuffer);
HAL_UART_Transmit(&huart2, (uint8_t *)RxBuffer, strlen(RxBuffer), 5000);
}
else{
sprintf(RxBuffer,"read timeout.\r\n\r\n");
HAL_UART_Transmit(&huart2, (uint8_t *)RxBuffer, strlen(RxBuffer), 5000);
}
API reference docs here page 1037/2232

How to test Golang channels / go-routines

I have a type that contains a byte of data, and takes a channel to post new data there. Other code can read the last written byte of data using a Read function.
Edit: for actual, runnable code, see https://github.com/ariejan/i6502/pull/3 especially files acia6551.go and acia6551_test.go. Tests results can be viewed here: https://travis-ci.org/ariejan/i6502/jobs/32862705
I have the following:
// Emulates a serial interface chip of some kind.
type Unit struct {
// Channel used for others to use, bytes written here will be placed in rxChar
Rx chan byte
// Internal store of the last byte written.
rxChar byte // Internal storage
}
// Used internally to read data store in rxChar
func (u *Unit) Read() byte {
return u.rxChar
}
// Create new Unit and go-routing to listen for Rx bytes
func NewUnit(rx chan byte) *Unit {
unit := &Unit{Rx: rx}
go func() {
for {
select {
case data := <-unit.Rx:
unit.rxData = data
fmt.Printf("Posted 0x%02X\n", data)
}
}
}()
return unit
}
My test looks like this:
func TestUnitRx(t *testing.T) {
rx := make(chan byte)
u := NewUnit(rx)
// Post a byte to the Rx channel
// This prints "Posted 0x42", as you'd expect
rx <- 0x42
// Using testing
// Should read last byte, 0x42 but fails.
fmt.Println("Reading value...")
assert.Equal(t, 0x42, u.Read())
}
At first I figured the "Reading value" happened before the go-routing got around to writing the data. But the "Posted" message is always printed before "Reading".
So, two questions remain:
Is this the best way to handle an incoming stream of bytes (at 9600 baud ;-))
If this is the right way, how can I properly test it or what is wrong with my code?
Guessing by the pieces posted here, it doesn't look like you have anything guaranteeing the order of operations when accessing the stored data. You can use a mutex around any data shared between goroutines.
A better option here is to use buffered channels of length 1 to write, store, and read the bytes.
It's always a good idea to test your program with -race to use the race detector.
Since this looks very "stream" like, you very well may want some buffering, and to look at some examples of how the io.Reader and io.Writer interfaces are often used.

Some basic HTTP protocol questions (programming and theory)

Before you decide its a tl:dr (too long, didnt read) post try to read at least some, since Its a question broken down in a lot of small pieces. Some of which you can probably answer and help me.
Please try to help me as much as you can. These types of problems are very common on the internet and I think you will help me and much more people after me.
I am currently researching HTTP services and the protocol itself so that I can discover if it is useful to me.
I have some basic questions as well as some code that needs to be discussed.
First I would like to know how does the communication start? I have discovered that the client sends a message in which it requests a resource (is this correct?). Then what happens? I (as a server) have to reply with what?
Do I need to append a carriage return and a line feed after every response? Somewhere it says there even need to be two (\r\n\r\n).
How can an asynchronous writing be established? (I hope this question is understandable) My primary goal is to achieve a connection between a client and a server and then a continuous data stream from server to the client. Does the client need to reply for every message it gets?
I hope I made my questions clear, since I'm not an expert in these things (yet, I am very interested in it).
And for the programming part of my problem.
I have managed to put together a simple program in Qt in C++ (server side) and a simple client in Objective C (iOS). The client connects and I can read the request header. It is like this:
Data available, incoming: "GET / HTTP/1.1
Host: localhost:9990
Connection: close
User-Agent: CFStream%20test/1.0 CFNetwork/609 Darwin/12.2.0
Should I reply to this header manually? And if so, what?
The client side code looks like this (i know its not pseudo but i think its pretty self-explanatory):
- (void)setupStream
{
NSURL *url = [NSURL URLWithString:#"http://localhost:9990"];
CFHTTPMessageRef message = CFHTTPMessageCreateRequest(NULL, (CFStringRef)#"GET", (CFURLRef)url, kCFHTTPVersion1_1);
stream = CFReadStreamCreateForHTTPRequest(NULL, message);
CFRelease(message);
if (!CFReadStreamSetProperty(stream, kCFStreamPropertyHTTPShouldAutoredirect, kCFBooleanTrue))
{
NSLog(#"Some error.");
}
CFDictionaryRef proxySettings = CFNetworkCopySystemProxySettings();
CFReadStreamSetProperty(stream, kCFStreamPropertyHTTPProxy, proxySettings);
CFRelease(proxySettings);
if (!CFReadStreamOpen(stream))
{
CFRelease(stream);
NSLog(#"Error opening stream.");
}
CFStreamClientContext context = {0, self, NULL, NULL, NULL};
CFReadStreamSetClient(stream, kCFStreamEventHasBytesAvailable | kCFStreamEventErrorOccurred, readStreamCallback, &context);
CFReadStreamScheduleWithRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopCommonModes);
NSLog(#"Done");
}
This is the setup stream method. The stream variable is a class variable of type CFReadStreamRef.
The callback looks like this:
static void readStreamCallback(CFReadStreamRef aStream, CFStreamEventType event, void *client)
{
ViewController *controller = (ViewController*)client;
[controller handleEvent:event forStream:aStream];
}
And the handle event like this:
- (void)handleEvent:(CFStreamEventType)event forStream:(CFReadStreamRef)aStream
{
if (aStream != stream)
{
return;
}
NSLog(#"Handle event callback");
switch (event)
{
case kCFStreamEventHasBytesAvailable:
NSLog(#"Work log");
UInt8 bytes[11];
CFIndex length;
length = CFReadStreamRead(stream, bytes, 11); //I know 11 bytes is hard coded, its in testing stage now. Feel free to suggest me how to do it better.
if (length == -1)
{
NSLog(#"Error, data length = -1");
return;
}
NSLog(#"Len: %li, data: %s", length, bytes);
break;
default:
NSLog(#"Other event");
break;
}
}
And thats practically all the client code that is worth mentioning. The Qt Server part (I will only post the important parts) is done like this: (this is a subclassed QTcpServer class). First the startServer(); is called:
bool Server::startServer()
{
if (!this->listen(QHostAddress::Any, 9990))
return false;
return true;
}
When there is a connection incoming the incomingConnection is fired off with the socket descriptor as a parameter:
void Server::incomingConnection(int handle)
{
qDebug("New client connected");
ServerClient *client = new ServerClient(handle, this); //The constructor takes in the socket descriptor needed to set up the socket and the parent (this)
client->setVectorLocation(clients.count()); //This is a int from a Qvector in which i append the clients, its not important for understanding right now.
connect(client, SIGNAL(clientDisconnected(int)), this, SLOT(clientDisconnected(int)), Qt::QueuedConnection); //When the client socket emits a disconnected signal the ServerClient class emits a client disconnected signal which the server uses to delete that client from the vector (thats why I use "setVectorLocation(int)") - not important right now
clients.push_back(client); //And then I append the client to the QVector - not important right now
}
The ClientServer class constructor just creates a new socket and connects the required methods:
ServerClient::ServerClient(int handle, QObject *parent) :
QObject(parent)
{
socket = new QTcpSocket(this); //Socket is a class variable
connect(socket, SIGNAL(disconnected()), this, SLOT(disconnected()));
connect(socket, SIGNAL(readyRead()), this, SLOT(readyRead()));
socket->setSocketDescriptor(handle);
}
Ready read just writes me the data incoming (it wont be much user later i think):
void ServerClient::readyRead()
{
qDebug() << "Data available, incoming: " << socket->readAll();
}
And finally the write data:
void ServerClient::writeData(QByteArray *data)
{
data->append("\r\n\r\n"); //I have read this must be appended to all outgoing data from a HTTP server
socket->write(*data);
socket->flush();
qDebug() << "Written data to client: " << *data;
}
This code however does not always work. Sometimes when I write message like "Message" the client recieves all the data and some things that shouldnt be there (the new line and a wierd symbol - can NSLog cause this?). Sometimes when I send "Hellow" the client only gets "Hel" and some other funky stuff.
What are the problems? What should I pay more attention about? Anything that will help me will be MUCH appreciated. And please dont paste in some links that contain a book with a few hundred pages, Im sure this can be solved just by explaining things to me.
THANKS A LOT!
Jan.
You asked many questions ... and that's a perfectly legitimate thing to do :)
I confess - it was too long, I didn't read :(
BUT ...
1) Yes, the HTTP protocol does expect na "CRLF" ("\r\n"). Many servers and many clients are "forgiving", but strictly speaking - yes, you need them.
REFERENCE: RFC 2616
2) Wanting to understand HTTP "internals" is also perfectly legitimate - I applaud you.
One good way is to read the RFC(s).
Another is to use a "telnet" client: http://blog.tonycode.com/tech-stuff/http-notes/making-http-requests-via-telnet
Yet another is to study requests and responses in FF Firebug
3) Socket programming is another issue - which explains why sometimes you might read "hello world", and other times you might just get "hel".
Strong recommendation: Beej's Guide to Network Programming
4) Finally, no way would I actually write a server in Qt with C++ (except maybe as a toy "science experiment", or for some really off-the-wall requirement)
I would definitely write server code in C# (for Windows servers), Java (for everything else) or a scripting language I felt comfortable with (Perl, Ruby/RoR, Python and Lua all come to mind).
IMHO .. and hope that helps!
Your questions pretty much amount to "how does HTTP work", and the full answer lies in the specification.

Objective-C - Passing Streamed Data to Audio Queue

I am currently developing an app on iOS that reads IMA-ADPCM audio data in over through a TCP socket and converts it to PCM and then plays the stream. At this stage, I have completed the class that pulls (or should I say reacts to pushes) in the data from the stream and decoded it to PCM. I have also setup the Audio Queue class and have it playing a test tone. Where I need assistance is the best way to pass the data into the Audio Queue.
The audio data comes out of the ADPCM decoder as 8 Khz 16bit LPCM at 640 bytes a chunk. (it originates as 160 bytes of ADPCM data but decompresses to 640). It comes into the function as uint_8t array and passes out an NSData object. The stream is a 'push' stream, so everytime the audio is sent it will create/flush the object.
-(NSData*)convertADPCM:(uint8_t[]) adpcmdata {
The Audio Queue callback of course is a pull function that goes looking for data on each pass of the run loop, on each pass it runs:
-(OSStatus) fillBuffer: (AudioQueueBufferRef) buffer {
I've been working on this for a few days and the PCM conversion was quite taxing and I am having a little bit of trouble assembling in my head the best way to bridge the data between the two. It's not like I am creating the data, then I could simply incorporate data creation into the fillbuffer routine, rather the data is being pushed.
I did setup a circular buffer, of 0.5 seconds in a uint16_t[] ~ but I think I have worn my brain out and couldn't work out a neat way to push and pull from the buffer, so I ended up with snap crackle pop.
I have completed the project mostly on Android, but found AudioTrack a very different beast to Core-Audio Queues.
At this stage I will also say I picked up a copy of Learning Core Audio by Adamson and Avila and found this an excellent resource for anyone looking to demystify core audio.
UPDATE:
Here is the buffer management code:
-(OSStatus) fillBuffer: (AudioQueueBufferRef) buffer {
int frame = 0;
double frameCount = bufferSize / self.streamFormat.mBytesPerFrame;
// buffersize = framecount = 8000 / 2 = 4000
//
// incoming buffer uint16_t[] convAudio holds 64400 bytes (big I know - 100 x 644 bytes)
// playedHead is set by the function to say where in the buffer the
// next starting point should be
if (playHead > 99) {
playHead = 0;
}
// Playstep factors playhead to get starting position
int playStep = playHead * 644;
// filling the buffer
for (frame = 0; frame < frameCount; ++frame)
// framecount = 4000
{
// pointer to buffer
SInt16 *data = (SInt16*)buffer->mAudioData;
// load data from uint16_t[] convAudio array into frame
(data)[frame] = convAudio[(frame + playStep)];
}
// set buffersize
buffer->mAudioDataByteSize = bufferSize;
// return no Error - Osstatus will return an error otherwise if there is one. (I think)
return noErr;
}
As I said, my brain was fuzzy when I wrote this, and there's probably something glaringly obvious I am missing.
Above code is called by the callback:
static void MyAQOutputCallback(void *inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inCompleteAQBuffer)
{
soundHandler *sHandler = (__bridge soundHandler*)inUserData;
CheckError([sHandler fillBuffer: inCompleteAQBuffer],
"can't refill buffer",
"buffer refilled");
CheckError(AudioQueueEnqueueBuffer(inAQ,
inCompleteAQBuffer,
0,
NULL),
"Couldn't enqueue buffer (refill)",
"buffer enqued (refill)");
}
On the convAudio array side of things I have dumped the it to log and it is getting filled and refilled in a circular fashion, so I know at least that bit is working.
The hard part in managing rates, and what to do if they don't match. At first, try using a huge circular buffer (many many seconds) and mostly fill it before starting the Audio Queue to pull from it. Then monitor the buffer level to see his big a rate matching problem you have.