Execute function before application gets quit - objective-c

I'd like to call a function before processing the stop/kill signal.
Is there an easy way to do this?

You can handle a SIGTERM and SIGINT signal by setting-up a signal handler (see signal(3)), however you cannot handle SIGKILL, which is why it should be the last resort to use against a program.
If you always want to do something before the process exits, then see atexit(3).
$ cat sig.c
#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>
#include <signal.h>
static void closedown() {
printf("running closedown\n");
}
static void sighandler(int signal) {
exit(1);
}
int main(int argc, const char **argv) {
signal(SIGTERM, sighandler);
signal(SIGINT, sighandler);
atexit(closedown);
while (1) {
sleep(1);
printf("tick\n");
}
return 0;
}
$ clang -o sig sig.c
$ ./sig
tick
tick
^Crunning closedown

Related

Boost asio crashes

I have a program using cpprestsdk for http querying and websocketpp for subscribing a data stream. The program will crash immediately(it says Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)). But if I comment either of the http querying or subcribing data stream, the program won't crash.
#include <websocketpp/config/asio_client.hpp>
#include <websocketpp/client.hpp>
#include "json.hpp"
#include <iostream>
#include <ctime>
#include <iostream>
#include <cpprest/http_client.h>
#include <cpprest/filestream.h>
#include <vector>
#include <string>
using std::string;
using namespace web;
using std::cout, std::endl;
using std::vector;
using websocketpp::lib::placeholders::_1;
using websocketpp::lib::placeholders::_2;
using websocketpp::lib::bind;
typedef websocketpp::client<websocketpp::config::asio_tls_client> client;
typedef websocketpp::config::asio_client::message_type::ptr message_ptr;
void on_stream_data(websocketpp::connection_hdl hdl, message_ptr msg) {
}
class OrderBook {
public:
void initialize() {
web::http::client::http_client_config cfg;
std::string uri = string("https://fapi.binance.com/fapi/v1/depth?symbol=btcusdt&limit=1000");
web::http::client::http_client client(U(uri), cfg);
web::http::http_request request(web::http::methods::GET);
request.headers().add("Content-Type", "application/x-www-form-urlencoded");
web::http::http_response response = client.request(request).get();
}
int start_stream() {
client c;
std::string uri = string("wss://fstream.binance.com/ws/btcusdt#depth#100ms");
try {
c.set_access_channels(websocketpp::log::alevel::all);
c.clear_access_channels(websocketpp::log::alevel::frame_payload);
c.init_asio();
c.set_message_handler(bind(on_stream_data, ::_1, ::_2));
websocketpp::lib::error_code ec;
client::connection_ptr con = c.get_connection(uri, ec);
if (ec) {
std::cout << "could not create connection because: " << ec.message() << std::endl;
return 0;
}
c.connect(con);
c.run();
} catch (websocketpp::exception const &e) {
std::cout << e.what() << std::endl;
}
}
};
int main(int argc, char *argv[]) {
OrderBook ob;
ob.initialize(); // comment either of these two lines, the program won't crash, otherwise the program will crash once start
std::this_thread::sleep_for(std::chrono::milliseconds(10000000));
ob.start_stream(); // comment either of these two lines, the program won't crash, otherwise the program will crash once start
}
When I run this program in Clion debug mode, Clion show that the crash comes from function in /opt/homebrew/Cellar/boost/1.76.0/include/boost/asio/ssl/detail/impl/engine.ipp
int engine::do_connect(void*, std::size_t)
{
return ::SSL_connect(ssl_);
}
It says Exception: EXC_BAD_ACCESS (code=1, address=0xf000000000)
What's wrong with it? is it because I run two pieces of code using boost::asio, and something shouldn't be initialized twice?
I can compile this and run it fine.
My best bet is that you might be mixing versions, particularly boost versions. A common mode of failure is caused when ODR violations lead to Undefined Behaviour.
Note that these header-only libraries depend on a number of boost libraries that are not header-only (e.g. Boost System, Thread and/or Chrono). You need to compile against the same version as the libraries you link.
If you use distribution packaged versions of any library (cpprestsdk, websocketpp or whatever json library that is you're using) then you'd be safest also using the distribution packaged version of Boost.
I'd personally simplify the situation by just using Boost (Beast for HTTP/websocket, Json for, you guessed it).
Running it all on a test Ubuntu 18.04 the OS Boost 1.65 version, the start_stream sequence triggers this informative error:
[2022-05-22 13:42:11] [fatal] Required tls_init handler not present.
could not create connection because: Connection creation attempt failed
While being UBSAN/ASAN clean. Perhaps that error helps you, once you figure out the configuration problems that made your program crash.

QTcpSocket does not emit connected()

I'm using QT5 and starting out with a basic Server/Client setup. I'm looking at going single threaded for both apps as there is no heavy processing on network data. Now, from everything I've read and researched here, when using asynchronous approach, you don't use waitForXXXX() otherwise it messes up all the signals and slots. The problem - On the client end, the connected() signal is either never emitted or never processed, even though the server consoles tells me that a new client has connected. I've been working on the same issue for 2 weeks now and couldn't find the exact same issue anywhere. I've stripped back both apps to the minimum and still no luck - also stripped out the UI part now - I just want see the console working. I have also tried switching to public slots and changing the signal/slot connection type and still have the same problem.
If you require code from the server, please let me know, but here is the basics of the client:
main.cpp
#include "QGameSocket.h"
#include <QtWidgets/QApplication>
#include <windows.h>
int main(int argc, char *argv[])
{
AllocConsole();
freopen( "conin$", "r", stdin );
freopen( "conout$", "w", stdout );
freopen( "conout$", "w", stderr );
QApplication a( argc, argv );
QGameSocket* pSocket = new QGameSocket();
return a.exec();
}
QGameSocket.h
#ifndef _QGAMESOCKET_H
#define _QGAMESOCKET_H
#include <QtNetwork/qtcpsocket.h>
#pragma comment ( lib, "Qt5Network.lib" )
class QGameSocket: public QObject
{
Q_OBJECT
public:
explicit QGameSocket( QObject* pParent = 0 );
~QGameSocket();
private slots:
void __OnConnected();
void __OnReadyRead();
private:
QTcpSocket* m_pSocket;
};
#endif
QGameSocket.cpp
#include "QGameSocket.h"
#include <qdatastream.h>
QGameSocket::QGameSocket( QObject* pParent ) :
QObject( pParent )
{
m_pSocket = new QTcpSocket();
connect( m_pSocket, SIGNAL( connected() ), this, SLOT( __OnConnected() ) );
connect( m_pSocket, SIGNAL( readyRead() ), this, SLOT( __OnReadyRead() ) );
const QString strHost = "127.0.0.1";
qDebug() << "Connecting to host ...";
m_pSocket->connectToHost( strHost, 27015 );
}
QGameSocket::~QGameSocket()
{
m_pSocket->deleteLater();
}
void QGameSocket::__OnConnected()
{
qDebug() << "Successfully connected to host!";
}
void QGameSocket::__OnReadyRead()
{
//handle messages
}
Any help would be much appreciated, thank you!
I finally figured it out - I was using the release network library, not the debug library.
I changed:
#pragma comment ( lib, "Qt5Network.lib" )
to
#pragma comment ( lib, "Qt5Networkd.lib" )

Dll missing entry point timeGetTime

Trying to compile this DLL in MingGWx64, using the following command
gcc -shared -o evil.dll evil.cpp -DWIN32_LEAN_AND_MEAN
Through trial and error I moved the "int fireMyLaser ()" below the declaration, from the bottom of the code sample I found. But I still get an error on the load of the EXE that it can't find the entry-point timeGetTime. Anyone have any ideas?
#include <windows.h>
#define DllExport __declspec (dllexport)
int fireMyLaser()
{
WinExec("calc", 0);
return 0;
}
DllExport void timeGetTime() { fireMyLaser(); }
BOOL WINAPI DllMain(HINSTANCE hinstDLL,DWORD fdwReason, LPVOID lpvReserved)
{
fireMyLaser();
return 0;
}`
Compiling the DLL works, on loading the EXE I get "The procedure entry point timeGetTime could not be located in the dynamic link library"
I don't have access to the exe code, but through trial and error the below worked.
// includes adjusted here to allow for timeGetTime to be used as an entry point
#include <windef.h>
#include <stdio.h>
#include <WinBase.h>
//entrypoint timeGetTime below for exe to hit... repeatedly
extern "C" __declspec(dllexport) int timeGetTime() {
WinExec("calc.exe", 0);
return 0;
}
BOOL WINAPI DllMain(HINSTANCE hinstDLL,DWORD fdwReason, LPVOID lpvReserved)
{
timeGetTime();
return TRUE;
}

Android NDK/JNI torch state control

How can I change state of the camera flash throgh JNI function? I am looking to be able to have ON/OFF state control, just like in Java CameraManager.setTorchMode(cameraId, state); method. I've tried to search for it in native camera API ,but no success. Here's what I have done so far:
#include <jni.h>
#include <assert.h>
#include <jni.h>
#include <pthread.h>
#include <android/native_window_jni.h>
#include <camera/NdkCameraDevice.h>
#include <camera/NdkCameraManager.h>
#include <android/asset_manager.h>
#include "messages-internal.h"
JNIEXPORT void JNICALL
Java_com_android_rxjava_flashlightflicker_MainActivity_flasher(JNIEnv *env, jobject instance) {
ACameraIdList *cameraIdList = NULL;
const char *selectedCameraId = NULL;
ACameraManager *cameraManager = ACameraManager_create();
camera_status_t camera_status = ACAMERA_OK;
camera_status = ACameraManager_getCameraIdList(cameraManager, &cameraIdList);
/// Camera status not ok
if (camera_status != ACAMERA_OK) {
LOGE("Camera is bad id: %d \n", camera_status);
return;
}
// There is no camera
if (cameraIdList->numCameras < 1 ) {
LOGE("Camera is not present on the device.");
return;
}
selectedCameraId = cameraIdList->cameraIds[0];
ACameraMetadata *cameraMetedata = NULL;
ACameraManager_getCameraCharacteristics(cameraManager, selectedCameraId, &cameraMetedata);
// ACaptureSessionOutput_create()
}
I also tried to look in asset manager but no success, can anybody experienced with NDK camera give me a hand with it?
Thanks in advance!
This method is only available in Java API. You could access it through JNI, but IMO it would be easier and safer to write a wrapper static method in Java and have this wrapper called from your C++ code.

QGuiApplication & QXmlQuery problems on qt5

I'm trying to use QtXmlPatterns module in order to parse an XML file.
Unfortunately using Qt5.1 on MacOsX 10.7&10.8 I found a problem I have not with Qt4.8.5.
#include <QCoreApplication>
#include <QGuiApplication>
#include <QXmlQuery>
#include <QStringList>
#include <QDebug>
int main(int argc, char *argv[])
{
//QGuiApplication a(argc, argv);
QCoreApplication a(argc, argv);
QXmlQuery qry;
qry.setQuery("doc(\"file.xml\")");
QStringList lst;
qry.evaluateTo(&lst);
qDebug() << lst;
return 0;
}
this is the .pro I'm using.
QT += core gui xmlpatterns
TARGET = Test
TEMPLATE = app
CONFIG -= app_bundle
SOURCES += main.cpp
If I run a QCoreApplication everything works properly, instead if I switch on QGuiApplication (or a QApplication) this small program hangs forever on the evaluteTo function. It doesn't matter if file.xml exists or not.
On Windows and on Linux the same program run smoothly even if I use the QCoreApplication or the QGuiApplication or the QApplication.
I tried also to play a little with the QXmlQuery functions. If I call the setFocus function I got the same behaviour (with QCoreApplication everything it's ok, with QGuiApplication it hangs for ever on the setFocus function).
Suggestions?