Libcurl hangs on curl_easy_perform or curl_multi_perform never decrease second parameter - dll

I have an issue – There is an application (P) which create an instance of the com component (F) which make calls from separate dll (U).
P(app)—CoCreateInstance()--->F(com .dll)-----(call)--->U(MFC extension.dll)------(call libcurl.dll)
I create small test console application
#include "stdafx.h"
#include <iostream>
#include "curl/curl.h"
static std::string readBuffer;
static size_t WriteCallback(void *contents, size_t size, size_t nmemb, void *userp)
{
size_t realsize = size * nmemb;
readBuffer.append((const char*)contents, realsize);
return realsize;
}
int blocking_curl()
{
const std::string endPointUrl = "https://somecoolserver.com/resource";
const std::string urlparam = "param1=pValue1&param2=pValue2& param3=pValue3 ";
const std::string cookie = "";
const std::string httpHeadAccept = "application/xml";
const std::string httpContentType = "Content-Type: application/x-www-form-urlencoded";
const std::string AuthCertficate = "";
const std::string OAuthToken = "";
CURL *curl = NULL;
CURLcode res = CURLE_FAILED_INIT;
struct curl_slist *headers = NULL;
curl = curl_easy_init();
if(!curl)
{
return res;
}
curl_easy_setopt(curl, CURLOPT_CUSTOMREQUEST, "POST");
curl_easy_setopt(curl, CURLOPT_URL, endPointUrl.c_str());
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, urlparam.c_str());
curl_easy_setopt(curl, CURLOPT_DEFAULT_PROTOCOL, "https");
headers = curl_slist_append(headers, httpHeadAccept.c_str());
headers = curl_slist_append(headers, httpContentType.c_str());
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headers);
curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, errbuf);
headers = curl_slist_append(headers, OAuthToken.c_str());
readBuffer.clear();
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, WriteCallback);
if (!cookie.empty())
{
headers = curl_slist_append(headers, AuthCertficate.c_str());
std::string pCookie = "somestring=";
pCookie += cookie;
curl_easy_setopt(curl, CURLOPT_COOKIE, pCookie.c_str());
}
else
{
headers = curl_slist_append(headers, OAuthToken.c_str());
}
res = curl_easy_perform(curl);
curl_easy_cleanup(curl);
std::cout << "Response: " << readBuffer.c_str() << std::endl;
return res;
}
int _tmain(int argc, _TCHAR* argv[])
{
blocking_curl();
return 0;
}
This code works fine and I get in readBuffer the exact a such result as I expect.
However, if I directly copy this code to my U dll (more exactly replace such a code within called function) this code hangs on curl_easy_perform. Version of libcurl 7.54.0. I see in debugger all correct flow and call stack – but in U dll it hangs on curl_easy_perform. I even have no idea how it is possible! In debugger I see libcurl.dll is loaded and the version (7.54.0) is correct. All parameters are hardcoded (url, url parameters and so on) but in one case it works and for another doesn’t! Additionally I created another function with multi interface but it is also doesn’t work.
I have two ideas:
The com component load U dll and U dll try to load libcurl.dll somewhere from the system. My searching through the file system shows there is several version of libcurl.dll in the system. One for MacAfee antivirus and one in Microsoft office. However if I put the call curl_version() before curl_easy_perform I see as a result correct version. I can’t 100% be sure it is correct because this string may be get on compile time but actually it load libcurl of the other version.
Com component may have different address space and U dll actually can’t find the library. I also thing it is correct because all other calls are correct including curl_init..
But what else?! If somebody have any ideas please share.

The issue was a result of call the function of dll (U) with libcurl code in InitInstance method of dll F (COM component). After moving all libcurl routine to the other working methods everything have worked fine.

Related

C++ Builder Function error [bcc32 - Ambiguity error] inside dll file

I am creating a currency converter Win32 program in Embarcadero C++Builder. I wrote a function for transforming date from format specified on user PC to YYYY-MM-DD format. I need that part because of API settings.
When I have this function inside my project it works fine, but I need to have that function inside a DLL.
This is how my code looks like:
#pragma hdrstop
#pragma argsused
#include <SysUtils.hpp>
extern DELPHI_PACKAGE void __fastcall DecodeDate(const System::TDateTime DateTime, System::Word &Year, System::Word &Month, System::Word &Day);
extern "C" UnicodeString __declspec (dllexport) __stdcall datum(TDateTime dat) {
Word dan, mjesec, godina;
UnicodeString datum, datum_dan, datum_mjesec, datum_godina;
DecodeDate(dat, godina, mjesec, dan);
if (dan<=9 && mjesec<=9) {
datum_dan="0"+IntToStr(dan);
datum_mjesec="0"+IntToStr(mjesec);
}
if (dan<=9 && mjesec>9) {
datum_dan="0"+IntToStr(dan);
datum_mjesec=IntToStr(mjesec);
}
if (dan>9 && mjesec<=9) {
datum_dan=IntToStr(dan);
datum_mjesec="0"+IntToStr(mjesec);
}
if (dan>9 && mjesec>9) {
datum_dan=IntToStr(dan);
datum_mjesec=IntToStr(mjesec);
}
datum_godina=IntToStr(godina);
return datum_godina+"-"+datum_mjesec+"-"+datum_dan;
}
extern "C" int _libmain(unsigned long reason)
{
return 1;
}
`
I've included SysUtils.hpp and declared DecodeDate() function, without those lines I have a million errors. But with code looking like this, I am getting this error, which I can't get rid of:
[bcc32 Error] File1.cpp(30): E2015 Ambiguity between '_fastcall System::Sysutils::DecodeDate(const System::TDateTime,unsigned short &,unsigned short &,unsigned short &) at c:\program files (x86)\embarcadero\studio\19.0\include\windows\rtl\System.SysUtils.hpp:3466' and '_fastcall DecodeDate(const System::TDateTime,unsigned short &,unsigned short &,unsigned short &) at File1.cpp:25'
Full parser context
File1.cpp(27): parsing: System::UnicodeString __stdcall datum(System::TDateTime)
Can you help me to get rid of that error?
The error message is self-explanatory. You have two functions with the same name in scope, and the compiler doesn't know which one you want to use on line 30 because the parameters you are passing in satisfy both function declarations.
To fix the error, you can change this line:
DecodeDate(dat, godina, mjesec, dan);
To either this:
System::Sysutils::DecodeDate(dat, godina, mjesec, dan);
Or this:
dat.DecodeDate(&godina, &mjesec, &dan);
However, either way, you should get rid of your extern declaration for DecodeDate(), as it doesn't belong in this code at all. You are not implementing DecodeDate() yourself, you are just using the one provided by the RTL. There is already a declaration for DecodeDate() in SysUtils.hpp, which you are #include'ing in your code. That is all the compiler needs.
Just make sure you are linking to the RTL/VCL libraries to resolve the function during the linker stage after compiling. You should have enabled VCL support when you created the DLL project. If you didn't, recreate your project and enable it.
BTW, there is a MUCH easier way to implement your function logic - instead of manually pulling apart the TDateTime and reconstituting its components, just use the SysUtils::FormatDateTime() function or the TDateTime::FormatString() method instead, eg:
UnicodeString __stdcall datum(TDateTime dat)
{
return FormatDateTime(_D("yyyy'-'mm'-'dd"), dat);
}
UnicodeString __stdcall datum(TDateTime dat)
{
return dat.FormatString(_D("yyyy'-'mm'-'dd"));
}
That being said, this code is still wrong, because it is not safe to pass non-POD types, like UnicodeString, over the DLL boundary like you are doing. You need to re-think your DLL function design to use only interop-safe POD types. In this case, change your function to either:
take a wchar_t* as input from the caller, and just fill in the memory block with the desired characters. Let the caller allocate the actual buffer and pass it in to your DLL for populating:
#pragma hdrstop
#pragma argsused
#include <SysUtils.hpp>
extern "C" __declspec(dllexport) int __stdcall datum(double dat, wchar_t *buffer, int buflen)
{
UnicodeString s = FormatDateTime(_D("yyyy'-'mm'-'dd"), dat);
if (!buffer) return s.Length() + 1;
StrLCopy(buffer, s.c_str(), buflen-1);
return StrLen(buffer);
}
extern "C" int _libmain(unsigned long reason)
{
return 1;
}
wchar_t buffer[12] = {};
datum(SomeDateValueHere, buffer, 12);
// use buffer as needed...
int len = datum(SomeDateValueHere, NULL, 0);
wchar_t *buffer = new wchar_t[len];
int len = datum(SomeDateValueHere, buffer, len);
// use buffer as needed...
delete[] buffer;
allocate a wchar_t[] buffer to hold the desired characters, and then return a wchar_t* pointer to that buffer to the caller. Then export a second function that the caller can pass the returned wchar_t* back to you so you can free it correctly.
#pragma hdrstop
#pragma argsused
#include <SysUtils.hpp>
extern "C" __declspec(dllexport) wchar_t* __stdcall datum(double dat)
{
UnicodeString s = FormatDateTime("yyyy'-'mm'-'dd", dat);
wchar_t* buffer = new wchar_t[s.Length()+1];
StrLCopy(buffer, s.c_str(), s.Length());
return buffer;
}
extern "C" __declspec(dllexport) void __stdcall free_datum(wchar_t *dat)
{
delete[] dat;
}
extern "C" int _libmain(unsigned long reason)
{
return 1;
}
wchar_t *buffer = datum(SomeDateValueHere);
// use buffer as needed...
free_datum(buffer);

How to use http websocket on Mongoose embedded web server with SSL?

I'm trying to use Http WebSocket on Mongoose embedded web server with SSL.
And I tried this mongoose example called "simplest_web_server_ssl".
But when I executed the program, it printed out this message below.
"Failed to create listener: Invalid SSL cert"
I think it's because the program doesn't know where the "server.pem" file is.
I put these "server.pem" and "server.key" files from the example folder into a "release" folder where the .exe file is created and runs.
Actually I'm quite new to Mongoose and SSL.
Please anybody could help me?
Thanks, regards.
/*
* Copyright (c) 2016 Cesanta Software Limited
* All rights reserved
*/
/*
* This example starts an SSL web server on https://localhost:8443/
*
* Please note that the certificate used is a self-signed one and will not be
* recognised as valid. You should expect an SSL error and will need to
* explicitly allow the browser to proceed.
*/
#include "mongoose.h"
static const char *s_http_port = "8443";
static const char *s_ssl_cert = "server.pem";
static const char *s_ssl_key = "server.key";
static struct mg_serve_http_opts s_http_server_opts;
static void ev_handler(struct mg_connection *nc, int ev, void *p) {
if (ev == MG_EV_HTTP_REQUEST) {
mg_serve_http(nc, (struct http_message *) p, s_http_server_opts);
}
}
int main(void) {
struct mg_mgr mgr;
struct mg_connection *nc;
struct mg_bind_opts bind_opts;
const char *err;
mg_mgr_init(&mgr, NULL);
memset(&bind_opts, 0, sizeof(bind_opts));
bind_opts.ssl_cert = s_ssl_cert;
bind_opts.ssl_key = s_ssl_key;
bind_opts.error_string = &err;
printf("Starting SSL server on port %s, cert from %s, key from %s\n",
s_http_port, bind_opts.ssl_cert, bind_opts.ssl_key);
nc = mg_bind_opt(&mgr, s_http_port, ev_handler, bind_opts);
if (nc == NULL) {
printf("Failed to create listener: %s\n", err);
return 1;
}
// Set up HTTP server parameters
mg_set_protocol_http_websocket(nc);
s_http_server_opts.document_root = "."; // Serve current directory
s_http_server_opts.enable_directory_listing = "yes";
for (;;) {
mg_mgr_poll(&mgr, 1000);
}
mg_mgr_free(&mgr);
return 0;
}
I've had the same issue. You can step into mongoose using gdb or similar tool to find the actual reason for that error message. If you think mongoose isn't finding your files, try using absolute paths. If it finds the file, you might need to regenerate (and/or register) the cert files on your computer.

How to load a graph with tensorflow.so and c_api.h in c++ language?

I am not able to find any examples about how to load a graph with tensorflow.so and c_api.h in C++. I read the c_api.h, however the ReadBinaryProto function was not in it. How can I load a graph without the ReadBinaryProto function?
If you're using C++, you might want to use the C++ API instead. The label image example would probably be a good sample to help you start.
If you really want to use just the C API, use TF_GraphImportGraphDef to load a graph. Note that the C API isn't particularly convenient to use (it is intended to build bindings in other languages such as Go, Java, Rust, Haskell etc.) For example:
#include <stdio.h>
#include <stdlib.h>
#include <tensorflow/c/c_api.h>
TF_Buffer* read_file(const char* file);
void free_buffer(void* data, size_t length) {
free(data);
}
int main() {
// Graph definition from unzipped https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip
// which is used in the Go, Java and Android examples
TF_Buffer* graph_def = read_file("tensorflow_inception_graph.pb");
TF_Graph* graph = TF_NewGraph();
// Import graph_def into graph
TF_Status* status = TF_NewStatus();
TF_ImportGraphDefOptions* opts = TF_NewImportGraphDefOptions();
TF_GraphImportGraphDef(graph, graph_def, opts, status);
TF_DeleteImportGraphDefOptions(opts);
if (TF_GetCode(status) != TF_OK) {
fprintf(stderr, "ERROR: Unable to import graph %s", TF_Message(status));
return 1;
}
fprintf(stdout, "Successfully imported graph");
TF_DeleteStatus(status);
TF_DeleteBuffer(graph_def);
// Use the graph
TF_DeleteGraph(graph);
return 0;
}
TF_Buffer* read_file(const char* file) {
FILE *f = fopen(file, "rb");
fseek(f, 0, SEEK_END);
long fsize = ftell(f);
fseek(f, 0, SEEK_SET); //same as rewind(f);
void* data = malloc(fsize);
fread(data, fsize, 1, f);
fclose(f);
TF_Buffer* buf = TF_NewBuffer();
buf->data = data;
buf->length = fsize;
buf->data_deallocator = free_buffer;
return buf;
}
The previous answer is your main option if you are wanting to use it outside of the TensorFlow project (and consequently not build with Bazel). You need to load it from the c_api.h with TF_GraphImportDef, I recommend training and doing testing in Python and then exporting the model/graph for use with C++/C Api when you have finished.

Webm (VP8 / Opus) file read and write back

I am trying to develop a webrtc simulator in C/C++. For media handling, I plan to use libav. I am thinking of below steps to realize media exchange between two webrtc simulator. Say I have two webrtc simulators A and B.
Read media at A from a input webm file using av_read_frame api.
I assume I will get the encoded media (audio / video) data, am I correct here?
Send the encoded media data to simulator B over a UDP socket.
Simulator B receives the media data in UDP socket as RTP packets.
Simulator B extracts audio/video data from just received RTP packet.
I assume the extracted media data at simulator B are the encoded data only (am I correct here). I do not want to decode it. I want to write it to a file. Later I will play the file to check if I have done everything right.
To simplify this problem lets take out UDP socket part. Then my question reduces to read data from a webm input file, get the encoded media, prepare the packet and write to a output file using av_interleaved_write_frame or any other appropriate api. All these things I want to do using libav.
Is there any example code I can refer.
Or can somebody please guide me to develop it.
I am trying with a test program. As a first step, my aim is to read from a file and write to an output file. I have below code, but it is not working properly.
//#define _AUDIO_WRITE_ENABLED_
#include "libavutil/imgutils.h"
#include "libavutil/samplefmt.h"
#include "libavformat/avformat.h"
static AVPacket pkt;
static AVFormatContext *fmt_ctx = NULL;
static AVFormatContext *av_format_context = NULL;
static AVOutputFormat *av_output_format = NULL;
static AVCodec *video_codec = NULL;
static AVStream *video_stream = NULL;
static AVCodec *audio_codec = NULL;
static AVStream *audio_stream = NULL;
static const char *src_filename = NULL;
static const char *dst_filename = NULL;
int main (int argc, char **argv)
{
int ret = 0;
int index = 0;
if (argc != 3)
{
printf("Usage: ./webm input_video_file output_video_file \n");
exit(0);
}
src_filename = argv[1];
dst_filename = argv[2];
printf("Source file = %s , Destination file = %s\n", src_filename, dst_filename);
av_register_all();
/* open input file, and allocate format context */
if (avformat_open_input(&fmt_ctx, src_filename, NULL, NULL) < 0)
{
fprintf(stderr, "Could not open source file %s\n", src_filename);
exit(1);
}
/* retrieve stream information */
if (avformat_find_stream_info(fmt_ctx, NULL) < 0)
{
fprintf(stderr, "Could not find stream information\n");
exit(2);
}
av_output_format = av_guess_format(NULL, dst_filename, NULL);
if(!av_output_format)
{
fprintf(stderr, "Could not guess output file format\n");
exit(3);
}
av_output_format->audio_codec = AV_CODEC_ID_VORBIS;
av_output_format->video_codec = AV_CODEC_ID_VP8;
av_format_context = avformat_alloc_context();
if(!av_format_context)
{
fprintf(stderr, "Could not allocation av format context\n");
exit(4);
}
av_format_context->oformat = av_output_format;
strcpy(av_format_context->filename, dst_filename);
video_codec = avcodec_find_encoder(av_output_format->video_codec);
if (!video_codec)
{
fprintf(stderr, "Codec not found\n");
exit(5);
}
video_stream = avformat_new_stream(av_format_context, video_codec);
if (!video_stream)
{
fprintf(stderr, "Could not alloc stream\n");
exit(6);
}
avcodec_get_context_defaults3(video_stream->codec, video_codec);
video_stream->codec->codec_id = AV_CODEC_ID_VP8;
video_stream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
video_stream->time_base = (AVRational) {1, 30};
video_stream->codec->width = 640;
video_stream->codec->height = 480;
video_stream->codec->pix_fmt = PIX_FMT_YUV420P;
video_stream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
video_stream->codec->bit_rate = 400000;
video_stream->codec->gop_size = 10;
video_stream->codec->max_b_frames=1;
#ifdef _AUDIO_WRITE_ENABLED_
audio_codec = avcodec_find_encoder(av_output_format->audio_codec);
if (!audio_codec)
{
fprintf(stderr, "Codec not found audio codec\n");
exit(5);
}
audio_stream = avformat_new_stream(av_format_context, audio_codec);
if (!audio_stream)
{
fprintf(stderr, "Could not alloc stream for audio\n");
exit(6);
}
avcodec_get_context_defaults3(audio_stream->codec, audio_codec);
audio_stream->codec->codec_id = AV_CODEC_ID_VORBIS;
audio_stream->codec->codec_type = AVMEDIA_TYPE_AUDIO;
audio_stream->time_base = (AVRational) {1, 30};
audio_stream->codec->sample_rate = 8000;
audio_stream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
#endif
if(!(av_output_format->flags & AVFMT_NOFILE))
{
if (avio_open(&av_format_context->pb, dst_filename, AVIO_FLAG_WRITE) < 0)
{
fprintf(stderr, "Could not open '%s'\n", dst_filename);
}
}
/* Before avformat_write_header set the stream */
avformat_write_header(av_format_context, NULL);
/* initialize packet, set data to NULL, let the demuxer fill it */
av_init_packet(&pkt);
pkt.data = NULL;
pkt.size = 0;
pkt.stream_index = video_stream->index;
ret = av_read_frame(fmt_ctx, &pkt);
while (ret >= 0)
{
index++;
//pkt.stream_index = video_avstream->index;
if(pkt.stream_index == video_stream->index)
{
printf("Video: Read cycle %d, bytes read = %d, pkt stream index=%d\n", index, pkt.size, pkt.stream_index);
av_write_frame(av_format_context, &pkt);
}
#ifdef _AUDIO_WRITE_ENABLED_
else if(pkt.stream_index == audio_stream->index)
{
printf("Audio: Read cycle %d, bytes read = %d, pkt stream index=%d\n", index, pkt.size, pkt.stream_index);
av_write_frame(av_format_context, &pkt);
}
#endif
av_free_packet(&pkt);
ret = av_read_frame(fmt_ctx, &pkt);
}
av_write_trailer(av_format_context);
/** Exit procedure starts */
avformat_close_input(&fmt_ctx);
avformat_free_context(av_format_context);
return 0;
}
When I execute this program, it outputs "codec not found". Now sure whats going wrong, Can somebody help please.
Codec not found issue is resolved by separately building libvpx1.4 version. Still struggling to read from source file, and writing to a destination file.
EDIT 1: After code modification, only video stuff I am able to write to a file, though some more errors are still present.
EDIT 2: With modified code (2nd round), I see video frames are written properly. For audio frames I added the code under a macro _AUDIO_WRITE_ENABLED_ , but if I enable this macro program crashing. Can somebody guide whats wrong in audio write part (code under macro _AUDIO_WRITE_ENABLED_).
I am not fully answering your question, but I hope we will get to the final solution eventually. When I tried to run your code, I got this error "time base not set".
Time base and other header specs are part of codec. This is, how I have this thing specified for writing into file (vStream is of AVStream):
#if LIBAVCODEC_VER_AT_LEAST(53, 21)
avcodec_get_context_defaults3(rc->vStream->codec, AVMEDIA_TYPE_VIDEO);
#else
avcodec_get_context_defaults2(rc->vStream->codec, AVMEDIA_TYPE_VIDEO);
#endif
#if LIBAVCODEC_VER_AT_LEAST(54, 25)
vStream->codec->codec_id = AV_CODEC_ID_VP8;
#else
vStream->codec->codec_id = CODEC_ID_VP8;
#endif
vStream->codec->codec_type = AVMEDIA_TYPE_VIDEO;
vStream->codec->time_base = (AVRational) {1, 30};
vStream->codec->width = 640;
vStream->codec->height = 480;
vStream->codec->pix_fmt = PIX_FMT_YUV420P;
EDIT: I ran your program in Valgrind and it segfaults on av_write_frame. Looks like its time_base and other specs for output are not set properly.
Add the specs before avformat_write_header(), before it is too late.

transform javascript to opcode using spidermonkey

i am new to spider monkey and want to use it for transform java script file to sequence of byte code.
i get spider monkey and build it in debug mode.
i want to use JS_CompileScript function in jsapi.h to compile javascript code and analysis this to get bytecode , but when in compile below code and run it , i get run time error.
the error is "Unhandled exception at 0x0f55c020 (mozjs185-1.0.dll) in spiderMonkeyTest.exe: 0xC0000005: Access violation reading location 0x00000d4c." and i do not resolve it.
any body can help me to resolve this or introducing other solutions to get byte code from javascript code by using spider monkey ?
// spiderMonkeyTest.cpp : Defines the entry point for the console application.
//
#define XP_WIN
#include <iostream>
#include <fstream>
#include "stdafx.h"
#include "jsapi.h"
#include "jsanalyze.h"
using namespace std;
using namespace js;
static JSClass global_class = { "global",
JSCLASS_NEW_RESOLVE | JSCLASS_GLOBAL_FLAGS,
JS_PropertyStub,
NULL,
JS_PropertyStub,
JS_StrictPropertyStub,
JS_EnumerateStub,
JS_ResolveStub,
JS_ConvertStub,
NULL,
JSCLASS_NO_OPTIONAL_MEMBERS
};
int _tmain(int argc, _TCHAR* argv[]) {
/* Create a JS runtime. */
JSRuntime *rt = JS_NewRuntime(16L * 1024L * 1024L);
if (rt == NULL)
return 1;
/* Create a context. */
JSContext *cx = JS_NewContext(rt, 8192);
if (cx == NULL)
return 1;
JS_SetOptions(cx, JSOPTION_VAROBJFIX);
JSScript *script;
JSObject *obj;
const char *js = "function a() { var tmp; tmp = 1 + 2; temp = temp * 2; alert(tmp); return 1; }";
obj = JS_CompileScript(cx,JS_GetGlobalObject(cx),js,strlen(js),"code.js",NULL);
script = obj->getScript();
if (script == NULL)
return JS_FALSE; /* compilation error */
js::analyze::Script *sc = new js::analyze::Script();
sc->analyze(cx,script);
JS_DestroyContext(cx);
JS_DestroyRuntime(rt);
/* Shut down the JS engine. */
JS_ShutDown();
return 1;
}
Which version of Spidermonkey are you using? I am using the one that comes with FireFox 10 so the API may be different.
You should create a new global object and initialize it by calling JS_NewCompartmentAndGlobalObject() and JS_InitStandardClasses() before compiling your script :
.....
/*
* Create the global object in a new compartment.
* You always need a global object per context.
*/
global = JS_NewCompartmentAndGlobalObject(cx, &global_class, NULL);
if (global == NULL)
return 1;
/*
* Populate the global object with the standard JavaScript
* function and object classes, such as Object, Array, Date.
*/
if (!JS_InitStandardClasses(cx, global))
return 1;
......
Note, the function JS_NewCompartmentAndGlobalObject() is obsolete now, check the latest JSAPI documentation for the version your are using. Your JS_CompileScript() call just try to retrieve a global object which has not been created and probably this causes the exception.
how about using function "SaveCompiled" ? it will save object/op-code (compiled javascript) to file