Correct usage of SetRootType in FlatBuffers - flatbuffers

I'm trying to set a new custom root before parsing a JSon into a structure via flatbuffers.
The Corresponding FSB has a root_type already and I want to override it only to be able to parse it into a struct once.
The SetRootType("NonRootStructInFbsT") fails
The documentation of the API says, this can be used to override the current root which is exactly what I want to do.
std::string schemaText;
std::string schemaFile("MySchema.fbs");
if(not flatbuffers::FileExists(schemaFile.c_str())) {
error("Schema file inaccessible: ", schemaFile);
return nullptr;
}
if(not flatbuffers::LoadFile(schemaFile.c_str(), false, &schemaText)) {
error(TAG, "Failed to load schema file: ", schemaFile);
return nullptr;
}
info("Read schema file: ", schemaText.size(), schemaText);
flatbuffers::Parser parser;
if(not parser.SetRootType("NonRootStructInFbsT")) {
error("Unable to set root type: ", customRoot);
return nullptr;
}
info("Set the root type: ", customRoot);
I always get the error message
Unable to set root type: NonRootStructInFbsT

The root of a FlatBuffer can only be a table, so root_type and SetRootType will reject names of anything else, like a struct or a union.
Furthermore, the fact that the name ends in T appears to refer to an "object API" type. These are names purely in the generated code, you need to supply names as they are in the schema.

The posted code lacks call of flatbuffers::Parser::Parse with schema contents. Your code:
std::string schemaText;
std::string schemaFile("MySchema.fbs");
if(not flatbuffers::FileExists(schemaFile.c_str())) {
error("Schema file inaccessible: ", schemaFile);
return nullptr;
}
if(not flatbuffers::LoadFile(schemaFile.c_str(), false, &schemaText)) {
error(TAG, "Failed to load schema file: ", schemaFile);
return nullptr;
}
info("Read schema file: ", schemaText.size(), schemaText);
flatbuffers::Parser parser;
It is all OK, but here we have contents of schema file in schemaText and empty parser with default options. Too early to set root types. We should read the schema to parser with something like:
if(not parser.Parse(schemaText)) {
error(TAG, "Defective schema: ", parser.error_);
return nullptr;
}
After that if we reached here we have parser with schema and so in that schema we can chose also root type:
if(not parser.SetRootType("NonRootStructInFbsT")) {
error("Unable to set root type: ", customRoot);
return nullptr;
}
info("Set the root type: ", customRoot);
Note that parser.error_ is quite informative about any errors you face during parser usage.

Related

How should I bundle a library of text files with my module?

I have the following structure in the resources directory in a module I'm building:
resources
|-- examples
|-- Arrays
| |-- file
|-- Lists
|-- file1
|-- file2
I have the following code to collect and process these files:
use v6.d;
unit module Doc::Examples::Resources;
class Resource {
has Str $.name;
has Resource #.resources;
has Resource %.resource-index;
method resource-names() {
#.resources>>.name.sort
}
method list-resources() {
self.resource-names>>.say;
}
method is-resource(Str:D $lesson) {
$lesson ~~ any self.resource-names;
}
method get-resource(Str:D $lesson) {
if !self.is-resource($lesson) {
say "Sorry, that lesson does not exist.";
return;
}
return %.resource-index{$lesson};
}
}
class Lesson is Resource {
use Doc::Parser;
use Doc::Subroutines;
has IO $.file;
method new(IO:D :$file) {
my $name = $file.basename;
self.bless(:$name, :$file)
}
method parse() {
my #parsed = parse-file $.file.path;
die "Failed parse examples from $.file" if #parsed.^name eq 'Any';
for #parsed -> $section {
my $heading = $section<meta>[0] || '';
my $intro = $section<meta>[1] || '';
say $heading.uc ~ "\n" if $heading && !$intro;
say $heading.uc if $heading && $intro;
say $intro ~ "\n" if $intro;
for $section<code>.Array {
die "Failed parse examples from $.file, check it's syntax." if $_.^name eq 'Any';
das |$_>>.trim;
}
}
}
}
class Topic is Resource {
method new(IO:D :$dir) {
my $files = dir $?DISTRIBUTION.content("$dir");
my #lessons;
my $name = $dir.basename;
my %lesson-index;
for $files.Array -> $file {
my $lesson = Lesson.new(:$file);
push #lessons, $lesson;
%lesson-index{$lesson.name} = $lesson;
}
self.bless(:$name, resources => #lessons, resource-index => %lesson-index);
}
}
class LocalResources is Resource is export {
method new() {
my $dirs = dir $?DISTRIBUTION.content('resources/examples');
my #resources;
my %resource-index;
for $dirs.Array -> $dir {
my $t = Topic.new(:$dir);
push #resources, $t;
%resource-index{$t.name} = $t;
}
self.bless(:#resources, :%resource-index)
}
method list-lessons(Str:D $topic) {
self.get-resource($topic).list-lessons;
}
method parse-lesson(Str:D $topic, Str:D $lesson) {
self.get-resource($topic).get-resource($lesson).parse;
}
}
It works. However, I'm told that this is not reliable and there there is no guarantee that lines like my $files = dir $?DISTRIBUTION.content("$dir"); will work after the module is installed or will continue to work into the future.
So what are better options for bundling a library of text files with my module that can be accessed and found by the module?
Files under the resources directory will always be available as keys to the %?RESOURCES compile-time variable if you declare them in the META6.json file this way:
"resources": [
"examples/Array/file",
]
and so on.
I've settled on a solution. As pointed out by jjmerelo, the META6.json file contains a list of resources and, if you use the comma IDE, the list of resources is automatically generated for you.
From within the module's code, the list of resources can be accessed via the $?DISTRIBUTION variable like so:
my #resources = $?DISTRIBUTION.meta<resources>
From here, I can build up my list of resources.
One note on something I discovered: the $?DISTRIBUTION variable is not accessible from a test script. It has to be placed inside a module in the lib directory of the distribution and exported.

How to override the NQPMatch.Str function

... Or how to change $<sigil>.Str value from token sigil { ... } idependently from the matched text. Yes I'm asking how to cheat grammars above (i.e. calling) me.
I am trying to write a Slang for Raku without sigil.
So I want the nogil token, matching anything <?> to return NqpMatch that stringifies: $<sigil>.Str to '$'.
Currently, my token sigil look like that
token sigil {
| <[$#%&]>
| <nogil> { say "Nogil returned: ", lk($/, 'nogil').Str; # Here It should print "$"
}
}
token nogil-proxy {
| '€'
| <?>
{log "No sigil:", get-stack; }
}
And the method with that should return a NQPMatch with method Str overwritten
method nogil {
my $cursor := self.nogil-proxy;
# .. This si where Nqp expertise would be nice
say "string is:", $cursor.Str; # here also it should print "$"
return $cursor;
}
Failed try:
$cursor.^cache_add('Str', sub { return '$'; } );
$cursor.^publish_method_cache;
for $cursor.^attributes { .name.say };
for $cursor.^methods { .name.say };
say $cursor.WHAT.Str;
nqp::setmethcacheauth($cursor, 0);
Currently, most of my tests work but I have problems in declarations without my (with no strict) like my-var = 42; because they are considered as method call.
#Arne-Sommer already made a post and an article. This is closely related. But this questions aims:
How can we customize the return value of a compile-time token and not how to declare it.
Intro: The answer, pointed by #JonathanWorthington:
Brief: Use the mixin meta function. (And NOT the but requiring compose method.)
Demo:
Create a NQPMatch object by retrieving another token: here the token sigil-my called by self.sigil-my.
Use ^mixin with a role
method sigil { return self.sigil-my.^mixin(Nogil::StrGil); }
Context: full reproducible code:
So you can see what type are sigil-my and Nogil::StrGil. But I told you: token (more than method) and role (uninstantiable classes).
role Nogil::StrGil {
method Str() {
return sigilize(callsame);
}
}
sub EXPORT(|) {
# Save: main raku grammar
my $main-grammar = $*LANG.slang_grammar('MAIN');
my $main-actions = $*LANG.slang_actions('MAIN');
role Nogil::NogilGrammar {
method sigil {
return self.sigil-my.^mixin(Nogil::StrGil);
}
}
token sigil-my { | <[$#%&]> | <?> }
# Mix
my $grammar = $main-grammar.^mixin(Nogil::NogilGrammar);
my $actions = $main-actions.^mixin(Nogil::NogilActions);
$*LANG.define_slang('MAIN', $grammar, $actions);
# Return empty hash -> specify that we’re not exporting anything extra
return {};
}
Note: This opens the door to mush more problems (also pointed by jnthn question comments) -> -0fun !

Test file structure in groovy(Spock)

How to test created and expected file tree in groovy(Spock)?
Right now I'm using Set where I specify paths which I expect to get and collecting actual paths in this way:
Set<String> getCreatedFilePaths(String root) {
Set<String> createFilePaths = new HashSet<>()
new File(root).eachFileRecurse {
createFilePaths << it.absolutePath
}
return createFilePaths
}
But the readability of the test isn't so good.
Is it possible in groovy to write expected paths as a tree, and after that compare with actual
For example, expected:
region:
usa:
new_york.json
california.json
europe:
spain.json
italy.json
And actual will be converted to this kind of tree.
Not sure if you can do it with the built-in recursive methods. There certainly are powerful ones, but this is standard recursion code you can use:
def path = new File("/Users/me/Downloads")
def printTree(File file, Integer level) {
println " " * level + "${file.name}:"
file.eachFile {
println " " * (level + 1) + it.name
}
file.eachDir {
printTree(it, level + 1)
}
}
printTree(path, 1)
That prints the format you describe
You can either build your own parser or use Groovy's built-in JSON parser:
package de.scrum_master.stackoverflow
import groovy.json.JsonParserType
import groovy.json.JsonSlurper
import spock.lang.Specification
class FileRecursionTest extends Specification {
def jsonDirectoryTree = """{
com : {
na : {
tests : [
MyBaseIT.groovy
]
},
twg : {
sample : {
model : [
PrimeNumberCalculatorSpec.groovy
]
}
}
},
de : {
scrum_master : {
stackoverflow : [
AllowedPasswordsTest.groovy,
CarTest.groovy,
FileRecursionTest.groovy,
{
foo : [
LoginIT.groovy,
LoginModule.groovy,
LoginPage.groovy,
LoginValidationPage.groovy,
User.groovy
]
},
LuceneTest.groovy
],
testing : [
GebTestHelper.groovy,
RestartBrowserIT.groovy,
SampleGebIT.groovy
]
}
}
}"""
def "Parse directory tree JSON representation"() {
given:
def jsonSlurper = new JsonSlurper(type: JsonParserType.LAX)
def rootDirectory = jsonSlurper.parseText(jsonDirectoryTree)
expect:
rootDirectory.de.scrum_master.stackoverflow.contains("CarTest.groovy")
rootDirectory.com.twg.sample.model.contains("PrimeNumberCalculatorSpec.groovy")
when:
def fileList = objectGraphToFileList("src/test/groovy", rootDirectory)
fileList.each { println it }
then:
fileList.size() == 14
fileList.contains("src/test/groovy/de/scrum_master/stackoverflow/CarTest.groovy")
fileList.contains("src/test/groovy/com/twg/sample/model/PrimeNumberCalculatorSpec.groovy")
}
List<File> objectGraphToFileList(String directoryPath, Object directoryContent) {
List<File> files = []
directoryContent.each {
switch (it) {
case String:
files << directoryPath + "/" + it
break
case Map:
files += objectGraphToFileList(directoryPath, it)
break
case Map.Entry:
files += objectGraphToFileList(directoryPath + "/" + (it as Map.Entry).key, (it as Map.Entry).value)
break
default:
throw new IllegalArgumentException("unexpected directory content value $it")
}
}
files
}
}
Please note:
I used new JsonSlurper(type: JsonParserType.LAX) in order to avoid having to quote each single String in the JSON structure. If your file names contain spaces or other special characters, you will have to use something like "my file name", though.
In rootDirectory.de.scrum_master.stackoverflow.contains("CarTest.groovy") you can see how you can nicely interact with the parsed JSON object graph in .property syntax. You might like it or not, need it or not.
Recursive method objectGraphToFileList converts the parsed object graph to a list of files (if you prefer a set, change it, but File.eachFileRecurse(..) should not yield any duplicates, so the set is not needed.
If you do not like the parentheses etc. in the JSON, you can still build your own parser.
You might want to add another utility method to create a JSON string like the given one from a validated directory structure, so you have less work when writing similar tests.
Modified Bavo Bruylandt answer to collect file tree paths, and sort it to not care about the order of files.
def "check directory structure"() {
expect:
String created = getCreatedFilePaths(new File("/tmp/region"))
String expected = new File("expected.txt").text
created == expected
}
private String getCreatedFilePaths(File root) {
List paths = new ArrayList()
printTree(root, 0, paths)
return paths.join("\n")
}
private void printTree(File file, Integer level, List paths) {
paths << ("\t" * level + "${file.name}:")
file.listFiles().sort{it.name}.each {
if (it.isFile()) {
paths << ("\t" * (level + 1) + it.name)
}
if (it.isDirectory()) {
collectFileTree(it, level + 1, paths)
}
}
}
And expected files put in the expected.txt file with indent(\t) in this way:
region:
usa:
new_york.json
california.json
europe:
spain.json
italy.json

Need to list all files in a directory on AIX

Have a program that needs to list all the files in a directory on AIX.
Have successfully done this on Windows:-
hFind = FindFirstFile(szDir, &ffd);
if (hFind == INVALID_HANDLE_VALUE)
{
fprintf(stderr,"Can not scan for files.\n");
goto MOD_EXIT;
}
do
{
if (! (ffd.dwFileAttributes & FILE_ATTRIBUTE_DIRECTORY))
{
printf("File:%s\n",ffd.cFileName);
}
}
while (FindNextFile(hFind, &ffd) != 0);
and on Linux:-
d = opendir(szDir);
if (!d)
{
fprintf(stderr,"Can not open directory '%s'.\n",szDir);
goto MOD_EXIT;
}
while(dir = readdir(d))
{
if (dir->d_type != DT_DIR)
{
printf("File:%s\n",dir->d_name);
}
}
closedir(d);
readdir appears to exist on AIX, but from the manual it would appear it only returns directories not files. The field d_type does not exist in the dirent structure.
When readdir() refers to directory entries it means entries in the directory, not subdirectories of the directory. So you can get all the names from there.
To discover if they are files or directories, the portable / reliable way is to to stat() the result. There are standard macros to test the st_mode returned in the stat buffer (e.g. S_ISDIR)

Sense/net using content query in console application

I try to use content query in console application but it throw an exception "Object reference not set to an instance of an object".
Please give help me resolve that problem.
var startSettings = new RepositoryStartSettings
{
Console = Console.Out,
StartLuceneManager = false,
IsWebContext = false,
PluginsPath = AppDomain.CurrentDomain.BaseDirectory,
};
using (Repository.Start(startSettings))
{
try
{
string path = "/Root/Sites/Default_Site/workspaces/Document/HACCP/Document_Library/SanXuat/ChonLocChuanBiDiaDiemSXRau";
string fieldName1 = "Name";
var content = Content.Load(path);
int count = ContentQuery.Query(".AUTOFILTERS:OFF .COUNTONLY Infolder:" + path).Count;
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
if you want to execute a content query, you have to enable LuceneManager when you start the repository, because that component is responsible for querying.
new RepositoryStartSettings
{
Console = Console.Out,
StartLuceneManager = true, // <-- this is necessary
IsWebContext = false,
PluginsPath = AppDomain.CurrentDomain.BaseDirectory,
}
Please make sure that all the config values are in place (e.g. index directory path, enable outer search engine). You can copy them from the Export or Import tool's config file.
A few more notes:
in a content query please always enclose path expressions in quotes, because if there is a space in the path, it causes a query error that is hard to find (because it would return a different result set). For example:
InTree:'/Root/My Folder'
Or you can use the built-in parameter feature that makes sure the same:
// note the #0 parameter, which is a 0-based index
ContentQuery.Query("+TypeIs:Article +InTree:#0", null, containerPath);