How to have multiple Doxyfiles? - documentation

I am working on a project that includes other 3rd party libraries. All of these 3rd party libraries come with their own Doxyfile, and every of these libraries have their own specific 'Image' folder. Unfortunately some of these libraries are using the same naming conventions to name images.
Due to this reason, if I generate a Documentation for the entire source tree (including 3rd party libraries), it falls in ambiguity due to clashing image-names, and includes images from wrong folder.
How can I use multiple Doxygen files, so each library has its own sandbox? And at the end, the entire documentation output can be interlinked somehow?

Check out the doxygen manual chapter about Linking to external documentation. It is fairly easy to achieve what you want using tag files.

Related

<Gameplay Class>.h vs <Gameplay Class>.generated.h

In unreal engine, what is the difference between the two?
I could not find it in the API, just this: https://docs.unrealengine.com/5.0/en-US/gameplay-classes-in-unreal-engine/
I suspect that it adds .generated if you create the class from the Unreal editor, but I do not understand if it is any different with or without it.
Ah, so the .generated header is required inside the actual header file of the class (specifically as the last header).
https://forums.unrealengine.com/t/creating-classes-in-visual-studio/282386/4
Unreal has a code generation tool called "Unreal Header Tool" or UHT for short. During the build process of the project, it runs right before the actual compiler to generate code for the reflection, based on the UPROPERTY(), UFUNCTION(), etc. calls that you have in your code.
All that information is stored in two files: <Class>.generated.h and <Class>.generated.cpp
The header needs to be included last in the header to ensure that all references in a file are potentially valid in the generated code. Everything within the generated header file can be accessed via the UClass reflection system.
You can find the generated files in the "Intermediate/Build" directory of your project.
You can find the implementation of the UHT in the project on GitHub and a little more info about it in the docs.

Adding a two new phases to an Xcode framework project

I am building a project on Github written in Objective-C. It resolves MAC addresses down to manufacturer details. The lookup table is currently stored as text file manuf.txt (from the Wireshark project), which is parsed at run-time, which is costly. I would prefer to compile this down to archived objects at build-time, and load that instead.
I would like to amend the build phases such that I:
Build a simple compiler
Run the compiler, parsing manuf.txt and outputting archived objects
Build the framework
Copy the archived objects into the framwork
I am looking for wisdom on how to achieve steps 1 and 2 using Xcode v7.3 as Xcode provides only a Copy Files phase or a Run Script phase. An example of other projects achieving similar goals would be inspiring.
I suspect that what you are asking is possible, but tricky. The reason is that you will need to write a bunch of class files and then dynamically add them to the project.
Firstly you will need to employ a run script phase to run various tools from the command line to parse your file and generate a number of class files from it. I would suggest looking into various templating engines. For example appledoc uses moustache templates to generate API documentation files. You could use the same technique to generate header and implementation files.
Next, rather than generating archived objects an trying to import into a framework. I think you may be better off generating raw source code, adding it to a project and compiling into a framework. Probably simpler in the long run.
To automatically include the generated code I would look into (which means I haven't actually tried this :-) adding a folder reference to the project rather than an Xcode group. Folder references are an option in the 'Add files to ...' dialog.
Folder references refer to a directory and automatically add the entire contents of that directory to a project. So you can use one to point to the directory where you have generated the source code. This is a much better option than trying to manipulate the project or injecting things into an established framework.
I would prefer to parse the file at runtime. After launch you can look for an already existing output, otherwise parse it one time.
However, I have to do something similar at Objective-Cloud. I simply added a run script build phase and put the compiler call into it.

IntelliJ fails to recognize stanford-corenlp-3.5.2-models.jar file

I'm using stanford nlp to do sentiment analysis. I just need the sentiment score therefore following are the libraries that I'm adding into my project:
1) ejml-0.23.jar
2) stanford-corenlp-3.5.2.jar
3) stanford-corenlp-3.5.2-models.jar
When I add dependencies in the Project Structure in IntelliJ, first two work fine and are imported correctly and shown in the external libraries tab. But for the models.jar it throws me an error saying that IDEA cannot determine what kind of files the chosen item contains. When I still go ahead and add it - I don't see the models.jar file in the External Libraries section. See pics attached:
Libraries present in my Project Structure:
External Libraries: models.jar not included
stanford-corenlp-3.5.2-models.jar is not a library of Java classes but instead, it contains various nlp models that are used by different algorithms in the core package.
Therefore IntelliJ fails to recognize the content (which is fine). As long as the jar file is available in the classpath at the runtime, you should not run into any issues. It seems the case by looking at the project structure so I would not worry about this particular error.

How do I link multiple libraries in a Firebreath plugin?

Does anyone know where I can find a Firebreath sample (either Mac OS X or Windows) that illustrates how to create a plugin that includes 1 or more other libraries (.DLLs or .SOs) that each rely on other sub-projects built as static libraries (LIBs)?
For example, let's say that the Firebreath plugin is called PluginA, and that PluginA calls methods from DLL_B and DLL_C. DLL_B and DLL_C are C++ projects. DLL_B calls methods from another project called LIB_D, and DLL_C calls methods from a project called DLL_E.
Therefore, the final package should contain the following files:
PluginA.dll
DLL_B.dll (which also incorporates LIB_D)
DLL_C.dll
DLL_E.dll
I am currently forced to dump all source files in the pluginA solution, but this is just a bottleneck (for example I cannot call libraries written in other languages, such as Objective-C on Mac OS X).
I tried following the samples on Firebreath, but couldn't get them to work, and I found no samples from other users that claimed they were able to get it to work. I tried using CMAKE, and also running the solutions directly from X-Code, but the end result was the same (received linking errors, after deployment DLL_C couldn't find DLL_E etc.)
Any help would be appreciated - thank you,
Mihnea
You're way overthinking this.
On windows:
DLLs don't depend on a static library because if they did it would have been compiled in when they were built.
DLLs that depend on another DLL generally just need that other DLL to be present in the same location or otherwise in the DLL search path.
Those two things taken into consideration, all you need to do is locate the .lib file that either is the static library or goes with the .dll and add a target_link_library call for each one. There is a page on firebreath.org that explains how to do this.
On linux it's about the same but using the normal rules for finding .so files.

Is it worth it to create static libraries for iOS?

There is code that I want to include in most of my projects. Things like AFNetworking, categories for CoreData and unit testing, etc.
It seems logical to include all of these in a static library, and then use it in each project. I've noticed though, that many third-party libraries (like AFNetworking, and it's predecessor ASIHTTP) are included in projects by copying over all of their source files and then manually linking the necessary libraries to the project target.
This seems to me like the easiest way. It took a fair amount of time to figure out how to include an existing static library into a project. Even after I knew how, it still seems like a pain to do it for every new project. Also, the header search paths that you specify are to a local directory with the static library's files. Wouldn't it be easier, and is there a way, to copy the static library's files into the project? This is the same idea as including the class files directly like most libraries seem to do already, but it would be more organized because everything would be lumped into one library project, instead of having class files everywhere and having to include every one of them.
Static libraries feel like they should be the right way to go. Make a library that can be used with all projects that includes classes that every project will need. Makes sense. I am just conflicted because it seems like the right way to go is to leave everything out of a 'formal' library, and just copy over all of the class files instead.
I guess I am just looking for what experienced developers find to be the best option.
I would be among the first to admit that the process of referencing a static library in Xcode is not entirely intuitive. However, using a static library is the best option, without a doubt.
The main reason is maintainability: when you copy source code of a library to many places, you must remember to update all of them to the latest code when you upgrade to the next version of the library. This may be a rather error-prone process, especially when the underlying library source changes significantly (e.g. new files are added, old files are renamed, etc.)
There's a halfway solution - make an XCode project that builds your static library from source and put that into a shared repository (ie.. git submodule etc) which is included from each project's main repository.
Each of your projects would include this submodule and project. Then they get the latest source code each time they pull that submodule. If you set this up as a build dependency it will build a static library the first time you build and then XCode is smart enough just to include it each subsequent build so you get the benefit of fast build times.
You also get the advantage of having the source right there for stepping though / debugging.
If it's in a separate XCode project and a new version of a library adds or removes a source file you would only need to change that shared project - all your individual projects wouldn't change at all.
What about using CocoaPods? This tool does exactly what you want in a declarative way: you have a file (Podfile) where you declare your dependencies, and the tool downloads all the dependencies and builds a static library that gets added to your project.
I would agree that static libraries feel like they might be the correct way to go for a number of reasons, but can also introduce some issues.
The positives would be creating an easy way to add a library to a project. Although not completely intuitive, it is rather trivial to add a static library to a project after one does it a few times. Add the files, add the search path, done. This could also be useful in certain source control situations. Also, updating a library may be easier.
I think the real problem here is for the open source community. By including, say AFNetworking, for example, as a static library, you lose all access to the implementation files. This is a great feature of including source rather than a library. It lets you change code to how you see fit, and hopefully give back.