I've been reading the D Cookbook and near the beginning there's the following sentence:
D is binary compatible with C, but not source compatible.
SAS allows users to define and call C functions from within SAS. But I'm wondering, would it'd also be possible to do this from D?
I found Adam Ruppe's answer to create a DLL here, and I tried using that to create the DLL example from the SAS documentation; however, whenever I go to call it, the dll gets loaded, and then SAS proceeds to crash (without any crash log that I can find).
Yes, you can write DLLs in D which use or implement a C API.
You have to make sure that the function signatures and calling conventions match. In the page you linked, the calling convention is indicated as stdcall, so your D functions need to be annotated with extern(Windows) or extern(System).
How can one programmatically determine which type libraries (GUID and version) a given native, VB6-generated DLL/OCX depends on?
For background: The VB6 IDE chokes when opening a project where one of the referenced type libraries can't load one of its dependencies, but it's not so helpful as to say which dependency can't be met--or even which reference has the dependency that can't be met. This is a common occurrence out my company, so I'm trying to supplement the VB6 IDE's poor troubleshooting information.
Relevant details/attempts:
I do have the VB source code. That tells me the GUIDs and versions as of a particular revision in the repo, but when analyzing a DLL/OCX/TLB file I don't know which version of the repo (if any--could be from a branch or might never have been committed to a branch) a given DLL/OCX corresponds to.
I've tried using tlbinf32.dll, but it doesn't appear to be able to list imports.
I don't know much about PE, but I popped open one of the DLLs in a PE viewer and it only shows MSVBVM60.dll in the imports section. This appears to be a special quirk of VB6-produced type libraries: they link only to MSVBVM60 but have some sort of delay-loading mechanism for the rest of the dependencies.
Even most of the existing tools I've tried don't give the information--e.g., depends.exe only finds MSVBVM60.dll.
However: OLEView, a utility that used to ship with Visual Studio, somehow produces an IDL file, which includes the importlib directives. Given that VB doesn't use IDL files, it's clearly generating the information somehow. So it's possible--I just have no idea how.
Really, if OLEView didn't do it I'd have given it up by now as impossible. Any thoughts on how to accomplish this?
It turns out that I was conflating basic DLL functionality and COM. (Not all DLLs are COM DLLs.)
For basic DLLs, the Portable Executable format includes a section describing its imports. The Optional Header's directory 1 is about the DLL's imports. Its structure is given by IMAGE_IMPORT_DESCRIPTOR. This is a starting point for learning about that.
COM DLLs don't seem to have an equivalent as such, but you can discover which other COM components its public interface needs: for each exposed interface, list out the types of their properties and their method arguments, and then use the Registry to look up where those types come from. tlbinf32.dll provides some of the basic functionality for listing members, etc. Here's and intro to that.
Here's my question:
I have Project A which is referenced in Project B, but the problem is I also need to reference Project B in Project A. However, every time I try to do it there is an error which states that it cannot reference Project B to Project A because it will cause a circular dependency.
So can anyone suggest a workaround for my problem?
This is generally a bad idea. If there are some common things that might need to be shared by both assemblies, move it to a third assembly and have both projects reference that instead.
First, let me say that yes, this question may be subjective. However, I believe that there is probably a 'best' answer, if all the relevant factors are taken into consideration. In any case, it's worth giving it a shot and asking :)
Let's say that I've three libraries, A, B, and C.
Library B uses library A.
Library C uses library A.
I want people to be able to use A, B, and C together, or to just take any combination of A, B, and C if they wish.
I want to be able to distribute the libraries with source code, so that people can build them themselves if they wish, or just grab and use individual files.
I don't really want to distribute them together in one large monolithic lump.
Apart from the sheer issue of bulk, there's a good reason that I don't want to do this. Let's say that B has an external dependency on some other library that it's designed to work with. I don't want to force someone who just wants to use C to have to link in that other library, just because B uses it. So lumping together A, B and C in one package wouldn't be good.
I want to make it easy for someone who just wants C, to grab C and know that they've got everything they need to work with it.
What are the best ways of dealing with this, given:
the language in question is Objective-c
my preferred delivery mechanism is one or more frameworks (but I'll consider other options)
my preferred hosting mechanism is git / github
I'd rather not require a package manager
This seems like a relatively straightforward question, but before you dive in and say so, can I suggest that it's actually quite subtle. To illustrate, here are some possible, and possibly flawed, solutions.
CONTAINMENT / SUBMODULES
The fact that B and C use A suggests that they should probably contain A. That's easy enough to achieve with git submodules. But then of course the person using both B and C in their own project ends up with two copies of A. If their code wants to use A as well, which one does it use? What if B and C contain slightly different revisions of A?
RELATIVE LOCATION
An alternative is set up B and C so that they expect a copy of A to exist in some known location relative to B and C. For example in the same containing folder as B and C.
Like this:
libs/
libA/
libB/ -- expects A to live in ../
libC/ -- expects A to live in ../
This sounds good, but it fails the "let people grab C and have everything" test. Grabbing C in itself isn't sufficient, you also have to grab A and arrange for it to be in the correct place.
This is a pain - you even have to do this yourself if you want to set up automated tests, for example - but worse than that, which version of A? You can only test C against a given version of A, so when you release it into the wild, how do you ensure that other people can get that version. What if B and C need different versions?
IMPLICIT REQUIREMENT
This is a variation on the above "relative location" - the only difference being that you don't set C's project up to expect A to be in a given relative location, you just set it up to expect it to be in the search paths somewhere.
This is possible, particularly using workspaces in Xcode. If your project for C expects to be added to a workspace that also has A added to it, you can arrange things so that C can find A.
This doesn't address any of the problems of the "relative location" solution though. You can't even ship C with an example workspace, unless that workspace makes an assumption about the relative location of A!
LAYERED SUBMODULES
A variation on the solutions above is as follows:
A, B and C all live in their own repos
you make public "integration" repos (lets call them BI and CI) which arrange things nicely so that you can build and test (or use) B or C.
So CI might contain:
- C.xcworksheet
- modules/
- A (submodule)
- C (submodule)
This is looking a bit better. If someone just wants to use C, they can grab CI and have everything.
They will get the correct versions, thanks to them being submodules. When you publish a new version of CI you'll implicitly be saying "this version of C works with this version of A". Well, hopefully, assuming you've tested it.
The person using CI will get a workspace to build/test with. The CI repo can even contain sample code, example projects, and so on.
However, someone wanting to use B and C together still has a problem. If they just take BI and CI they'll end up with two copies of A. Which might clash.
LAYERED SUBMODULES IN VARIOUS COMBINATIONS
The problem above isn't insurmountable though.
You could provide a BCI repo which looks like this:
- BC.xcworkspace
- modules/
- A (submodule)
- B (submodule)
- C (submodule)
Now you're saying "if you want to use B and C together", here's a distribution that I know works.
This is all sounding good, but it's getting a bit hard to maintain. I'm now potentially having to maintain, and push, various combinations of the following repos: A, B, C, BI, CI, BCI.
We're only talking about three libraries so far. This is a real problem for me, but in the real world potentially I have about ten. That's gotta hurt.
So, my question to you is:
What would you do?
Is there a better way?
Do I just have to accept that the choice between small modules and a big monolithic framework is a tradeoff between better flexibility for the users of the module, and more work for the maintainer?
Libraries are like an onion, lots of layers. And layer violations make for a nasty onion; an inner layer cannot contain an outer layer.
create 3 separate static library projects (assuming you may be targeting iOS); A, B, C
B can include headers from A, C can include headers from A
B and C cannot include headers from each other. A cannot include headers from B or C
Create a Workspace for each combination of libraries you want to support
Add appropriate projects to workspace
Create a new project in each workspace to contain test app and/or unit tests for that combination
The key is the workspace. With the workspace, you can combine an arbitrary set of projects and, as long as their configurations are the same (Debug vs. Release), build/run/analyze/profile will properly determine dependencies (and you can set them up manually), build everything into a single derived data / products folder, and it'll just work.
If you want to build, say, the C project standalone, then A will need to be installed as expected (typically into /usr/local/, but into ~/ works, too) and exactly as it would be on a customer's system (if you were to support binary library installs).
This is exactly how many of us manage our projects at Apple and it works quite well. In terms of management, keep it as simple as possible. You are unlikely to have an entire team devoted to build & configuration and, thus, your configurations should be simple.
If you were to honestly assess the situation and conclude that A will never used by B, then fold B into A and be done with it. Writing re-usable APIs is incredibly difficult to do well. I've seen many a project get bogged down into trying to create a fully generalized solution for what should be just one specific use, wasting huge amounts of time in the process (and sometimes failing).
While you note
I'd rather not require a package manager
I'd still suggest CocoaPods to you. It does all the other things, like deep dependency management, is very friendly to git and is overall pretty simple to install and use.
Still, this is not the exact answer in terms of requirements, you've set.
I'm trying to create an D application which uses a (third party) COM .dll so I can scrape a text box of another application so I can sound an error when a certain string shows up.
However the third party doesn't provide .lib, .def or .h files that go with the dll (atleast with the free trial version). I can create the .lib file with the implib tool but I don't see any of the library's functions in the created .lib.
Their (visual c++) samples use the #import directive to link it in however that is of no use for me ...
On a side note how can I get the proper interfaces (in a .di with boilerplate that does the linking) of the dll automatically? I ask so the correctness of the linkage doesn't depend on my (likely to be incorrect) translation of the functions. They do have a webpage which gives all functions but the object model is a bit chaotic to say the least.
From what I know, COM libraries only expose a few functions, required to (un)register the library and to create objects.
You can however view the interfaces and functions in a COM .dll using the OLE/COM Object Viewer. It seems it might be able to output header files (.h). Afterwards, maybe you could use htod as a starting point to converting everything to D interfaces.
The DMD distribution seems to include a .COM sample (chello.d, dclient.d, dserver.d), and at first glance it doesn't look like it would require any LIBs explicitly.
Unfortunately, I've never actually used COM in D, so I can't advise any further. I hope this helps in some way.
While I have yet to actually do COM work myself, I am trying to revive Juno over on Github/he-the-great. Part of the project is tlbimpd which is what will output a D file from a DLL.
I've tested the examples and successfully run tlbimpd. Please do try things out for your use and submit any issues.