How to instruct linker to consider strong IRQ definition present in static library insteadof weak definition - usb

We have problem in linking strong USB_IRQ handler.
We have real USB IRQ definition present in a static library.
We are filling .vector table in application startup file (*.s) with handler name and we also have the __weak definition, defined in the same startup file.
While linking we see linker always picks-up weak IRQ definition present in the startup file instead of strong IRQ definition present in the library (*.a).
If we remove weak definition from startup file, the strong definition is considered and it works well.
The problem that we see is, the library file that contains strong definition is not referred in any means from our application, that means, we are not using any functions or structures present in that file. only, IRQ handler is used and that too it trigger only when there is a hardware event.
We use ARM GNU tool chain, tried multiple options nothing helps.
We went through the internet help, and found few options like, --no_remove and --keep linker options, but, these flags does not seems to be supported.
Please suggest if you have some input in this regard.

I think you need to make sure that at least one symbol in the same source file as the strong definition gets referenced from your program. Otherwise, the linker will have no reason to load the object file that contains the IRQ. So for example you could define some init function in that source file, and call it from your program.
Make sure the IRQ function is declared with __attribute__((used)) as well.

Related

How -dead_strip work on Objective-C code?

In https://opensource.apple.com/source/cctools/cctools-836/RelNotes/CompilerTools.html
The static linker has an option to do dead code stripping The static
link editor now can do dead code stripping. The new ld(1) option to do
this is -dead_strip. There is also an additional option
-no_dead_strip_inits_and_terms that can be used when -dead_strip is specified to cause all constructors and destructors to never be dead
code stripped. The load map printed with the ld(1) option -M notes
what was dead stripped from the input files.
However, due to dynamic runtime property of Objective-C, dead_strip does not work well on Objective-C.
So in my understanding, it does not work on the level of functions.
Can it work on the level of classes, i.e. if one class is not used, can it be removed effectively?

Is all of a static library included in a final product after linking?

Suppose I create an iOS application. I include a static library. I create an object of a class that is defined and implemented in static library. This object doesn't use other classes defined in the library. Will all of the static library be present in the application I build? The idea is that much of the static library contains unused code and wouldn't need to be present.
I believe there a flags that help determine the behavior -- if someone can spell out how this works, I sure would appreciate it.
A static library is an archive of object files. If you link against a static library libfoo.a then
the linker by default will link into your final executable all and only those object files in libfoo.a
that are required to provide definitions for the public symbols that are referenced by the program.
More specifically, if the linker finds the library requested (via the option -lfoo) at a given
point in the commandline sequence of object files and libraries to be linked, then it will
extract from the archive and link into the executable each object file in the archive that provides
a definition for any symbol that remains undefined up to that point in the linkage.
In so doing, definitions of unused public symbols may be redundantly linked into
your program, but only if they are found in an object file (whether free-standing or a member of
a library) that is not completely redundant.
If you do not want to tolerate even those potential redundancies, then a combination of
compiler and linker options can eliminate them: see this answer

Compile-time warning about missing category method implementation

In our Xcode project we have multiple targets which share some common code. Each target includes only sources which are actually used by it. So when we use some category methods inside classes which are shared between targets we need to make sure that this category implementation is also included in all targets. Xcode doesn't show any warnings during compile time or link time if we forget to include category implementation to some of the targets. And it is troublesome to do it by hand.
Is there any automated way to ensure that category implementations are included to the targets which use them?
Categories are not automatically linked to the final binary.
They are linked if the linker finds the file where they are defined is used (which was a source of constant bug some times ago).
What you can do is use a special flag on the linker: '-all_load' and '-ObjC' in Build Settings/Linking/Other Linker flags
-ObjC Loads all members of static archive libraries that implement an Objective-C class or category.
And from this discussion:
-all_load and -force_load tell the linker to link the entire static archive in the final executable, even if the linker thinks that parts
of the archive are unused.
Another way I use to force link the module is to put a C function in the file:
void _linkWithNBLogClass(void)
{
NSLog(#"%s", __FUNCTION__);
}
and call it at the start of my application:
linkWithNBLogClass();
This way, by the console feedback, I'm sure my module is loaded and ready to be used.
The described behavior is as intended and much existing code would break, if it is changed.
Prior to formal protocols there was a need to declare methods without defining them. This was for optional methods, i. e. for declaring a delegate API. The usual technique was to declare a so-called informal protocol, consisting of a category on NSObject that is never implemented.
But if you have a category implementation, of course the completeness of it is checked against the category interface. (Otherwise you get a "Method definition for X is not found" error.) So you do not have a missing method in the category implementation, but a missing category implementation.
I do not think that this is a big deal. You will get a runtime error instead of a compile time error and simply add the category implementation to the target.

What's the principle of LOADDLL.EXE?

It can be used to run arbitary Dynamic Link Library in windows,
how can it possibly know the entry point of an arbitary dll?
The answer depends on how much details you need. Basically, it comes down to this:
A DLL can optionally specify an entry-point function. If present, the system calls the entry-point function whenever a process or thread loads or unloads the DLL.
[...] If you are providing your own entry-point, see the DllMain function. The name DllMain is a placeholder for a user-defined function. You must specify the actual name you use when you build your DLL.
(Taken from the MSDN article Dynamic-Link Library Entry-Point Function.)
So basically, the entry point can be specified inside the DLL, and the operating system's DLL loader knows how to look this up.
The IMAGE_OPTIONAL_HEADER (part of the portable executable's header on Windows machines) contains an RVA of the AddressOfEntryPoint that is called by programs looking for an entry point to call (e.g., the loader).
More information on the IMAGE_OPTIONAL_HEADER can be found here. And this paper is good for just general PE knowledge.
What do you mean by "run a DLL"? DLLs aren't normal programs, they are just a collection of functions. The entry point itself usually doesn't do much apart from initializing stuff required by other functions in the DLL. The entry point is automatically called when the DLL is loaded (you can use LoadLibrary to do this).
If you want to call a specific function after loading the DLL, you can use GetProcAddress to get a pointer to the function you want.

How to update a C++ dll without needing to relink the exe with the lib file?

First off , I'm referring to a Windows environment and VC++ compiler.
What I want to be able to do is rebuild a Vc++ dll and maintain compatability with an exe that has already been linked to the lib without having to rebuild the exe or load the dll dynamically using LoadLibrary. In other words, is there a way to add classes and methods to a dll(but not remove any) and ensure the existing entrypoints remain the same?
If you export the functions from using a DEF file and manually specify the ordinals, you should be able to accomplish this.
Reference
http://msdn.microsoft.com/en-us/library/d91k01sh(VS.80).aspx
It depends on how your EXE used the classes from the DLL. Adding new classes should not affect existing entrypoints. Aside from that, however, any the following will affect object size and/or layout, and as such will be a client-breaking change (note that this is technically VC-specific, but most of these apply to any sane implementation):
Removing fields (even private) from classes
Adding new fields (even private) to classes
Adding new base classes to existing classes
Removing base classes from existing classes
Adding new virtual method before an existing virtual method (adding new virtual methods after existing ones is okay, except for the case described in next point)
Adding a new virtual method in a class that is used as base class by another class in the same DLL which also has virtual methods
Changing type of existing fields
Changing signature of existing methods
Making a virtual method non-virtual, and vice versa
As long as you don't add any exported symbols, the ordinals won't change. If you add exported symbols through the standard dllexport mechanism, then that's going to be difficult to control. If you use the old style .xpf symbol file you might be able to control the ordering of the symbols in the lib (although I don't know this for sure - it might still reorder them however it likes), but it's tricky to do C++ symbols this way.
I think that ordinals are rarely used to resolve DLL imports anymore - I think that you have to use .def files to get the linker to use them. So as long as you don't change names or signatures of the exported functions, the .exe should work just fine.