Some valid IDL files, build as IDL Projects and install but some or all of their contents do not appear in the SCA Explorer/Target SDR/IDL Repository. I believe that the IDL parser used to build the tree in the IDL Repository is different from that used by the omniORB idl2cpp (omniidl) during building and it rejects some valid IDL. The cases I have found all use a value of a const previously defined. e.g.,
const Algorithm ALG_NONE = 0
const Algorithm ALG_LPC = ALG_NONE + 3
The second line can occur in the same file or in a file that includes the file containing the first line.
The file containing the first line is accepted if the second line is not in the same file, but the file containing the second line is rejected and none of its contents appear in the IDL Repository tree. It appears that it rejects a const appearing as a value on the right side of an assignment statement. These files however are valid and are accepted as valid by omniidl. But they cannot be used in RedHawk because they cannot be selected for a component interface.
I am not very familiar with RedHawk IDE sources or Eclipse plugins and so have not been able to find where the syntax for the parser is specified. I see "eclipsecorba" appearing in plugin lists so I assume that RedHawk is using the Eclipse CORBA Plugin (aka ECP) and that its parser is the one being used to build the tree. So I suspect that the parser error is in that package rather than in code added by RedHawk.
Can anyone confirm this and suggest where I might look in the ECP code for this? Should I report this as a ECP bug to the ECP group on SourceForge? I am not sure how active it is since it appears that the latest version is from 2008.
This seems to be a bug in the IDL editor, the IDL you have is legal. Would recommend you to report this to ECP but given the long inactivity of that project probably it will not fixed soon. I do know that one of our Remedy IT engineers has created a more modern IDL editor for Eclipse but due to lack of funding this work is not available publicly.
Related
In Sublime Text, I'm used to accessing function names through the # symbol list. However, when using a project established from vue-templates, all the function names and data attributes in .vue files do not appear in this list.
This makes navigating .vue files tedious. I have installed all vue-related Sublime packages but none of them seem to fix this.
How can I get symbol indexing working properly with Vue files? Or, do you have any experience with other text editors that do this properly?
The symbol list in Sublime (visible via Goto > Goto Symbol... or Goto > Goto Symbol in Project...) is controlled primarily by the syntax definition for the language in question and secondarily by configuration metadata that tells Sublime what parts of the syntax are actually symbols that should be displayed in the symbol list.
In general:
Sublime runs an indexer over all of the files that are currently in your project
The indexer uses the rules in the syntax definition to break up the text into various scopes that describe the purpose of each bit of text (e.g. "This is a string", "this is a method call", etc)
A preferences file contains rules that indicate what scopes are considered symbols, both for the current file as well as project wide
The two parts of this need to work hand in hand in order for the symbol lists to populate correctly (as Sublime can't guess on its own), and both parts should be provided by the package or packages that are providing Vue support to Sublime.
The best course of action would be to raise an issue with the developers for the Vue package that's providing the Syntax definition. It's possible that the simple inclusion of an appropriate Symbol List.tmPreferences file by the syntax author would be enough to fix the issue.
It's also possible that the symbol list is not fully populated because sublime is still indexing all of the files in the project and so the data is not available yet.
You can check the status of the indexer in recent builds of sublime by selecting Help > Indexing Status... from the menu to see if that's the issue. However unless you have an extremely large set of files this is likely not the issue.
package com.yada.yada
What happens with IntelliJ when I create Java class 'ss' in com.yada.yada without package statement? - RED "Missing Package Statement".
What happens when I create Kotlin file in com.yada.yada without package statement? - "Keep going bro until your DI framework will fail to scan your deps during runtime"
Why IJ package validation is non-mandatory for Kotlin? I just wasted an hour trying to figure out what's wrong with package scan only to realise this was the show stopper. Would Java 9 jigsaw quadruple the chaos for Kotlin sparked by such malformed files/classes with no warning messages? Well, you bet it will!
Please return back the "warning" statement for Kotlin. P.S. registration/login methods are not sufficient for myself to access Intellij bugtracker(and I am genuinely pissed off with one time access password resets, 1000 resources and 980 passwords I don't remember or care to) therefore making this public on stackoverflow.
If anybody going to defend this behaviour please explain why? Maybe I am missing something, otherwise please reply with open bug(preferably somebody from JetBrains) and I will accept it.
The missing inspection warning for files with no package statement is a bug; the corresponding YouTrack issue is here.
I'm not sure this post is appropriate here, but to answer the actual question, the official documentation states that:
If the package is not specified, the contents of such a file belong to
"default" package that has no name.
Since with Kotlin your files don't have to be in folders that match their packages, not having a package declaration has to be an option so that you can have files that are organized in folders, but you don't wish to put them in packages - this way they can have the same package as if they weren't nested in any folders, and were just in the root of the project.
I do concede that it's a bit odd there is a warning for your package declaration not matching the folder your file is in, but you don't get this if you just omit the package declaration altogether. I suppose this is assumed to be intentional.
This shouldn't generally be a problem because IDEs will generate the appropriate package declaration for the folder you've created your file in by default. I'm not sure how you created a file without a package declaration if you're using IntelliJ, unless you did it in the root src folder instead of inside a package folder.
How can one programmatically determine which type libraries (GUID and version) a given native, VB6-generated DLL/OCX depends on?
For background: The VB6 IDE chokes when opening a project where one of the referenced type libraries can't load one of its dependencies, but it's not so helpful as to say which dependency can't be met--or even which reference has the dependency that can't be met. This is a common occurrence out my company, so I'm trying to supplement the VB6 IDE's poor troubleshooting information.
Relevant details/attempts:
I do have the VB source code. That tells me the GUIDs and versions as of a particular revision in the repo, but when analyzing a DLL/OCX/TLB file I don't know which version of the repo (if any--could be from a branch or might never have been committed to a branch) a given DLL/OCX corresponds to.
I've tried using tlbinf32.dll, but it doesn't appear to be able to list imports.
I don't know much about PE, but I popped open one of the DLLs in a PE viewer and it only shows MSVBVM60.dll in the imports section. This appears to be a special quirk of VB6-produced type libraries: they link only to MSVBVM60 but have some sort of delay-loading mechanism for the rest of the dependencies.
Even most of the existing tools I've tried don't give the information--e.g., depends.exe only finds MSVBVM60.dll.
However: OLEView, a utility that used to ship with Visual Studio, somehow produces an IDL file, which includes the importlib directives. Given that VB doesn't use IDL files, it's clearly generating the information somehow. So it's possible--I just have no idea how.
Really, if OLEView didn't do it I'd have given it up by now as impossible. Any thoughts on how to accomplish this?
It turns out that I was conflating basic DLL functionality and COM. (Not all DLLs are COM DLLs.)
For basic DLLs, the Portable Executable format includes a section describing its imports. The Optional Header's directory 1 is about the DLL's imports. Its structure is given by IMAGE_IMPORT_DESCRIPTOR. This is a starting point for learning about that.
COM DLLs don't seem to have an equivalent as such, but you can discover which other COM components its public interface needs: for each exposed interface, list out the types of their properties and their method arguments, and then use the Registry to look up where those types come from. tlbinf32.dll provides some of the basic functionality for listing members, etc. Here's and intro to that.
I want my plugin (an automated termination analysis tool) to run on code the user selects inside Eclipse. Naturally, the user selects source code (a .java file, a method in the outline, ...). However, my program needs the compiled .class file(s) as input.
How can I get the .class files for selected source items? Related to this, how can I get a bytecode descriptor to the selected source method? In case of generics and varargs transforming a (Eclipse) source descriptor to the corresponding bytecode descriptor seems nontrivial to me.
I do not want to run javac on my own and I do not want to guess how the .class file is named (this is nasty for inner classes) and then try to find it on the disk (if it exists? maybe I can force Eclipse to compile?).
The Bytecode Outline plugin uses the following solution (see JdtUtils.getByteCodePath):
Based on the source element, find the output location, e.g. /home/user/workspace/project/build/)
Use the package information to find the right directory inside build/, e.g. /home/user/workspace/project/build/some/package/
Find the "outermost" class definition (important for inner classes), use this name as the file name of the .class file, e.g. /home/user/workspace/project/build/some/package/Foo.class
in case of an inner class, do weird magic (JdtUtils.getClassName) and modify the name of the resulting class file accordingly (maybe resulting in Foo$1.class)
So the problem of this question is solved, where the translation of inner classes to the corresponding file names could be improved. According to the author, though, the current approach (using "magic") works for "95% of the cases" and he does not know about any related bugs in the past few years.
I want to remove AssemblyInfo.cpp, because of some metadata errors that sometimes come up.
Is AssemblyInfo.cpp useful for anything? Or can it be removed without any problem?
I've discovered one distinction for this file: it has to do with values reported under calls to Assembly.GetReferencedAssemblies. I was working on tracking version numbers of our binaries from our SVN repository by embedding the revision numbers into them. Initially I too was updating AssemblyInfo.cpp and found nothing reported in the file property details tab for the binary. It seemed this file did nothing for me in terms of updating those details, which was not the case with similar updates to a csproj's AssemblyInfo.cs. Why the difference right?
Now in one such csproj we happen to reference a vcxproj and that csproj dumps to a log the versions of all its referenced assemblies using the .NET Assembly.GetReferencedAssemblies method. What I discovered was that the number that was being reported in that log was not the vcxproj's version as given by the VS_VERSIONINFO resource I added (which does get the version details into the file properties details tab). Instead the number reported was actually matching that defined in the AssemblyInfo.cpp.
So for vcxproj files it looks like VS_VERSIONINFO is capable of updating the contents you find under the file properties details tab but AssemblyInfo.cpp is capable of exposing the version to GetReferencedAssemblies. In C# these two areas of reporting seem to be unified. Maybe there's a way to direct AssemblyInfo.cpp to propagate into the file details in some fashion, but what I'm going to wind up doing is duplicating the build info to both locations in a prebuild step. Maybe someone can find a better approach.
So far I never had the AssemblyInfo.cpp in my managed c++ dlls, so I don't think it is necessary.
(I just added the file to have version information for my c++ dlls).
Why not just fix the errors? On that note, what errors are you getting?
This file provides information such as a version number which is definitely needed in order to use the assembly you have built.