Is it possible to load a specific package during runtime?
I want to have a kind of plugins where each one has the same functions than the others but with different behaviour, and depending on the configuration file, load one or other.
No, Go doesn't support dynamically loaded libraries.
Your best bet is to start the plugin as its own executable and communicate with it through sockets or via stdin/stdout.
2017 update
This answer is no longer true, Go now supports plugins (for Linux and MacOS only as of June 2021)
There is support for this now as of go 1.8
https://golang.org/pkg/plugin/
You might consider executing the ‘plugin’ packages at runtime, by writing out a new program (say, to a temp directory) and executing via exec.Command, something along the lines of exec.Command("go", "run", files…).Run()
You’ll see some similar code here.
Just do these,create a codegen that reads the configuration, generates a basic go file with the packages loaded in order and then execute that, compile languages won't nor provide dynamic loading, even dart suffers in a way,simple just read your configuration file then create a temporary file with the necessary codes to load up and communicate with sockets or http
I think what you are looking for is the special function init
if you add a
func init() {
}
inside a package it will run it the first time the package is imported.
This happens only in the same binary. As other have already said go does not support dynamically loaded libraries.
Related
I got chance to migrate Flex application to Apache Royale, able to run helloworld applications. started migrating Application, getting couple of exceptions. bellow is the one.
we are using the
AdobeSpelling.swc
AlivePDF.swc
Cairngorm.swc
flexmdi.swc
FlexUnit.swc
spcairngorm.swc
these '.swc' libraries.
how can i import these or is any similar libraries in royale compatible files.
i found asconfig.json file - external-library-path - but i am compiling my application with maven pom.xml.
Please help me, basic migrations
Error Log:
Warning: Definition com.model.ModelLocator could not be found.
import com.model.ModelLocator;
Warning: Definition com.util.customComponents.CustomMenuBarEvent could
not be found.
import com.util.customComponents.CustomMenuBarEvent;
There is two path which you can go in case of migrating.
Emulation Components. However there is a chance that some of
the components wasn't added to emulation so you may get exceptions
and this would be the place where you can add them and make pull
requests to Royale. Those components allows you in best case
successfully build your application without changing drastically UI
part, but you may won't see anything on the screen or it may be
messed, cause there wasn't volunteer who could work on displaying
better them.
Another path is to distinguish your pure ActionScript code (no
dependency to Flash) from UI part - pure AS3 code should ported
without any problem - and rewrite UI from scratch using Basic module
or Jewel
All libraries which you have mention have strong dependencies to Flash, so my recommendation is to find JS replacement for them and use it in your porting. There is also PureMVC which is working pretty good with Royale - it's has been tested in several applications already.
I installed Perl6 with rakudobrew and wanded to browse the installed files to see a list of hex-filenames in ~/.rakudobrew/moar-2018.08/install/share/perl6/site/sources as well as ~/.rakudobrew/moar-2018.08/install/share/perl6/sources/.
E.g.
> ls ~/.rakudobrew/moar-2018.08/install/share/perl6/sources/
09A0291155A88760B69483D7F27D1FBD8A131A35 AAC61C0EC6F88780427830443A057030CAA33846
24DD121B5B4774C04A7084827BFAD92199756E03 C57EBB9F7A3922A4DA48EE8FCF34A4DC55942942
2ACCA56EF5582D3ED623105F00BD76D7449263F7 C712FE6969F786C9380D643DF17E85D06868219E
51E302443A2C8FF185ABC10CA1E5520EFEE885A1 FBA542C3C62C08EB82C1F4D25BE7B4696F41B923
522BE83A1D821D8844E8579B32BA04966BAB7B87 FE7156F9200E802D3DB8FA628CF91AD6B020539B
5DD1D8B49C838828E13504545C427D3D157E56EC
The files contain the source of packages but this does not feel very accessible. What is the rational for that?
In Perl 6, the mechanism for loading modules and caching their compilations is pluggable. Rakudo Perl 6 comes with two main mechanisms for this.
One is a file-system based repository, and it's used with things like -Ilib. This resolves modules simply using paths on disk. Whenever a module loaded, it first has to check that the modules sources have not changed in order to re-compile them if so. This is ideal for development, however such checks take time. Furthermore, this doesn't allow for having multiple versions of the same module available and picking the one matching the specification in the use statement. Again, ideal for development, when you just want it to use your latest changes, but less so for installation of modules from the ecosystem.
The other is an installation repository. Here, specific versions of modules are installed and precompiled. It is expected that all interactions with such a repository will be done through the API or tools using the API (for example, zef locate Some::Module). It's assumed that once a specific version of a module has been installed, then it is immutable. Thus, no checks need to be done against source, and it can go straight to loaded the compiled version of the module.
Thus, the installation repository is not intended for direct human consumption. The SHA-1s are primarily an implementation convenience; an alternative scheme could have been used in return for a bit more effort (and may well be used in the future). However, the SHA-1s do also create the appearance of something that wasn't intended for direct manipulation - which is indeed the case: editing a source file in there will have no effect in the immediate, and probably confusing effects next time the compiler is upgraded to a new version.
I have a solution that is using an hybrid .csproj and project.json combination (for nuget management purposes). So basically the "project.json" file is working as a "packages.config" file with a floating version capability.
This solution is using a custom RuleSet that is being distributed via Package, and is imported automatically. On the dev machine, works without a problem.
At the build machine (that is, inside the machine itself, working as an user) the solution also compiles without a problem.
However, when a vNext build (is this the name for the new build system?) is queued, it ignores completely the custom ruleset and just uses the StyleCop one (that is also included), which gives a bunch of warnings. Said warnings should not appear as the Custom RuleSet basically suppresses those warnings (ie: Warning SA1404: Code analysis suppression must have justification,
Warning SA1124: Do not use regions, etc)
As far as I have checked, there is no setting to specify the ruleset, and this works with XAML Builds. What is different in this new build system that is causing this? Is there a way to force/specify the Code Analysis Rule Set from the definition?
Thanks in advance for any help or advice on the matter.
Update/Edit
After debugging back and forth with the wonderful help of jessehouwing I must include the following detail on my initial report (that I ignored as I did not know that it was influential):
I am using SonarQube Analysis on my build definition.
I initially did not mention it as I did not know that it replaces the Code Analysis at Build Time (and not only when it "analyzes", as I thought).
If you are using the SonarQube tasks
The SonarQube tasks generate a new Code Analysis Ruleset file on the fly and will overwrite the one configured for the projects. These rulesets will be used regardless of what you've previously specified.
There is a trick to the naming of the rulesets through which you can include your own overrides.
More information on the structure can be found in the blog post from the SonarQube/Visual Studio team. Basically when you Bind your solution to SonarQube it will generate 2 ruleset files. One which will be overwritten during build, the other containing your customizations.
There is a toolkit/SDK to generate a SonarQube plugin for custom analyzers which allow you to import your rules into SonarQube, so it will know what rules to activate for your project(s).
If you're not using SonarQube
Yes you can specify the ruleset you want to use and force Code Analysis to run. It requires a couple of MsBuild arguments:
/p:RunCodeAnalysis=true /p:CodeAnalysisRuleset="PathToRuleset"
Or you can use my MsBuild helper extension to configure these settings with the help of a UI template:
Has someone experience in loading jars dynamically for a XPages Application?
We would like to have some calculation code which is going to change quite often in external Jar Files and load them dynamically when they are needed. Does anyone know if that's possilbe with Domino?
You could create a tasklet which you can roll out like an OSGI plugin. This way you can execute the calculations in this tasklet which you can update independently of your application. That way you only need to update your update site and all applications who use that code will the latest version installed.
You can find more info about it here: http://xpag.es/?1926
Another solution would be to put the jar file on your server in the java/ext/lib directory. And every time a new release is created you can update that file on the server. A server / HTTP task restart would be necessary ofcourse.
I want to switch my python-IDE from idle to pydev (eclipse). I am using a couple of modules which I have as compiled bytecode (*.pyc) only. In idle that was never a problem and it even offers code completion for those compiled modules. But pydev gives me a lot of "undefined variable" errors - however the code is interpreted correctly.
Is there a way pydev can handle bytecode modules the way idle does? Perhaps without decompiling the files?
Try adding the modules as forced builtins.
To do that, go into Settings → PyDev → Interpreter - (Python/Jython/IronPython as approriate), select the interpeter you're using, and add it to the list on the Forced Builtins tab (look here for more details).
(Note that you may or may not have to add multiple entries for subpackages and modules; for example to get Fabric working properly one needs to add both fabric and fabric.api)
That makes PyDev load those modules into an interpreter to get code-completion and error checking data, rather than just analysing source code.
I've not tried it for .pyc files, but it works for other things like importing something that's generated dynamically by a script's __init__.py or something (ie fabric) so it might work for you.
(see also this FAQ and that one on the PyDev site)