How to obtain a list of installed VS Code extensions using code - vscode-extensions

I want to get the list of installed extensions for VS Code in code.
Not from the CLI, I want it in code so I can write it to the console for diagnostic purposes in the middle of a unit test that's behaving like things aren't installed.
I already know how to get a list from the CLI as detailed here How to show the extensions installed in Visual Studio Code?.
Probably there's some command I can use with executeCommand, but I can't find it.

const extensions = vscode.extensions.all; // returns an array
will give you all installed extensions - it does include built-in extensions, like vscode.xml and all other pre-installed language extensions. Not just the extensions you may have manually installed.
You could filter those by their id if you wanted. To remove those starting with vscode. for example.
let extensions = vscode.extensions.all;
extensions = extensions.filter(extension => !extension.id.startsWith('vscode.'));
That'll get rid of ~80 of the built-ins, but there are more - there are a few starting with 'ms-code' you might not be interested in.

Related

Install jpeg 2000 on Windows 10

I want to investigate a new application for JPEG 2000 encoding and decoding. I downloaded openjpeg-master and managed to cobble together the ability to cmake the files. After a bunch of grinding, this resulted in the following output:
"Build files have been written to: C: openjpeg-master/build
\build> "
Any "normal" Unix installations have a multi-step installation like this:
"UNIX/LINUX - MacOS (terminal) - WINDOWS (cygwin, MinGW)
To build the library, type from source tree directory:
mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make
Binaries are then located in the 'bin' directory.
To install the library, type with root privileges:
make install
make clean
To build the html documentation, you need doxygen to be installed on your system. It will create an "html" directory in TOP_LEVEL/build/doc)
make doc"
But the Windows 10 equivalent is unclear, to put the most charitable spin on it. You can find it here: "https://github.com/uclouvain/openjpeg/blob/master/INSTALL.md"
Some questions arise:
is there a better starting place for installing JPEG 2000 that actually shows me how to install it and run the tests?
if not, how do I get from the build files to installing the libraries and making the test programs?
Is there more information I can dig out that would help to answer these questions?
Since I'm allergic to Visual Studio, I overlooked a nice tutorial specifying how to install something as complex as openjpeg by direct clone from github. However, in desperation, I found it and it worked. It is Visual Studio Community 2019 Version 16.8.3. I needed only to use -DTHIRDPARTY to get the third party libraries installed. There is a drop-down menu to build and install OPENJPEG. All I need to do now is figure out how to compile and run the utilities that invoke the installed libraries ...
actually, the complete line to add was -DBUILD_THIRDPARTY:bool=true.
Somewhere in my frantic random search for a way forward, I remember seeing the thought that to make the tests work, I merely need to find files like *.vsproj and run them a separate VS solutions. Some random guesswwork with .vdproj files in src/bin/... hasn't produced anything good. Is there not a document somewhere showing how to run the tests?

Enable rendering asciidoc operation in InteliJ Idea

Documenting rest api using spring-restdocs in InteliJ Idea is fine, but I missing rendered "include" snippets of operation macro.
== Get Comments sorted
To get sorted according single attribute and with no specific ordering (asc-ending is default), you can refer to this example:
operation::comments/getSortedDescending[snippets='http-request,path-parameters,http-response']
I would expect that somehow I can enable spring-restdocs-asciidoc artefact to be used when rendering things in InteliJ Idea with asciidoctor plugin.
Associated issue https://github.com/asciidoctor/asciidoctor-intellij-plugin/issues/310
I've had a look at spring-restdocs-asciidoctor. The operation::[] is an extension that renders the content. It relies on an attribute snippets that needs to be set.
The IntelliJ plugin for AsciiDoc supports both ruby extensions and attributes for previews as experimental options.
To make it work I did the following:
checkout the project
running gradlew asciidoctor to generate the snippets
add a file .asciidoctorconfig to set the path to the generated snippets
add a directory .asciidoctor and place the extension in this directory
confirm the warning message "This project contains Asciidoctor Extensions..." in the IDE
You need to confirm the warning message every time you restart your IDE. As it will run Ruby code locally, this is a security issue. Maybe we'll enhance it in the future so you only need to reconfirm it once the extension's code changes.
The changes are on the following branch: https://github.com/ahus1/spring-restdocs/tree/poc_extension_intellij

HaxeDevelop: cross-platform compilation via default project templates

Trying investigate how to create "Hello world" on different languages via HaxeDevelop. I'm newbie and may be inacurate at terminology.
1) C# project. Pressing F8 gives me error:
haxe -cp src -cs D:/Programs/Projects/CsTestHaxe/bin/ -main Main
Unix.Unix_error(8, "mkdir", "D:/Programs/Projects/CsTestHaxe/bin/")
Build halted with errors (haxe.exe).
Via googling pretty much outdated info at least found solution:
haxe -main Main -cs out
And it works but ouput go to "src" location which is bad. Next googling led me to "Custom build" and using .hxml with pre-build command at project settings.
But why default template/settings not works for such simple thing as "Hello world" (used cs.system.Console)?
How default build may be fixed / probably I've installed or setup something wrong via HaxeDevelop installation?
2) C++ project. Pressing F8 gives me error:
Warning: Could not find environment variables for Visual Studio
Missing HXCPP_VARS
Error: Could not automatically setup MSVC
Error: Build failed
Build halted with errors (haxe.exe).
Using command line (similar to C# above) I can exucute C++ sources, but cant compile it.
Installed Visual Studio Community 2017. Nothing changed, same error. VS provide different own parts for installation. Should I install any specific?
Found also many threads about OpenFL workaround for C++ compilation. But I needn't OpenFL and want to use default Haxe API and tools.
Also OpenFL and C++ always mentioned with Lime. Do I need it too? Installed Lime via command line. But seems nothing changed.
3) Am I right that HaxeDevelop not yet support HashLink?
And if possible couple words about why HashLink appeared if there is Neko affiliated with Haxe?
As a result here an additional question: is it right that Haxe during compilation to target platform only "convert" .hx source to target one and then using third party (target platform) compile?
1) C# project. Pressing F8 gives me error.
This appears to be a known Haxe issue. Since it's been fixed on the dev branch, you could try a nightly build from build.haxe.org. Alternatively, you could also try manually creating the bin directory, since that seems to be what the error is about.
2) C++ project. Pressing F8 gives me error:
The latest Haxelib release of hxcpp (3.4.64) does not support Visual Studio 2017 yet. You could use a development version by installing hxcpp from GitHub, since again, it should be fixed there:
haxelib git hxcpp https://github.com/HaxeFoundation/hxcpp
The alternative is to downgrade Visual Studio.
Also OpenFL and C++ always mentioned with Lime. Do I need it too?
Yes, if you want to use OpenFL, you also need Lime, as OpenFL depends on it.
3) Am I right that HaxeDevelop not yet support HashLink?
Actually, a HashLink project template was added. But to follow the general theme of this answer, it seems it hasn't made it into an official relase yet. You can get a nightly build from here.
And if possible couple words about why HashLink appeared if there is Neko affilated with Haxe?
There is a two-part blog series on haxe.org by HashLink's author: part 1, part 2. The first part has a paragraph talking about this exact topic. Here's an excerpt:
First, let me explain the reasons for writing another virtual machine in replacement of Neko.
[...]
Back then, the Neko virtual machine was not especially designed to run Haxe and suffered from some limitations, the main one being performance.
[...]
And to your final question:
is it right that Haxe during compilation to target platform only "convert" .hx source to target one and then using third party (target platform) compile?
That is true for some targets, but it depends. For C++, C# and Java, Haxe indeeds generates source code for the target language and then invokes the target-native compiler after doing its own compilation (this step is usually called "native compilation").
However, some targets produce byte code directly (SWF and Neko), so there is no native compilation step there. Other target languages are interpreted (JS, PHP, Python and Lua), so there's no native compilation step there either. For HL it actually depends, there is HL/Jit (byte code) and HL/C, which is compiled to native C code.
You can find a comprehensive list of Haxe targets an their characteristics here.
Phew, that was a lot of questions in one. ;)

TFS Build ignores configured Code Analysis ruleset

I have a solution that is using an hybrid .csproj and project.json combination (for nuget management purposes). So basically the "project.json" file is working as a "packages.config" file with a floating version capability.
This solution is using a custom RuleSet that is being distributed via Package, and is imported automatically. On the dev machine, works without a problem.
At the build machine (that is, inside the machine itself, working as an user) the solution also compiles without a problem.
However, when a vNext build (is this the name for the new build system?) is queued, it ignores completely the custom ruleset and just uses the StyleCop one (that is also included), which gives a bunch of warnings. Said warnings should not appear as the Custom RuleSet basically suppresses those warnings (ie: Warning SA1404: Code analysis suppression must have justification,
Warning SA1124: Do not use regions, etc)
As far as I have checked, there is no setting to specify the ruleset, and this works with XAML Builds. What is different in this new build system that is causing this? Is there a way to force/specify the Code Analysis Rule Set from the definition?
Thanks in advance for any help or advice on the matter.
Update/Edit
After debugging back and forth with the wonderful help of jessehouwing I must include the following detail on my initial report (that I ignored as I did not know that it was influential):
I am using SonarQube Analysis on my build definition.
I initially did not mention it as I did not know that it replaces the Code Analysis at Build Time (and not only when it "analyzes", as I thought).
If you are using the SonarQube tasks
The SonarQube tasks generate a new Code Analysis Ruleset file on the fly and will overwrite the one configured for the projects. These rulesets will be used regardless of what you've previously specified.
There is a trick to the naming of the rulesets through which you can include your own overrides.
More information on the structure can be found in the blog post from the SonarQube/Visual Studio team. Basically when you Bind your solution to SonarQube it will generate 2 ruleset files. One which will be overwritten during build, the other containing your customizations.
There is a toolkit/SDK to generate a SonarQube plugin for custom analyzers which allow you to import your rules into SonarQube, so it will know what rules to activate for your project(s).
If you're not using SonarQube
Yes you can specify the ruleset you want to use and force Code Analysis to run. It requires a couple of MsBuild arguments:
/p:RunCodeAnalysis=true /p:CodeAnalysisRuleset="PathToRuleset"
Or you can use my MsBuild helper extension to configure these settings with the help of a UI template:

pydev: undefined variable error when importing compiled modules

I want to switch my python-IDE from idle to pydev (eclipse). I am using a couple of modules which I have as compiled bytecode (*.pyc) only. In idle that was never a problem and it even offers code completion for those compiled modules. But pydev gives me a lot of "undefined variable" errors - however the code is interpreted correctly.
Is there a way pydev can handle bytecode modules the way idle does? Perhaps without decompiling the files?
Try adding the modules as forced builtins.
To do that, go into Settings → PyDev → Interpreter - (Python/Jython/IronPython as approriate), select the interpeter you're using, and add it to the list on the Forced Builtins tab (look here for more details).
(Note that you may or may not have to add multiple entries for subpackages and modules; for example to get Fabric working properly one needs to add both fabric and fabric.api)
That makes PyDev load those modules into an interpreter to get code-completion and error checking data, rather than just analysing source code.
I've not tried it for .pyc files, but it works for other things like importing something that's generated dynamically by a script's __init__.py or something (ie fabric) so it might work for you.
(see also this FAQ and that one on the PyDev site)