Bundling versus debugging in a Visual Studio Code extension - vscode-extensions

I have written an extension for VS Code and now am creating a package for it.
It is recommended then to bundle the files, which I do with esbuild. Packaging works, but it leaves me with a dilemma. In package.json, I can either write
"main": "./out/main.js",
which lets VS Code use the bundled code; this results in a usable package but I cannot debug the code.
Or I can write
"main": "./out/extension.js",
which directs VS Code to the original code; then I can debug but I do not generate a usable package.
Surely I must misunderstand something, but what is it?

Update yo code and generate a new extension from it. The sample has already been configured to bundle the files so you don't need to do anything yourself.
But if you want to use esbuild for bundling, then probably you are on your own. Moving to rebuild has been mentioned last year, but no progress ever after, https://github.com/microsoft/vscode/issues/115023#issuecomment-771692495

If you "npm run esbuild" in a powershell terminal, it runs the "esbuild" script (which I copied into package.json from https://code.visualstudio.com/api/working-with-extensions/bundling-extension ) that uses --sourcemap, and that seems to make debugging possible. (You can use breakpoints in extension.ts even though it's running main.js.)
Then, you can replace the "watch" script names in tasks.json with "esbuild-watch", and it will automatically run esbuild when you make changes.
I haven't tested this thoroughly (I'm about as lost as you are with this stuff) but I thought this might help anyway.

Related

VB.net .exe cannot be run from another computer. Missing assemblies for ClosedXML

I must put this program into production today, and I can't get it to run independently.
In the program, I have included NuGet package "Imports ClosedXML.Excel" and use it to create spreadsheets. When I build my executable, and try to run it from another computer, it cannot find the ClosedXML and Documentformat.OpenXml assemblies.
I checked in References that "Copy Local" was = True for ClosedXML and Documentformat.OpenXml, but it's not working. I found another website that mentioned Global Assembly Cache, and that if the dependency is in there, it will not be included in the Build .exe.
I am running Visual Studio Professional 2017. I am in over my head on this one, so if you have answer (and I hope you do), please try to provide it in elementary terms I can understand.
Sometimes issue is solved by individually adding application files inside the following menu
Go to Publish-->Application Files
Select Show all files
Under Publish Status
Set the files you need to Include [not include(auto)]

How can I disable TrackFileAccess via CMake?

My company uses CMake to manage their code. Some of my colleagues are on Linux, and I'm on Windows, using Visual Studio. Our code is organised into a number of libraries, which translates into a number of Visual Studio projects under one solution.
To speed up compilation, I'm trying to integrate clcache with my setup. To do this, I need to disable TrackFileAccess for every project in the solution as noted here.
So, to my understanding, I have to modify the CMake files to either either inject some XML into each library's .vcproj file, or to modify the parameters passed to msbuild.exe itself. I'm having a lot of trouble figuring out how to do either of these things.
To try invoking msbuild.exe with specific command line parameters, I found the variable CMAKE_MAKE_PROGRAM. I tried using it with SET(CMAKE_MAKE_PROGRAM "${CMAKE_MAKE_PROGRAM} /p:TrackFileAccess=false" CACHE INTERNAL ""), but I can see from Process Explorer that msbuild.exe was not getting invoked with that argument.
I couldn't work out how I'd go about injecting XML into the .vcproj files, or if it can even be done with CMake. Is there actually a way to do it? Or would I instead need to perhaps write a script to run after CMake runs, to edit its output?
While we're at it, do I really need to edit every single .vcproj file, or could I perhaps edit something that each .vcproj will inherit?
Aha!
I did more digging, and I think I'm barking up the wrong tree with CMake. It turns out, I could edit C:\Users\me\AppData\Local\Microsoft\MSBuild\v4.0\Microsoft.Cpp.x64.user.props and add in
<PropertyGroup Label="Globals">
<TrackFileAccess>false</TrackFileAccess>
</PropertyGroup>
and it works!

What exactly does "Clean and Build" do?

Maybe this seems like a naive question.
I have some functionality in my project that suddenly doesn't work, an when I "Clean and Build", it works again.
What might be the problem that is being solved by performing this Clean and Build?
Your input is highly appreciated.
The actions on a Make and clean are derermined by the makefile. You can look at the Makefile itself to see what is there.
make by itself corresponds to the very first rule in the makefile.
make clean corresponds to the clean rule in the makefile
Also, you can use
make -n
and
make -n clean
These will show you the sequence of command(s) that are actually executed when make or make clean are run.
As a general convention, make by itself is a rule for compiling so it executes the compile commands necessary to build the project.
make clean is used for cleaning the project. It deletes the object files and the executable, and any other files necessary to allow a clean build of the project on next make.
Why doing make clean followed by a make helps?
It can be due to a number of reasons and its hard for me to tell you what may be going on. Can you give some more details on the failure mode?
with help of "build and clean" you can remove the old build files and make the new files.so that the updates will be effectively shown in the views.

MSBuild overwriting dependencies

Ok, so I've got a somewhat complicated problem with my build environment that I'm trying to deal with.
I have a solution file that contains multiple C# projects which is built by a NAnt script calling MSBuild - passing MSBuild the name of the solution file and a path to copy the binaries to. This is because I want my automated build environment (CruiseControl.Net) to create a folder named after the revision of each build - this way I can easily go back to previous binaries for any reason.
So idealy I have a folder layout like this
c:\build\nightly\rev1
c:\build\nightly\rev2
c:\build\nightly\rev3
...
c:\build\nightly\rev10
etc.
The problem that's arisen is I recently added the latest version of the Unity IoC container to my project, checking it directly out of MS's online SVN repository. What's happening is I have a Silverlight 3 project that references the Silverlight version of Unity but I also have other projects (namely my Unit testing project) that reference the standard (non-Silverlight) version of Unity.
So what happens is since MSBuild is dumping everything into one single folder the Silverlight version of the Unity assembly is overwriting the non-Silverlight version because they have the exact same assembly file name.
Then when CruistControl runs my unit tests they fail because they don't have the proper dependencies available anymore (they try to load the Silverlight specific Unity assembly which obviously doesn't work).
So what I want to do is:
keep my desired output directory
structure (folder\revision)
I don't want to have to manually edit
every single proj file I have as this
is error prone when adding new
projects to the solution
Idealy I would like MSBuild to put everything into a folder structure similar to this:
nightly\revision1\project1
nightly\revision1\project2
nightly\revision1\project3
...
nightly\revision2\project1
nightly\revision2\project2
nightly\revision2\project3
etc
I can't modify the Unity project to give it a different file name because it comes from another SVN repository I cannot commit changes to. I found a similar question posted here and the suggested solution was to use a "master" MSBuild file that used a custom task to extract all the project file names out of the solution then loop over each one building them. I tried that but it doesn't build them in the order of their dependencies, so it fails for my project.
Help?
Firstly I would always have the build server delete the old working copy and check out a fresh copy to avoid any problems with stale artifacts from the previous build.
Next I would have nant or msbuild build the solutions as before with the artifacts from each build going to their local working output folders.
After that I'd move the artifacts from their working paths to their output paths, this shouldn't require digging through the project files since you can just tell msbuild/nant to copy working\project1\bin\release\**\*.* to artifacts\project1\.
The script that does this should ideally be stored along with the source with the main file, e.g. build.nant or build.proj in top level of the trunk.
For third party libraries I would simple include the DLLs directory in your repository. Nothing worse than writing some code and having a third party dependency break your build because of changes on their end.
Simply document the versions of the libraries you are using, and if you must update them, you'll have a better sense of what breaks the build before you even check it in.
Also, doesn't CC.Net automatically handle the providing of releases based on revision? I'm using TeamCity and it keeps a copy of the artifacts of every build.
I highly recommend reading JP Boodhoo's Automating Builds with NAnt blog series. That's been my starting point and have made lots of changes for my own taste. I also highly recommend checking out the builds of many open sources projects for examples. I've learned a lot from the builds of the Castle/Nhibernate/Rhino-Tools stack.

Pre-Pre-build Steps in Hudson

I'm in a bit of a pickle. I'm trying to run some environmental scripts before I run the build in a m2 project, but it seems no matter how hard I try - the 'pre' build script are never run early enough.
Before the 'pre-build' scripts are run, the project checks to see if the correct files are in the workspace - files that won't be there until the scripts I've written are executed.
To make them 'pre-build', I'm using the M2 Extra Steps plugin - but's it's not 'pre' enough.
Has anyone got any suggestions as to how I can carry out what I want to do?
Cheers.
Have you considered breaking it up into two projects, and setting the pre-build project to be upstream of the build project?
e.g.,
Foo Pre-build
Foo Build
After Foo Pre-build runs, cause "Foo Build" to run.
I have used this, admittedly in different scenarios than yours, quite successfully. This has the added benefit (if you need it) of allowing you to manually run a build without going through the pre-build steps, if you know they aren't necessary.
You should use the free form project type and not the maven project type.
If this is a problem (ie, there are projects that are expecting to be triggered by or triggering from), consider using a custom workspace location and having a free form project execute in this workspace before the maven project runs. The free form project can be used as the trigger for the maven project.
Does adding another build step as a shell script work?
My problem stemmed from the fact I wanted to set-up my workspace before I ran anything due to an issue with Dynamic Views (ClearCase) not being accessible from the workspace - I wanted to add a symlink to fix this.
However, Andrew Bayer has made a change to the plugin that I'm currently testing that should fix this...so the question is probably invalid in it's current form.
Will edit it once we come to a conclusion.