How do you modify an npm library you are using in your project? - npm

I'm using ng-bootstrap in my Angular project.
The problem is that ng-bootstrap is still in its early stages and missing lots of functionality. I have added a simple feature within the code in my node_modules/#ng-bootstrap directory.
The trouble is that I worry that if/when there is an update to ng-bootstrap and I update my project with it, my local changes in the functionality will be overwritten and lost.
What are some techniques to deal with this problem?

You've effectively just created your own "branch" of that package. You could submit a pull request if the functionality is something that should be there for everyone. Since you have custom changes, you're responsible for making sure updates don't overwrite them.
If i needed to so something like this, i'd see if there was a way to implement the changes without modifying the ng-bootstrap files themselves. Without knowing what the change is, i can't say how that might be accomplished. One option there is to not use a package manager for that framework, or let the package manager get the "official" files, and then copy them somewhere else that you actually use. You're still responsible for making sure to merge changes in when the framework updates, but at least it won't be automatically overwritten.

Related

How to organize code (component) sharing in multiple vue applications WITHOUT a monorepo

We're planning three similar vue projects. We already know that we will be able to reuse a lot of code (especially vue SFCs and simple js helper functions) in all of them and we're looking for a proper way to share the code between them.
Unfortunately the scope of the projects is rather different and a monorepo is not an option due to its limitations in read / write permission and visibility management. Therefore we're planning to handle the reusable parts as separate repos (and most likely private npm packages) which seems to be a straightforward approach. However, the question is: How can we create a convenient setup in which we are able to work on the shared components from within the scope of one of the parent projects?
Project A [project-repo-a]
project-specific stuff for A
private package A [package-repo-a] (conveniently editable from within project A)
private package B [package-repo-b] (conveniently editable from within project A)
Project B [project-repo-b]
project-specific stuff for B
private package B [package-repo-b] (conveniently editable from within project B)
private package C [package-repo-c] (conveniently editable from within project B)
In our PHP projects, there is a simple solution, we just require the reusable parts via composer with the prefer-source option which provides the full git repository which can be worked on right from within the parent application. However, as far as we understand there is no prefer-source thing in npm or yarn. So how can we achieve the desired setup? (Or are we overlooking a major downside of this setup in general?)
We already looked into / considered the following (without finding a suitable approach):
yarn / npm link: We understood, that we could use linking in general, but this seems to be a very inconvenient approach while constantly developing the shared components (and always having to publish them to reflect the latest changes).
yarn workspaces / lerna: Seem to be closest to what we want, however they seem to be (or are explicitly) designed for a monorepo approach. In the end they don't to provide a solution for actually getting the git source of a package (in a separate repo) into the parent project (since there is no --prefer-source thing) - do they?
using composer additionally: Just pulling the git sources down with composer and creating yarn workspaces from the composer vendor folder. However, this is obviously a hacky way and sounds quite error prone concerning the whole dependency management
using a yarn post-install script to pull down the git source of the required private packages, but as the composer way, this seems to be rather unpredictable in terms of module resolution, dependency management and so on.
using git submodules and yarn workspaces: Could be a solution. To be honest we're just completely unexperienced with git submodules and at a first glance it didn't look very intuitive. If there is no other way, we'll anyways consider to use this approach.
To be clear about this: We're not asking the taste question if one or the other of those approaches would be "best". We're feeling like none of them is the right one. The question is: Are we overlooking a technically clean and proven approach to our scenario, using npm, yarn or another package manager / dependency management solution?
Git X-Modules is a tool designed to do exactly what you were asking about. Here's a video that explains it. However, it's very new and therefore can't be really considered "proven" :-)
Yet, if you consider trying it, we would love to hear your feedback!
(As you may guess from the previous sentence, I am a part of the development team.)
you probably have already figured this out but have you looked into https://bit.dev/ ?
I'm currently considering it for a similar task to yours and it looks like it could do the job. Here's an article explaining how to use it https://blog.bitsrc.io/how-to-easily-share-vue-components-between-applications-1d30a1ad4e4d

Optionally leave old version of component on upgrade

I've been trying to set up a WiX component such that the user can specify that the installer should not upgrade that component on a MajorUpgrade. I had the following code, but this means that if the condition is met then the new version is not installed, but the old version is also removed.
<Component Id="ExampleComponent" GUID="{GUID here}">
<Condition>NOT(KEEPOLDFILE="TRUE")</Condition>
<File Id="ExampleFile" Name="File.txt" KeyPath="yes" Source="File.txt"/>
</Component>
Ideally, if the user specifies "KEEPOLDFILE=TRUE", then the existing version of "File.txt" should be kept. I've looked into using the Permanent attribute, but this doesn't look relevant.
Is this possible to achieve without using CustomActions?
A bit more background information would be useful, however:
If your major upgrade is sequenced early (e.g. afterInstallInitialize) the upgrade is an uninstall followed by a fresh install, so saving the file is a tricky proposition because you'd save it, then do the new install, then restore it.
If the upgrade is late, then file overwrite rules apply during the upgrade, therefore it won't be replaced anyway. You'd need to do something such as make the creation and modify timestamps identical so that Windows will overwrite it with the new one. The solution in this case would be to run a custom action conditioned on "keep old file", so you'd do the reverse of this:
https://blogs.msdn.microsoft.com/astebner/2013/05/23/updating-the-last-modified-time-to-prevent-windows-installer-from-updating-an-unversioned-file/
And it's also not clear if that file is ALWAYS updated, so if in fact it has not been updated then why bother to ask the client whether to keep it?
It might be simpler to ignore the Windows Installer behavior by setting its component id to null, as documented here:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa368007(v=vs.85).aspx
Then you can do what you want with the file. If you've already installed it with a component guid it's too late for this solution.
There are better solutions that require the app to get involved where you install a template version of this file. The app makes a copy of it that it always uses. At upgrade time that template file is always replaced, and when the app first runs after the upgrade it asks whether to use the new file (so it copies and overwrites the one it was using) or continue to use the existing file. In my opinion delegating these issues to the install is not often an optimal solution.
Setting attributes like Permanent is typically not a good idea because they are not project attributes you can turn on and off on a whim - they apply to that component id on the system, and permanent means permanent.
I tried to make this a comment, it became to long. I prefer option 4 that Phil describes. Data files should not be meddled with by the setup, but managed by your application exe (if there is one) during its launch sequence. I don't know about others, but I feel like a broken record repeating this advice, but hear us out...
There is a description of a way to manage your data file's overwriting or preservation here. Essentially you update your exe to be "aware" of how your data file should be managed - if it should be preserved or overwritten, and you can change this behavior per version of your application exe if you like. The linked thread describes registry keys, but the concept can be used for files as well.
So essentially:
Template: Install your file per-machine as a read-only template
Launch Sequence: Copy it in place with application.exe launch sequence magic
Complex File Revision: Update the logic for file overwrite or preservation for every release as you see fit along the lines as the linked thread proposes
Your setup will "never know" about your data file, only the template file. It will leave your data file alone in all cases. Only the template file it will deal with.
Liberating your data files from the setup has many advantages:
Setup.exe bugs: No unintended accidental file overwrites or file reset problems from problematic major upgrade etc... this is a very common problem with MSI.
Setup bugs are hard to reproduce and debug since the conditions found on the target systems can generally not be replicated and debugging involves a lot of unusual technical complexity.
This is not great - it is messy - but here is a list of common MSI problems: How do I avoid common design flaws in my WiX / MSI deployment solution? - "a best effort in the interest of helping sort of thing". Let's be honest, it is a mess, but maybe it is helpful.
Application.exe Bugs: Keep in mind that you can make new bugs in your application.exe file, so you can still see errors - obviously. Bad ones too - if you are not careful - but you can easily implement a backup feature as well - one that always runs in a predictable context.
You avoid the complicated sequencing, conditioning and impersonation concerns that make custom actions and modern setups so complicated to do right and make reliable.
Following from that and other, technical and practical reasons: it is much easier to debug problems in the application launch sequence than bugs in your setup.
You can easily set up test conditions and test them interactively. In other words you can re-create problem conditions easily and test them in seconds. It could take you hours to do so with a setup.
Error messages can be interactive and meaningful and be shown to the user.
QA people are more familiar with testing application functionality than setup functionality.
And I repeat it: you are always in the same impersonation context (user context) and you have no installation sequence to worry about.

Any reason to version control Elm's `build-artifacts`?

Is there any reason to keep elm-stuff/build-artifacts under version control? I was thinking I'd add it to my .gitignore, since it seems to change every time my .elm file changes.
(This project ignores the whole elm-stuff folder, but that seems wrong to me because exact-dependencies.json is in there.)
The content of elm-stuff gets generated from your source code so it's generally safe to ignore it's content.
For your question about committing exact-dependencies.json or not I think you should look at what the Rust community suggests for their equivalent cargo.lock.
If you're building an application and if you care about deterministic builds then it's better to commit it. If you are writing a library it's better to leave dependency resolution to the application that is using the library.
Note that Elm enforces semantic versioning so you can have the guarantee that a package upgrade is not going to break your build. That doesn't prevent change in the behaviour, that's why if you want deterministic builds you should commit your exact-dependencies.json.

Source control in SSIS and Concurrent work on dtsx file

I am working on building a new SSIS project from scratch. I want to work with couple of my teammates. I was hoping to get a suggestion on how we can have some have some source control, so that few of us can work concurrently on the same SSIS project (same dtsx file, building new packages.)
Version:
SQL Server Integration Service v11
Microsoft Visual Studio 2010
It is my experience that there are two opportunities for any source control system and SSIS projects to get out of whack: adding new items to the project and concurrent changes to an existing package.
Adding new items
An SSIS project has the .dtproj extension. Inside there, it's "just" XML defining what all belongs to the project. At least for 2005/2008 and 2012+ on the package deployment model. The 2012+ project deployment model carries a good bit more information about the state of the packages in the project.
When you add new packages (or project level connection managers or .biml files) the internal structure of the .dtproj file is going to change. Diff tools generally don't handle merging XML well. Or at all really. So, to prevent the need for merging the project definition, you need to find a strategy that works for you team.
I've seen two approaches work well. The first is to upfront define all the packages you think you'll need. DimFoo, DimDate, DimFoo, DimBar, FactBlee. Check that project and the associated empty packages in and everyone works on what is out there. When the initial cut of packages is complete, then you'll ensure everyone is sync'ed up and then add more empty packages to the project. The idea here is that there is one person, usually the lead, who is responsible for changing the "master" project definition and everyone consumes from their change.
The other approach requires communication between team members. If you discover a package needs to be added, communicate with your mates "I need to add a new package - has anyone modified the project?" The answer should be No. Once you've notified that a change to the project definition is coming, make it and immediately commit it. The idea here is that people commit and sync/check in whatever terminology with great frequency. If you as a developer don't keep your local repository up to date, you're going to be in for a bad time.
Concurrent edits
Don't. Really, that's about it. The general problem with concurrent changes to an SSIS package is that in addition to the XML diff issue above, SSIS also includes layout data alongside tasks so I can invert the layout and make things flow from bottom to top or right to left and there's no material change to SSIS package but as Siyual notes "Merging changes in SSIS is nightmare fuel"
If you find your packages are so large and that developers need to make concurrent edits, I would propose that you are doing too much in there. Decompose your packages into smaller, more tightly focused units of work and then control their execution through a parent package. That would allow a better level of granularity to your development and debugging process in addition to avoiding the concurrent edit issue.
A dtsx file is basically just an xml file. Compare it to a bunch of people trying to write the same book. The solution I suggest is to use Team Foundation Server as a source control. That way everyone can check in and out and merge packages. If you really dont have that option try to split your ETL process in logical parts and at the end create a master package that calls each sub packages in the right order.
An example: Let's say you need to import stock data from one source, branches and other company information from an internal server and sale amounts from different external sources. After u have all information gathered, you want to connect those and run some analyses.
You first design the target database entities that you need and the relations. One of your member creates a package that does all the import to staging tables. Another guy maybe handles external sources and parallelizes / optimizes the loading. You would build a package that in merges your staging and production tables, maybe historicizing and so on.
At the end you have a master package that calls each of the mentioned packages and maybe some additional logging or such.
In our multi-developer operation, we follow this rough plan:
Each dev has their own branch, separate from master branch
Once a week, devs push all their changes to remote
One of us pulls all changes, and merges all branches into master, manually resolving .dtproj conflicts as we go
Merge master in all dev branches - now all branches agree
Test in VS
Push all branches to remote, other devs can now pull and keep working
It's not a perfect solution, but it helps quarantine the amount of merge pain we have to experience.
We have large ssis solutions with 20+ packages in one solution, with TFS Git. One project required adding a bunch of new packages to the existing solution. We thought we were smart and knew to assign only one person to work on each new package, 2 people working on the same package would be suicide. Wasn't good enough. When 2 people tried add a different named, new, package at the same time, each showed dtproj as a file that had changed/needed to be checked in and suddenly I found myself looking at the xml for dtproj and trying to figure out which lines to keep (Microsoft should never ask end users to manually edit their internal files, which only they wrote and understand). Billinkc's solutions here are very good and the problem is very real. You may think that Microsoft is the great Wise One, and that your team can always add new packages to an existing solution without conflicts, but you'd be wrong. It also doesn't work to put dtproj in .gitignore. If you do that, you won't see other peoples new packages (actually the .dtsx file will come down in git, but you won't see that package in Solution Explorer because dtproj is what feeds Solution Explorer). This is a current problem (2021) and we are using Visual Studio 2017 Enterprise with SSDT.
To explain this problem to people, git obviously can handle a group of independent, individual files in a directory (like say .bat files) and can add, change, and delete those files easily. The problem comes in when you have a file that is naming, describing, and counting all the files in a directory (what dtproj does). When you have a file like dtproj you are creating a conflict on dtproj itself, when 2 people try to a add a new package at the same time. Your dtproj file has a line that shows the package you added, and my dtproj file shows the package I added, and tfs/git sees that as a Conflict.
Some are suggesting ways to deal with this if you have to add a lot of new packages, my idea is a little different. For the people who have to add new packages, don't work in the primary solution where this problem is, work somewhere else. Probably best to work in the "Projects" directory you get when you install Visual Studio, outside of TFS/Git. Obviously follow all the standards, Variable naming, and Package Configuration conventions for the target Solution. Then when the new packages are ready, give the .dtsx files to your Solution Gatekeeper for them to check in. Only the Gatekeeper can check in new packages using Add From Existing, avoiding conflicts. Once the package is checked in, developers can work on them in the main Solution.

WiX: Paraffin and repository/build server integration

Short version: How can I make sure that my component GUIDs remain stable using Paraffin on a build server?
I am currently working on a project that should be deployed via WiX. As this is a web project, it contains many files (still in early stage and already almost 200 files). Also, during development, files are constantly added and deleted, so maintaining the WiX component lists manually is simply not an option.
Since I read a lot about component rules and that people breaking them go to hell, I decided to go with Paraffin as a harvester. This tool is capable of updating an existing component list, thus not re-creating new GUIDs for existing components.
However, when a new component is created, the tool assigns a new GUID. Even if the component files are identical, then initial GUIDs will be different on different machines or even only at different times.
So, obviously, I need a central authority for fixing the initial GUIDs. My idea was to commit empty component lists, which are then filled by the build server calling Paraffin on build. So when I only distribute the MSIs created by the build server, I can be sure that component rules are being followed.
However, the problem with this approach is, that I have no means of tracking my GUIDs, should the build server crash or empty its local repository. I was thinking about having the build server commit the generated component list to my repository, but that doesn't seem like a clean idea.
Another solution I thought of was having all developers build (and thus call Paraffin) before commiting. Thus, each developer would create the initial GUIDs for their newly added files and commit them to the component list.
The obvious problem with this approach: People (e.g. developer A) will forget to build before they commit. So in these cases the build server will create the initial GUIDs for the new files, but those will also only be stored locally. A few commits later, developer B will come along and build the solution, creating a new GUID for the files created by developer A. He will then commit the component list containing this GUID and the build server will check it out. Now the build server has obtained a GUID (created by developer A) for a package, for which it had previously used a different (self-created) GUID, even though the files didn't change in the meantime.
So, how can I make sure, that my GUIDs remain stable between builds without relying on developers to build their solution before they commit? The approaches outlined above both seem unsatisfying to me, but are all I can think of right now.
As far as I am concerned component rules only really come into play when you have multiple installers that share components with the same guids (which should then be exactly the same resource(s)) or you are using a wixlib or a merge module which is then included as part of different installers.
From what you have said above, to me it doesn't sound like you will so, there is no harm in having different component guids for each build. It will just mean that when you upgrade the website, files that have not changed will be removed and re-installed under a different component guid. IMHO that doesn't really matter as long as the installer correctly installs all files that are required for the site to function and doesn't remove components from other products.
If you use the MajorUpgrade element, the old product will be completely removed before the new one is installed so any component guid's that are shared between the two versions will be removed and then re-installed anyway.
I always just leave my guid elements as Guid='*' that way I know that the there will never^ be any guid clashes in any of my components across my multiple products.
^ I know this is not theoretically true but in this use case it is.
Not entirely true. Changing your component GUIDs from build to build is fine if and only if you schedule RemoveExistingProducts early so that the files are off the system before you reinstall the new GUIDs. This approach works well for smallish installers with not so many files, but as your installer grows you will feel the pinch of having twice as much IO to do, as you remove and then reinstall, rather than just overwrite your files. In short, it's up to you, but you should think carefully about how large your application is likely to get before jumping in with the suggested approach.