How to correctly fork CDT Plugin so that the fork works alongside the original plugin? - eclipse-plugin

I want to fork org.eclipse.cdt.core to adapt some of its functionalities to my own plugin. Actually there are only a few packages that I want to fork, most of which are inside org.eclipse.cdt.core.parser (and subpackages). So my own plugin depends only the fork, not on the original CDT packages.
I noticed that I at least need to fork org.eclipse.cdt.ui and org.eclipse.cdt.core.native as well. I have removed all views and wizards from ui. My goal was to fork as little as possible and to reuse original plugins wherever possible.
I did: git clone http://git.eclipse.org/gitroot/cdt/org.eclipse.cdt.git created a new branch fork based on master.
Then I replaced every string org.eclipse.cdt.<>' bymy.url.cdt.<>. I renamed the import statements, the dependencies, the directory names and every occurence oforg.eclipse.cdt. Then I imported started to importmy.url.cdt.coreand thenuiandnative`.
I need to avoid conflicts with the original CDT Plugin, so that CDT and my Plugin can be installed next to each other. Is that possible without forking everything? I even tried to fork a whole bunch of packages (almost everything) which is not a nice solution.
So my main question is: What is the best way to fork CDT core when I only want to change functionality in a few packages?

Related

How can I use a fork of react-admin (or one of its packages) in my production environment?

I need a few changes to the ra-core package of the react-admin monorepo on my productive environment.
Can I tell a lerna published module to use my own published module as a submodule instead of it's own?
It's about this package:
https://github.com/marmelab/react-admin
https://www.npmjs.com/package/react-admin
I'm about to create a PR to maybe have these changes merged into the project itself, but I can't know when it will be merged and / or whether it will be even accepted.
But unfortunately I need these changes immediately and it's okay for me to use a fork of mine until it's clear what happens to the PR.
I tried to fork a new ra-core package and use this in my package.json, but this doesn't work. Locally linking is no problem, as I can link it directly in the react-admin module, but I need it in my production build process.
So I'm wondering if there is some way to tell the module in my package.json (react-admin) which comes with it's own subpackages to use one package that I provide in the package.json instead of it's subpackage.

React native update template

I started my project with a template:
react-native init myApp --templateez-devs
The template has upgraded and I would like to know if there is a way to upgrade my project without be manually.
tl;dr
In a word, no. You'll have to do it manually.
Templates
The templating system is quite dumb it basically creates a new react-native project and then copies the files that are included in the template and then installs the dependencies that have been listed.
As you will undoubtably updated files that were included in the original template, you wouldn't want to just install it over your existing project and hope for the best. That would cause you lots of problems. You may also have installed dependencies that require linking with native code, the template wouldn't specifically know about these changes.
Ways to upgrade
So how can you update to the new template? Well it really depends on what you have done to the project. Unfortunately there is not going to be an easy way to do it.
To see what the major changes are between the templates I would look at the files included in the release that you are currently using, and the release that you plan to use and create a diff this can be done using the following command
diff -ur b a > ba.diff
where a and b are the directories that you are comparing.
Unfortunately the template that you are using doesn't create releases on their github https://github.com/maykonmichel/react-native-template-ez-devs/releases
You could attempt to download them off of their npm page https://www.npmjs.com/package/react-native-template-ez-devs .
Ultimately you can compare the changes on their github by looking at the commit history, you could look at the changelog if it existed, you can also look at the dependency versions that they are using and see if they have differed from the ones that you have used.
You can use github to do your compare
Here is an example of the comparison between the most recent commit and one from a few days before.
https://github.com/maykonmichel/react-native-template-ez-devs/compare/f4ffa06..04a1b8c

Is nuget appropriate for daily development workflow?

I am looking at nuget for improving automatic handling of dependencies (both internal and third party) during development.
A long as you develop through the CI Build Server, all is good:
get latest source for A and B, where B depends on A
fix bug in A
build A
check into source control
CI Build Server initiated
new nuget package is created and placed in corporate repository
build B (which will get the updated A package)
run B to verify that the bug in A was fixed
n. repeat n times
However, I'm wondering if it is possible to work locally as a single developer, without having to wait for the CI Build Server to produce a new package?
Nuget has a feature Package Restore, which will download all dependencies automatically on build. You can also list the repository order that the Package Restore should look for packages.
If the workflow could become:
get latest source for A and B, where B depends on A
fix bug in A
build A
(building creates a local nuget package)
run B to test the (resolved) bug in A (should now use our local nuget package, not local repository)
...repeat n times
check into source control
CI Build Server initiated
new nuget package created in corporate repository
Is this possible using Visual Studio, MSBuild, a CI Build Server and nuget? I'm especially interested in the making of local packages while developing locally.
Note that I have native projects, although except the generation of nuget package post-build, this would be a workflow that I hope should work for both C# and C++ projects.
The solution I have now, though far from ideal, is what I could figure out works best. Oh! and it is a work in progress so it WILL change in the coming weeks/months as I figure out how to get around the kinks.
I mostly have to deal with managed DLL right now but I do have some native code and worst, multi-platform native code to deal with eventually.
Create a local repository, basically just a folder and configure it in your list of nuget feeds.
Then I created a task (MSBuild) that will package the project and output it in the local repository's root folder. Make sure the version of your package is always increasing. Presently I do this manually by editing the assembly version.
Once built, update your other projects that reference it, I usually do this though the package manager console (update-package).
Each projects that was updated, bump up their version rinse lathe and repeat until you get to your top-most project (the actual program).
Once everything is nice and good and you are ready to commit then the build system should do it's own packaging and send it to your official repository.
The Good
No clogging of the repository and build system with intermediary development versions, that garbage remains (as it should) local.
Local repos are super easy to set-up, can even be done without changes to VS though the global nuget config.
This is friendly to both paradigms of package recover or checking-in packages with the project. That said I would recommend not checking in the packages you built locally but rather one that was committed to your local repository ideally through the build system. What's built local should remain local.
The Bad
Still much more complicated than just adding projects to a solution.
The deeper (or wider) your dependency tree the bigger the pain.
The Ugly
Makes some native nuget behaviors quite quirky and annoying :
Update operation takes forever if your VS is connected to a version system (perforce for me). I hear they "solved" the problem, would hate to see how it was before if it was worst that it is now !
Having nuget change non-code reference back to never copy is a major pain.
If Only
Configure the desired state of a content dependency (copy always, never or newer) directly from the nuspec and be done with it ! (oh and same story with ClickOnce content status include, exclude etc)
Make the update operation quick, 2 minutes for a dozen project is just insane, especially if the ultimate goal is to manage 500+.
Perhaps a hybrid mode where locally we work with projects inclusion but the build system would work with nuget dependency (and build them if necessary)
If you are to parse the project do follow MSBuild parsing rules and honor the conditional statements.
There are still issues I have yet to figure out like how to manage multiple branches of the code in the repository. How to handle version conflict further up the food chain. In a large project (ultimately we have to bring 500+ separate projects together in a single application executable, conflicts are expected).
I would love to bring all the goodness of sane dependency management à la Maven but thus far I did not find nuget to be mature enough to even think of proposing it to the dev team.
Certainly. In our solutions, NuGet parks the libraries in the "packages" directory of the solution's hierarchy which is ultimately kept in TFS. This allows for complete solution check-outs that includes the required libraries. If it's your intention to update the libraries normally provided by NuGet, you'll need to update the dependent projects' references to point to the project containing the updated code normally provided by the NuGet process.
Prior to checking-in your regular solution work (not the NuGet related libs,) make sure the solution's NuGet libs are up to date, and the references in the solution point back to the NuGet installed libs. Of course, you'll check-in and fetch the NuGet related libs beforehand.

Third-Party Code and Git

When developing iOS applications, I frequently use third-party code from GitHub and reusable classes I created myself. What I have been doing is cloning the source code into a specific folder somewhere in ~/Documents, where I kept all the library code. Then I would drag the source files into the Xcode project and code away, with a local Git repository keeping track of the changes in my own source code. So far so good, but I recently found a severe problem: I wanted to switch back to a older version of my Xcode project and found that it did not compile anymore because it used an older version of the third-party code, and nowhere had I stored which version it used!
How is this problem usually solved? I have looked briefly into Git submodules, but I'm not sure if it's the right thing. I also briefly read about CocoaPods, but could I also use that for libraries I created myself?
It is actually solved with git submodule: the idea is to reference an exact commit for each submodule you need, allowing you to go back in the history, and find the coherent set of commits you need for your project to compile then.
(More in this answer)
However, that does require a slight change in your working tree structure, since each submodule would become sub-directories of the parent repo which represents your project.
Note also that it (git submodule) is useful for source dependencies.
CocoaPods would be more for building the binaries you depend on (binary dependency).

Alternatives to Git Submodules?

I feel that using Git submodules is somehow troublesome for my development workflow. I've heard about Git subtree and Gitslave.
Are there more tools out there for multiple repository projects and how do they compare ?
Can these tools run on Windows ?
Which is best for you depends on your needs, desires, and workflow. They are in some senses semi-isomorphic, it is just some are a lot easier to use than others for specific tasks.
gitslave is useful when you control and develop on the subprojects at more of less the same time as the superproject, and furthermore when you typically want to tag, branch, push, pull, etc all repositories at the same time. gitslave has never been tested on windows that I know of. It requires perl.
git-submodule is better when you do not control the subprojects or more specifically wish to fix the subproject at a specific revision even as the subproject changes. git-submodule is a standard part of git and thus would work on windows.
git-subtree provides a front-end to git's built-in subtree merge strategy. It is better when you prefer to have a single-repository "unified" git history. Unlike the subtree merge strategy, it is easier to export changes to the different (directory) trees back out to the original project, but it is not as automatic as it is with gitslave or even git-submodule.
repo is in theory similar to gitslave, but not as well documented for non-android operations that I have found. It is fairly dedicated to the Google Android development model and only natively supports a handful of git commands (though you can run arbitrary commands) and the limited native support doesn't support, for example, a centralized repository to push to and checking out a branch seems fairly difficult.
kitenet's mr is what you would want to use if you have multiple version control systems in use, but is mostly limited for git-only superprojects due to its lowest common denominator approach. There are ways to run arbitrary commands, but they are not as well integrated.
For some use cases, I have liked each of the following two simple approaches:
Nested repositories. If your software project has a plugin mechanism, with each plugin in its own sub-directory, it can make sense to git-ignore these plugin directories and, in your local filesystem, to make each of them into its own git repository. This way, all your files form a single directory tree, but are managed in different git repositories. It will not confuse git.
Per-package repositories. For software projects where you use some kind of source code package management system (gem / bundler, npm, pear or the like) it can make sense to put your re-used code into separate git repositories, then to make source packages from them, and then to install them with the package management tool into the parent project. Your parent project's git repository would only contain a reference to the required packages and their versions, while the actual code of these packages will be git-ignored as done with all other packages and external libraries as well. Compared to the nested repositories proposed above, this is a more elaborate approach as it allows to specify which package version is to be installed.
I currently use submodules for development and not just relating 3rd party libraries. There are some ways that you can make life easier with submodules, especially when they are the source of merge or rebase conflicts. Look to ls-tree to get the 2 commits involved on a conflict in the submodule. This is probably the most difficult part of submodules for people to deal with. For now scripting will make this much easier to work with. Future versions of Git should have better native support for dealing with them.
Hope this helps.
We encountered a similar issue when using Git submodules in projects where we had dependencies in a variety of languages. To deal with them, we built and open-sourced a tool called MDLR ("Modular") that gives you declarative version-controlled Git dependencies with similar functionality to Git submodules, but without the annoying workflow. You can install it and manage your dependencies with the instructions/downloads on the GitHub repo