I am building an eclipse rcp application. For the Drag and Drop functionality of resources I need extraset of validations. Eclipse doesn't suggest to subclass the ResourceDropAdapterAssistant. So I modified the code for ResourceDropAdapterAssistant which is in the package org.eclipse.ui.navigator.resources.
Now what is the best way to put this feature back to my eclipse? I have gone through this link. http://eclipsesource.com/blogs/2012/07/30/patching-your-own-eclipse-ide/
It has helped me to create a feature patch. But I am clear when I create the feature patch should I include only the modified class or all classes and packages under the plug-in?
Regards,
Priyank Thakkar
It should to only copy the .class files for the modified java files. However, I have created several feature patches and just find it easier and less error prone to include all .class files. So, that is what I would recommend doing.
Remember though that by default, a feature patch applies to exactly one version of the released feature. However, you can modify the created p2 metadata to expand the range. See these excellent blog posts for more information:
http://aniefer.blogspot.com/2009/06/patching-features-with-p2.html
http://aniefer.blogspot.com/2009/06/patching-features-part-2.html
Related
I just upgraded to the newest version of Pharo Smalltalk. Before doing so, I "File-outed" a package from my old version called My-Pharo - a package I use for various configurations and customizations of Pharo itself, most notably a class to put back "Workspace" in the main menu. I then "File-ined/Installed" the file into my new version.
When I checked the SystemBrowser, I had correctly gotten the My-Pharo package, but I'd also picked up a package called My-Pharo-Manifest... I see My-Pharo-Manifest actually is part of my File-Out, and seems to contain the package-comment for My-Pharo .
What is this manifest, what is it's purpose, and how should it be used? Is there something I can/should do to "merge" the manifest (ie. the comment) back into the My-Pharo class? Should I move the content of My-Pharo-Manifest somewhere else? ...Or is my best bet to simply delete the Manifest-package, and re-write the package-comment for My-Pharo?
I'm not a seasoned Pharo developer, I use it just time to time. I'll try to answer your question from the source code. For more detailed answer you would have to get it from the ones that are actually do the development of Pharo.
What is manifest?
Manifest contains package metadata.
what is it's purpose?
The purpose is to make life easier for the SmallLint (Smalltalk Code Critics). It is there for its speedup, because without the manifest the SmallLint would have to check the rule results all the time. Package metadata helps in managing false positives and/or TODOs.
packages: If you check for the where is the #hasPackageNamed: used, you will find out that it is at SmallLintManifestChecker>>manifestBuilderOfPackage:.
methods: if you search for #hasManifestFor: SmallLintManifestChecker>>manifestBuilderOfMethod:
Is there something I can/should do to "merge" the manifest (ie. the
comment) back into the My-Pharo class? Should I move the content of
My-Pharo-Manifest somewhere else?
I would just leave it be. It helps the SmallLint to do its job.
I have this project that I have done for experimentation with Qt and shared libraries. This is basically a couple of Qt Widgets from the tutorials for Qt and what I think is the right CMakeLists configuration so a MylibConfig.cmake is automatically generated from a MylibConfig.cmake.in to share the library. The problem is that I don't want the end user to add the dependencies of my library to its own CMakeLists.txt. This is, in my case, the library depends on Qt4, but I want that the end user to not have to do find_package(Qt 4 REQUIRED). Imagine that I want to provide an enclosed functionality to someone that does not need or want to know about what my library is built on. Is there a way in the automatic generation of the MylibConfig.cmake that it automatically finds all necessary packages or is the only option to add the fin package manually in the MylibConfig.cmake.in?
Thank you very much.
In fact, both mentioned projects do find of dependencies from their *Config.cmake files. And nowadays that is the only option -- CMake can't help you to do it "automatically".
So, some way or another, your config module should do the same.
The easy way is to add find_dependency() calls (cuz you know exactly what other packages, your project is based on).
A little bit harder is to do it "automatically" (writing your own helper function) -- for example by inspecting properties of your target(s), "searching" where all that libraries come from and finally generating find_dependency() calls anyway.
IntelliJ allows you to configure the "File and Code Templates" in Settings.
This is a global setting, however I want different templates depending on which project I am working on (for example there will be different #author tags if its commercial / open source work, and version information varies by project).
Eclipse manages this on a per-workspace basis; how can I achieve the same thing in IntelliJ IDEA?
Unfortunately per project templates are not supported in IntelliJ IDEA. I recommend you comment-on/vote-for/track the feature request Make file templates per-project. (See UPDATE about this feature request below)
A few workarounds you can try...
Create a File Template for each project. Then when you create a new class, use the project's template rather than the standard "Java class" template. It will clutter up your template list a bit, and you have to remember to change from the default template when creating a new class (remember than inline search is available in the new class dialog when setting the type). But it is workable.
The copyright settings are done on a per project basis. Sometimes a need for a specific header can be met using the copyright utility (even if it is not an actual copyright statement). The options are pretty good for determining where it gets placed. The one shortcoming will be that while you can configure it to be a comment just before the class declaration, you can only configure to be a block comment or inline comment, not a javadoc comment.
Finally, a last option would be to write a live template for each project with the header information. Then after you create a class use the proper one to place the header information.
Hopefully those things will help while we wait for the feature to get implemented.
UPDATE
The above mentioned feature request to allow for file templates to be saved on a per project basis has been implemented in IDEA v14.1. It is currently (Feb 2015) available as an EAP (i.e. beta). It is scheduled for release at the end of Q1 2015.
I'm working on Eclipse RCP, of which i explored few concepts required for my project, I knew how to export RCP product(which is portable).
My development approach was, for each Java File change I'm deleting
the previously exported product and exporting it again. I think my
approach is dumb, there might be better ways.
For a fix in java file, each time exporting is time consuming. As a
workaround I thought of replacing the class file generated in bin to
my plugin jar, but for my java file, there are multiple class files
generated with classname$1.class, etc. It was difficult to replace
all these class files into my plugin.jar.
What is the better practice in such situation. What do expert RCP developers do, for a java change to be reflected to a product exported version without deleting product or creating new. Isn't there any hot-deployment kinda thing, as an analogy Jsp change into Application server is a hot deployment.
Looking forward for suggestions.
Day to day I generally just run my product in the debugger - code changes are reflected immediately.
However you can use p2 to update a previously exported product - although this requires exporting a new version of the product first to generate a compatible p2 repository. An alternative is to push your changes to a build server and have it build the new product and p2 repository for you. I find Tycho is a good choice to help automate my builds.
I know there are many questions out there regarding this same information. I have read them all, but my brain is all turned around and I don't know which way to go. Plus the lack of documentation really hurts.
Here is my scenerio. We are trying to use WIX to create an installer for our application that goes out to our dealers for our product information. The app includes about 2000 images and documents of our products and a SQL CE database that are updated via Microsoft Sync Framework. The data changes so often that keeping these 2000 as content files in the app's project is very undesirable. The app relies on .NET Framework 3.5 SP1, SQL Server CE 3.5, Microsoft Sync Framework 1.0 and ADO.NET Sync Services 2.0.
Here are the requirements for the app:
The dealers will be given the app on a CD every year for any updates (app or data updates).
The app must update itself from the internet to get any new images, documents or data.
The prerequisites must be installed if they do not exist on the client machine.
The complete installer should be generated from an MSBuild script with as little human interaction as possible (we don't want to be manually updating the 2000+ file list).
What we have accomplished so far is that we have a Votive project in our solution. We have manually specified the binaries in a .wxs file. Web have modified the .wixproj file to use the HeatDirectory task to gather our data (images and documents and database) from a specified location (This is broken and giving an ICE38 error). This seems all right, but still is a lot of work. We have to manually update our data by running the program in release mode and copying it to the specified directory.
I am looking to see what other people would do in this situation.
How would you arrange your solution with regards to the 2000+ data files? Would you create a custom build script that gets the current data from the server or would you include them as content files in the main project?
How would you get WIX to include all of the project output (including the referenced assemblies) and all of the data files? If you have any complete samples, that would be great. All I have found are little clips here and there and not an entire example from start to finish.
How would you deal with the version numbers? Would you put them as a constant in the build script and reference them through the $(var.VersionNumberName)? Would you have the version number automatically picked up from the project being deployed? If so, How?
If there is any better information than what I am finding, please include. I have read numerous articles, blogs, Stackoverflow questions, the tuturial, the wiki, etc. Everything seems to be in bits and pieces. The tutorial is nice, but doesn't explain anything about MSBuild and Votive. I would like to see a start to finish tutorial on using MSBuild and Votive and all the WIX MSBuild targets. If no one knows of a tutorial like this I may put one together. I have already spent the entire week gathering info and reading. I'm new to MSBuild as well, so if anyone has any great articles on MSBuild, please include them.
The key is to isolate the different types of complexities into separate merge modules and put them altogether into an MSI as part of the build. That way things that change often can change without impacting things that hardly change at all.
1) For the data files:
We use Paraffin to generate the WiX and hence the merge modules for an html + Flash based help system consisting of thousands of files (I can't convince the customer to go to CHM).
Compile these into a merge module all by themselves.
2) Assemblies: assuming that this is a set that changes less often just make a merge module by hand or with WixEdit with the correct files and dependencies.
3) For the version number there a lot of ways to manage this depending on your build system. The AssemblyInfoTask is pretty straight forward way to make sure all your assemblies are versioned appropriately. The MSBuild Extension Pack has some versioning stuff if you are using TFS.
I had a similar scenario and was unable to find a drop in solution so ended up with the following:
I wrote a custom command line program called wixgen.exe for generating wxs manifest files. It is pretty specific to our implementation in that it only knows how to create 2 types of wxs files. One for IIS Website/Virtual Directory deployments and another for Windows Service deployments.
Each time a build is triggered by our continuous integration server a post-build task runs wixgen with the right args to generate a new manifest.wxs for the project being changed. It automatically includes all the files needed for the deployment. These builds also version the dlls using a variation of the technique at: http://richardsbraindump.blogspot.com/2007/07/versioning-builds-with-tfs-and-msbuild.html
A seperate build which is manually triggered is then used to build the wixproj projects containing the generated wxs files and produce the msi's.
I would ditch the CD delivery (so 90's) and got with ClickOnce. This solution seems to fit well since you already use the .NET framework. With ClickOnce you should be able to just keep updating the content of your solution and make updates available to your heart's content. Let me know if you need, sample ClickOnce deployment code.
You can find more ClickOnce information here.
Similar to dkackman's answer, you should seperate your build into several components, isolating build components to be built seperately.
I come from a mainly Java background, however for building MSIs and NET executables we use maven; with the 'maven-wix-plugin' plugin for building the installers, and using the NMaven plugin for compiling any NET code. However, as we're only performing very basic development in NET, with most development in Java, we don't need too much complexity from the NMaven plugin (which is probably a 'good thing' (TM) as it's only at version 0.17).
If you're a purely NET house, you could also look into Blydan (http://www.codeplex.com/byldan), which seems to be the focus of development there at the moment (it's the same team for NMaven and Byldan).
If you do want more information on NMaven or Byldan raise another question and I'll give as much info as I can (which is not a huge amount, as stated I only do very limited NET development).