I'm working on a program that shall have an "updates" module (online). I can't figure out how to do this. Initially i'm trying with a SVN repository. Any better idea? How is this normally done?
(I'm not asking for a concrete languague, i only want an general idea about the procces)
Thank you.
What we do (in an intranet environment) is roughly:
We have an application that (instead of directly starting) points to a little script that fetches the latest 'publicized' version from a known location using rsync.
Then the script simply bootstraps the application itself.
This way:
Everyone always works with the same version of the software.
New builds are easy to deploy: just copy them over to the known 'sync' location.
Using rsync or similar allows you to minify overhead since it works incrementally.
We force the upgrade upon our users, but this mechanism could also be adapted for online (on-demand) updates.
Related
One thing I really like about the later version of Worklight/MobileFirst Studio is the faster edit/test cycle when working in the Mobile Browser Simulator: just edit, save, click Go/Refresh; no need to build/deploy.
When using CLI (6.3.0.00.20141111-1216) this does not seem to be the case. Seems like I need to do
mfp build; mfp deploy;
After every edit. Am I missing a trick?
Right now I'm thinking I need to revert to my old practice of setting up a web-server to serve directly from my product folder, which is not ideal because I then need to mock up the WL.* APIs I use.
This information from Karl Bishop:
At the current time, this is a limitation of the CLI, based on the use of a standalone MFP Server. Within Studio, some special tricks are being played, to just updated modified files. We are working to resolve this in the CLI and perform similar per file deployments, but we're not there yet. In the interim, I encourage you to view Justin Berstler's video on using the CLI with Grunt.
As I am using eclipse and just set up a dropwizard server. On the command prompt I typed in java -jar target/hello-world-0.0.1-SNAPSHOT.jar server hello-world.yml and is running. Yet when ever I make a change to my eclipse file, like changing the yml file for example it doesn't update. I have to crtl+c and re-run what I typed in above. My question is, is there a faster way of testing so that it updates every time I change something or I just have to deal with the testing. Thanks.
Run from within the IDE
Different Java IDEs permit more efficient workflows. For example, in an IDE you can run up your application using a Runtime Configuration that executes your Service.main() method with parameters of server hello-world.yml. This will save you endless Maven builds.
Unfortunately, with Eclipse the hot swapping of code changes is often cumbersome, so I would recommend that you consider Intellij which is more reliable when it comes to hot swapping code. Even then hot swapping can be risky.
Sometimes a restart is unavoidable
That being said, in your situation hot swapping won't help. You are changing the startup configuration file which is only read at startup. You will have to restart to see the changes unless you create your own dynamic-refresh-on-file-hash-change mechanism (not advised).
One alternative is to put much of your configuration testing in unit tests and verify that your code is responding as expected.
Static assets give an optimal workflow (no restarts)
You may encounter a situation where you only want to change static assets (like JavaScript files) in which case Intellij will allow you to simply recompile on the fly and will copy the changed assets into the /target directory and have them immediately picked up by Dropwizard without a restart.
If you wanted to go one step further you could enlist the services of Grunt.js so that it continuously monitored the src/main/resources/assets (or similar) for changes and then automatically update your /target for you. Again, Intellij will autosave on focus change so this would lead to an optimal workflow where you change the asset, wait one second, refresh browser and see the immediate result.
I wrote a lengthy blog article covering Dropwizard and Ember Data a while ago if you want more details on this approach (and single page web application development in general).
I am to migrate a Websphere machine (including the applications which run on it) to a new machine. They wanted a clean install of the OS and WebSphere, so I did that. I also took a full file backup of all of the applications they had on the old server. The problem is that to re-install them on the new server, the WebSphere dialog asks me for the JAR/EAR/WAR file, which I don't have.
Is there any reasonably easy way to simply extract the backup of the WebSphere application files I have taken from the old maching, and simply configure the new machine to use them? WAR, etc. is a nice feature to have, but to be forced to use it seems silly.
Edit: The existing WebSphere server is still up and running in production.
Edit: The old server is WAS 3.5, which means it doesn't even have an export function, sadly. Also, the directory where it actually runs the content from has a completely different structure (consisting of like a a %/Web and %/Servlet, where % is the context path of the application). In the "Install" section, it doesn't even mention EAR or WAR, only JAR. I am currently thinking that perhaps the best thing to do might be to just copy the directory over to another WAS 3.5 system and then upgrade that system (and hope it converts the folder structure and updated the config as part of the upgrade).
Edit: The closest thing I have found to a solution so far is this link:
http://www.javazoom.net/services/newsletter/was4.html (though I am not sure if that tool is available or relevant for WAS 7.x).
This has to be a problem other people have run into before, but I can't find a solution anywhere on the WEB.
Thank you!
Here do they have sample Jacl scripts one can use to export/import appserver's configuration. So that is what you can start with. If your new bow uses the same version of WAS (and the same topology if it is not a standalone box) as the old one, it might be a (relatively) safe process.
Migration between different versions of Websphere might be somewhat more tricky, but I'm sure IBM published at least one redbook on that topic.
If you still have the old server running, than just export the apps and you have the war/ear files. However, If you don't know the configuration for the apps, you are screwed. However, I am sure IBM has tools that you can use. Some of the paid tools look even nice and user friendly (at least according to their sales demos). I can't tell you what you need, since I don't know what documentation you have for your apps. But as it looks like there is not much there, otherwise you would just install the application the same way they were installed on your old server and use the binaries (war, ear, jar) that are archived somewhere.
I'm wondering how Software Development Team distribute their Standard IDE(s)?
E.g. developing with Eclipse, custom Code formatter, svn Resository, Copyright Header..
At the moment my Team has a standard zip File which is then distributed withhin the developers.
Problem:
If one file, a Plugin or the IDE itself changes, e.g. new Coding Guidlines, Upgrade Eclipse 3.5.1 the whole distribution has to be done again. Every developer needs to unzip the bundel again. Imagine your working with different Workspaces (Jetty, different Tomcamt Versions, WTP) due to Project History That doesn't scale
I know that there are some related Articels
A new version of Eclipse just came out. Is there anything I can do to avoid having to manually hunt down my plugins again?
Manage Your Eclipse Install With A Local Git Repository
And some comercial Programs.
Eclipse also has a new Update-Installer Approach
But I don't see the Killer App. How do your team solve this? Is there a best practice?
I guess best would be a Program letting you choose your current Project and then downloads the configured IDE from the Server and leting you know if Project Config Files are Updated
For eclipse look at Buckminster it targets exactly your target I suppose, didn't use it personally through.
At my previous company they wrote a custom update agent that pulled from a centrally configured server which was updated by the team leaders. It worked well, until people wanted to install their own plugins.
Basically, a developer wanted a plugin, fought in futility to get it included in the default (managed) repo, installed it himself, then updates broke on his machine when the team lead had a sudden stroke of common sense and included it.
They never did come up with a 'good' way to manage it. But, at least they didn't put us all on terminal servers with thin clients.
Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.