Studying web servers such as apache httpd and tomcat - apache

I would like to see how everything is handled behind the scenes behind web servers such as apache httpd and tomcat. How does one go about stepping through these applications, making changes, and then viewing changes?? Applications this complex use scripts for building and I presume they take a while to compile, it seems to me that there would be more to it than simply downloading the source code and importing into Eclipse. Or is it actually that simple?
And how do developers who want to work on the code of these projects get around the fact that it will take a fair amount of time to compile these applications (and other non-trivial applications such as web browsers)? When I am working on smaller stuff I am constantly compiling and then debugging. I imagine that is no feasible when it can take several minutes to compile?

Easy: just read.
http://tomcat.apache.org/tomcat-7.0-doc/building.html
Also, http://wiki.apache.org/tomcat/FAQ/Developing
The current Tomcat 7.0.x trunk takes about 17 seconds to build on my MacBook Pro, and that included downloading a few dependencies that I didn't already have laying around. If you want to re-compile a single .java file, you can re-run the entire build and the toolchain (really just Apache Ant) will figure out which files actually need to be recompiled.
You only modified one source file? Only one source file will be re-compiled when you run ant deploy (you don't even need the "deploy": it's the default). If you use Eclipse or some other similar IDE, it will recompile on the fly and you don't need to worry about the command line or any of that.
If you have further questions, please join the Tomcat users' mailing list (or the developers' list) and join the community.

Related

Is hot reload possible in Glassfish at least for development purposes?

I had this question ever since I learnt about Erlang and it's possibility for hot replacements of separate modules and even functions on a production server.
We are developing a project in Java for Glassfish 2.1. It's basically an .ear file consisting of a bunch of .war modules.
So, every time, I make a minor change in the code of one of the modules, in order to check that change I have to redeploy the whole ear on my development server. Is that really how it's supposed to work?
Basically we have several levels of replacement - are they even possible?
The war module. Happens during a big number of commits by several developers. Is it possible to replace just this module on the working server without redeploying the whole ear file.
The java classes (action handles and etc). Is it possible just to replace the bytecode of the classes that were changed?
The jsp pages. I think they are launched indepently by the special jsp interpreter in glassfish. So this one definitely should have the possibility of the hot reload of a single changed page.
I know I can perfectly replace the different parts of HTML design, such as css or js files, or image files. But what about the bytecode files described above?

DropWizard testing

As I am using eclipse and just set up a dropwizard server. On the command prompt I typed in java -jar target/hello-world-0.0.1-SNAPSHOT.jar server hello-world.yml and is running. Yet when ever I make a change to my eclipse file, like changing the yml file for example it doesn't update. I have to crtl+c and re-run what I typed in above. My question is, is there a faster way of testing so that it updates every time I change something or I just have to deal with the testing. Thanks.
Run from within the IDE
Different Java IDEs permit more efficient workflows. For example, in an IDE you can run up your application using a Runtime Configuration that executes your Service.main() method with parameters of server hello-world.yml. This will save you endless Maven builds.
Unfortunately, with Eclipse the hot swapping of code changes is often cumbersome, so I would recommend that you consider Intellij which is more reliable when it comes to hot swapping code. Even then hot swapping can be risky.
Sometimes a restart is unavoidable
That being said, in your situation hot swapping won't help. You are changing the startup configuration file which is only read at startup. You will have to restart to see the changes unless you create your own dynamic-refresh-on-file-hash-change mechanism (not advised).
One alternative is to put much of your configuration testing in unit tests and verify that your code is responding as expected.
Static assets give an optimal workflow (no restarts)
You may encounter a situation where you only want to change static assets (like JavaScript files) in which case Intellij will allow you to simply recompile on the fly and will copy the changed assets into the /target directory and have them immediately picked up by Dropwizard without a restart.
If you wanted to go one step further you could enlist the services of Grunt.js so that it continuously monitored the src/main/resources/assets (or similar) for changes and then automatically update your /target for you. Again, Intellij will autosave on focus change so this would lead to an optimal workflow where you change the asset, wait one second, refresh browser and see the immediate result.
I wrote a lengthy blog article covering Dropwizard and Ember Data a while ago if you want more details on this approach (and single page web application development in general).

Can a web server dynamically generate an executable on the fly?

Ninite.com seems to be doing it currently. I'm wondering how.
While it's possible for them to have every combination of app pre-generated, it seems unlikely/hacky.
[EDIT]
Is compiling a Windows executable using this method resource-intensive? Can it be done ~100k times a day without exorbitant cost? I'm asking because Ninite announced that they're going paid-only... can it be costing them that much?
[EDIT2]
The downloads aren't huge, it's just a small hundred KB web-based downloader+installer app that knows which apps to install.
^^ Regarding this, the EXE file served up by the webapp is named something like Ninite AIMP Audacity Chrome Digsby FastStone Installer.exe when given 20+ apps to install. It's probably likely that the server is serving up the same file under different filenames, and the app is then configuring itself based on the filename, no?
The site doesn't seem to create executable but just provide them for download.
[EDIT] Creating those huge downloads on the fly would create a huge burden on the server. Moreover it could create buggy software. So my guess is, if these people know what they're doing, they have a server which prepackages everything, tests it and then dumps it in the download directory of the web server.
But of course, nothing stops a server from invoking any kind of program (with maybe the exception of the patience of the surfer). So they can run compilers, archivers, whatever.
Why would a web server not be able to dynamically generate an executable?
Sure, just run a compiler on the server with exec().
I do something similar with generating PDF files from LaTeX sources, since that is basically compiling as well...

How to migrate WebSphere app with no WAR/EAR file

I am to migrate a Websphere machine (including the applications which run on it) to a new machine. They wanted a clean install of the OS and WebSphere, so I did that. I also took a full file backup of all of the applications they had on the old server. The problem is that to re-install them on the new server, the WebSphere dialog asks me for the JAR/EAR/WAR file, which I don't have.
Is there any reasonably easy way to simply extract the backup of the WebSphere application files I have taken from the old maching, and simply configure the new machine to use them? WAR, etc. is a nice feature to have, but to be forced to use it seems silly.
Edit: The existing WebSphere server is still up and running in production.
Edit: The old server is WAS 3.5, which means it doesn't even have an export function, sadly. Also, the directory where it actually runs the content from has a completely different structure (consisting of like a a %/Web and %/Servlet, where % is the context path of the application). In the "Install" section, it doesn't even mention EAR or WAR, only JAR. I am currently thinking that perhaps the best thing to do might be to just copy the directory over to another WAS 3.5 system and then upgrade that system (and hope it converts the folder structure and updated the config as part of the upgrade).
Edit: The closest thing I have found to a solution so far is this link:
http://www.javazoom.net/services/newsletter/was4.html (though I am not sure if that tool is available or relevant for WAS 7.x).
This has to be a problem other people have run into before, but I can't find a solution anywhere on the WEB.
Thank you!
Here do they have sample Jacl scripts one can use to export/import appserver's configuration. So that is what you can start with. If your new bow uses the same version of WAS (and the same topology if it is not a standalone box) as the old one, it might be a (relatively) safe process.
Migration between different versions of Websphere might be somewhat more tricky, but I'm sure IBM published at least one redbook on that topic.
If you still have the old server running, than just export the apps and you have the war/ear files. However, If you don't know the configuration for the apps, you are screwed. However, I am sure IBM has tools that you can use. Some of the paid tools look even nice and user friendly (at least according to their sales demos). I can't tell you what you need, since I don't know what documentation you have for your apps. But as it looks like there is not much there, otherwise you would just install the application the same way they were installed on your old server and use the binaries (war, ear, jar) that are archived somewhere.

How do I set up a build server on the cheap/free?

Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.