As I am using eclipse and just set up a dropwizard server. On the command prompt I typed in java -jar target/hello-world-0.0.1-SNAPSHOT.jar server hello-world.yml and is running. Yet when ever I make a change to my eclipse file, like changing the yml file for example it doesn't update. I have to crtl+c and re-run what I typed in above. My question is, is there a faster way of testing so that it updates every time I change something or I just have to deal with the testing. Thanks.
Run from within the IDE
Different Java IDEs permit more efficient workflows. For example, in an IDE you can run up your application using a Runtime Configuration that executes your Service.main() method with parameters of server hello-world.yml. This will save you endless Maven builds.
Unfortunately, with Eclipse the hot swapping of code changes is often cumbersome, so I would recommend that you consider Intellij which is more reliable when it comes to hot swapping code. Even then hot swapping can be risky.
Sometimes a restart is unavoidable
That being said, in your situation hot swapping won't help. You are changing the startup configuration file which is only read at startup. You will have to restart to see the changes unless you create your own dynamic-refresh-on-file-hash-change mechanism (not advised).
One alternative is to put much of your configuration testing in unit tests and verify that your code is responding as expected.
Static assets give an optimal workflow (no restarts)
You may encounter a situation where you only want to change static assets (like JavaScript files) in which case Intellij will allow you to simply recompile on the fly and will copy the changed assets into the /target directory and have them immediately picked up by Dropwizard without a restart.
If you wanted to go one step further you could enlist the services of Grunt.js so that it continuously monitored the src/main/resources/assets (or similar) for changes and then automatically update your /target for you. Again, Intellij will autosave on focus change so this would lead to an optimal workflow where you change the asset, wait one second, refresh browser and see the immediate result.
I wrote a lengthy blog article covering Dropwizard and Ember Data a while ago if you want more details on this approach (and single page web application development in general).
Related
I haven't done any cshtml front-end development for a few years.
What's the current, generally accepted way for ASP.NET Core front-end developers to work across a range of tools on Windows?
By that, I mean a way to have the front-end JS build and the .NET project(s) also build and to work rapidly in the browser and code.
My thinking is.
We have much better command line story around dotnet today.
Some folk like VS Code.
Some folk prefer VS 2019, and some like either, depending.
We need to work on UI aspects sometimes.
But we also need to attach a debugger and debug the server logic sometimes.
The build server should have no problem, be simple, and rely mostly on build logic held in the repo.
Tooling, and kicking off the whole build and serve process should be understandable and familiar.
It should be pretty simple to get going after a team noob clones the repo.
My initial thought would be to setup NPM then use something like Gulp to kick off everything, including running dotnet run.
Then when running under the Visual Studio 2019 debugger, use the Task Runner Explorer to kick off the Gulp stuff but skip the dotnet run part.
(shame there doesn't seem to be a command line for start VS(Code or 2019) and attach debugger)
Now I'm expecting to get a "primarily opinion based" SO beating, but there are general trends and ideas that go into designing all these tools for how they can all play ball together and what the dev story looks like.
You've pretty much already described the process. However, I'll add a few things:
You don't need the dotnet run bit. Visual Studio and VS Code are both capable of debugging directly.
You can assign the gulp tasks to build tasks in Task Runner Explorer, so you really don't even then to think about running those directly. I'm not as sure on this aspect of VS Code, but I'm sure there's probably some extension to handle it, if it's not already built-in.
If you want true ease of development, the best thing you can do is use Docker. Just add a Dockerfile to each project that actually runs (i.e. not a class library) and set up the steps to build and run it there. In Visual Studio, you can right-click the project and choose Add > Docker Support, and it will actually generate a ready-made Dockerfile, though you may need to add a step or two to handle the client-side build steps. In any case, this then becomes truly click and run, with nothing to worry about. The story is even better when you use docker-compose, as then Visual Studio and VS Code can spin up your entire application stack all at once, including external dependencies such as a database, Redis instance, etc. If you haven't used Docker before, start now. It's absolutely revolutionary for development.
One note for CI/CD, as much as possible, you should add a YAML file to describe your CI/CD pipeline. Depending on the the actual provider you're using for build/release, there might be some differences, so consult the relevant documentation. (Azure DevOps, for example, doesn't currently support describing release pipelines in yaml, though you can still do your build that way.) In any case, this allows you to configure all this in code, and have it committed to source control.
You may consider the same for your infrastructure. Azure has ARM templates, AWS has CloudFormation, GCP has Deployment Manager. There's also third-party tools like Terraform or Ansible. All of these, in some form or fashion (usually JSON or YAML) allow you to define all the characteristics of the infrastructure you're going to deploy to and commit that to source control. This makes deployment and things like creating new environments as breeze.
I had this question ever since I learnt about Erlang and it's possibility for hot replacements of separate modules and even functions on a production server.
We are developing a project in Java for Glassfish 2.1. It's basically an .ear file consisting of a bunch of .war modules.
So, every time, I make a minor change in the code of one of the modules, in order to check that change I have to redeploy the whole ear on my development server. Is that really how it's supposed to work?
Basically we have several levels of replacement - are they even possible?
The war module. Happens during a big number of commits by several developers. Is it possible to replace just this module on the working server without redeploying the whole ear file.
The java classes (action handles and etc). Is it possible just to replace the bytecode of the classes that were changed?
The jsp pages. I think they are launched indepently by the special jsp interpreter in glassfish. So this one definitely should have the possibility of the hot reload of a single changed page.
I know I can perfectly replace the different parts of HTML design, such as css or js files, or image files. But what about the bytecode files described above?
I would like to see how everything is handled behind the scenes behind web servers such as apache httpd and tomcat. How does one go about stepping through these applications, making changes, and then viewing changes?? Applications this complex use scripts for building and I presume they take a while to compile, it seems to me that there would be more to it than simply downloading the source code and importing into Eclipse. Or is it actually that simple?
And how do developers who want to work on the code of these projects get around the fact that it will take a fair amount of time to compile these applications (and other non-trivial applications such as web browsers)? When I am working on smaller stuff I am constantly compiling and then debugging. I imagine that is no feasible when it can take several minutes to compile?
Easy: just read.
http://tomcat.apache.org/tomcat-7.0-doc/building.html
Also, http://wiki.apache.org/tomcat/FAQ/Developing
The current Tomcat 7.0.x trunk takes about 17 seconds to build on my MacBook Pro, and that included downloading a few dependencies that I didn't already have laying around. If you want to re-compile a single .java file, you can re-run the entire build and the toolchain (really just Apache Ant) will figure out which files actually need to be recompiled.
You only modified one source file? Only one source file will be re-compiled when you run ant deploy (you don't even need the "deploy": it's the default). If you use Eclipse or some other similar IDE, it will recompile on the fly and you don't need to worry about the command line or any of that.
If you have further questions, please join the Tomcat users' mailing list (or the developers' list) and join the community.
I'm working on a program that shall have an "updates" module (online). I can't figure out how to do this. Initially i'm trying with a SVN repository. Any better idea? How is this normally done?
(I'm not asking for a concrete languague, i only want an general idea about the procces)
Thank you.
What we do (in an intranet environment) is roughly:
We have an application that (instead of directly starting) points to a little script that fetches the latest 'publicized' version from a known location using rsync.
Then the script simply bootstraps the application itself.
This way:
Everyone always works with the same version of the software.
New builds are easy to deploy: just copy them over to the known 'sync' location.
Using rsync or similar allows you to minify overhead since it works incrementally.
We force the upgrade upon our users, but this mechanism could also be adapted for online (on-demand) updates.
Currently I'm tasked with doing the daily build. We have an ASP.NET 2005 website with a SQL Server 2005 backend. Our current source control is Visual Source Safe 2005.
At this point, I use the brute-force method of daily builds.
Get Latest version of source code
Get Latest version of Database release script
Backup old website files to a directory
Publish new code to my local machine
Run on my server to keep the test/stage site working
Push newly created files to the website
Run SQL Script on test database (assuming updates, otherwise I don't bother)
Test website on the Test Server.
Looking at the idea of automated builds intrigues me since it means that I do less each morning. How would you recommend I proceed? I want to have a fully fleshed out idea before I present it to my boss.
Ditch VSS, move to Subversion, and check out CruiseControl.NET. Alternatively, if you have a MSDN developer license, you can run TFS workgroup edition and set up a build server on any old XP box. Its what we do at our shop.
As Assaf noted, you can use CC.NET with VSS directly. Nice.
TeamCity has worked well for me. It has a very simple setup. Combine it with an MsBuild script for your operations and you're auto-matic.
For build management I wholeheartedly recommend TeamCity. It doesn't require IIS6 (like CC.net does) since it runs on it's own copy of Tomcat and the setup is all done thru various forms. This is a big deal to me since the build server is just an XPPro box. It integrates well with SVN and there is no crazy XML file manipulation like I had to do with CruiseControl.Net. Big win for me.
For a build runner we use NAnt to send emails to various people, copy the packaged builds where they're supposed to go, run NUnit and NCover, and deploy the software to our web farm.
For automated testing we use Watin.
http://www.nunit.org/index.php
http://www.jetbrains.com/teamcity
http://ncover.sourceforge.net/
http://subversion.tigris.org/
http://nant.sourceforge.net/
http://watin.sourceforge.net/
Try CruiseControl.Net. It's free, and whatever customized daily/continuous routine you want it to perform you can always add with scripts.
Remember, it's not just about daily (nightly) builds, but also about letting you catch build errors in time (since it continuously builds after every source commit/check-in). You don't necessarily test every code chance on every possible platform and build configuration, but CC can do exactly that for you (in the background).
http://confluence.public.thoughtworks.org/display/CCNET/Visual+Source+Safe+Source+Control+Block
All of what you are doing can be performed by a set of batch files, depending on how automated your test environment is. The main batch file can be started as a 'scheduled task' at midnight or whatever. That's how we 'do it cheap' here and at other places I've worked. If you need help with a particular batch, I can provide a sample.
I second (or third) the reccomendation for Subversion/CruiseControl.net. Also, if it is appropriate, check out hosted services for SVN like CVSDude. You'll probably become well versed with MSBuild in the process too. Once you get it setup it is great.
The cost doesn't come from licensing of the tools or even hardware necessarily, but from your time building and maintaining the system - and depending on what you are doing, that could become significant.
Start with the basics and incrementally improve it over time. Like anything else, if you try to come out of the gate with lots of automation and functionality you could find yourself mired in it fulltime for weeks.
Whatever tools you use, house them in a virtual machine (ie., vmware).
When the equipment inevitably goes south, you can copy the image onto any machine and not miss a beat because your build server decided to take the day off, assuming of course, you back up.