I have a custom ravendb analyzer and I need to upgrade it to a newer version.
When I try and copy the new version into the analyzers folder I get an OS warning saying the file is in use.
I suppose this means that I need to stop raven, copy the file and restart it. But that would result in downtime as my db will obviously be offline during this time.
Is there another way of doing it?
Derek,
No, you have to do it this way.
Related
Problem
I want to both use stable versions of KRE and the bleeding edge nightly built KRE. One ASP.NET5 application may be beta2, but another I may want to be beta4. So what I did was install both in powershell as found here.
What happened is that the stable KVM installed in C:/Users/derp/.kre and the nightly build KVM installed in C:/Users/derp/.k
Worse yet, I can only see this now
Attempts
I tried kvm install KRE-CLR-x86.1.0.0-beta2 and it failed
Shall I try moving the packages from /kre file to the /.k file? This seems hacky and like a really bad idea
RTFM - Tried to use the install feature and including the -a, but failed.
I'm doing something the hard way and can't see the obvious.
I search on here
I feel if there is an answer to what I am trying to do above, it is worth being on here for others to find as well. Thank you all for your patience.
ASP.NET 5 is under development and there is no guarantee that changes between different pre-release version are backward compatible (sorry!).
The /.kre -> ./k rename is not backward compatible and you cannot have both the old and the new kvm simultaneously on the PATH. However, you can get can have two versions of kvm on your machine but you will have to use the full path for at least one of them.
I think the key is the path environment variable of your system. You have to use two set of "kvm", one for night builds, one for public beta, to download and set correct path environment variable.
For instance, I get one kvm from Entity Framework 7 repository, which can download and use beta 4 builds. I also have another kvm from Home repository which can download and use public beta builds.
You can use either kvm with "upgrade" or "use" command to set correct path environment variable, then run your application on the runtime you need. I think even Visual Studio 2015 CTP runs your projects based on the Runtime specified in your path environment variable. For the time being, only beta 3 run times can display in the project property dialog of VS 2015 CTP, but when hitting ctrl + F5, my website starts to load beta 4 runtime and assemblies, I can see the loading in output window, I think this is because I have .k folder prior to the .kre folder in the path environment variable.
Can you try the following?
$cmd-prompt>kpm Install KRE-CLR-x86
It worked for me.
Just getting ready to upgrade from 5.1 to 6.3. We have never performed an upgrade before.
About the upgrade path: When installing the updates, do I need to install the hotfixes, or just the major releases? (My gut says only major releases).
I found the documentation here:
http://www.sitefinity.com/documentation/documentationarticles/upgrading-you-sitefinity-5.1-project-to-the-latest-version
Is this documentation enough to make a smooth upgrade?
Yeah, just follow the documentation in the link you posted.
My process is to take full backups of the site files and database then perform the upgrade locally. Do the first step in the upgrade path then run through the site to test, back end and front end, then run the next step in the upgrade, and so on. I suppose if you want to be extra careful you could take additional backups between each upgrade step but that's probably overkill.
When making the web.config changes, there is an option to have Project Manager merge them for you but I end up just using Beyond Compare to compare the _EmptyProject folder in the extracted Project Manger files to my local files and do the web.config changes through a file compare. It cuts down on the differences in files from upgrade to upgrade and shows you whats been changed. The _EmptyProject folder is essentially the vanilla Sitefinty site files for that version.
Once the site is fully upgraded locally, I just publish the site in Visual Studio, copy the files over to the live site and overwrite the live database with a backup of my locally upgraded database.
Hope that helps.
I have upgraded Sitefinity 5.1 to 6.0, on a website which is in production (which included going through a couple of steps for the versions between).
I just followed the guidelines, and it went fine.
Now there are a couple of things you need to be aware of :
Source control
If your Sitefinity solution is on "Source Control", you should create a new duplicate of your solution, and disconnect this one(newly created) from "Source Control" before starting the upgrade. And of course you do the upgrade on the solution which is not in Source Control. Because you will probably have a lot of dll's to integrate, and if you have the project manager, your sitefinity project will run correctly, even though the new dll's aren't properly integrated in your solution and possibly "source control".
Unexpected behaviours of previously working elements
Secondly, I didn't test the frontend and backend during the different steps (Sitefinity versions within upgrade), but I tested everything once my solution had reached the last Sitefinity version. I thought I had checked everything, but it wasn't the case, and some of my custom Widgets didn't work properly on the latest version of Sitefinity. Next time I'll go more in detail on all custom parts, since from a working version of Sitefinity, you can end up with a newer version that breaks some behaviours. If you notice this, you might better wait a bit more for a fix, or the next release which might fix the problems.
Outside access to website during upgrade.
Furthermore, once you need to do the upgrade on the production database/website, the website shouldn't be accessed by people, since the upgrade of database might take some time.
Time needed for upgrading everything
One more thing I would like to add, it takes time to perform upgrade of several versions.
The first time I upgraded (I needed to go through 2 versions), and having to upgrade locally, to a development database, deploy the website on developement environment, then make it again on test. I took about 4 hours before everything was fully working. Make sure you have enough time, because it can be more tricky if you need to stop everything then come back to it.
Where I can download VSTO Office Runtime version 10.0.40820?
I need link for this specific version, NOT for the newest one (http://go.microsoft.com/fwlink/?LinkId=158918).
Why I need this specific version? I created custom InstallShield .PRQ for VSTO that download file from web and checks if file is not corrupted using MD5 hash.
If I use generic download link and MS will deploy new version of VSTO then setup will complaint about corrupted file.
InstallShield has abstractions in their PRQ file that are meant to help but I have found problematic.
First, the PRQ itself can have a URL. This means that at runtime the PRQ embedded into the setup.exe will go try to download a newer version of itself from this URL. If this happens, the new XML is used and yours is ignored. This sounds like a good idea to keep things fresh and up to date but the CM Nazi in me see's this as a man in the middle vunerability that compromises the integrity of the build.
The second is that your XML or the downloaded XML both have URL attributes on the individual files. Again the CM Nazi in me says that while this seems like a good idea, it's really inserting an external dependency that isn't under my control and again violating the integrity of the build.
If it was me, I'd never use InstallShield and/or Microsoft URL's in my PRQ files. Host the content yourself and do change management of that so you can have complete control. If longevity of the build is desired then don't use web downloads in the first place. Bake it all into the EXE.
As for the exact question you asked, I'll have to google for it. But really I'd probably just move onto the latest version and then implement the above advice starting there.
I am to migrate a Websphere machine (including the applications which run on it) to a new machine. They wanted a clean install of the OS and WebSphere, so I did that. I also took a full file backup of all of the applications they had on the old server. The problem is that to re-install them on the new server, the WebSphere dialog asks me for the JAR/EAR/WAR file, which I don't have.
Is there any reasonably easy way to simply extract the backup of the WebSphere application files I have taken from the old maching, and simply configure the new machine to use them? WAR, etc. is a nice feature to have, but to be forced to use it seems silly.
Edit: The existing WebSphere server is still up and running in production.
Edit: The old server is WAS 3.5, which means it doesn't even have an export function, sadly. Also, the directory where it actually runs the content from has a completely different structure (consisting of like a a %/Web and %/Servlet, where % is the context path of the application). In the "Install" section, it doesn't even mention EAR or WAR, only JAR. I am currently thinking that perhaps the best thing to do might be to just copy the directory over to another WAS 3.5 system and then upgrade that system (and hope it converts the folder structure and updated the config as part of the upgrade).
Edit: The closest thing I have found to a solution so far is this link:
http://www.javazoom.net/services/newsletter/was4.html (though I am not sure if that tool is available or relevant for WAS 7.x).
This has to be a problem other people have run into before, but I can't find a solution anywhere on the WEB.
Thank you!
Here do they have sample Jacl scripts one can use to export/import appserver's configuration. So that is what you can start with. If your new bow uses the same version of WAS (and the same topology if it is not a standalone box) as the old one, it might be a (relatively) safe process.
Migration between different versions of Websphere might be somewhat more tricky, but I'm sure IBM published at least one redbook on that topic.
If you still have the old server running, than just export the apps and you have the war/ear files. However, If you don't know the configuration for the apps, you are screwed. However, I am sure IBM has tools that you can use. Some of the paid tools look even nice and user friendly (at least according to their sales demos). I can't tell you what you need, since I don't know what documentation you have for your apps. But as it looks like there is not much there, otherwise you would just install the application the same way they were installed on your old server and use the binaries (war, ear, jar) that are archived somewhere.
I have a program that I intend to install on Linux and Windows machines. I have it cross-compiling fine (with autotools), but at some point I would like the program to be able to update its binaries. The only ways I can think of doing this are:
Give users write access to "C:\Program Files\Foo Program" or "/usr/bin/foo_program".
or
Install the program to the user's profile/home directory.
Neither of these seems like a good idea. What would you do?
You need to give us more details on what you are trying to do - I don't understsand the link between cross platform, patching and your question.
If you need to be able to auto update the program, on linux at least, the best solution is to provide a binary package (rpm, deb, whatever, depending on your target), which is updated regularly - so that new versions will be picked up by the package manager. On windows and mac os x, things are usually more decentralized, each program has its own update manager. The best technical solution depends on the technology (C/C++/python/whatever). One exception I can think of on Linux is vmplayer, which tells you when there is a new version - but you still have to install the new version.
If the program binary is writeable, you could download the patch or the new bits to %TEMP% or /tmp then apply them to the binary. I don't think you need to be able to create new files in the directory. But you're going to run into problems on Windows with the file being in use while you try and patch it.