Move Domino mailfiles to a new directory location on server - lotus-domino

Currently we have a lot of mailfiles in different directories on the same server, some are located on the server in data\mail and others are located in data\mail\DK... or data\mail\USA...
These mailfiles are also replicated to other servers and we have noticed on the other servers the mailfiles have another file structure.
This makes administration very difficult so we would like to move all our maiLfiles to the data\mail... directories on all servers.
(Some clients have local replicas)
What is the best practise for doing this?
can the admin process do this, move the file, update person record and update clients?

AdminPs "Move to Another Server" functionality works fine for that job (watch out for the delete requests, though).

My guess is that the original administrator set up the system so that the users mailfile on the home mail server is in the root of the \mail directory and that the sub directories contain replicas of mailfiles from others servers as a means of cheap backup.
I'd suggest looking at the NAb and seeing if this is indeed the case and if it is then you are in luck. All you will need to do in this case is bring the server down, move all the mailfiles in the subdirectories into the main mail directory and restart the server. Once the server comes back up it will continue to replicate these mailfiles with the other server.
I would check the replication connection documents to see if any special replication schedules have been setup for those subdirectories, if so you'll have to adjust them to ensure proper replication.
If the users home mailserver is not using the root mail directory as their mailfile storage area then it is a longer process. You can use AdminP to do it but it COULD cause you issues if you accidently come back in at a later date and approve the deletion requests or if the server doesn't have enough diskspace to double all the mailfiles, also having two replicas of a mailfile on a single server is not a good idea either.
If you need to do the long process I'd look at doing it manually. Down the server, move the mailfiles, bring the server up and edit each person doc to set the correct location for the mailfile and then visit each user machine to edit their location document to point to correct location. It is the only safe way to do it.
The last option is to buy a new server and then use adminp to move all the users to that server, making sure the mailfiles is stored in the /mail directory, no risk of duplicate replicas on a single server, adminP looks after adjusting the settings on all the users machines and you end up with a nice clean, new server ( on which you could implement things like transaction logs and daos )

As for the safe way to go on this one:
Correcting the physical mailfile location should be quite easy (bring server down, move mailfiles, start server again), but modifying all those person documents could be quite complicated if you go one by one.
I would encourage you to use Ytria's ScanEZ software, so that you can mass-modify all person docs in the NAB using a simple Replace Substring formula to correct the Mailfile path information at once.
This is an incredibly fast process, should not take more than 10-20 seconds to go.

Related

Copying database to cloned server and keeping both copies?

I'm in a rather odd situation. At my work, we have two MSSQL 2012 servers, one physically hosted here, one virtual. Through a long, frustrating set of circumstances, our migration plans fell apart and we now have both servers with different data on each. I have to take a database, let's call it cms1, from the physical server and move it to the virtual server. However, I have to also make sure the virtual server's copy of cms1 remains intact, then run a script to move the changed tables from one to the other.
What I've tried so far is:
Make a full back up of the physical server's copy into cms1.bak, then copy that bak file over to the virtual server.
Rename the virtual server's version of the database with "alter database cms1 modify name = cms1_old". Good so far.
Take the newly renamed cms1_old db offline, then restore from my bak file. I get an error that the file for cms1 (NOT cms_old) is in use.
I went to the actual location on disk and renamed the two files associated with cms1 to be cms1_old. I closed SSMS and re-opened it, and tried the restore again. I got the same error, that the file for cms1 (again, NOT the file for cms1_old) was in use.
(update) I have since discovered detaching databases and tried to use that. When re-attaching after renaming the files for cms1_old, though, SSMS says that the files cannot be found. I believe I've gotten every path correct, so I'm not sure why this is happening.
Was my mistake in not taking the cms1 database offline BEFORE renaming it? If so, is there a way to fix this, or should I start again? This cms1 database is a test, not the real thing, but I want to get the procedure nailed down before working on our live database. How would I move a copy of cms1 from physical to virtual, keeping cms1 on the virtual server, so both can exist side by side while I move data from certain tables of one to the other? I really hope I'm making sense--I've been fighting with this for two hours straight. Thanks for any suggestions. I'm not too experienced in this sort of thing; I know SQL reasonably well, but dealing with physical DB files, backups, etc is new to me.

MS- Acess database interface update from local

I am extremely new to Ms-Access. I have a central back end access database in server computer. And all the users have the front end user interface installed on their system.
Now, whenever I make any changes to the interface in my local, i need to re-install the updated interface on each of their system. Is their any way that i can do so that i will make the changes only on my local and it will be automatically reflected on all the users' systems.
Thank you.
Ok there are a couple of options that you can do to either fully or paritally automate this process.
Partial Automation
If you don't have a lot of users and you don't want to do a great deal of coding you can write a simple batchfile or vbs file which you set up on the users desktop as an icon. Batch file code would show the following type of information.
#Echo Off
REM Copy your file from server location to local user machine
xcopy "F:\ServerDirectory\databasename.mdb" "C:\ClientDirectory\databasename.mdb" /E /Y /R
Set this up on the users machine as an icon and whenever you want them to update their front end ask them to double click the icon. This will overwrite their client with whatever you place in the location on the server. It is advisable to create all table links to the database back end having UNC paths as well.
I have used this successfully for various applications - I make changes to the front end place in appropriate location on the server and then do a quick e-mail to people just to ask them to double click the bat file icon.
Full Automation
Programmatically set version control up using visual basic so the client checks version number of the client against a server number and if the client is not the latest will download a new version.
This is more involved and full instructions are available here.
Front End Auto Update
When you deploy an MS Access solutions like this, you need to decide whether to share the client MDB file between all users, or distribute copies to each user. It sounds like you have taken the second option. Each choice has merits and disadvantages. If you stay with the current approach, you might look at a scripting option to deploy updated client MDB files between users.

How does one test if a backup of an `.mdb` file is good in an automated manner?

I make a copy of an .mdb database (and it's other partition) every night, and test it by opening it up to see if it works.
By "make a copy" I mean:
I kick all the users out of the database who are connected via RDP (not automated...)
Rename both backend files...and then proceed to make a copy of the files. (automated...)
And by "see if it works" I mean:
Relink a frontend file (.mde) to both files (this is automated)
Open it (and it's other partition) with a frontend (.mde) and workgroup security file (.mdw) on my local machine to see if it works. (this is not automated, and the part I am focusing on here...)
There are only two tables in the other partition, so I run the part of the frontend file I know uses that partition to test if the backup is going to work.
Would connecting to the backup of the files and doing a query on some table in both partitions be enough to prove that the backup is good without actually looking at it with human eyes?
I have also automated the process of compacting the live database, but I don't feel safe automating this part until I have verified that the backups indeed work.
Also before I get any posts along the lines of "Why are you still using access?", let me just state that I don't get to make those decisions and this database was here a long time before I got here.
(Please Note: if you feel I have posted this on SO in error please feel free to migrate to the DBA SE or to Serverfault)

Why use sysprep for Sharepoint 2010 Developer VMs?

I have read several articles about creating a Sharepoint Developer VM. They all say to "sysprep" them. Why (exactly) must the sysprep be done? What kind of problems (and why) will we run into if we don't sysprep them?
(I suppose what I am asking is, what would be the difference in doing "sysprep" and just bringing up the VM, changing its Name/IP, reboot then install SP?)
I've had success in the past with just copying Hyper-V vhd's as a method of cloning VM's - however, I now use sysprep when cloning any of my machines as it's been mentioned as a best practice in many places. And, it does some nice things like allowing you to cleaning up a bunch of stuff that I don't want to duplicate and letting me choose a new name for the machine on boot. From MS Sysprep Technical Reference:
Sysprep prepares a computer for disk
imaging or delivery to a customer by
configuring the computer to create a
new computer security identifier (SID)
when the computer is restarted. In
addition, Sysprep cleans up user- and
computer-specific settings and data
that must not be copied to a
destination computer.
And you may want to read Russinovich's post on The Machine SID Duplication Myth (and Why Sysprep Matters) for more good explanation of how SIDs work and the very last paragraph has another reason for going this route:
Note that Sysprep resets other
machine-specific state that, if
duplicated, can cause problems for
certain applications like Windows
Server Update Services (WSUS), so
Microsoft’s support policy will still
require cloned systems to be made
unique with Sysprep.
Good luck!

How do I distribute updates to a Access database front end?

I've got an Access 2007 database that I developed which connects to SQL Server for the actual data storage. I used the Package Solution Wizard to create a distributable installer which included access runtime (with an ACCDE file) which I went around and installed on 15 or so PCs. Anyway, my question is, what is the best way to distribute updates to this database? Right now I'd need to go around and remove and reinstall. That's not a problem... I was just wondering if there was another way.
I've tried leaving the front end on a network share but it seems that most people suggest storing the front-end on the local machine, which makes sense. The problems I've run into when I leave it on a network share (at least with Access 2003 mdbs) is that I find myself needing to compact and repair often and I also have to kill the open sessions (user's who have the file open) when upgrading. I would imagine it could also hypothetically create an unnecessary bottleneck if the user was not on the local network.
Automating front-end distribution is trivial. It's a problem that has been solved repeatedly. Tony Toews's http://autofeupdater.com is one such solution that is extremely easy to implement and completely transparent to the end user.
We developed a vbscript 'launcher' for our access apps. That is what is linked to on the start menu of user's pcs and it does the following.
It checks a version.txt file located on a network share to see whether it contains different text to a locally stored copy
If the text is different it copies the access mdb and the new version.txt to the user's hard drive.
Finally it runs the mdb in access
In order to distribute an update to the user's pc all that is required is to change the text in version.txt on the network share.
Perhaps you can implement something similar to this
Make a batch file on the server (network drive).
Create a shortcut link to that batch file.
Copy the shortcut to User's Desktop.
When user double-clicks on shortcut, it will copy a fresh copy from network to local.
Replace old database.adp on the server drive when you update a new version.
Each user gets a copy of database.adp on their machine.
Remove Security warning when opening file from network share is here.
Batch File
#ECHO OFF
REM copy from network drive to local
xcopy "Your_Network_Drive\database.adp" "C:\User\database.adp" /Y /R /F
REM call your database file - Access 2007
"C:\Program Files\Microsoft Office\Office12\MSAccess.EXE" "C:\User\database.adp"
This is a very old post and I used the autofeupdater until it stopped working so I wrote one of my own and it has evolved over the last few years into something that I have used with many clients. It's so simple to use and there is no interface. Just an EXE and a very simple config file.
Please check it out here. I can also help with custom solutions if none of the configurations work for your needs. http://www.dafran.ca/MS-Access-Front-End-Loader.aspx
After trying all of the solutions above (not exactly these solutions but these are the common suggestions in the Access community), I developed a system entirely within Access using VBA that allows an admin DB to create and publish objects to client DBs without the need for user intervention or management of multiple DB files.
This approach has several benefits:
1. It simplifies the development process by having a dedicated environment (admin DB) for development and testing totally separate from the client DBs.
2. It simplifies the update/distribution process by allowing a developer to push out updates in real time that client DBs can implement in the background, without involving users. Can also allow devs to roll back to previous versions if desired.
3. It could be used as a kind of change management system within Access for developers who want to commit multiple changes to objects and modules and retain past changes.
4. It allows for easier user access control by allowing an admin to easily assign certain objects to specific users/roles without needing to maintain multiple versions of the DB.
I will hopefully post the code to GitHub soon, I just have to get clearance from my workplace to release it. I will edit this post to include the link when I have.
We have usually kept the Access front ends on network drives, and just put up with the need to compact and repair on a regular basis. You will probably find you need to do that even when they are installed locally, anyway.
If you must have it installed locally, there are various tools which will enable you to "push out" software updates, and the guys over on ServerFault would have more information on those. Assuming such tools aren't available, the only other option I can think of is to write a small loader program that checks the local .MDB against a master copy on the server, and re-copies it across if they are different, before then launching the MDB.