Remote Differential Compression with Diff Tools like Beyond Compare - beyondcompare4

Should I turn off Remote Differential Compression (RDC) on Windows 11 when using diff tools like Beyond Compare to sync files beteeen two computers?
Sync between Windows 10 computers is much faster than sync between Windows 10 and Windows 11 and I was wondering if this is related to RDC.

Beyond Compare doesn't do anything to specifically take advantage of Remote Differential Compression (RDC).
Scooter Software hasn't tested copy performance with RDC on vs RDC off, for internal testing we generally use default settings for each version of Windows.

Related

Migrate 100+ virtual machines from on-prem to azure

Apologies if this is the wrong platform for this question.
If I want to migrate 100 VM's onto Azure VM's what all things I need to consider and how can I migrate?
This is not a comprehensive answer but some things to consider are:
- Start with a thorough inventory of the VMs to migrate. Issues to watch out for include..
- Any unsupported OS versions, including 32-bit.
- large numbers of attached drives.
- Disk drives >1TB.
- Gen 2 VHDs.
- Application and network interdependencies which need to be maintained.
- Specific performance requirements (i.e. any VMs that would need Azure premium storage, SSD drives etc.).
In developing a migration strategy some important considerations are:
- How much downtime can you tolerate? To minimize downtime look at solutions like Azure Site Recovery which supports rapid switchover. If downtime is more flexible there are more offline migration tools and scripts available.
- Understand whether to move to the new Azure Resource Manager or the Service Management deployment model. See https://azure.microsoft.com/en-us/documentation/articles/resource-group-overview/.
- Which machines to move first (pick the simplest, with fewest dependences).
- Consider cases where it may be easier to migrate the data or application to a new VM rather than migrate the VM itself).
A good forum to ask specific migration questions is: Microsoft Azure Site Recovery
Appending to sendmarsh's reply
The things you will have to consider are:
Version of virtual environment i.e VMWare or Hyper-V.
Os version, RAM Size, OS disk size, OS disk count, Number of disks, Capacity of each disk, format of hard disk, number of processor cores,number of NIC's, processor architecture, Network configurations such as IP address's, generation type if the environment is Hyper-V.
I could have missed a few more things... like checking if the VMWare tools are installed. Some of the configurations are not supported like having an iSCSI disk will not be supported. Microsoft supports not all naming conventions for the machines, so be careful in setting the name as that might affect things later.
A full length of pre-requisites list is over at:
[1]: https://azure.microsoft.com/en-us/documentation/articles/site-recovery-best-practices/#azure-virtual-machine-requirements
Update: Using Powershell to automate the migration would make your life easier.

DotNetZip performance issue, but only on one specific server

I'm having a wierd performance problem with the DotNetZip library.
In the application (which runs under asp.net) i'm reading a set of files from the database and packs them on-the-fly into a zip file for the user to download.
Everything works fine on my development laptop. A zip file being about 10MB with default compression rate takes something around 5 seconds to finish. However, on the dev server at the customer, the same set of files takes around 1-2 minutes to compress. I've even experienced even longer times, up to several minutes. The CPU utilization is 100% when the zipping is running, but otherwise it stays around 0%, so it's not due to overload.
What's even more interesting is that on the production server, it takes something about 20 seconds to finish.
Where should I start looking?
Some hardware specs:
My Laptop
Development environment running on a virtualbox with 2 cores and 4GB RAM dedicated.
Core i5 M540 2,5GHz
8 GB RAM
Win7
Dev Server
According to properties dialog on My Computer (probably virtualized)
Intel Xeon 5160 3GHz
540MB RAM
Windows 2003 Server
Task Manager Reports Single Core
Production Server
According to properties dialog on My Computer (probably virtualized)
Xenon 5160 3GHz
512MB RAM
Windows 2003 Server
Task Manager Reports Dual Core
Update
The servers are running on a VMWare host. Found the VMWare icon hiding in the taskbar.
as mitch said, the virus scanner would probably be your best bet. that combined with the dev server being just a single core machine and the production server being a dual core (and probably without virus scanner) may explain a delay. what would also be valuable to know is the type of disk in those machines. if the production server and your laptop have SSDs and the dev server has a very old standard harddisk with low rpm, for example, that would also explain a delay. try getting a view on the I/O reads/writes for the zipfolder for the dev server and production server, you could use the SysInternals tools for that, and if you have a virus scanner or any other unexpected process running you're probably going to see a difference there. the SysInternals tools could be of value here in finding the culprit quickly.
UPDATE: because you commented the zip is created in-memory I'd like to add you can also use those tools to get a better understanding of what happens in memory. a delay of several minutes where you'd expect almost equal results because the dev server and production server are a lot alike has me thinking of the page file.. see if there are other processes on the dev server that have claimed a lot of memory. if there isn't enough left for the zip operation the dev server will start using the page file, which is very expensive.
The hardware seemed to be the problem here.
The customer's IT guys have now upgraded the server hardware on which the virtualized dev server runs and I now see compression times at about 6s for the same package size and number of files as on my local computer.
The specs now found in the My Computer properties window:
AMD Phenom II X6 1100T
3.83GHz 1,99 GB RAM

Multi-platform development from one computer

I am planning to build a new development computer for both Windows & Linux platforms. On Windows, my development would be primarily in .NET/C#/IIS/MSSQL Server. On Linux—preferably Ubuntu—my development would be in Ruby and Python.
I am thinking of buying a laptop with Windows 7 pre-installed with 4GB RAM, Intel Core 2 Duo, and 320 GB HD; running 2 VMs for both Windows and Linux development with the host OS as my work station. Of course, I would be running DBs and web servers on the respective platforms.
Is this a typical setup? My only concern is running two VMs side by side. Not sure if this configuration would be optimal. Alternative would be to do my Windows development on the host Windows 7 OS. What are your thoughts?
I really like using VMs for development, because it makes it really easy to maintain different configurations, make backups, test comms between machines, experiment, and so forth.
Linux VMs work pretty well. Windows in a VM on Windows, however, can be a resource hog. You probably want more than 4 GB on the laptop.
If your not going to be switching between the two platform frequently, I would recommend repartitioning your hard drive after you get your machine, and installing Windows in one partition and Linux in the other. Doing things that way is usually simpler, in that you don't need the over head of the VMs.
Sounds like you will get 15 minutes out of this laptop's battery, maybe 20.
Speaking from experience, you will prefer a desktop plus a "more mobile" laptop. You can do this without spending more than you had budgeted (remember you can skip the monitor on the deskop), but that will probably get you slightly less specs in exchange for the flexibility and a laptop you really can take with you. But I recommend spending slightly more than you would on the single laptop, and remember you do get two machines out of it.
You can network between them (e.g. use remote desktop programs from the laptop to connect to VMs on the desktop).
In my particular case, about 6 years ago I needed a new machine that could be used for on-location photography and had the power, screen-size, disk space, etc. to run Photoshop and other tools (e.g. batch processing ~900 largish images was one use case). I got a beefy laptop, which worked great for that, but the battery quickly died and never had much lifetime in the first place. The system has always been more of a "slightly-easier-to-move desktop" than a laptop, and it sounds like you'd rather have a real laptop.

What do I need to know about running my own dedicated server (with windows 2008)

I'm thinking of getting my own dedicated server with the following stats:
Processor: Celeron 440 2.0 GHz
Memory: 1 GB
Primary Hard Drive : 160 GB SATA II
This will be running Windows. I have some experience with my local IIS and playing around with servers, but I have never set one up (at least a Windows one) and I've never dealt with DNS/backup/security issues.
My question has two parts:
Will this server be able to run Windows 2008, SQL Server, and possible Exchange on it without trouble. I'm worried about the processor and RAM.
Are there any guides/tutorials that talk about how to admin a windows server from start to finish. (I'm looking for something like the FAQs slicehost has for *nix based servers).
You WILL run into a problems with RAM. Refer to MS documentation and minimum requirements (SQL Server and Exchange). Also please mind that new releases of Exchange run only on 64bit systems.
Personally I would recommend installing CORE version of W2K8 if you plan to go with your described configuration.
It depends from user load. If you have about 1k unique users / month this means that probably, you will have 30 users per day - roughly 1 per hour. I think you will use more CPU working on this computer personally. So it really depends from user load.
If I were you, I would add more RAM to have something about 4 GB. RAM is the cheapest upgrade available.
You state "I've never dealt with DNS/backup/security issues."
I would suggest to you that these are the most important issues. You need to stay on top of security, applying security patches, insuring firewalls are properly configured etc.
Having been called after the fact for websites that have been hacked, I can tell you it is not pretty. Learn all you can before you stand up this server on the internet.

What are the key use cases for use of virtualization in software development?

What are the key use cases for the use of virtualization -- that is, running one or more "virtual PCs" using software such as VMWare and Microsoft Virtual PC -- for software development?
Also -- are there other instances/uses of virtualization that aren't covered by my definition above (use of a tool like MS Virtual PC or VMWare), and that are useful to developers?
My impetus for asking is this StackOverflow comment by Metro Smurf asserting "You'll wonder how you ever developed without it!", regarding use of virtualization.
(Please include just one use case per response. Thanks!)
Application testing in multiple environments is one obvious use of virtualization that I'm aware of. Testing your application on other operating systems (without requiring additional physical computers to do so), as well as testing that involves software that generally only allows you to install a single version on a given machine (such as the Internet Explorer browser; running both IE6 and IE7 on the same machine is not an officially supported configuration), are good candidates for virtual machine usage.
If your build-server is running in a VM, you can make a snapshots of it for every software release in order to be 100% sure that you can recreate the build environment (in case you want to make patches to old releases, for example).
If you set up snapshots of your development environment (and back them up) it can be very easy to resume productivity if your computer breaks down. When your machine breaks down right before your release - and you can resume immediately with all your tools installed and configured, it can be a lifesaver.
The simplest case which applies to my current situation is that we have a complex client-server environment and with virtualization every developer can very quickly get a baseline set of operating systems to deploy their local build to and verify end to end functionality.
Locally you have your dev box, and N client boxes which get re-initialized as fresh OSes each time you want to try a build. Essentially it's the test environment equivalent of a 'make clean' where even the client workstation gets replaced with a new OS.
Quickly distributing environments between team members is a very nice use case to for virtualization especially if you have a lot of various components, tools, etc.. This can save you a ton of time with new hires, contractors, or other individuals who need an environment quickly.
Many presenters use a VM for presentations - it allows them to revert immediately to reset the presentation for the next day, transfer all presentation materials quickly between computers, and not have to show your attendees your messy My Documents folder.
Using virtualization for sales activities is also a great use case. You can take a snapshot at a particular time that you can save as your demo baseline. Then once you run through the demonstration and change the data, etc. you can restore back to your previous baseline for future demonstrations. You can also capture multiple baselines and pick and choose which baseline best fits the upcoming demo.
Test environments. If you have more than one setup that a system needs to be targeted for (e.g. Windows & Linux, XP & Vista) then a machine with lots of RAM and VMWare (or on of the others) is a good way to manage the environments.
Another is developing on one system and targeting another. For example, at one point I did some J2EE work on a workstation running Linux where the client was I.E. 5.5. A VM with Windows 2000 and IE 5.5 would let me test the application.
Reasons I use virtual machines for development.
Isolate different development environments.
Testing environments.
Easy recovery due to computer hardware failure/upgrade.
Ability to "roll-back" changes to your development environment if something corrupts it.
Currently, I am using VirtualBox for my VM setup. I used to use VirtualPC, but I REALLY hated not having any type of "snapshot" feature (like VMware and VirtualBox have).
We develop software for use in our SaaS application, our production environment has a large number of servers and their software environment needs to be absolutely predictable; we can't have ANYTHING installed extra, or absent from our development machines.
Moreover, our application requires a number of different server types in order to function properly (at least 7 last time I counted); mostly they can't be installed on the same (virtual) machine - at least, not without violating the "same software as production" requirement.
In order to have a consistent environment, it's necessary to use VMs. I don't know how anyone ever manages without them.
Snapshots and rollbacks are nice too, but I use them only occasionally (really useful during installation / upgrade tests).
Suppose you're developing a new version of your software, and checking that the upgrade from the previous version works correctly... how long does it take to do a test cycle without being able to rollback the box? Do you have to reinstall the OS then the old version? Can you guarantee that the uninstall really uninstalls everything?
Being able to test/retest your deployment process is a huge savings.
Developing Add-Ins for different versions of Microsoft Office (using Visual Studio Tools for Office).
My main work machine has Office 2007. When I work with Add-Ins for Office 2003 I use a virtual machine with Visual Studio and Office 2003.
I'm suprised that nobody has mentioned the VMware record/replay feature (awesome video demo) which is great for debugging.
I have a headless server running ESXi which runs various machines for building installers (so I don't have to give up processing power on my desktop), automated testing (server is faster than any desktop) and various test environments (about 20 different configurations) so that the support team can easily jump onto a configuration that closely matches a customers system.
When you have one really beefy server running VMs that can be shared between support, test and dev teams, you introduce huge cost savings. In all, we're running ~25 VMs on ESXi (dual-quad core Xeon 2.5G + 8Gb RAM) shared between 5-10 people, some of the developers use Virtual PC and then I use VMware Workstation on my desktop. All of the Mac users here use VMware Fusion as well
I am surprised that no one has mentioned the benefit of increased security by isolating, for example, the database server and web server in different VM's.
Some server applications can use VMs too. When one vm is not used much, the server can locate the resources to other vms.
Some sort of test environment: if you are debugging malware (either writing or developing a pill against it) it is not clever to use the real OS. The only possible disadvantage is that the viruses can detect that they are being run in the virtualization. :( One of the possibilities to do it is because the VM engines can emulate a finite set of hardware.