We an distributing 230K files, (873MB) of smallish JPG files on DVD. The install program will place these files in an Apache Virtual folder.
Setup(.exe) is taking a too long for our customers. Our initial approach was to create a ZIP and copy from the DVD and unzip to the client Hard disk.
I just tried a RoboCopy (we have a win7 (64 bit) 4 core computer. I tried with 16 threads. Pretty poor. Over Five Hours.
Options : *.* /V /S /COPY:DAT /NP /MT:16 /R:5 /W:30
Copied
Dirs : 6
Files : 230236
Bytes : 873.80 m
Times : 5:28:56
The DVD needs to be discarded after use, so the files need to be on the target machine. We did also try and ISO image. Not bad, takes about 10 minutes to copy, and then there is software for mounting the ISO as Drive Letter, which can be virtual folders to Apache, but the peformance with Apache is not good (used http://www.magiciso.com/ ) to mount. Besides ISO is limited size and Read-Only.
Now we are considering Virtual Hard Drive http://technet.microsoft.com/en-us/magazine/ee872416.aspx
But I have not given up on Roboform. Should I be using different switches? or is a VHD the best way to go?
Target machines are 4+ core, 10TB 24GB RAM win2008 servers.
I got the answer from a different thread. Basically, we are creating a Microsoft VHD (virtual Hard Disk) and filling in the files with RoboCopy and shipping the VHD.
See: Unzip too slow for transfer of many files
Related
I am using Windows 11 Pro (since late November '21), before that Win 10 Pro.
For many years I have been happily backing up my files [ROBOCOPY] to a hidden NAS share but recently I retired the NAS and moved to a USB solution (USB Docking Station, 2TB Drive), removing drives in between backups. Which is where things started to go a bit weird because some of the folders didn't appear on the backup drive. Using TreeSizeFree I can see that the folders and their sub-folders/files are there (all entirely visible, of course) have copied across, I just can't see the folders in question.
Here is a sample line from my script:
robocopy C:\Games E:\Games /v /e /xo /w:5 /r:2 /log+:E:\bak-Games.log /tee /copy:dat /dcopy:t
robocopy S:\ E:\Shared /v /e /xo /w:5 /r:2 /log+:E:\bak-Shared.log /tee /copy:dat /dcopy:d >
The top line works as expected but the bottom one doesn't.
I've checked all the switches I'm using and none of them appear to do anything that might account for this behaviour.
I've tried a number of solutions including using the "/a-:sh" switch, making sure that files aren't hidden and pre-creating the folder in question but it just makes them invisible.
The only thing that appears to be a common factor between lines of the script that work and those that don't is that the ones that do are copied from folders (into appropriately named folders on the backup drive) while the ones that don't are copied from partitions i.e. (in the sample script lines above):
robocopy c:\games to e:\games. [WORKS AOK]
robocopy s:\ to e:\shared. [WORKS, ROOT FOLDER NOT VISIBLE]
Any advice appreciated :)
Thanks
James
My goal is to make an Electron application, which synchronizes clients' folder with server. To explain it more clearly:
If client doesn't have the files present on the host server, the application downloads all of the files from server to client.
If client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files.
If a file has been removed from the host server, but is present at client's folder, the application deletes the file.
Simply, the application has to make sure, that client has EXACT copy of host server's folder.
So far, I did this via wget -m, however frequently wget did not recognize, that some files changed and left clients with outdated files.
Recently I've heard of zsync-windows and webtorrent npm package, but I am not sure which approach is right and how to actually accomplish my goal. Thanks for any help.
rsync is a good approach but you will need to access it via node.js
An npm package like this may help you:
https://github.com/mattijs/node-rsync
But things will get slightly more difficult on windows systems:
How to get rsync command on windows?
If you have ssh access to the server an approach could be using rsync through a Node.js package.
There's a good article here on how to implement this.
You can use rsync which is widely used for backups and mirroring and as an improved copy command for everyday use. It offers a large number of options that control every aspect of its behaviour and permit very flexible specification of the set of files to be copied.
It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.
For your use case:
If the client doesn't have the files present on the host server, the application downloads all of the files from a server to the client. This can be achieved by simple rsync.
If the client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files. Use: –remove-source-files or -delete based on whether you want to delete the outdated files from the source or the destination.
If a file has been removed from the host server but is present at the client's folder, the application deletes the file. Use: -delete option of rsync.
rsync -a --delete source destination
Given it's a folder list (and therefore having simple filenames without spaces, etc.), you can pick the filenames with below code
# Get last item from each line of FILELIST
awk '{print $NF}' FILELIST | sort >weblist
# Generate a list of your files
find -type f -print | sort >mylist
# Compare results
comm -23 mylist weblist >diffs
# Remove old files
xargs -r echo rm -fv <diffs
you'll need to remove the final echo to allow rm work
Next time you want to update your mirror, you can modify the comm line (by swapping the two file arguments) to find the set of files you don't have, and feed those to wget.
or
rsync -av --delete https://mirror.abcd.org/xyz/xyz-folder/ my-client-xyz-directory/
As everyone knows OVF is Open Virtual Format for exporting virtual appliances it is helps in many aspects and reliable. I got to know about OVF from wiki Open Virtualization Format. Hypervisors like VMware bare-metal hypervisor, Virtualbox, Hyprer-V, has provided their tools for converting VM to OVF/OVA formats. Got to know from below helpful links VMware, Hyper-v,VirtualBox.
But how can i do the custom OVF if i have only VHD,VHDX,VDI,VMDK files of some Virtual Machine?
Does there any difference between VMDK and VMDK from exported OVF?
Is there any programmatic approach by using which i can do this easily?
Thanks
VMware OVF package consist of sparse disk. I did it simple way with the help of VirtualBox, VirtualBox provides you command line option for disk conversion so you can get your disk in target format and then create package, Package consist of .OVF file and .MF file along with disks in one folder.
.MF File consist of SHA1 check-sum of all files in package
.OVF consist of deployment configuration i.e Controller, Disks, RAM,
CPU etc.
No need to study everything just export some VM in OVF format and then refer that .OVF and do changes as you want and update check-sum in .MF file
how can i do the custom OVF?
VMWare OVF file is just an .xml file. It contains the information about resources like disk file .vmdk, cd/disk file .iso, memory, vCPU, Network Adapter, virtual disk size and host configuration parameters.
For reference you can export ovf file from any VM which is already created/running on host.
Does there any difference between VMDK and VMDK from exported OVF?
We can export VMDK from host not from OVF(.ovf is just file). I think exported VMDK and VMDK are same. Because from exported VMDK can also be used to bring-up VM on host.
Is there any programmatic approach by using which i can do this easily?
You can update the ovf file using any programming language. But I prefer to choose python and library.
I prefer to use .OVA instead of .OVF file.
Basically .OVA is tar of .VMDK, .OVF, .MF(cryptography file of all files in .OVA tar (optional)), .iso(optional), etc.
IF you use .OVF file to bring-up instances, you need to keep all the information provided files in same directory like .VMDK, .iso, etc. There may be chances of missing files or placed in different directory.
I have two computers that aren't networked. I need to replicate the folder structure of one drive that exists on one computer and put it on the other computer. Both Windows 7 machines. I don't need the files, just folders/directories. The drive letters are the same on both computers (Y). The computers are miles apart physically, but I do have access to the computer I am trying to get the folder structure of via LogMeIn.
I am thinking I need to save the folder structure to a file using some process. Move that file to my computer (via email or LogMeIn File Manager) and run some process to put in on my computer.
Is there a better solution? If not, is there code out there to do this via VBA, Cmd window, .bat script, VB.NET, or VBS? I know I can write it in VBA, but I'd rather not recreate the wheel if I don't have to.
I don't have a 'process'/program that does this. LogMeIn File Manager doesn't do this either (I asked). There are lots of paths on this drive that I need so creating them by hand would take a long time. I searched and found a lot of solutions that work with computers that are networked, but these computers are not networked, hence why I think I need to save it to a file. Again, I don't want all the files on the drive (its huge and most of the files are unnecessary), just folders.
thanks.
Create a directory listing of the source computer and redirect the output to a text file:
dir /ad /b /s >> dirlist.txt
The switches to dir are (more info at MS TechNet):
/ad List only files with attribute of directory
/b Bare format (no heading information or summary)
/s Displays files in specified directory and all subdirectories
Transfer dirlist.txt from the source computer to the destination computer.
On the destination computer, use a for /F with that text file from the command prompt to create the directory structure:
for /F "delims=" %i in (dirlist.txt) do md "%i"
The switches to for are documented at MS TechNet
You have many options:
Windows command xcopy source destination /T /E. The /T option creates the directory structure, and the /E option includes empty directories.
Bundle the empty directory structure into an installer (perhaps as a zip file).
If the structure isn't relatively small and not likely to change, you could just put a bunch of md commands into a batch file.
Combination - xcopy the structure locally, zip it, transfer it, unzip it.
Just a conceptual question. A program file is compiled and linked with required libraries into an exe file. Now this file I understand is machine code, which the processor can understand. What I am curious about is how the OS plays a role. I would of thought the OS actually interprets the exe file ? I mean if I write a program in assembly, I could do modification on memory blocks anywhere, does the OS protect against this?
Yes the OS (specifically the loader) parses the executable file format. On Windows this is a PE (Portable Executable). On Linux, this is typically an ELF (Executable and Linkable Format) file. Generally, the OS doesn't concern itself with the actual code, however.
The loader determines which chunks of the program on disk go where in memory. Then it allocates those virtual address ranges, and copies the relevant parts of the file into place. Then it does any relocations required, and finally jumps to the entry point (also specified in the file format.)
The thing to remember is that most all modern OSes protect processes from one another by means of Virtual Memory. That means that every process runs isolated in its own Virtual Address space. So if Notepad writes to address 0x700000, he's not going to affect a variable that Word has at 0x700000. By the way these virtual addresses work, those actually map to totally different addresses in RAM.
On x86 platforms, this security is provided by the Protected Mode and Paging features of the processor.
The key is that it is the hardware that prevents you from doing anything "bad".
Peering Inside the PE: A Tour of the Win32 Portable Executable File Format
Microsoft PE and COFF Specification
ELF Specification