How to transfer the data from windows7 machine to windows 2003 server using ANT script or batch script? - apache

I am using windows7 machine,I would like to know how to transfer the data from local machine to windows2003 server and create directory in to target machine through ant script and batch script..

Most systems have an admin share defined. Your C: drive is located at \\locahost\C$. Replace localhost with the name of your target system.
You should run net use n: \\servername\c$ to establish a connection. If you are not in a domain, you will need to specify username and a password for the connection.
Once you map it, you can treat it like a local drive in your scripts in most situations. Then use whatever tool you are comfortable with to move the files. robocopy is a good one for this.

Related

Is it possible to edit code on my own machine and save it to account I've ssh'd into?

Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.

Accessing external hard drive after logging into a remote machine using ssh command

I am doing an intensive computing project with a super old C program. The program requires a library called Sun Performance Library which is a commercial ware. Instead of purchasing the library by myself, I am running the program by logging onto a Solaris machine in our computer lab with the ssh command, while the working directory to store output data is still on my local Mac.
Now, a problem just occurred: the program uses large amount of disk space to save some intermediate results and the space on my local Mac is quickly filled (50 GB for each user prescribed by the administrator). These results are necessary for the next stage of computing and I cannot delete any of them before it finally produce the output data. Therefore, I have to move the working directory to an external hard drive in order to continue. Obviously,
cd /Volumes/VOLNAME
is not the correct way to do it because the remote machine will give me a prompt saying
/Volumes/VOLNAME: No such file or directory.
So, what is the correct way to do it?
sshfs recently added support for "slave mode" which allows you to do this. Assuming you have sshfs on Solaris (I'm not sure about this), the following command (ran from your Mac) will do what you want: dpipe /usr/lib/openssh/sftp-server = ssh SOLARISHOSTNAME sshfs MACHOSTNAME:/Volumes/VOLNAME MOUNTPOINT -o slave
This will result in the MOUNTPOINT directory on the server being mounted to your local external drive. Note that I'm not sure whether macOS has dpipe. If it doesn't, you can replace it with one of the equivalent solutions at How to make bidirectional pipe between two programs?. Also, if your SFTP server binary is somewhere else, substitute its path.
The common way to mount a remote volume in Solaris is via NFS, but that usually requires root permissions.
Another approach would be to make your application read its data from stdin and output its results to stdout, without using the file system directly. Then you could just redirect the data from/to your local machine through ssh. For instance:
ssh user#host </Volumes/VOLNAME/input.data >/Volumes/VOLNAME/output.data

copying the file from ssh

I have connected to a esxi server that I created using putty in windows. I want to capture the esxtop output in a csv file to open it with excel. But I can't find the capture.csv file. I looked around and found that I might have to use scp command.
I found this is the general syntax
scp local_file(s) user#hostname:destination_directory
now what will be the hostname here if I just want to copy capture.csv to a windows drive?
I think your question is about copying a file from Linux/UNIX to Windows. To achieve the functionality of scp there is a freeware called WinSCP, which u can use for copying files. Or create a share in Linux/UNIX side using Samba and the share folder can be accessed from Windows.
When I output files on an ESX host and need to get them into my windows environment, I just create them on the host itself in a directory on the same data-store as its drives and then connect from windows via WinSCP and FTP them to my local windows desktop.
Sadly, I don't think there is really a better way at this time.

Windows Service File I/O Exception

I am trying to write a Windows service on a PC running 64-bit Windows using Visual Studio 2008. In this service, I am trying to read a a control file from an external drive located on a different machine on the same LAN. The path to the file from the reading machine will be via a mapped network drive...( T:). I am using a TextFieldParser from Microsoft.VisualBasic.FileIO class to read the file at T:\filename. I'm getting a file not found exception, however, the path to the drive works perfectly if I copy and paste it into Windows Explorer from the same machine.
Anyone know if there are any issues connecting in this manner and/or what I am doing wrong?
Thanks for you help.
You will need to make sure that the account under which the service runs has a drive mapping to T:, or, better, try using a UNC path (e.g. \\server1\someshare\filename). And you will still need to make sure that the account has access to the file. Try to use an account with its access rights limited to only what it needs, so not the Administrator account.

Multiple Website Backup

Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them