I'm new to the scripting world and need help. I have been using RoboCopy to copy a file from one server to another.
With RoboCopy, I used /xo to only copy the newer file. Now I need a script to copy a new file from source to destination and then run the command CrcTool.exe against the file just copied.
The problem is that I don't know the file name that was copied so I don't know how to execute the CrcTool.exe
Related
I am trying to copy files from one folder path to another folder path in the TFVC repository using Powershell script in CI as mentioned below.
Get-Item -path $/SCSM/CMC/1.1.0.0/Dev/1.1.1.0/TSC/ServiceRuleScript - Destination $/SCSM/CCB/1.0.0.0/Dev/1.0.1.0/TSC/RulesEngine
When trying the above script getting the below error:
Copy-Item : Cannot find path 'C:\privateagent\_work\66\s\$/SCSM/CMC/1.1.0.0/Dev/1.1.1.0/TSC/ServiceRuleScript
We tried with the copy file task in CI, it is getting executed but we are unable to see the file in the destination folder path
enter image description here
Please help me out with this task.
The problem is that your Source Folder is wrong.
Use this Predefined variables:
$(Build.SourcesDirectory)\{your folder want to be copied}
It works on my side:
We are trying to call file from one path "Masters\DbProject\V1.0.0.1\PowershellScripts
and we are using the copy task to call file in another path "CCM/1.0.0.0/1.1.0.0/Dev/1.1.1.0". Both are different paths. When we are trying to call the file using copy file task getting the below error.
##[error]Unhandled: Not found SourceFolder: C:\privateagent_work\66\s\Masters\DbProject\V1.0.0.1\PowershellScripts
But the file which we are trying to call is in the below path: "C:\privateagent_work\79\s\Masters\DbProject\V1.0.0.1\PowershellScripts
We have mentioned the below path in our copy task
Source folder in copy task - $(Build.SourcesDirectory)\Masters\DbProject\V1.0.0.1\PowershellScripts
The problem here in the copy task is - The copy file task should call file from "C:\privateagent_work\79\s\Masters\DbProject\V1.0.0.1\PowershellScripts" but it is trying to call the file from "C:\privateagent_work\66\s\Masters\DbProject\V1.0.0.1\PowershellScripts
Please share your inputs.
I am importing data from a CSV file using the COPY FROM in PostgreSQL. It works flawlessly on my machine, however, were I to clone the repository onto a new machine the code would cease to function to the file path being hardcoded starting at the Users directory of the computer.
In other languages, I would be able to use something like ./ or ~/ to start somewhere not at the absolute beginning of the file, but I haven't found T-SQL or Postgres to have that functionality available.
What I have
COPY persons(name,address,email,phone)
FROM '/Users/admin/Development/practice/data/persons.csv
How can I make that file path function on any machine the project gets cloned to?
I put my LINQPAD.config in the same folder as LINQPad.exe and it works well for LinqPad GUI.
When I run the same .linq program via lprun.exe. lprun.exe does not use the same LINQPAD.config file as LINQPad.exe but instead it expects a LINQPAD.config file in the same directory as the .linq file which is passed as an argument to lprun.exe.
This is quite inconvenient. Now, I have two options, either copy LINQPAD.config to where my .linq script is or copy my .linq script to where lprun.exe is.
lprun.exe's help doesn't mention anything about specifying a path to LINQPAD.config.
This is a bug and will be fixed for the next LINQPad 5 release (5.0.10). It should pick up the config file from the query folder only if it exists, otherwise it should fall back to the lprun.exe folder.
I've searched all over and can't seem to find a script that will work. All I want to do is have a "failover script" that will copy files from a directory on my server and paste it into a different folder on a different server on the network.
Then just use copy method
Microsoft Copy method
TUtorial copy from FTP to server
I think we have a problem in our FTP scripts that pull files from a remote server to a local machine. I couldn't find an answer in their knowledge base, nor scripting documentation.
We are doing an MGET *.* and then a MDELETE *.* immediately after it. I think what is happening is that, while we are copying files from the server, additional files are copied into the same directory and then the delete command deletes everything from the server. So we end up deleting file we never copied down.
Is there a straight-forward way to delete only the files that were copied, or is it going to be some sort of hack job where we generate a dynamic delete script based on what we actually copied down?
Answers that are product specific would be much appreciated!
Here were the options that I came up with and what I ended up doing.
Rename the extension on the server, copy the renamed files, and then delete the renamed files. This could not work because there is no FTP rename command that works with wildcards (Windows rename command will by the way).
Move the files to a subdirectory on the server, copy the files from that location, and then delete from the remote location. This could not work because there is no FTP command to move the files on the remote server.
Copy the files down in one script and SHELL a batch file on the local side that dynamically builds a script to connect to the server and delete the files that were copied down. This is the solution I ended up using to solve this problem.