Space in Directory Parameter of svcutil.exe - wcf

I'm attempting to download metadata for a WCF service using svcutil but I'm running into issues with the /directory:<> parameter. The directory I want to save to has a space in it:
C:\Service References\Logging
so when I execute /t:metadata I receive the following error:
Error: The directory 'C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\References\Logging' could not be found. Verify that the directory exists and that you have the appropriate permissions to read it.
It looks to me like the space in "Service References" is causing the issue. From my understanding of command shell (which is very little) spaces act as delimiters for an executable. So I tried escaping the space with a carrot
Service^ References
and surrounding the path in double quotes
"C:\Service References\Logging"
but neither of those seem to be working, as the /directory: parameter doesn't recognize them as valid characters in the value. I haven't been able to find any direction in regards to this and svcutil, so I'm at a loss right now.
I could download the files to a temp folder and then move them, but I would prefer not to take that approach.
I would appreciate any direction that could be given on trying to resolve this.
Thanks in advance.
-- EDIT --
this is the full command that I'm trying to run. if you try it yourself, you'll have to add you're own WCF reference as this one is on an internal ip address
svcutil /t:metadata http://dev.taskservices.noelnet.com/LoggingService/LoggingService.svc /d:C:\Service References\Logging\

According to the documentation for svcutil
/directory: - Directory to create files in (default: current directory) (Short Form: /d)
Since the default is to use the current directory, let us change the current directory for the command.
pushd "C:\Service References\Logging\"
svcutil /t:metadata http://dev.taskservices.noelnet.com/LoggingService/LoggingService.svc
popd
If you do not need to revert back to the original directory you can just use cd "C:\Service References\Logging\".
Note, in order for this to work, svcutil must be called using its entire path or have its path listed in the PATH environment variable. This is what I mean by calling using its entire path:
cd "C:\Service References\Logging\"
"C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\svcutil.exe" /t:metadata http://dev.taskservices.noelnet.com/LoggingService/LoggingService.svc

Related

asp.net core gulp path too long

The gulp files installed in a asp.net5 web project use the maximum path length. If you have a project path with more than a few characters long, the folders cannot be deleted.
This post refers to how to build using a short temp directory:
"Path too long" when publishing asp.net 5 from Visual Studio 2015
The question is, how do you easily remove these files when you need to clean up, restore, or archive a project?
Simple answer is file system basics. Create a directory in the same root as your project and give it a really short name (like "c:\t"). Then move all the files in node_modules there. then delete them.
I hear ms is working on a more workable gulp folder structure.
The reason you are hitting the NTFS file, path and name length limit of 255 characters is because of NPM nesting of package dependencies, which is a known Node issue on the Windows stack. You should try to update NPM to the latest version, 3.0 or greater, where they now use a flat approach to handle package dependencies. This will help you avoid the problem "unable to delete" because you will never have paths beyond 255 characters.
Perform the following:
1) Update NPM on your machine, by updating to the latest version of Node (download from https://nodejs.org/download).
2) Update Visual Studio 2015 External Web Tools to point to the folder with the new tools. (Tools-Options-Projects And Solutions-External Web Tools).
Usually:
C:\Program Files\nodejs"
or
C:\Program Files (x86)\nodejs"
Make sure this is the top option on the list of paths.
3) (On automated build) Make sure that Visual Studio does not use the packaged NodeJS version when building your project by passing in the following parameter to MSBuild.
/p:ExternalToolsPath="C:\Program Files\nodejs"
or (x86) if applies:
/p:ExternalToolsPath="C:\Program Files (x86)\nodejs"
After doing a lot of head hunting, I found about robocopy and this command has been my friend since then. I use the following steps to remove a file or folder when the windows path is too long
Create a folder anywhere in your system to use as a source (leave it empty).
Take back up from the folder you want to delete (if there is something important)
Open Command prompt
Type the following command. Modify the placeholders to suit your needs.
robocopy C:\path-to-source-empty-folder E:\path-to-folder-you-cant-delete /purge.
Note: If there are spaces in source or destination path in Step 4, the path must be enclosed by quotation marks.
After successful execution of the command, you will get execution report like the following
Everything inside the destination folder will be deleted forever.
You can also type robocopy in command prompt to see other options.
I hope this helps.

Using SSIS package to zip all the txt files and move to related folder [duplicate]

I am trying to zip the contents of a Folder in SSIS, there are files and folders in the source folder and I need to zip them all individually. I can get the files to zip fine my problem is the folders.
I have to use 7.zip to create the zipped packages.
Can anyone point me to a good tutorial. I haven't been able to implement any of the samples that I have found.
Thanks
This is how I have configured it.
Its easy to configure but the trick is in constructing the Arguments. Though you see the Arguments as static in the screenshot, its actually coming from a variable and that variable is set in the Arguments expression of Execute Process Task.
I presume you will have this Execute Process task in a For Each File Ennumerator with Traverse SubFolders checked.
Once you have this basic setup in place, all you need to do is work on building the arguments to do the zipping, how you want them. A good place to find all the command line arguments is here.
Finally, the only issue I ran into was not providing a working directory in the command line arguments for 7zip. The package used to run fine on my dev environment but used to fail when running on the server via a SQL job. This was because 7zip didn't have access to the 'Temp' folder on the SQL Server, which it uses by default as the 'working directory'. I got round this problem by specifying the 'working directory as follows at the end of the command line arguments, using the -ws switch:
For e.g:
a -t7z DestinationFile.7z SourceFile -wS:YourTempDirectoryToWhichTheSQLAgentHasRights

Nuget: Is there a transformation token available to get the location of the package tools folder?

I am trying to use Nuget to distribute a ms build .targets file. I need to modify some elements of the file to include the installed path of a few assemblies. For that I would like to use the tools folder. I am having a hard time finding the token (if it exists) to do the replacement. Has anyone encountered this problem or know of a workaround?
http://docs.nuget.org/docs/creating-packages/configuration-file-and-source-code-transformations
You'll have to go the PowerShell route to get this done, as no transform exists AFAIK. The init.ps1 file can process some parameters provided by the NuGet VSIX.
Simply add the following to the top of the init.ps1 file and use the $installPath variable in your scripts that modify the file content.
param($installPath, $toolsPath, $package, $project)
Check here for an example usage.

MSBuild Package Location

When I run MSBuild with the /t:Package parameter I want to be able to specify where the folder that contains the *.cmd and *.zip files gets output. Specifying _PackageTempDir outputs the entire application without the deploy files (*.cmd and *.zip). Is there any way to specify this in the command line?
UPDATE:
The OutDir param outputs more than I need or want.
I've found that setting /p:DesktopBuildPackageLocation=some\package.zip for MSBuild doesn't work (it works when specified in pubxml though).
However, it turns out that setting /p:PackageFileName=some\package.zip works fine. Furthermore, you can use it along with /p:PublishProfile parameter.
If you set
<DesktopBuildPackageLocation>c:\foo\MyProject.zip</DesktopBuildPackageLocation>
you'll get the .zip file, the .cmd file, and the other related output files in c:\foo.

How can I prevent Visual Studio from locking the xml documentation files in the bin directory?

My visual studio solution includes a web application and a unit test application. My web application uses log4net. I want to be able to use msbuild from the command-line to build my solution. However, whenever I build the solution from the command-line, I get build errors because it can't copy log4net.xml to the test project's bin directory.
The error message is:
"Unable to copy file '\bin\log4net.xml' to 'bin\Debug\log4net.xml'. Access to the path '\bin\log4net.xml' is denied."
It looks like Visual Studio is locking this file, but I can't figure out why it would need to. Is there a way to prevent VS from locking the XML documentation files in a project that it has loaded?
I've found the following solution:
In VS postbuild event or in NAnt/MSbuild script execute the cmd script
handle.exe -p devenv [Path to the folder with locked files] > handles.txt
FOR /F "skip=5 tokens=3,4 delims=: " %%i IN (handles.txt) DO handle -p %%i -c %%j -y
handle.exe is available here http://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
first line of the script dumps to handles.txt all handles for files locked by VS
second line reads handle ids from the file and kills the handles
After the script is executed files may be removed/replaced/moved etc
If you're fine with omitting the xml & pdb files altogether from the output, you can pass /p:AllowedReferenceRelatedFileExtensions=none to msbuild on the command line.
(Thanks to related answer https://stackoverflow.com/a/8757941/251011 )
EDIT: If you also have problems with dll files having this error, I recently discovered an environment variable solution: https://stackoverflow.com/a/23069603/251011
I've had this problem with Visual Studio, too. We use NAnt instead of MSBuild, but the problem is the same. I was able to work around it by modifying the build file to ignore failures when copying xml documentation.
Note that this doesn't actually solve the original problem since the xml files are still locked, but this workaround was good enough for us since the actual content of our xml documentation doesn't change very often.
Krystan wrote:
You could drop this file into another directory and reference it from there or place code that uses it into a library and have the post build event on that copy it to its bin directory and then reference.
Our xml file locking problem is not in the projects bin directory, rather an external reference directory. We hit it when performing TortoiseSVN->Update where a new version is available. Assuming it's because VS is using the file for intellisense.
For those who hit this locking issue due to TortoiseSVN->Update, I'm currently experimenting with a pre-update hook which deletes the offending file(s) before updating (they will be restored if no update is needed), so far this seems to work (which is weird) but I haven't tested it thoroughly enough to say for sure. Will update this answer if it proves reliable.
Here's hoping MS fix it in VS 2010.
Basically don't check files into the bin folder, its a bad idea.
You could drop this file into another directory and reference it from there or place code that uses it into a library and have the post build event on that copy it to its bin directory and then reference.
Msbuild will then copy that to the webprojects bin directory for you :)
We have this exact issue with people checking in stuff to the bin directory, unless you absolutely have to bin directories should either not be checked in at all or just have .refresh files in there to avoid these sorts of locking issues.
Bit late on the reply, sorry :)