I'm trying to dig into the depths of teamcity to get a better understanding of what its doing under the hood(and improve my own build knowledge). I noticed that when I run a build step it then executes its own .cmd which I presume wraps up the msbuild scripts. The problem is that whenever I look in the directory specified the file doesn't exist as I'm guessing it creates, executes then deletes almost instantly. Any suggestions on how to access the file? or what's inside?
Starting:D:\TeamCity\buildAgent\temp\agentTmp\custom_script5990675507156014131.cmd
A temporary file is created by TeamCity when you run add a Command Line Build Step with "Custom script" as runner.
The content of this file would be the Custom script you specified inside the user interface.
The produced output would be:
Step 1/1: Command Line (1s)
Starting: D:\TeamCity\buildAgent\temp\agentTmp\custom_script2362934300799611461.cmd
in directory: D:\TeamCity\buildAgent\work\c72dca7a7355b5de
Hello World
Process exited with code 0
In case anyone is wondering about this still, you can force echo back on.
Put as the first thing in the custom script
#echo on
this will undo the silent commands teamcity defaults to.
I looked around for a while but there seems to be no configuration variable in TeamCity allowing to keep generated files. Now if the commands executed take some time, e.g. more than a couple of seconds, you could just open the temp directory in explorer and start hitting F5 (refresh) from the moment a build is started until you see the .cmd file appear, then be quick and right-click it and select 'Edit' to open it in a text editor. If that is too hard you can try with the solution presented here: create a Powershell script with code like this:
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "D:\TeamCity\buildAgent\temp\agentTmp"
$watcher.Filter = "*.cmd"
$watcher.IncludeSubdirectories = $false
$watcher.EnableRaisingEvents = $true
$action = { $path = $Event.SourceEventArgs.FullPath
Add-content "D:\log.txt" -value (Get-Content $path)
}
Register-ObjectEvent $watcher "Created" -Action $action
Register-ObjectEvent $watcher "Changed" -Action $action
while ($true) {sleep 1}
and run it. When the build starts and creates a cmd file, the powershell script will copy the content to d:\log.txt. This will still not work for very short-lived scripts though. In that case I'd just make the script last longer by adding something like
ping 127.0.0.1 -n 5 -w 1000 > NUL
which will make it last at least 5 seconds.
Related
I have a powershell script getting all the computers from WSUS using PoshWSUS. I manually execute the script after opening Powershell in admin mode.
I have to execute the script using SSIS now. I have inserted Execute Process Task in Control Flow. The executable is set as C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
This is the argument: -NoProfile -ExecutionPolicy ByPass -command ". c:\mypath\GetWSUSList.ps1" -verb runAs
I've tried many others, mostly including in this page: PowerShell: Running a command as Administrator
But none of them worked and still getting Unauthorization error. Any help would be appreciated.
I found this link Automate Running PowerShell Scripts that Require Admin elevation via SSIS. It seems similar to the issue you have, so you might be able to use this as a reference.
Here is the solution below:
Step 1: Create a powershell script file. My script.ps1 is:
import-module poshwsus
ForEach ($Server in Get-Content $WSUSServers)
{
& connect-poshwsusserver $Server -port $WSUSPort | out-file $ProcessLog -append
& Get-PoshWSUSUpdateSummaryPerClient -UpdateScope (new-poshwsusupdatescope) -ComputerScope (new-poshwsuscomputerscope) | Select Computer, LastUpdated | export-csv -NoTypeInformation -append $FileOutput
}
Step 2: Create a .bat file, let's say it is called RunMyPS1.bat, like below.
#ECHO OFF
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& {Start-Process PowerShell -ArgumentList '-NoProfile -ExecutionPolicy Bypass -File ""C:\Scripts\WSUSReport\script.ps1""' -Verb RunAs}"
PAUSE
Note that using -verb runAs at the end of the argument line is very important here.
Step 3: Create a Task Scheduler to run your .bat file, named "RunMyBat" for example.
Open Task Scheduler, click "Create Task" on the right menu. Under General, make sure the checkbox Run with highest priveleges is checked, this is very important. Then navigate to Actions section, add new action by browsing to your .bat file.
Step 4. Run your task scheduler via SSIS
Add "Execute Process Task" to your Control Flow. Make sure the executable of the task is set to: "C:\Windows\System32\schtasks.exe" and the arguments is: "/run /TN "RunMyBat" like below.
Step 5. Run your SSIS package.
Important: Note that after the "execute process task" running the task scheduler is triggered, SSIS directly comes to the next step (if any) without waiting the task scheduler completes its process. Therefore, if there is any tasks that will use the output or updated data by your PowerShell script, then insert a "Script Task" and add sleep to ensure that your powershell script is completed.
System.Threading.Thread.Sleep(120000);
I'm using WSL and ConEmu build 180506. I'm trying to setup a task in ConEmu to use the current directory of the active tab when opening a new console but I cannot get it to work.
What I did is to setup the task {Bash: bash} using the instructions on this page
setting the task command as :
set "PATH=%ConEmuBaseDirShort%\wsl;%PATH%" & %ConEmuBaseDirShort%\conemu-cyg-64.exe --wsl -C~ -cur_console:pm:/mnt
Then following the instruction on this page, I added to my .bashrc
if [[ -n "${ConEmuPID}" ]]; then
PS1="$PS1\[\e]9;9;\"\w\"\007\e]9;12\007\]"
fi
and finally setup a shortcut using the macro :
Shell("new_console", "{bash}", "", "%CD%")
But it always open the new console in the default directory ('/home/[username]').
I don't understand what I'm not doing right.
I also noticed that a lot of environment variables listed here are not set. Basically, only $ConEmuPID and $ConEmuBuild seem to be set.
Any help would be appreciated.
GuiMacro Shell was intended to run certain commands, not tasks.
You think you may try to run macro Task("{bash}","%CD%")
Or set your {bash} task parameters to -dir %CD% and just set hotkey for your task.
Of course both methods require working CD acquisition from shell. Seems like it's OK in your case - %d shows proper folder.
I found the answer:
Shell("new_console:I", "bash.exe", "", "%CD%")
The readme is actually pretty good: https://github.com/cmderdev/cmder/blob/master/README.md
The question is all in the title : how to use LuaDoc with LuaForWindows ?
In my Lua installation, I have a luadoc_start.bat but the command windows closes as soon as it opens.
From there I don't know what else can I do.
Any help ?
Thanks
For using luadoc in Lua For Windows, the command is like this:
luadoc_start.bat path\to\lua\file\name.lua
which is to be done in either the command prompt window, or powershell.
I get a doc file to properly generate but how to do it for the whole project? I understand there is a -d argument but I am not sure how to use it, none of my tries where successful.
For this task, you'll need to write a shell script. Here's a small powershell script.
$files = Get-ChildItem .\
foreach( $file in $files ) {
luadoc_start.bat "$file"
}
Where, you have to cd to the path\to\lua\file directory and run this PS1 file.
Just append pause line to luadoc_start.bat and you will see help screen.
I'm currently using CruiseControl.NET to automate my build. We have two different ways to build a solution in my build environment - one is through the Keil uVision4 IDE, the other is through Visual Studio 2008.
I've successfully gotten the Visual Studio solution to build properly using CruiseControl.NET and have created a batch file which properly uses Keil's uVision command line interface to compile my uvproj Project (compilation details here).
Problem Description
1) I can successfully execute the build script on my Windows 2008 server and build the project if I create a command prompt with administrator privileges (I'm doing this manually - start -> run -> cmd with ctrl-shift-enter to run as admin).
2) However, if I create a command prompt without administrator privileges, and attempt to execute the batch file, the batch file won't work unless I accept the prompt asking me to confirm admin rights are required to run the batch script.
How do I automatically execute a batch file as an administrator through CruiseControl?
Is this something that could be automated using the RunAs command?
Technical details
1) The batch file being executed is pretty simple - it deletes the old output and re-makes the output, creating a build log file in the location below.
set BuildLogLocation=BuildLog\BuildLog.txt
echo on
cd ../..
cd PTM
rmdir /s /q output
mkdir output
mkdir BuildLog
C:\Keil\UV4\UV4.exe -r myProj.uvproj -o %BuildLogLocation%
echo ErrorLevel of build is %ERRORLEVEL%
echo build complete, see %BuildLogLocation%
2) Currently I'm looking to use the Exec functionality to run the Keil build script above:
<Exec>
<Command>C:\myProject\Build\KeilBuild\BuildScript.bat<Command/>
<buildTimeoutSeconds>600<buildTimeoutSeconds/>
<!-- Details about error codes can be found here:
http://www.keil.com/support/man/docs/uv4/uv4_commandline.htm -->
<successExitCodes>0,1</successExitCodes>
<Exec/>
Related questions:
How can I use a build server with Keil uVision4 (MDK-ARM), script a build, use a makefile? (Electrical Engineering)
Execute a command-line command from CruiseControl.NET (Stack Overflow)
Can you run CCService, the CruiseControl.NET Windows Service, as a user who has administrative permissions? I'd try that first.
If that doesn't work, I would use runas to run your script. You'll have to embed the administrative user's password in the script calling runas.
I know this is old but, Did you get an offical way to do it Via Cruise Control?
Normally I create this and call it to call other processes "As Admin".
Make a ".VBS" script with This in the contents:
Dim strBatchPath
strBatchPath = "PATH-TO-FILE.EXE"
Set runBatch = CreateObject("shell.application")
runBatch.shellexecute strBatchPath,,,"runas",1
That could be an option to people that can't find an official way
You could try psExec from sysinternals. If you don't need to run as a nt-authority account you should be able to use this in the same way as runas.
It allows you to pass in the username/password as a switch (if memory serves)
I have Discovered that when using PSEXEC and using the -h switch, it then "runs as admin" on destination
e.g.
psexec -h \ServerToRunOn /accepteula -u DOMAIN\USER -p PASSWORD "PATH-TO-FILE"
I am Using CC.Net to call a batch file with the above in. This will run that file as Admin
I'm currently writing a library in C# and was using PowerShell to quickly test it on some occasions. However, this prevents me from re-building the project as PowerShell obviously still has the DLL open.
Is there a way of unloading the DLL again after adding it with Add-Type? The documentation doesn't seem to have clues on that and the obvious candidate would be Remove-Type (which doesn't exist – there is only one command anyway with Type as its noun). It gets cumbersome to close PowerShell and do all the stuff of navigating to the build directory and adding the type again each time I want to rebuild.
Like the others say, this is a .NET behavior. Assemblies loaded into an AppDomain cannot be unloaded. Only the AppDomain can be unloaded, and powershell uses a single appdomain. I blogged a bit about this some years ago:
https://web.archive.org/web/20170707034334/http://www.nivot.org/blog/post/2007/12/07/WhyAppDomainsAreNotAMagicBullet
When I test like this, I usually keep a shell open and use a nested shell to do tests. start powershell, cd to bin location then run "powershell" to start nested shell (new process.) "exit" to start over, and run "powershell" again.
I find the simplest way to get around this problem is to wrap the Add-Type and the test code inside of a Start-Job. Start-Job will create a background process, and the type will be loaded there. Once you are done, the process goes away and you're free to retry.
Here's an example of how it looks:
$job = Start-Job -ScriptBlock {
Add-Type -path 'my.dll'
$myObj = new-object My.MyTestClassName
$result = $myObj.TestMethod
$result
}
Wait-Job $job
Receive-Job $job
The output from the test method will be echoed to the console.
If your assembly doesn't require a binding context you can do this:
$bytes = [System.IO.File]::ReadAllBytes("Path_To_Your_Dll.dll")
[System.Reflection.Assembly]::Load($bytes)
Here is a complete example that allows to run the Add-Type command as a background job so that the assembly is unloaded once it finishes:
# Start-Job will not preserve the working directory, so do it manually
# Other arguments can also be passed to the job this way
$cd = Split-Path $MyInvocation.MyCommand.Path
$jobParams = #{
'cd' = $cd
}
Start-Job -InputObject $jobParams -ScriptBlock {
cd $Input.cd
Add-Type -Path assembly.dll
} | Receive-Job -Wait -AutoRemoveJob
Receive-Job -Wait will make sure that the output of the job is received since otherwise it will be lost.
I have been facing to similar problem. It is not possible to unload a type/assembly (that's because it applies to .NET framework).
In .NET you can solve it if you crate a new application domain (System.AppDomain) and load the assembly into that domain. It is possible to unload the app domain and that unloads all the dlls as well.
I haven't tried it yet, because for me it is much simpler to close a tab in Console and open new one.
Visual Studio Code:
Settings -> Extensions -> PowerShell Configuration -> Debugging: Create Temporary Integrated Console
Check checkbox: "Determines whether a temporary PowerShell Integrated Console is created for each debugging sessions, usefull for debugging PowerShell classes and binary modules."