I'm working on an Inno Setup script and during uninstall I call a custom DLL to do some Revert operation. Unfortunately, after uninstall is completed the DLL and it's dependencies were not removed, despite the fact I called UnloadDLL and DeleteFile (which returns False).
Why does UnloadDLL fail?
Is there a possibility to load the DLL dynamic with LoadLibrary? I have seen some functions regarding this, but they are all deprecated. The DLL is built with Visual Studio with C interface.
Here's the code:
function Revert(param: String): cardinal;
external 'Revert#{app}\Revert.dll cdecl delayload uninstallonly';
procedure RevertAll();
var
param: String;
dataDirectory: String;
temp: String;
i: Integer;
begin
dataDirectory := ExpandConstant('{commonappdata}\MyAppData');
StringChangeEx(dataDirectory, '\', '\\', True);
param := '{"dataDirectory": "' + dataDirectory + '", "registryPath" : "SOFTWARE\\MyReg\\Key"}';
Revert(param);
temp := ExpandConstant('{app}\Revert.dll');
for i := 0 to 10 do
begin
UnloadDLL(temp);
Sleep(500);
if DeleteFile(temp) then
break;
end;
procedure CurUninstallStepChanged(CurUninstallStep: TUninstallStep);
begin
if (CurUninstallStep = usUninstall) then
begin
RevertAll();
end
end;
I would like to add a modest contribution given how many hairs I have lost myself trying to remove a DLL upon software removal.
I tried a few things and ended up doing the following:
case CurUninstallStep of
usUninstall:
begin
Exec('powershell.exe', ExpandConstant('-NoExit -ExecutionPolicy Bypass Start-Job -FilePath {%TEMP}\mySoft\CleanUtils.ps1 -ArgumentList {%TEMP}\mySoft\'), '', SW_HIDE, ewNoWait, ErrorCode);
Exec('powershell.exe', ExpandConstant('-NoExit -ExecutionPolicy Bypass -Command "Start-Job -ScriptBlock {{ Remove-Item -Recurse -Force {%TEMP}\mySoft"'), '', SW_HIDE, ewNoWait, ErrorCode);
end;
end;
A few things to notice:
The UnloadDLL didn't work for me either, despite trying everything I could. The DLL would simply not get removed by InnoSetup
The first Exec invokes a PowerShell shell. Notice the -NoExit. Without that, the job would simply exit without running.
The StartJob creates a detached background process that runs after the uninstaller quits.
The first instruction executes the following PowerShell script:
# Input directory
$temp_dir=$args[0]
Get-ChildItem "$temp_dir" -Force -Filter "*.dll" |
Foreach-Object {
$file = $_.FullName
while (Test-Path($file) )
{
try
{
remove-item -Recurse -Force $file
}
catch
{
Start-Sleep -seconds 5
}
}
}
That script iterates over the list of DLLs and removes all the DLLs in the folder.
The second Exec deletes the parent folder. I could have put that instruction into the Powershell script, but I suspect that I would have ended up with the same problem, that is, PowerShell needing that file, thus never ending.
Regarding that second Exec, the other difference is that the job is invoked via a -Command option. Without that, I would encounter the following error:
Start-Job : Cannot bind parameter 'ScriptBlock'. Cannot convert the "-encodedCommand" value of type "System.String" to type "System.Management.Automation.ScriptBlock".
At line:1 char:11
+ Start-Job -ScriptBlock -encodedCommand IABHAGUAdAAtAFAAcgBvAGMAZQBzAH ...
+ ~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Start-Job], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.PowerShell.Commands.StartJobCommand
This is far from perfect, but this is the best I could come up with after almost one week(!) of trying bunch of different things.
Good luck to all the Inno fellows!
Not sure what is the real problem, but unload the DLL manually with Windows API works:
function GetModuleHandle(moduleName: String): LongWord;
external 'GetModuleHandleW#kernel32.dll stdcall';
function FreeLibrary(module: LongWord): Integer;
external 'FreeLibrary#kernel32.dll stdcall';
var
lib: LongWord;
res: integer;
repeat
lib := GetModuleHandle('Revert.dll');
res := FreeLibrary(lib);
until res = 0;
Related
I'm trying to read a gz file line by line in Perl6, however, I'm getting blocked:
How to read gz file line by line in Perl6 however, this method, reading everything into :out uses far too much RAM to be usable except on very small files.
I don't understand how to use Perl6's Compress::Zlib to get everything line by line, although I opened an issue on their github https://github.com/retupmoca/P6-Compress-Zlib/issues/17
I'm trying Perl5's Compress::Zlib to translate this code, which works perfectly in Perl5:
use Compress::Zlib;
my $file = "data.txt.gz";
my $gz = gzopen($file, "rb") or die "Error reading $file: $gzerrno";
while ($gz->gzreadline($_) > 0) {
# Process the line read in $_
}
die "Error reading $file: $gzerrno" if $gzerrno != Z_STREAM_END ;
$gz->gzclose() ;
to something like this using Inline::Perl5 in Perl6:
use Compress::Zlib:from<Perl5>;
my $file = 'chrMT.1.vcf.gz';
my $gz = Compress::Zlib::new(gzopen($file, 'r');
while ($gz.gzreadline($_) > 0) {
print $_;
}
$gz.gzclose();
but I can't see how to translate this :(
I'm confused by Lib::Archive example https://github.com/frithnanth/perl6-Archive-Libarchive/blob/master/examples/readfile.p6 I don't see how I can get something like item 3 here
There should be something like
for $file.IO.lines(gz) -> $line { or something like that in Perl6, if it exists, I can't find it.
How can I read a large file line by line without reading everything into RAM in Perl6?
Update Now tested, which revealed an error, now fixed.
Solution #2
use Compress::Zlib;
my $file = "data.txt.gz" ;
my $handle = try open $file or die "Error reading $file: $!" ;
my $zwrap = zwrap($handle, :gzip) ;
for $zwrap.lines {
.print
}
CATCH { default { die "Error reading $file: $_" } }
$handle.close ;
I've tested this with a small gzipped text file.
I don't know much about gzip etc. but figured this out based on:
Knowing P6;
Reading Compress::Zlib's README and choosing the zwrap routine;
Looking at the module's source code, in particular the signature of the zwrap routine our sub zwrap ($thing, :$zlib, :$deflate, :$gzip);
And trial and error, mainly to guess that I needed to pass the :gzip adverb.
Please comment on whether my code works for you. I'm guessing the main thing is whether it's fast enough for the large files you have.
A failed attempt at solution #5
With solution #2 working I would have expected to be able to write just:
use Compress::Zlib ;
.print for "data.txt.gz".&zwrap(:gzip).lines ;
But that fails with:
No such method 'eof' for invocant of type 'IO::Path'
This is presumably because this module was written before the reorganization of the IO classes.
That led me to #MattOates' IO::Handle like object with .lines ? issue. I note no response and I saw no related repo at https://github.com/MattOates?tab=repositories.
I am focusing on the Inline::Perl5 solution that you tried.
For the call to $gz.gzreadline($_): it seems like gzreadline tries to return the line read from the zip file by modifying its input argument $_ (treated as an output argument, but it is not a true Perl 5 reference variable[1]), but the modified value is not returned to the Perl 6 script.
Here is a possoble workaround:
Create a wrapper module in the curent directory, e.g. ./MyZlibWrapper.pm:
package MyZlibWrapper;
use strict;
use warnings;
use Compress::Zlib ();
use Exporter qw(import);
our #EXPORT = qw(gzopen);
our $VERSION = 0.01;
sub gzopen {
my ( $fn, $mode ) = #_;
my $gz = Compress::Zlib::gzopen( $fn, $mode );
my $self = {gz => $gz};
return bless $self, __PACKAGE__;
}
sub gzreadline {
my ( $self ) = #_;
my $line = "";
my $res = $self->{gz}->gzreadline($line);
return [$res, $line];
}
sub gzclose {
my ( $self ) = #_;
$self->{gz}->gzclose();
}
1;
Then use Inline::Perl5 on this wrapper module instead of Compress::Zlib. For example ./p.p6:
use v6;
use lib:from<Perl5> '.';
use MyZlibWrapper:from<Perl5>;
my $file = 'data.txt.gz';
my $mode = 'rb';
my $gz = gzopen($file, $mode);
loop {
my ($res, $line) = $gz.gzreadline();
last if $res == 0;
print $line;
}
$gz.gzclose();
[1]
In Perl 5 you can modify an input argument that is not a reference, and the change will be reflected in the caller. This is done by modifying entries in the special #_ array variable. For example: sub quote { $_[0] = "'$_[0]'" } $str = "Hello"; quote($str) will quote $str even if $str is not passed by reference.
I'm extending my Inno-Setup script with code that I can best implement in C# in a managed DLL. I already know how to export methods from a managed DLL as functions for use in an unmanaged process. It can be done by IL weaving, and there are tools to automate this:
NetDllExport (written by me)
UnmanagedExports
So after exporting, I can call my functions from Pascal script in an Inno-Setup installer. But then there's one issue: The DLL can't seem to be unloaded anymore. Using Inno-Setup's UnloadDLL(...) has no effect and the file remains locked until the installer exits. Because of this, the setup waits for 2 seconds and then fails to delete my DLL file from the temp directory (or install directory). In fact, it really stays there until somebody cleans up the drive.
I know that managed assemblies cannot be unloaded from an AppDomain anymore, unless the entire AppDomain is shut down (the process exits). But what does it mean to the unmanaged host process?
Is there a better way to allow Inno-Setup to unload or delete my DLL file after loading and using it?
As suggested in other answers, you can launch a separate process at the end of the installation that will take care of the cleanup, after the installation processes finishes.
A simple solution is creating an ad-hoc batch file that loops until the DLL file can be deleted and then also deletes the (now empty) temporary folder and itself.
procedure DeinitializeSetup();
var
FilePath: string;
BatchPath: string;
S: TArrayOfString;
ResultCode: Integer;
begin
FilePath := ExpandConstant('{tmp}\MyAssembly.dll');
if not FileExists(FilePath) then
begin
Log(Format('File %s does not exist', [FilePath]));
end
else
begin
BatchPath :=
ExpandConstant('{%TEMP}\') +
'delete_' + ExtractFileName(ExpandConstant('{tmp}')) + '.bat';
SetArrayLength(S, 7);
S[0] := ':loop';
S[1] := 'del "' + FilePath + '"';
S[2] := 'if not exist "' + FilePath + '" goto end';
S[3] := 'goto loop';
S[4] := ':end';
S[5] := 'rd "' + ExpandConstant('{tmp}') + '"';
S[6] := 'del "' + BatchPath + '"';
if not SaveStringsToFile(BatchPath, S, False) then
begin
Log(Format('Error creating batch file %s to delete %s', [BatchPath, FilePath]));
end
else
if not Exec(BatchPath, '', '', SW_HIDE, ewNoWait, ResultCode) then
begin
Log(Format('Error executing batch file %s to delete %s', [BatchPath, FilePath]));
end
else
begin
Log(Format('Executed batch file %s to delete %s', [BatchPath, FilePath]));
end;
end;
end;
You could add a batch script (in the form of running cmd -c) to be executed at the end of setup that waits for the file to be deletable and deletes it. (just make sure to set the inno option to not wait for the cmd process to complete)
You could also make your installed program detect and delete it on first execution.
As suggested in this Code Project Article : https://www.codeproject.com/kb/threads/howtodeletecurrentprocess.aspx
call a cmd with arguments as shown below.
Process.Start("cmd.exe", "/C ping 1.1.1.1 -n 1 -w 3000 > Nul & Del " + Application.ExecutablePath);
But basically as #Sean suggested, make sure you dont wait for the cmd.exe to exit in your script.
While not exactly an answer to your question, can't you just mark the DLL to be deleted next time the computer is restarted?
Here's what I did, adapted from Martin's great answer. Notice the 'Sleep', this did the trick for me. Because the execution is called in a background thread, that is not a blocker, and leaves sufficient time for InnoSetup to free up the resources.
After doing that, I was able to clean the temporary folder.
// Gets invoked at the end of the installation
procedure DeinitializeSetup();
var
BatchPath: String;
S: TArrayOfString;
FilesPath: TStringList;
ResultCode, I, ErrorCode: Integer;
begin
I := 0
FilesPath := TStringList.Create;
FilesPath.Add(ExpandConstant('{tmp}\DLL1.dll'));
FilesPath.Add(ExpandConstant('{tmp}\DLL2.dll'));
FilesPath.Add(ExpandConstant('{tmp}\DLLX.dll'));
while I < FilesPath.Count do
begin
if not FileExists(FilesPath[I]) then
begin
Log(Format('File %s does not exist', [FilesPath[I]]));
end
else
begin
UnloadDLL(FilesPath[I]);
if Exec('powershell.exe',
FmtMessage('-NoExit -ExecutionPolicy Bypass -Command "Start-Sleep -Second 5; Remove-Item -Recurse -Force -Path %1"', [FilesPath[I]]),
'', SW_HIDE, ewNoWait, ErrorCode) then
begin
Log(Format('Temporary file %s successfully deleted', [ExpandConstant(FilesPath[I])]));
end
else
begin
Log(Format('Error while deleting temporary file: %s', [ErrorCode]));
end;
inc(I);
end;
end;
Exec('powershell.exe',
FmtMessage('-NoExit -ExecutionPolicy Bypass -Command "Start-Sleep -Second 5; Remove-Item -Recurse -Force -Path %1"', [ExpandConstant('{tmp}')]),
'', SW_HIDE, ewNoWait, ErrorCode);
Log(Format('Temporary folder %s successfully deleted', [ExpandConstant('{tmp}')]));
end;
The easy way to do what you want is through an AppDomain. You can unload an AppDomain, just not the initial one. So the solution is to create a new AppDomain, load your managed DLL in that and then unload the AppDomain.
AppDomain ad = AppDomain.CreateDomain("Isolate DLL");
Assembly a = ad.Load(new AssemblyName("MyManagedDll"));
object d = a.CreateInstance("MyManagedDll.MyManagedClass");
Type t = d.GetType();
double result = (double)t.InvokeMember("Calculate", BindingFlags.InvokeMethod, null, d, new object[] { 1.0, 2.0 });
AppDomain.Unload(ad);
Here is what the DLL code looks like...
namespace MyManagedDll
{
public class MyManagedClass
{
public double Calculate(double a, double b)
{
return a + b;
}
}
}
Question)
How do I get a DSC script resource to wait until the code has completed before moving on?
(The code is invoke-expression "path\file.exe")
Details)
I am using powershell version 5
and am trying to get DSC setup to handle our sql server installations.
My manager has asked me to use the out of the box DSC components.
i.e. no downloading of custom modules which may help.
I have built up the config file that handles the base server build - everything is good.
The script resource that installs sql server is good.
It executes, and waits until it has installed completely, before moving on.
When I get up to the script resource that installs the sql server cumulative update, I have issues.
The executable gets called and it starts installing (it should take 10-15 minutes), but the dsc configuration doesn't wait until it has installed, and moves on after a second.
This means that the DependsOn for future steps, gets called, before the installation is complete.
How can I make the script resource wait until it has finished?
Have you tried the keyword "DependsOn" like that ?
Script MyNewSvc
{
GetScript = {
$SvcName = 'MyNewSvc'
$Results = #{}
$Results['svc'] = Get-Service $SvcName
$Results
}
SetScript = {
$SvcName = 'MyNewSvc'
setup.exe /param
while((Get-Service $SvcName).Status -ne "Running"){ Start-Sleep 10 }
}
TestScript = {
$SvcName = 'MyNewSvc'
$SvcLog = 'c:\svc.log'
If (condition) { #like a a running svc or a log file
$True
}
Else {
$False
}
}
}
WindowsFeature Feature
{
Name = "Web-Server"
Ensure = "Present"
DependsOn = "[Script]MyNewSvc"
}
Invoke-Expression doesn't seem to wait until the process has finished - try this in a generic PowerShell console and you'll see the command returns before you close notepad:
Invoke-Expression -Command "notepad.exe";
You can use Start-Process instead:
Start-Process -FilePath "notepad.exe" -Wait -NoNewWindow;
And if you want to check the exit code you can do this:
$process = Start-Process -FilePath "notepad.exe" -Wait -NoNewWindow -PassThru;
$exitcode = $process.ExitCode;
if( $exitcode -ne 0 )
{
# handle errors here
}
Finally, to use command line arguments:
$process = Start-Process -FilePath "setup.exe" -ArgumentList #("/param1", "/param2") -Wait -PassThru;
$exitcode = $process.ExitCode;
I'm struggling since a couple of days to upload files to Sharepoint 2010 with powershell.
I'm on a win7 machine with powershell v2 trying to upload to a SP 2010 site.
I'm having 2 major issues
$Context.web value is always empty even after Executequery() and no
error is shown. My $Context variable gets the server version (14.x.x.x.x) but nothing more
$Context.Load($variable) which always returns the error Cannot find an overload for "Load" and the argument count: "1".
I copied Sharepoint DLLs to my Win7 machine and I import the reference to my script.
The below script is a mix of many parts I took from the net.
I'v already tried unsuccessfully to add an overload on the clientcontext defining Load method without Type parameter suggested in the following post
http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell/
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "https://Root-of-my-site"
$listname = "My-folder"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$Folder = "C:\temp\Certificates"
$List = $Context.Web.Lists.GetByTitle($listname)
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = get-content -encoding byte -path $File.Fullname
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
}
The error is
Cannot find an overload for "Load" and the argument count: "1".
At C:\temp\uploadCertToSharepoint.ps1:48 char:14
+ $Context.Load <<<< ($Upload)
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
Can someone please help me sorting this issue?
I'll need to upload around 400 files with ad-hoc fields to a sharepoint site in a couple of weeks and at the moment I'm completely stuck. Running the script server side is unfortunately not possible.
Thanks,
Marco
This error occurs since ClientRuntimeContext.Load is a Generics Method:
public void Load<T>(
T clientObject,
params Expression<Func<T, Object>>[] retrievals
)
where T : ClientObject
and Generics methods are not supported natively in PowerShell (V1, V2) AFAIK.
The workaround is to invoke a generic methods using MethodInfo.MakeGenericMethod method as described in article Invoking Generic Methods on Non-Generic Classes in PowerShell
In case of ClientRuntimeContext.Load method, the following PS function could be used:
Function Invoke-LoadMethod() {
param(
$clientObjectInstance = $(throw “Please provide an Client Object instance on which to invoke the generic method”)
)
$ctx = $clientObjectInstance.Context
$load = [Microsoft.SharePoint.Client.ClientContext].GetMethod("Load")
$type = $clientObjectInstance.GetType()
$clientObjectLoad = $load.MakeGenericMethod($type)
$clientObjectLoad.Invoke($ctx,#($clientObjectInstance,$null))
}
Then, in your example the line:
$Context.Load($Upload)
could be replaced with this one:
Invoke-LoadMethod -clientObjectInstance $Upload
References
Invoking Generic Methods on Non-Generic Classes in PowerShell
Some tips and tricks of using SharePoint Client Object Model in
PowerShell. Part 1
It throws the error because in powershell 2.0 you cannot call generic method directly.
You need to create closed method using MakeGenericMethod. Try to use code below.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "http://server"
$listname = "listName"
$Folder = "C:\PS\Test"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$method = $Context.GetType().GetMethod("Load")
$closedMethod = $method.MakeGenericMethod([Microsoft.SharePoint.Client.File])
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = (get-content -encoding byte -path $File.Fullname)
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$closedMethod.Invoke($Context, #($Upload, $null) )
$Context.ExecuteQuery()
}
Hi i am not exactly sure if my wording is right but i need a variable which contains current date/time whenever i write data to log ; how can i do that without initializing everytime.Currently everytime i need a update i use these both statements jointly.Is there an other way of doing this?
$DateTime = get-date | select datetime
Add-Content $LogFile -Value "$DateTime.DateTime: XXXXX"
please do let me know if any questions or clarifications regarding my question.
This script make the real Dynamic variable in Powershell ( Thanks to Lee Holmes and his Windows PowerShell Cookbook The Complete Guide to Scripting Microsoft's Command Shell, 3rd Edition)
##############################################################################
##
## New-DynamicVariable
##
## From Windows PowerShell Cookbook (O'Reilly)
## by Lee Holmes (http://www.leeholmes.com/guide)
##
##############################################################################
<#
.SYNOPSIS
Creates a variable that supports scripted actions for its getter and setter
.EXAMPLE
PS > .\New-DynamicVariable GLOBAL:WindowTitle `
-Getter { $host.UI.RawUI.WindowTitle } `
-Setter { $host.UI.RawUI.WindowTitle = $args[0] }
PS > $windowTitle
Administrator: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
PS > $windowTitle = "Test"
PS > $windowTitle
Test
#>
param(
## The name for the dynamic variable
[Parameter(Mandatory = $true)]
$Name,
## The scriptblock to invoke when getting the value of the variable
[Parameter(Mandatory = $true)]
[ScriptBlock] $Getter,
## The scriptblock to invoke when setting the value of the variable
[ScriptBlock] $Setter
)
Set-StrictMode -Version 3
Add-Type #"
using System;
using System.Collections.ObjectModel;
using System.Management.Automation;
namespace Lee.Holmes
{
public class DynamicVariable : PSVariable
{
public DynamicVariable(
string name,
ScriptBlock scriptGetter,
ScriptBlock scriptSetter)
: base(name, null, ScopedItemOptions.AllScope)
{
getter = scriptGetter;
setter = scriptSetter;
}
private ScriptBlock getter;
private ScriptBlock setter;
public override object Value
{
get
{
if(getter != null)
{
Collection<PSObject> results = getter.Invoke();
if(results.Count == 1)
{
return results[0];
}
else
{
PSObject[] returnResults =
new PSObject[results.Count];
results.CopyTo(returnResults, 0);
return returnResults;
}
}
else { return null; }
}
set
{
if(setter != null) { setter.Invoke(value); }
}
}
}
}
"#
## If we've already defined the variable, remove it.
if(Test-Path variable:\$name)
{
Remove-Item variable:\$name -Force
}
## Set the new variable, along with its getter and setter.
$executioncontext.SessionState.PSVariable.Set(
(New-Object Lee.Holmes.DynamicVariable $name,$getter,$setter))
There's a Set-StrictMode -Version 3 but you can set it as -Version 2 if you can load framework 4.0 in your powershell V2.0 session as explained Here
The use for the OP is:
New-DynamicVariable -Name GLOBAL:now -Getter { (get-date).datetime }
Here the Lee Holmes's evaluation (where it is clear what is the real flaw) about the method I used in my other answer:
Note
There are innovative solutions on the Internet that use PowerShell's debugging facilities to create a breakpoint that changes a variable's value whenever you attempt to read from it. While unique, this solution causes PowerShell to think that any scripts that rely on the variable are in debugging mode. This, unfortunately, prevents PowerShell from enabling some important performance optimizations in those scripts.
Why not use:
Add-Content $LogFile -Value "$((Get-Date).DateTime): XXXXX"
This gets the current datetime every time. Notice that it's inside $( ) which makes powershell run the expression(get the datetime) before inserting it into the string.
wrap your two commands in function so you will have just one call ?
function add-log{
(param $txt)
$DateTime = get-date | select -expand datetime
Add-Content $LogFile -Value "$DateTime: $txt"
}
Besides these other ways (which frankly I would probably use instead - except the breakpoint approach), you can create a custom object with a ScriptProperty that you can provide the implementation for:
$obj = new-object pscustomobject
$obj | Add-Member ScriptProperty Now -Value { Get-Date }
$obj.now
Using PsBreakPoint:
$act= #'
$global:now = (get-date).datetime
'#
$global:sb = [scriptblock]::Create($act)
$now = Set-PSBreakpoint -Variable now -Mode Read -Action $global:sb
calling $now returns current updated datetime value
One liner:
$now = Set-PSBreakpoint -Variable now -Mode Read -Action { $global:now = (get-date).datetime }