Is it possible to call a COM method from PowerShell using named parameters? The COM object method I am working with has dozens of parameters:
object.GridData( DataFile, xCol, yCol, zCol, ExclusionFilter, DupMethod, xDupTol,
yDupTol, NumCols, NumRows, xMin, xMax, yMin, yMax, Algorithm, ShowReport,
SearchEnable, SearchNumSectors, SearchRad1, SearchRad2, SearchAngle,
SearchMinData, SearchDataPerSect, SearchMaxEmpty, FaultFileName, BreakFileName,
AnisotropyRatio, AnisotropyAngle, IDPower, IDSmoothing, KrigType, KrigDriftType,
KrigStdDevGrid, KrigVariogram, MCMaxResidual, MCMaxIterations, MCInternalTension,
MCBoundaryTension, MCRelaxationFactor, ShepSmoothFactor, ShepQuadraticNeighbors,
ShepWeightingNeighbors, ShepRange1, ShepRange2, RegrMaxXOrder, RegrMaxYOrder,
RegrMaxTotalOrder, RBBasisType, RBRSquared, OutGrid, OutFmt, SearchMaxData,
KrigStdDevFormat, DataMetric, LocalPolyOrder, LocalPolyPower, TriangleFileName )
Most of those parameters are optional and some of them are mutually exclusive. In Visual Basic or Python using the win32com module you can use named parameters to specify only the subset of options you need. For example (in Python):
Surfer.GridData(DataFile=InFile,
xCol=Options.xCol,
yCol=Options.yCol,
zCol=Options.zCol,
DupMethod=win32com.client.constants.srfDupMedZ,
xDupTol=Options.GridSpacing,
yDupTol=Options.GridSpacing,
NumCols=NumCols,
NumRows=NumRows,
xMin=xMin,
xMax=xMax,
yMin=yMin,
yMax=yMax,
Algorithm=win32com.client.constants.srfMovingAverage,
ShowReport=False,
SearchEnable=True,
SearchRad1=Options.SearchRadius,
SearchRad2=Options.SearchRadius,
SearchMinData=5,
OutGrid=OutGrid)
I can't figure out how to call this object from PowerShell in the same way.
This problem did interest me, so I did some real digging and I have found a solution (though I have only tested on some simple cases)!
Concept
The key solution is using [System.Type]::InvokeMember which allows you to pass parameter names in one of its overloads.
Here is the basic concept.
$Object.GetType().InvokeMember($Method, [System.Reflection.BindingFlags]::InvokeMethod,
$null, ## Binder
$Object, ## Target
([Object[]]$Args), ## Args
$null, ## Modifiers
$null, ## Culture
([String[]]$NamedParameters) ## NamedParameters
)
Solution
Here is a reusable solution for calling methods with named parameters. This should work on any object, not just COM objects. I made a hashtable as one of the parameters so that specifying the named parameters will be more natural and hopefully less error prone. You can also call a method without parameter names if you want by using the -Argument parameter
Function Invoke-NamedParameter {
[CmdletBinding(DefaultParameterSetName = "Named")]
param(
[Parameter(ParameterSetName = "Named", Position = 0, Mandatory = $true)]
[Parameter(ParameterSetName = "Positional", Position = 0, Mandatory = $true)]
[ValidateNotNull()]
[System.Object]$Object
,
[Parameter(ParameterSetName = "Named", Position = 1, Mandatory = $true)]
[Parameter(ParameterSetName = "Positional", Position = 1, Mandatory = $true)]
[ValidateNotNullOrEmpty()]
[String]$Method
,
[Parameter(ParameterSetName = "Named", Position = 2, Mandatory = $true)]
[ValidateNotNull()]
[Hashtable]$Parameter
,
[Parameter(ParameterSetName = "Positional")]
[Object[]]$Argument
)
end { ## Just being explicit that this does not support pipelines
if ($PSCmdlet.ParameterSetName -eq "Named") {
## Invoke method with parameter names
## Note: It is ok to use a hashtable here because the keys (parameter names) and values (args)
## will be output in the same order. We don't need to worry about the order so long as
## all parameters have names
$Object.GetType().InvokeMember($Method, [System.Reflection.BindingFlags]::InvokeMethod,
$null, ## Binder
$Object, ## Target
([Object[]]($Parameter.Values)), ## Args
$null, ## Modifiers
$null, ## Culture
([String[]]($Parameter.Keys)) ## NamedParameters
)
} else {
## Invoke method without parameter names
$Object.GetType().InvokeMember($Method, [System.Reflection.BindingFlags]::InvokeMethod,
$null, ## Binder
$Object, ## Target
$Argument, ## Args
$null, ## Modifiers
$null, ## Culture
$null ## NamedParameters
)
}
}
}
Examples
Calling a method with named parameters.
$shell = New-Object -ComObject Shell.Application
Invoke-NamedParameter $Shell "Explore" #{"vDir"="$pwd"}
## the syntax for more than one would be #{"First"="foo";"Second"="bar"}
Calling a method that takes no parameters (you can also use -Argument with $null).
$shell = New-Object -ComObject Shell.Application
Invoke-NamedParameter $Shell "MinimizeAll" #{}
Using the Invoke-NamedParameter function did not work for me. I was able to find an interesting solution here https://community.idera.com/database-tools/powershell/ask_the_experts/f/powershell_for_windows-12/6361/excel-spreadsheet-export which did work for me.
$excel = New-Object -ComObject excel.application
$objMissingValue = [System.Reflection.Missing]::Value
$Workbook = $excel.Workbooks.Open($datafile,$objMissingValue,$False,$objMissingValue,$objMissingValue,$objMissingValue,$true,$objMissingValue)
Any parameter I did not use I added a missing value.
Related
I am trying to load 160gb csv file to sql and I am using powershell script I got from Github and I get this error
IException calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\b.ps1:54 char:26
+ [void]$datatable.Rows.Add <<<< ($line.Split($delimiter))
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
So I checked the same code with small 3 line csv and all of the columns match and also have header in first row and there are no extra delimiters not sure why I am getting this error.
The code is below
<# 8-faster-runspaces.ps1 #>
# Set CSV attributes
$csv = "M:\d\s.txt"
$delimiter = "`t"
# Set connstring
$connstring = "Data Source=.;Integrated Security=true;Initial Catalog=PresentationOptimized;PACKET SIZE=32767;"
# Set batchsize to 2000
$batchsize = 2000
# Create the datatable
$datatable = New-Object System.Data.DataTable
# Add generic columns
$columns = (Get-Content $csv -First 1).Split($delimiter)
foreach ($column in $columns) {
[void]$datatable.Columns.Add()
}
# Setup runspace pool and the scriptblock that runs inside each runspace
$pool = [RunspaceFactory]::CreateRunspacePool(1,5)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# Setup scriptblock. This is the workhorse. Think of it as a function.
$scriptblock = {
Param (
[string]$connstring,
[object]$dtbatch,
[int]$batchsize
)
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connstring,"TableLock")
$bulkcopy.DestinationTableName = "abc"
$bulkcopy.BatchSize = $batchsize
$bulkcopy.WriteToServer($dtbatch)
$bulkcopy.Close()
$dtbatch.Clear()
$bulkcopy.Dispose()
$dtbatch.Dispose()
}
# Start timer
$time = [System.Diagnostics.Stopwatch]::StartNew()
# Open the text file from disk and process.
$reader = New-Object System.IO.StreamReader($csv)
Write-Output "Starting insert.."
while ((($line = $reader.ReadLine()) -ne $null))
{
[void]$datatable.Rows.Add($line.Split($delimiter))
if ($datatable.rows.count % $batchsize -eq 0)
{
$runspace = [PowerShell]::Create()
[void]$runspace.AddScript($scriptblock)
[void]$runspace.AddArgument($connstring)
[void]$runspace.AddArgument($datatable) # <-- Send datatable
[void]$runspace.AddArgument($batchsize)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{ Pipe = $runspace; Status = $runspace.BeginInvoke() }
# Overwrite object with a shell of itself
$datatable = $datatable.Clone() # <-- Create new datatable object
}
}
# Close the file
$reader.Close()
# Wait for runspaces to complete
while ($runspaces.Status.IsCompleted -notcontains $true) {}
# End timer
$secs = $time.Elapsed.TotalSeconds
# Cleanup runspaces
foreach ($runspace in $runspaces ) {
[void]$runspace.Pipe.EndInvoke($runspace.Status) # EndInvoke method retrieves the results of the asynchronous call
$runspace.Pipe.Dispose()
}
# Cleanup runspace pool
$pool.Close()
$pool.Dispose()
# Cleanup SQL Connections
[System.Data.SqlClient.SqlConnection]::ClearAllPools()
# Done! Format output then display
$totalrows = 1000000
$rs = "{0:N0}" -f [int]($totalrows / $secs)
$rm = "{0:N0}" -f [int]($totalrows / $secs * 60)
$mill = "{0:N0}" -f $totalrows
Write-Output "$mill rows imported in $([math]::round($secs,2)) seconds ($rs rows/sec and $rm rows/min)"
Working with a 160 GB input file is going to be a pain. You can't really load it into any kind of editor - or at least you don't really analyze such a data mass without some serious automation.
As per the comments, it seems that the data has some quality issues. In order to find the offending data, you could try binary searching. This approach shrinks the data fast. Like so,
1) Split the file in about two equal chunks.
2) Try and load first chunk.
3) If successful, process the second chunk. If not, see 6).
4) Try and load second chunk.
5) If successful, the files are valid, but you got another a data quality issue. Start looking into other causes. If not, see 6).
6) If either load failed, start from the beginning and use the failed file as the input file.
7) Repeat until you narrow down the offending row(s).
Another a method would be using an ETL tool like SSIS. Configure the package to redirect invalid rows into an error log to see what data is not working properly.
In .Net we can get the datasource from a connectionstring using below mechanism:
System.Data.SqlClient.SqlConnectionStringBuilder builder = new System.Data.SqlClient.SqlConnectionStringBuilder(connectionString);
string server = builder.DataSource;
I was trying to do that in PowerShell but getting the following exception:
$ConstringObj = New-Object System.Data.SqlClient.SqlConnectionStringBuilder($conString)
New-Object : Exception calling ".ctor" with "1" argument(s): "Keyword
not supported: 'metadata'." At line:1 char:17
+ $ConstringObj = New-Object System.Data.SqlClient.SqlConnectionStringBuilder($con ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [New-Object], MethodInvocationException
+ FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.PowerShell.Commands.NewObjectCommand
How to do that in PowerShell?
Problem
There's some weird behavior when using SqlConnectionStringBuilder in PowerShell - let me explain
Since it's a dotnet class, you'd expect all of the same properties and methods available in C#
For example, this works fine in C#:
var cnnBuilder = new SqlConnectionStringBuilder();
cnnBuilder.DataSource = "server_name";
cnnBuilder.InitialCatalog = "db_name";
So the equivalent code in PS, should work:
$cnnBuilder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
$cnnBuilder.DataSource = "server_name"
$cnnBuilder.InitialCatalog = "db_name"
However, SqlConnectionStringBuilder is built ontop of DbConnectionStringBuilder which implements IDictionary so fundamentally we're working with a dictionary object that has some syntactic sugar wrappers
.NET resolves this with an override on the dictionary accessors and setters like this (simplified here):
public override object this[string keyword] {
get {
Keywords index = GetIndex(keyword);
return GetAt(index);
}
set {
Keywords index = GetIndex(keyword);
switch(index) {
case Keywords.DataSource: DataSource = ConvertToString(value); break;
case Keywords.InitialCatalog: InitialCatalog = ConvertToString(value); break;
// ***
}
}
}
So really, it's taking the DataSource property and mapping it to the "Data Source" key (with space)
Whenever PS assigns or retrieves a value, it has to decide whether to use the underlying dictionary implementation or the property. And when you look for DataSource in the dictionary (without the space), that sql connection keyword doesn't exist.
Solutions
Opt 1 - Use Dictionary Names
You can use the bracket or dot notation with the actual sql key to access the entry in the hashtable
$cnnBuilder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
$cnnBuilder["Data Source"] = "server_name"
$cnnBuilder."Initial Catalog" = "db_name"
Opt 2 - Use PSBase
PSBase returns the "raw view of the object" and will give us the default behavior in dotnet
$cnnBuilder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder
$cnnBuilder.PSBase.DataSource = "server_name"
$cnnBuilder.PSBase.InitialCatalog = "db_name"
Opt 3 - Use -Property Parameter
During the construction, you can set the -Property parameter on New-Object which "sets property values and invokes methods of the new object."
$cnnBuilder = New-Object System.Data.SqlClient.SqlConnectionStringBuilder `
-Property #{
DataSource = "server_name"
InitialCatalog = "db_name"
}
Additional Reading
Using SQLConnection object in PowerShell
Your example should work. However, you could also grab the datasource using a regex:
[regex]::Match($ConstringObj, 'Data Source=([^;]+)').Groups[1].Value
First time user, looking for help with a script that's been driving me crazy.
Basically, I need to create a set number of files of an exact size (512KB, 2MB, 1GB) to test a SAN. These files need to be filled with random text so that the SAN doesn't catch the nuls and does actually allocate the blocks - that's also the reason I couldn't just use fsutils.
Now, I've been messing with the new-bigrandomfile by Verboon and tweaking it to my needs.
However I'm getting the error:
You cannot call a method on a null-valued expression.
At L:\random5.ps1:34 char:9
+ $stream.Write($longstring)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
This is the bit of code I've come up with so far; I'll add a loop at the end to copy the file I just created N times so to fill up the lun.
Set-Strictmode -Version 2.0
#temp file
$file = "c:\temp\temp.rnd"
#charset size
$charset = 64
#Block Size
$blocksize = 512
#page size
$Pagesize = 512KB
#Number of blocks in a page
$blocknum = $Pagesize / $blocksize
#Resulting/desired test file size
$filesize = 1GB
#number of pages in a file
$pagenum = $filesize / $Pagesize
# create the stream writer
$stream = System.IO.StreamWriter $file
# get a 64 element Char[]; I added the - and _ to have 64 chars
[char[]]$chars = 'azertyuiopqsdfghjklmwxcvbnAZERTYUIOPQSDFGHJKLMWXCVBN0123456789-_'
1..$Pagenum | ForEach-Object {
# get a page's worth of blocks
1..$blocknum| ForEach-Object {
# randomize all chars and...
$rndChars = $chars | Get-Random -Count $chars.Count
# ...join them in a string
$string = -join $rndChars
# repeat random string N times to get a full block string length
$longstring = $string * ($blocksize / $charset)
# write 1 block to file
$stream.Write($longstring)
# release resources by clearing string variables
Clear-Variable string, longstring
}
}
$stream.Close()
$stream.Dispose()
# release resources through garbage collection
[GC]::Collect()
$file.Close()
I've tried a gazillion variants like:
$stream = [System.IO.StreamWriter] $file
$stream = System.IO.StreamWriter $file
$stream = NewObject System.IO.StreamWriter $file
Of course, being a total noob at powershell, I've tried using quotes, brackets, provided the full path instead of the variable, etc. All (or most) seem to be valid syntax variants, according to a ton of examples I found online, but the output is still the same.
In case you have any improvement to suggest or alternative way to perform this task I'm all ears.
Edited the script above: just a couple of " for $file made the error disappear, - thanks LinuxDisciple; however, the file gets created but stays at 0 bytes and the script stuck in a loop.
Fix your instantiation of StreamWriter to any of these correct variants:
$stream = [System.IO.StreamWriter]::new($file)
$stream = [IO.StreamWriter]::new($file) # the default namespace may be omitted
$stream = New-Object System.IO.StreamWriter $file
You can specify encoding:
$stream = [IO.StreamWriter]::new(
$file,
$false, # don't append
[Text.Encoding]::ASCII
)
See StreamWriter on MSDN for available constructors and parameters.
PowerShell ISE offers autocomplete with tooltips:
type [streamw and press Ctrl-Space to autocomplete the full .NET class name
type ]:: to see the available methods and properties
type new and press Ctrl-Space to see the constructor overrides
whenever needed, put the caret at the method name and press Ctrl-Space for the tooltip
I know nothing about powershell but a few things:
Are you sure $longstring has a value before you call stream.Write()? It sounds like it's null and that's why the error. If you can somehow output the value of $longstring to the console, it would help you make sure that it has a value.
Also, troubleshoot the code with a simplified version of your code, so that you can pinpoint what's going on, for example
$file = c:\temp\temp.rnd
$stream = System.IO.StreamWriter $file
$longstring = 'whatever'
$stream.Write($longstring)
I'm struggling since a couple of days to upload files to Sharepoint 2010 with powershell.
I'm on a win7 machine with powershell v2 trying to upload to a SP 2010 site.
I'm having 2 major issues
$Context.web value is always empty even after Executequery() and no
error is shown. My $Context variable gets the server version (14.x.x.x.x) but nothing more
$Context.Load($variable) which always returns the error Cannot find an overload for "Load" and the argument count: "1".
I copied Sharepoint DLLs to my Win7 machine and I import the reference to my script.
The below script is a mix of many parts I took from the net.
I'v already tried unsuccessfully to add an overload on the clientcontext defining Load method without Type parameter suggested in the following post
http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell/
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "https://Root-of-my-site"
$listname = "My-folder"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$Folder = "C:\temp\Certificates"
$List = $Context.Web.Lists.GetByTitle($listname)
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = get-content -encoding byte -path $File.Fullname
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
}
The error is
Cannot find an overload for "Load" and the argument count: "1".
At C:\temp\uploadCertToSharepoint.ps1:48 char:14
+ $Context.Load <<<< ($Upload)
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
Can someone please help me sorting this issue?
I'll need to upload around 400 files with ad-hoc fields to a sharepoint site in a couple of weeks and at the moment I'm completely stuck. Running the script server side is unfortunately not possible.
Thanks,
Marco
This error occurs since ClientRuntimeContext.Load is a Generics Method:
public void Load<T>(
T clientObject,
params Expression<Func<T, Object>>[] retrievals
)
where T : ClientObject
and Generics methods are not supported natively in PowerShell (V1, V2) AFAIK.
The workaround is to invoke a generic methods using MethodInfo.MakeGenericMethod method as described in article Invoking Generic Methods on Non-Generic Classes in PowerShell
In case of ClientRuntimeContext.Load method, the following PS function could be used:
Function Invoke-LoadMethod() {
param(
$clientObjectInstance = $(throw “Please provide an Client Object instance on which to invoke the generic method”)
)
$ctx = $clientObjectInstance.Context
$load = [Microsoft.SharePoint.Client.ClientContext].GetMethod("Load")
$type = $clientObjectInstance.GetType()
$clientObjectLoad = $load.MakeGenericMethod($type)
$clientObjectLoad.Invoke($ctx,#($clientObjectInstance,$null))
}
Then, in your example the line:
$Context.Load($Upload)
could be replaced with this one:
Invoke-LoadMethod -clientObjectInstance $Upload
References
Invoking Generic Methods on Non-Generic Classes in PowerShell
Some tips and tricks of using SharePoint Client Object Model in
PowerShell. Part 1
It throws the error because in powershell 2.0 you cannot call generic method directly.
You need to create closed method using MakeGenericMethod. Try to use code below.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$site = "http://server"
$listname = "listName"
$Folder = "C:\PS\Test"
$context = New-Object Microsoft.SharePoint.Client.ClientContext($site)
[Microsoft.SharePoint.Client.Web]$web = $context.Web
[Microsoft.SharePoint.Client.List]$list = $web.Lists.GetByTitle($listName)
$method = $Context.GetType().GetMethod("Load")
$closedMethod = $method.MakeGenericMethod([Microsoft.SharePoint.Client.File])
Foreach ($File in (dir $Folder))
{
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = (get-content -encoding byte -path $File.Fullname)
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$closedMethod.Invoke($Context, #($Upload, $null) )
$Context.ExecuteQuery()
}
Hi i am not exactly sure if my wording is right but i need a variable which contains current date/time whenever i write data to log ; how can i do that without initializing everytime.Currently everytime i need a update i use these both statements jointly.Is there an other way of doing this?
$DateTime = get-date | select datetime
Add-Content $LogFile -Value "$DateTime.DateTime: XXXXX"
please do let me know if any questions or clarifications regarding my question.
This script make the real Dynamic variable in Powershell ( Thanks to Lee Holmes and his Windows PowerShell Cookbook The Complete Guide to Scripting Microsoft's Command Shell, 3rd Edition)
##############################################################################
##
## New-DynamicVariable
##
## From Windows PowerShell Cookbook (O'Reilly)
## by Lee Holmes (http://www.leeholmes.com/guide)
##
##############################################################################
<#
.SYNOPSIS
Creates a variable that supports scripted actions for its getter and setter
.EXAMPLE
PS > .\New-DynamicVariable GLOBAL:WindowTitle `
-Getter { $host.UI.RawUI.WindowTitle } `
-Setter { $host.UI.RawUI.WindowTitle = $args[0] }
PS > $windowTitle
Administrator: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
PS > $windowTitle = "Test"
PS > $windowTitle
Test
#>
param(
## The name for the dynamic variable
[Parameter(Mandatory = $true)]
$Name,
## The scriptblock to invoke when getting the value of the variable
[Parameter(Mandatory = $true)]
[ScriptBlock] $Getter,
## The scriptblock to invoke when setting the value of the variable
[ScriptBlock] $Setter
)
Set-StrictMode -Version 3
Add-Type #"
using System;
using System.Collections.ObjectModel;
using System.Management.Automation;
namespace Lee.Holmes
{
public class DynamicVariable : PSVariable
{
public DynamicVariable(
string name,
ScriptBlock scriptGetter,
ScriptBlock scriptSetter)
: base(name, null, ScopedItemOptions.AllScope)
{
getter = scriptGetter;
setter = scriptSetter;
}
private ScriptBlock getter;
private ScriptBlock setter;
public override object Value
{
get
{
if(getter != null)
{
Collection<PSObject> results = getter.Invoke();
if(results.Count == 1)
{
return results[0];
}
else
{
PSObject[] returnResults =
new PSObject[results.Count];
results.CopyTo(returnResults, 0);
return returnResults;
}
}
else { return null; }
}
set
{
if(setter != null) { setter.Invoke(value); }
}
}
}
}
"#
## If we've already defined the variable, remove it.
if(Test-Path variable:\$name)
{
Remove-Item variable:\$name -Force
}
## Set the new variable, along with its getter and setter.
$executioncontext.SessionState.PSVariable.Set(
(New-Object Lee.Holmes.DynamicVariable $name,$getter,$setter))
There's a Set-StrictMode -Version 3 but you can set it as -Version 2 if you can load framework 4.0 in your powershell V2.0 session as explained Here
The use for the OP is:
New-DynamicVariable -Name GLOBAL:now -Getter { (get-date).datetime }
Here the Lee Holmes's evaluation (where it is clear what is the real flaw) about the method I used in my other answer:
Note
There are innovative solutions on the Internet that use PowerShell's debugging facilities to create a breakpoint that changes a variable's value whenever you attempt to read from it. While unique, this solution causes PowerShell to think that any scripts that rely on the variable are in debugging mode. This, unfortunately, prevents PowerShell from enabling some important performance optimizations in those scripts.
Why not use:
Add-Content $LogFile -Value "$((Get-Date).DateTime): XXXXX"
This gets the current datetime every time. Notice that it's inside $( ) which makes powershell run the expression(get the datetime) before inserting it into the string.
wrap your two commands in function so you will have just one call ?
function add-log{
(param $txt)
$DateTime = get-date | select -expand datetime
Add-Content $LogFile -Value "$DateTime: $txt"
}
Besides these other ways (which frankly I would probably use instead - except the breakpoint approach), you can create a custom object with a ScriptProperty that you can provide the implementation for:
$obj = new-object pscustomobject
$obj | Add-Member ScriptProperty Now -Value { Get-Date }
$obj.now
Using PsBreakPoint:
$act= #'
$global:now = (get-date).datetime
'#
$global:sb = [scriptblock]::Create($act)
$now = Set-PSBreakpoint -Variable now -Mode Read -Action $global:sb
calling $now returns current updated datetime value
One liner:
$now = Set-PSBreakpoint -Variable now -Mode Read -Action { $global:now = (get-date).datetime }