Using SQL2017 version 14.0.1.439. I need to change dataSource path of Connections in Tabular database with Powershell.
Here is my code:
$ServerName="localhost\tabular"
$loadInfo =
[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$server = New-Object Microsoft.AnalysisServices.Server
$server.connect($ServerName)
if ($server.name -eq $null) {
Write-Output ("Server '{0}' not found" -f $ServerName)
break
}
foreach ($d in $server.Databases )
{
Write-Output ( "Database: {0}; Status: {1}; Size: {2}MB; Data Sources: {3} " -f $d.Name, $d.State, ($d.EstimatedSize/1024/1024).ToString("#,##0"), $d.DataSources.Count )
}
My problem, that $d.DataSources.Count is always 0.
I am looking for some way to edit it with PS.
You should be accessing the data sources through Model.Datasources
I can confirm that the following code returns the number of data sources for my Tabular models.
$ServerName="localhost\tabular"
$loadInfo =
[Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")
$server = New-Object Microsoft.AnalysisServices.Server
$server.connect($ServerName)
if ($server.name -eq $null) {
Write-Output ("Server '{0}' not found" -f $ServerName)
break
}
foreach ($d in $server.Databases )
{
Write-output ( "Database: {0}; Status: {1}; Size: {2}MB; Data Sources: {3} " -f $d.Name, $d.State, ($d.EstimatedSize/1024/1024).ToString("#,##0"), $d.Model.DataSources.Count )
}
Related
Can anyone help me to get the script to remove the backup file older than 45 days from storage Blob container.
Actually i want to run the script in Managed instance ( PAAS) through SQL server agent job.
Please help me on this.
According this post.
I think you can modifier the PS script. Change $FromDate = ((Get-Date).AddDays(-6)).Date to $FromDate = ((Get-Date).AddDays(-45)).Date . The script is as follows:
#PowerShell Script to delete System Databases .bak files from Azure Block Blob
#-eq = equals
#-ne = not equals
#-lt = less than
#-gt = greater than
#-le = less than or equals
#-ge = greater than or equals
#Set All Variables
$YesterdayDate = ((Get-Date).AddDays(-1)).Date #Get Yesterday date
$FromDate = ((Get-Date).AddDays(-45)).Date #Get 6 Days back date from Today
$BlobType = "Pageblob"
$bacs = Get-ChildItem $location -Filter *.bak
$container="dwhdatabasesbackup"
$StorageAccountName="analytics"
$StorageAccountKey="xxxxxx"
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$filelist = Get-AzureStorageBlob -Container $container -Context $context
#Foreach Loop With a Condition $_.LastModified.Date -eq $YesterdayDate to make Sure there was a File Uploaded Yesterday
foreach ($filein$filelist | Where-Object {$_.LastModified.Date -eq $YesterdayDate -and $_.BlobType -eq $BlobType -and ($_.Name -like "*.bak")})
{
$Yesterdayfile = $file.Name
if ($Yesterdayfile -ne $null)
{
$FileFullLength = $Yesterdayfile.Length
$FileNameWithoutDatePart = $Yesterdayfile.SubString(0, $FileFullLength-30)
Write-Output ("File Name Without Date Part: " +$FileNameWithoutDatePart)
#Foreach Loop With a Condition $_.LastModified.Date -lt $FromDate to Remove Files those are 5 Days Old
foreach ($filein$filelist | Where-Object {$_.LastModified.Date -lt $FromDate -and $_.BlobType -eq $BlobType -and ($_.Name -like "$FileNameWithoutDatePart*.bak")})
{
$removefile = $file.Name
$RemoveFileFullLength = $removefile.Length
$ModifiedDate = $file.LastModified.Date
if (($removefile -ne $null) -and ($RemoveFileFullLength -eq $FileFullLength))
{
Write-Output ("Remove File Name: ("+$removefile +") as Modified Date: ("+ $ModifiedDate +") of File is Older Than Date: ("+ $FromDate + ")")
}
}
}
}
So I have a list of sitecollections and with them everyone who have access to the sitecollections. A ; separated csv file, formated like /sites1/sites;AD\User1.
We need to make a cleanup on ~400 sites where we want to remove users beloning to a certain company or being in a certain AD.
I have this information so thats not a problem.
Is there a way to make a powershellscript to remove theese users on theese site collection, in the other output we don't have any list of where they have permission only that they have it somewhere on the site.
I would also need the output format in a good way so we can restore them in a case it would be needed?
Can this be done?
We used below to get the list of users:
$ver = $host | select version
if($Ver.version.major -gt 1) {$Host.Runspace.ThreadOptions = "ReuseThread"}
if(!(Get-PSSnapin Microsoft.SharePoint.PowerShell -ea 0))
{
Add-PSSnapin Microsoft.SharePoint.PowerShell
}
$pathSave = "D:\Script\Output.csv"
$pathRead = "D:\Script\\sites.txt"
write-output = "======================================================================="
[System.Collections.ArrayList]$objectCollection = #();
[System.Collections.ArrayList]$allUsersArray = #();
foreach($line in Get-Content $pathRead)
{
$site = Get-SPSite($line)
$sitecntr = 0;
#write-output " ";
write-output $site.Url;
$allusers = $site.RootWeb.AllUsers;
foreach ($user in $allusers)
{
if (!$user.IsDomainGroup)
{
$sitecntr++;
$outstring = $site.Url + ";" + $user.UserLogin + ";" + $user.Email
$a = $allUsersArray.Add($outstring);
}
}
$outString = "Totalt: " + $sitecntr
if ($sitecntr -ne 0) {
write-output $outstring;
write-output " ";
}
$site.Dispose();
}
write-output ""
write-output "Exporting to csv..."
Out-File -FilePath $pathSave -InputObject $allUsersArray
write-output "Finished! "
write-output ""
This script will ensure reader permissions to the specified user across the site collection.
**Keep in mind:** This is not the right way to deal with permissions issues!
$site = Get-SPSite -Identity "http://mysite/"
$user = Get-SPUser -Identity "mydomain\myuser" -Web $site.RootWeb
$assignment = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
$role = $site.RootWeb.RoleDefinitions[[Microsoft.SharePoint.SPRoleType]::Reader]
$assignment.RoleDefinitionBindings.Add($role);
foreach ($web in $site.AllWebs) {
if ($web.HasUniquePerm) {
$web.RoleAssignments.Add($assignment)
}
}
Note: You need to be Site Collection Admin to use this script.
I just learned the benefit of using Write-Verbose & Write-Debug over my own Write-Log function, which you can find below:
Function Write-Log
{
param($logType, $logString, $logFile, [switch]$newLine)
$time = get-date -Format "HH:mm:ss"
$date = get-date -Format "yyyy-MM-dd"
$line = "[${date}][${time}][$logType] ${logString}"
if ($logFile)
{
$retryDelay = 0.5;
$maxRetries = 10;
$retries = 0;
while($retries -lt $maxRetries)
{
try
{
$line | out-file -Encoding utf8 -Append $logFile
break;
}
catch
{
++$retries;
Sleep $retryDelay;
}
}
}
if ($logType -eq 'INFO')
{
write-host -ForegroundColor Green $line
}
elseif ($logType -eq 'WARN')
{
write-host -ForegroundColor Yellow $line
}
elseif ($logType -eq 'ERROR')
{
write-host -ForegroundColor Red $line
}
if ($newLine -eq $true)
{
write-host
}
}
This helps to me keep my scripts output as little cluttered as possible and includes a timestamp which is handy when it comes to debugging.
Question
Is there a way to overload Write-Verbose so it behaves in the following way?
PS > Write-Verbose -Message 'I am a verbose message!'
[2016-02-25][07:44:36] VERBOSE: I am a verbose message!
Edit
I have found the following, which unfortunately isn't honoring the $VerbosePreference variable:
$VerbosePreference = "SilentlyContinue"
Function Private:Write-Verbose ($Message)
{
$time = get-date -Format "HH:mm:ss"
$date = get-date -Format "yyyy-MM-dd"
$line = "[${date}][${time}] "
Write-Host $line -NoNewline
&{Write-Verbose -Message $Message}
}
Write-Verbose -Message "Test"
The above will just output the date and timestamp, without the message.
Write-Verbose resides within Microsoft.PowerShell.Utility , so this is not possible afaik; without manipulating and change built-in behavior in Powershell (which should be avoided).
You could either create your own "Write-Verbose" function in your script/session scope; which would output the desired result (using cmdletbinding()); or live with an output message such as "VERBOSE: [2016-02-25][07:44:36] Your log message" (rely on the default behavior of write-verbose).
I'd recommend the latter unless you have some funky output requirements for your host.
If you go on and create your own Write-Verbose function, you should use the [cmdletbinding()] before your params; as this enables default parameters/switches to be passed to your functions(such as -verbose / -information, -debug etc).
For more information about cmdletbinding and parameter-binding see:
https://blogs.technet.microsoft.com/heyscriptingguy/2012/07/07/weekend-scripter-cmdletbinding-attribute-simplifies-powershell-functions/
https://posh2scripting.wordpress.com/2013/06/05/what-is-cmdletbinding/
Last thing; it's not recommended of using Write-host directly in your scripts as this messes with default stream redirects (etc). I would highly suggest on using Write-Verbose, Write-Debug, Write-information, Write-Output cmdlets if you're printing information to streams.
More information for not using Write-host can be found on:
http://www.jsnover.com/blog/2013/12/07/write-host-considered-harmful/
http://powershell.com/cs/blogs/donjones/archive/2012/04/06/2012-scripting-games-commentary-stop-using-write-host.aspx
Hope this answers your question.
Agree with #CmdrTchort on the answer to this question.
This answer's provided to give an implementation of a custom implementation of Write-Verbose which could be used instead (i.e. by making calls to Write-CustomVerbose instead of Write-Verbose. Obviously this won't affect any existing code or code in referenced libraries which still use Write-Verbose.
function Write-CustomVerbose {
[CmdletBinding(DefaultParameterSetName='UseTimestamp')]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[AllowEmptyString()]
[string]$Message
,
[Parameter(Mandatory = $false, ParameterSetName='UseTimestamp')]
[string]$TimestampFormat = ((Get-Culture).DateTimeFormat.UniversalSortableDateTimePattern)
,
[Parameter(Mandatory = $false, ParameterSetName='UseTimestamp')]
[switch]$UseLocalTime #defaults to UTC
,
[Parameter(Mandatory = $false, ParameterSetName='DoNotUseTimestamp')]
[switch]$ExcludeTimestamp #defaults to include the timestamp (as that's why we're using this function over the standard write-verbose
)
begin {
[string]$FormattedMessage = '{0}'
if(-not $ExcludeTimestamp.IsPresent) {
$FormattedMessage = "{1:$TimestampFormat}: $FormattedMessage"
}
}
process {
#get the time here rather than in begin as we want it to be accurate per message from pipeline
[DateTime]$Now = Get-Date
if(-not $UseLocalTime.IsPresent){$Now = $Now.ToUniversalTime()}
#output the results
write-verbose ($FormattedMessage -f $Message, $Now)
}
}
Example Usage: 1..1000 | Write-CustomVerbose -Verbose -UseLocalTime -TimestampFormat 'HH:mm'
Update
Here's a slightly more advanced version allowing you to hijack all streams at once:
function Write-Custom {
[CmdletBinding(DefaultParameterSetName='UseTimestamp')]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[PSCustomObject]$InputObject
,
[Parameter(Mandatory = $false, ParameterSetName='UseTimestamp')]
[string]$TimestampFormat = ((Get-Culture).DateTimeFormat.UniversalSortableDateTimePattern)
,
[Parameter(Mandatory = $false, ParameterSetName='UseTimestamp')]
[switch]$UseLocalTime #defaults to UTC
,
[Parameter(Mandatory = $false, ParameterSetName='DoNotUseTimestamp')]
[switch]$ExcludeTimestamp #defaults to include the timestamp (as that's why we're using this function over the standard write-verbose
)
begin {
[string]$FormattedMessage = '{0}'
if(-not $ExcludeTimestamp.IsPresent) {
$FormattedMessage = "{1:$TimestampFormat}: $FormattedMessage"
}
}
process {
#get the time here rather than in begin as we want it to be accurate per message from pipeline
[DateTime]$Now = Get-Date
if(-not $UseLocalTime.IsPresent){$Now = $Now.ToUniversalTime()}
#determine output back to original stream
[bool]$outputStream = $true
if($InputObject.WriteErrorStream) {$outputStream=$false;write-error ($FormattedMessage -f $InputObject, $Now)}
if($InputObject.WriteWarningStream){$outputStream=$false;write-warning ($FormattedMessage -f $InputObject, $Now)}
if($InputObject.WriteVerboseStream){$outputStream=$false;write-verbose ($FormattedMessage -f $InputObject, $Now) -Verbose}
if($InputObject.WriteDebugStream) {$outputStream=$false;write-debug ($FormattedMessage -f $InputObject, $Now) -Debug}
if($outputStream){$InputObject}
}
}
#demo
1..20 | %{
if($_ % 2 -eq 0) {Write-Output $_}
if($_ -eq 11) {Write-Error $_ -ErrorAction Continue 2>&1} #bit of a hack required to get error output to flow further along the pipeline.
if($_ -eq 13) {Write-Warning $_}
if($_ -eq 15) {Write-Verbose $_ -Verbose}
if($_ -eq 17) {Write-Debug $_ -Debug}
} *>&1 | Write-Custom
I’m writing some code to compare installed versions of software in some test computers currently I’m using a PowerShell PS1 script to create a text file and compare with previously created baseline text file
For the ease of end users I would like to automate these in a excel file, press a button and you get a result of what does not match with the baseline
My current ps1 scripts are
:CreateDefaultApps.ps1 - Create the Default baseline DefApp32.txt or DefApp64.txt in My Documents folder
cls
$CurDir = $myinvocation.InvocationName | split-path -parent
$DOCDIR = [Environment]::GetFolderPath("MyDocuments")
$TARGETDIR = "$DOCDIR\AppLog"
$COMPNAME = $env:computername
if(!(Test-Path -Path $TARGETDIR ))
{
New-Item -ItemType directory -Path $TARGETDIR
}
cls
if ([Environment]::Is64BitOperatingSystem)
{
Write-Host "64bit Windows Detected"
Write-Host "Collecting Product Information for"$COMPNAME
If (Test-Path $TARGETDIR\DefApp64.txt)
{
Remove-Item $TARGETDIR\DefApp64.txt
}
Else
{
Get-WmiObject -Class Win32_Product | Select-Object -Property Name, Version > $TARGETDIR\DefApp64.txt
}
write-host "File"$TARGETDIR"\DefApp64.txt Created"
}
else
{
Write-Host "32bit Windows Detected"
Write-Host "Collecting Product Information for"$COMPNAME
If (Test-Path $TARGETDIR\DefApp32.txt)
{
Remove-Item $TARGETDIR\DefApp32.txt
}
Else
{
Get-WmiObject -Class Win32_Product | Select-Object -Property Name, Version > $TARGETDIR\DefApp32.txt
}
write-host "File " $TARGETDIR"\DefApp32.txt Created"
}
:CompareApps.ps1 creates current list of applications and compare with Baseline
cls
$CurDir = $myinvocation.InvocationName | split-path -parent
Function Abort
{
Write-Host "Script Aborted"
}
$DOCDIR = [Environment]::GetFolderPath("MyDocuments")
$TARGETDIR = "$DOCDIR\AppLog"
$COMPNAME = $env:computername
if(!(Test-Path -Path $TARGETDIR ))
{
Write-Host $TARGETDIR" Folder does not Exist"
Abort
}
Function Finish
{
write-host "Comparing Products Completed"
}
Function CreateCompNameFile
{
Get-WmiObject -Class Win32_Product | Select-Object -Property Name, Version > $TARGETDIR\$COMPNAME.txt
write-host "File"$TARGETDIR\$COMPNAME".txt Created"
}
Function Compare64
{
write-host "Comparing Products with file"$TARGETDIR\$COMPNAME".txt"
Compare-Object -ReferenceObject (Get-Content $TARGETDIR\DefApp64.txt) -DifferenceObject (Get-Content $TARGETDIR\$COMPNAME.txt)
Finish
}
Function Compare32
{
write-host "Comparing Products with file"$TARGETDIR\$COMPNAME".txt"
Compare-Object -ReferenceObject (Get-Content $TARGETDIR\DefApp32.txt) -DifferenceObject (Get-Content $TARGETDIR\$COMPNAME.txt)
Finish
}
cls
if ([environment]::Is64BitOperatingSystem)
{
Write-Host "64bit Windows Detected"
Write-Host "Collecting Product Information for"$COMPNAME
If (Test-Path $TARGETDIR\$COMPNAME.txt)
{
Remove-Item $TARGETDIR\$COMPNAME.txt
CreateCompNameFile
}
else
{
CreateCompNameFile
}
Compare64
}
else
{
Write-Host "32bit Windows Detected"
Write-Host "Collecting Product Information for"$COMPNAME
If (Test-Path $TARGETDIR\$COMPNAME.txt)
{
Remove-Item $TARGETDIR\$COMPNAME.txt
CreateCompNameFile
}
else
{
CreateCompNameFile
}
Compare32
}
I would prefer to run this code via excel and capture the list in a worksheet, then compare the baseline which is already in the worksheet. How do I execute or use commands from PowerShell in VBA?
Previously I have used (see post How to list all installed applications in to excel)
Set objWMIService = GetObject("winmgmts:{impersonationLevel=impersonate}!\\" & StrComputer & "\root\cimv2")
Set objAllSoftwares = objWMIService.ExecQuery("Select * from Win32_Product")
For some reason it does not retrieve all the software installed (which you can see in Add/Remove Programs) e.g. 7-Zip (64 bit), GPL Ghost Script to name a few
Any help would be much appreciated.
Thanks
Dumidu
I have a very simple question. My purpose here to retrieve login names from a txt file into a variable into SQL and query the SQL table while predicating against that same variable.
So for example:
the txt file would have:
forde
blain
martin
Alex
so the idea to feed each name to a variable and output the designated computer name.
Declare #loginName varchar (25)
--open the file
--while the end of the file has not reached, read each line and place the name into #loginName variable
select *
from computerinfo
where loginname = #loginname
I don't necessarily need to bulk import to a temp table at this point.
Thanks.
i had to do this some weeks ago and the simple way i found was Powershell.
I had no SSIS else it's the best of course.
# You may want to adjust these
function Invoke-Sqlcmd2
{
[CmdletBinding()]
param(
[Parameter(Position=0, Mandatory=$true)] [string]$ServerInstance,
[Parameter(Position=1, Mandatory=$false)] [string]$Database,
[Parameter(Position=2, Mandatory=$false)] [string]$Query,
[Parameter(Position=3, Mandatory=$false)] [string]$Username,
[Parameter(Position=4, Mandatory=$false)] [string]$Password,
[Parameter(Position=5, Mandatory=$false)] [Int32]$QueryTimeout=600,
[Parameter(Position=6, Mandatory=$false)] [Int32]$ConnectionTimeout=15,
[Parameter(Position=7, Mandatory=$false)] [ValidateScript({test-path $_})] [string]$InputFile,
[Parameter(Position=8, Mandatory=$false)] [ValidateSet("DataSet", "DataTable", "DataRow")] [string]$As="DataRow"
)
if ($InputFile)
{
$filePath = $(resolve-path $InputFile).path
$Query = [System.IO.File]::ReadAllText("$filePath")
}
$conn=new-object System.Data.SqlClient.SQLConnection
if ($Username)
{ $ConnectionString = "Server={0};Database={1};User ID={2};Password={3};Trusted_Connection=False;Connect Timeout={4}" -f $ServerInstance,$Database,$Username,$Password,$ConnectionTimeout }
else
{ $ConnectionString = "Server={0};Database={1};Integrated Security=True;Connect Timeout={2}" -f $ServerInstance,$Database,$ConnectionTimeout }
$conn.ConnectionString=$ConnectionString
#Following EventHandler is used for PRINT and RAISERROR T-SQL statements. Executed when -Verbose parameter specified by caller
if ($PSBoundParameters.Verbose)
{
$conn.FireInfoMessageEventOnUserErrors=$true
$handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] {Write-Verbose "$($_)"}
$conn.add_InfoMessage($handler)
}
$conn.Open()
$cmd=new-object system.Data.SqlClient.SqlCommand($Query,$conn)
$cmd.CommandTimeout=$QueryTimeout
$ds=New-Object system.Data.DataSet
$da=New-Object system.Data.SqlClient.SqlDataAdapter($cmd)
[void]$da.fill($ds)
$conn.Close()
switch ($As)
{
'DataSet' { Write-Output ($ds) }
'DataTable' { Write-Output ($ds.Tables) }
'DataRow' { Write-Output ($ds.Tables[0]) }
}
} #Invoke-Sqlcmd2
$files = #(get-childitem "filelocationformultiplefile" -include *.txt -exclude *.bak -recurse | where-object {($_.LastWriteTime -le (Get-Date).AddDays(-0))-and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
$Query = #"
Bulk INSERT db.dbo.tbl from '$file' with (FirstRow = 1, FieldTerminator ='";', RowTerminator = '\n')
"#
Invoke-sqlcmd2 -ServerInstance "servername" -Database "db" -Query $Query
}
}
And i'm not crazy, except the 10 ending lines everything else is coming from a microsoft official blog.
You don't need everything, powershell is present on every computer, it requires to save this in a file with extension ps1 and configure the 4 variables $fileout2 -> $tablename