How to use SQL Server agent to run a Powershell command to generate a CSV file? - sql

I have a PowerShell command like this:
Search-ADAccount -AccountDisabled -UsersOnly | FT Name
> C:\Users\hou\Downloads\DisabledAccount.csv
This command can grab all disabled account names from AD and put it into a .CSV file.
I want to set up a job in SQL Server agent so it will run the command whenever I need it.
But the Agent keep gives me error when I was trying to run the job.
Can anyone let me know the right command for this while running in SQL Server agent?

You might have other issues but the glaring first one you have is you do not have a CSV file. If you opened the file you would see that it is not following any CSV standard but that of a formatted table.
Search-ADAccount -AccountDisabled -UsersOnly |
Export-CSV -notypeinformation C:\Users\hou\Downloads\DisabledAccount.csv
Format-Table is just for showing information on the screen. Not for use of generating output. If you want to make a CSV then that is what Export-CSV is there to do. If you only wanted the name column then you could add a Select-Object in the pipe.
Search-ADAccount -AccountDisabled -UsersOnly |
Select-Object Name |
Export-CSV -notypeinformation C:\Users\hou\Downloads\DisabledAccount.csv

Related

Whats the best approach to analyse a data source (csv) to then extract the deltas (changes) only to a destination source?

For my example, I'm looking to compare a source file with some changes made to the attributes in a table - lets say in the form of another source file.
What i want to achieve is
Sourcefile.csv
Newfile.csv
Deltafile.csv
(this will export only the changes deltas (rows) between the two files)
What i would like to achieve is that the row with the changes is exported as the delta, not just the column attribute.
All other rows that match do not need to be updated.
100,Renie,Stav,Renie.Stav#yopmail.com,Renie.Stav#gmail.com,CHANGE
101,Neila,Germann,CHANGE,Neila.Germann#gmail.com,developer
I've looked at Powershell, FC and SSIS incremental loading to see if this will work for my needs but a need some guidance in the right direction. Any help is greatly appreciated! :)
**Current Method **
Looking indepth at the powershell i tried in https://www.reddit.com/r/PowerShell/comments/cea8ax/compare_2_csv_files_and_export_the_rows_that_do/
Which is
# Compare work
$csv1 = Import-Csv -Path C:\Users\G23\Documents\Hackingfolder\source.csv
$csv2 = Import-Csv -Path C:\Users\G23\Documents\Hackingfolder\new.csv
$head = (Get-Content -Path C:\Users\G23\Documents\Hackingfolder\source.csv | Select-Object -First 1) -split ","
Compare-Object $csv1 $csv2 -Property $head -PassThru| Export-Csv C:\Users\G23\Documents\Hackingfolder\TheDiff.csv -NoTypeInformation
# Remove side indicator if you dont care to know where the diff came from
Compare-Object $csv1 $csv2 -Property $head | Select-Object -Property $head
Ignoring the side indicators i would get the rows that would not match, both of them. its not smart enough to know which one is the updated one. e.g I want the exported delta changes rows only.
PowerShell output
instead of the desired below changes only in csv
100,Renie,Stav,Renie.Stav#yopmail.com,Renie.Stav#gmail.com,CHANGE
101,Neila,Germann,CHANGE,Neila.Germann#gmail.com,developer
Thanks!

PowerShell - Create Text File from SQL

I'm creating a PowerShell script to generate .sql files based on some stored procedures in a SQL Server database.
On the advice of this article, I'm using .NET's System.Data.SqlClient.SqlConnection class to create my connection and run stored procedures. Per the article, I can store that data in a System.Data.DataTable object.
The trouble I'm running into is how to best build the file. I'm new to PowerShell, so I've been getting away with something like this:
$SqlFile = "$CurrentDirectory\$CurrentDate`_$Library`_$File`_Table_CREATE.sql$ReviewFlag"
"USE [$TargetDB]`n`nCREATE TABLE [$Library].[$File] (`n" | Out-File $SqlFile
$dt.DataTypeString | Out-File -Append $SqlFile
")`n" | Out-File -Append $SqlFile
But the trouble now is that I need to generate the file extension based on another query, which will also be a part of the file, but I can't do that in sequence as per above; I would need to somehow save that output and write it to the file.
Do I want to keep invoking Out-File, or would it be better to somehow convert the DataTable object to a string or list of strings that I could then concatenate into one large string to be written to the file all at once? This doesn't have to be perfect, but I would like to write something reasonable and maintainable.
Edit: I decided to go route of using an ArrayList to store the values from the DataTable, and from there it's simple string concatenation:
$ColumnStringList = New-Object -TypeName 'System.Collections.ArrayList'
foreach($val in $dt.DataTypeString) {$ColumnStringList.Add("`n`t"+$val)}
And then
$SqlFileString = $($SqlFileHeader+$ColumnStringList+$RRNCol+$SqlFileFooter)
$SqlFileString | Out-File $SqlFile

BCP update database table base on output from powershell

I have 4 files with the same csv header as following
Column1,Column2,Column3,Column4
But I only required data from Column2,Column3,Column4 for import the data into SQL database using BCP . I am using the PowerShell to select the columns that I want and import the required data using BCP but my powershell executed with no error and there are not data updated in my database table. May I know how to set the BCP to import the output from Powershell to database table. Here are my powershell script
$filePath = Get-ChildItem -Path 'D:\test\*' -Include $filename
$desiredColumn = 'Column2','Column3','Column4'
foreach($file in $filePath)
{
write-host $file
$test = import-csv $file | select $desiredColumn
write-host $test
$action = bcp <myDatabaseTableName> in $test -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
These are the output from the powershell script
D:\test\sample1.csv
#{column2=111;column3=222;column4=333} #{column2=444;column3=555;column4=666}
D:\test\sample2.csv
#{column2=777;column3=888;column4=999} #{column2=aaa;column3=bbb;column4=ccc}
First off, you can't update a table with bcp. It is used to bulk load data. That is, it will either insert new rows or export existing data into a flat file. Changing existing rows, usually called as updating, is out of scope for bcp. If that's what you need, you need to use another a tool. Sqlcmd works fine, and Powershell's got Invoke-Sqlcmd for running arbitary TSQL statements.
Anyway, the BCP utility has notoriously tricky syntax. As far as I know, one cannot bulk load data by passing the data as parameter to bcp, a source file must be used. Thus you need to save the filtered file and pass its name to bcp.
Exporting a filtered CSV is easy enough, just remember to use -NoTypeInformation switch, lest you'll get #TYPE Selected.System.Management.Automation.PSCustomObject as your first row of data. Assuming the bcp arguments are well and good (why -F2 though? And Unix newlines?).
Stripping double quotes requires another an edit to the file. Scrpting Guy has a solution.
foreach($file in $filePath){
write-host $file
$test = import-csv $file | select $desiredColumn
# Overwrite filtereddata.csv, should one exist, with filtered data
$test | export-csv -path .\filtereddata.csv -NoTypeInformation
# Remove doulbe quotes
(gc filtereddata.csv) | % {$_ -replace '"', ''} | out-file filtereddata.csv -Fo -En ascii
$action = bcp <myDatabaseTableName> in filtereddata.csv -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
Depending on your locale, column separator might be semicolon, colon or something else. Use -Delimiter '<character>' switch to pass whatever you need or change bcp's argument.
Erland's got a helpful page about bulk operations. Also, see Redgate's advice.
Without need to modify the file first, there is an answer here about how bcp can handle quoted data.
BCP in with quoted fields in source file
Essentially, you need to use the -f option and create/use a format file to tell SQL your custom field delimiter (in short, it is no longer a lone comma (,) but it is now (",")... comma with two double quotes. Need to escape the dblquotes and a small trick to handle the first doulbe quote on a line. But it works like a charm.
Also, need the format file to ignore column(s)... just set the destination column number to zero. All with no need to modify the file before load. Good luck!

Powershell script as a step in sql job giving error

I am trying to create a sql job which syncs users from a csv file to ad group.
My powershell script is one of the steps of this job. Issue is that my script is supposed to run on another server which has Active Directory but i keep on getting error when i run this step.
My script is following:
invoke-Command -Session Server-Name
Import-Module activedirectory
$ADUsers = Import-csv \\Server-Name\folder\file.csv
foreach ($User in $ADUsers)
{
$Username = $User.sAMAccountName
$group=$user.adgroup
if (Get-ADUser -F {SamAccountName -eq $Username})
{
foreach($group in $groups){Add-ADGroupMember -identity $group -Members $Username}
Write-Output "$username has beeen added to group $group"
}
}
Error i am getting is
Executed as user: Username. A job step received an error at line 2 in a PowerShell script. The corresponding line is 'Invoke-Command -Session Server-Name. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Cannot bind parameter 'Session'. Cannot convert the "Server-Name" value of type "System.String" to type "System.Management.Automation.Runspaces.PSSession". '. Process Exit Code -1. The step failed.
server name has '-' in between so need to know if that is causing the issue
or i am using wrong way to run this script on a different server from a sql job
Any help would be appreciated!
Jaspreet I am not expert on powershell but seems like you are passing the wrong parameters.Just referring to Microsoft docs seems like you need to pass the computer name rather than -Session
Try with this line of code at starting
invoke-Command -ComputerName Server-Name.
For more please refer Microsoft docs
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/invoke-command?view=powershell-6#examples

Powershell 4.0 - plink and table-like data

I am running PS 4.0 and the following command in interaction with a Veritas Netbackup master server on a Unix host via plink:
PS C:\batch> $testtest = c:\batch\plink blah#blersniggity -pw "blurble" "/usr/openv/netbackup/bin/admincmd/nbpemreq -due -date 01/17/2014" | Format-Table -property Status
As you can see, I attempted a "Format-Table" call at the end of this.
The resulting value of the variable ($testtest) is a string that is laid out exactly like the table in the Unix console, with Status, Job Code, Servername, Policy... all that listed in order. But, it populates the variable in Powershell as just that: a vanilla string.
I want to use this in conjunction with a stored procedure on a SQL box, which would be TONS easier if I could format it into a table. How do I use Powershell to tabulate it exactly how it is extracted from the Unix prompt via Plink?
You'll need to parse it and create PS Objects to be able to use the format-* cmdlets. I do enough of it that I wrote this to help:
http://gallery.technet.microsoft.com/scriptcenter/New-PSObjectFromMatches-87d8ce87
You'll need to be able to isolate the data and write a regex to capture the bits you want.