PowerShell - Create Text File from SQL - sql

I'm creating a PowerShell script to generate .sql files based on some stored procedures in a SQL Server database.
On the advice of this article, I'm using .NET's System.Data.SqlClient.SqlConnection class to create my connection and run stored procedures. Per the article, I can store that data in a System.Data.DataTable object.
The trouble I'm running into is how to best build the file. I'm new to PowerShell, so I've been getting away with something like this:
$SqlFile = "$CurrentDirectory\$CurrentDate`_$Library`_$File`_Table_CREATE.sql$ReviewFlag"
"USE [$TargetDB]`n`nCREATE TABLE [$Library].[$File] (`n" | Out-File $SqlFile
$dt.DataTypeString | Out-File -Append $SqlFile
")`n" | Out-File -Append $SqlFile
But the trouble now is that I need to generate the file extension based on another query, which will also be a part of the file, but I can't do that in sequence as per above; I would need to somehow save that output and write it to the file.
Do I want to keep invoking Out-File, or would it be better to somehow convert the DataTable object to a string or list of strings that I could then concatenate into one large string to be written to the file all at once? This doesn't have to be perfect, but I would like to write something reasonable and maintainable.
Edit: I decided to go route of using an ArrayList to store the values from the DataTable, and from there it's simple string concatenation:
$ColumnStringList = New-Object -TypeName 'System.Collections.ArrayList'
foreach($val in $dt.DataTypeString) {$ColumnStringList.Add("`n`t"+$val)}
And then
$SqlFileString = $($SqlFileHeader+$ColumnStringList+$RRNCol+$SqlFileFooter)
$SqlFileString | Out-File $SqlFile

Related

BCP update database table base on output from powershell

I have 4 files with the same csv header as following
Column1,Column2,Column3,Column4
But I only required data from Column2,Column3,Column4 for import the data into SQL database using BCP . I am using the PowerShell to select the columns that I want and import the required data using BCP but my powershell executed with no error and there are not data updated in my database table. May I know how to set the BCP to import the output from Powershell to database table. Here are my powershell script
$filePath = Get-ChildItem -Path 'D:\test\*' -Include $filename
$desiredColumn = 'Column2','Column3','Column4'
foreach($file in $filePath)
{
write-host $file
$test = import-csv $file | select $desiredColumn
write-host $test
$action = bcp <myDatabaseTableName> in $test -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
These are the output from the powershell script
D:\test\sample1.csv
#{column2=111;column3=222;column4=333} #{column2=444;column3=555;column4=666}
D:\test\sample2.csv
#{column2=777;column3=888;column4=999} #{column2=aaa;column3=bbb;column4=ccc}
First off, you can't update a table with bcp. It is used to bulk load data. That is, it will either insert new rows or export existing data into a flat file. Changing existing rows, usually called as updating, is out of scope for bcp. If that's what you need, you need to use another a tool. Sqlcmd works fine, and Powershell's got Invoke-Sqlcmd for running arbitary TSQL statements.
Anyway, the BCP utility has notoriously tricky syntax. As far as I know, one cannot bulk load data by passing the data as parameter to bcp, a source file must be used. Thus you need to save the filtered file and pass its name to bcp.
Exporting a filtered CSV is easy enough, just remember to use -NoTypeInformation switch, lest you'll get #TYPE Selected.System.Management.Automation.PSCustomObject as your first row of data. Assuming the bcp arguments are well and good (why -F2 though? And Unix newlines?).
Stripping double quotes requires another an edit to the file. Scrpting Guy has a solution.
foreach($file in $filePath){
write-host $file
$test = import-csv $file | select $desiredColumn
# Overwrite filtereddata.csv, should one exist, with filtered data
$test | export-csv -path .\filtereddata.csv -NoTypeInformation
# Remove doulbe quotes
(gc filtereddata.csv) | % {$_ -replace '"', ''} | out-file filtereddata.csv -Fo -En ascii
$action = bcp <myDatabaseTableName> in filtereddata.csv -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
Depending on your locale, column separator might be semicolon, colon or something else. Use -Delimiter '<character>' switch to pass whatever you need or change bcp's argument.
Erland's got a helpful page about bulk operations. Also, see Redgate's advice.
Without need to modify the file first, there is an answer here about how bcp can handle quoted data.
BCP in with quoted fields in source file
Essentially, you need to use the -f option and create/use a format file to tell SQL your custom field delimiter (in short, it is no longer a lone comma (,) but it is now (",")... comma with two double quotes. Need to escape the dblquotes and a small trick to handle the first doulbe quote on a line. But it works like a charm.
Also, need the format file to ignore column(s)... just set the destination column number to zero. All with no need to modify the file before load. Good luck!

run powershell in VBA access

I need to change a string in multiple text files
I have written the script below in ACCESS VBA but the error is TYPE MISMATCH
Dim str As String
str = "N=maher"
Call Shell("c:\windows\system32\powershell.exe" - Command("get-content -Path e:\temptest.txt") - Replace(str, "maher", "ali"))
The syntax for calling PowerShell is way off. Suggestion: get it working from the command line yourself first, and then run from Access (an odd choice: it just makes this more complicated).
A PowerShell script to do this (.ps1 file) would need to contain something like:
Get-Content -Path "E:\temptest.txt" | ForEach-Object { $_ -Replace 'maher', 'ali' } | do-something-with-the-updated-content
You need to define:
What you are replacing (you pass N=maher in but then hard code two strings for Replace.
What do to with the strings after doing the replacement (Get-Content only reads files).

Powershell: SQL Server Management Studio Script Generator

I use the Script Generator which is integrated in the Microsoft SQL Server Management Studio to generate an import script for a whole database.
I have to do some replacements in the script which I do with Powershell. Now I want to automate the generation. Is there a way to execute exactly this Script Generator Tool (and setting some options as on the screenshot - in my case 'Data only')? Or (if this isn't possible) can I open this tool window automatically from a ps script so I don't have to open the Management Studio, selecting the DB, ...?
I found some scripts which 'manually' build the script file in Powershell but that's not exactly what I'm looking for.
Thanks!
This question's been here awhile and you've probably found your answer by now, but for those looking for a simple way to do this, the current versions of SQL server Powershell modules have native commands and methods that support this functionality from SMO.
You can use Get-SqlDatabase and methods such as .Script() and .EnumScript().
For example, this will generate CREATE scripts for user defined functions and save it to file:
$Database = Get-SqlDatabase -ServerInstance $YourSqlServer -Name $YourDatabaseName
$MyFuncs = $Database.UserDefinedFunctions | Where Schema -eq "dbo"
$MyFuncs.Script() | Out-File -FilePath ".\SqlScripts\MyFunctions.sql"
If you want to script data and elements like indexes, keys, triggers, etc. you will have to specify the scripting options, like this:
$scriptOptions = New-Object -TypeName Microsoft.SqlServer.Management.Smo.ScriptingOptions
$scriptOptions.NoCollation = $True
$scriptOptions.Indexes = $True
$scriptOptions.Triggers = $True
$scriptOptions.DriAll = $True
$scriptOptions.ScriptData = $True
$Database.Tables.EnumScript($scriptOptions) | Out-File -FilePath ".\AllMyTables.sql"
Note that the Script() method doesn't support scripting data. Use EnumScript() for tables.
If you want to script data only, as asked, you can try $scriptOptions.ScriptData = $True and $scriptOptions.ScriptSchema = $False.

How to use SQL Server agent to run a Powershell command to generate a CSV file?

I have a PowerShell command like this:
Search-ADAccount -AccountDisabled -UsersOnly | FT Name
> C:\Users\hou\Downloads\DisabledAccount.csv
This command can grab all disabled account names from AD and put it into a .CSV file.
I want to set up a job in SQL Server agent so it will run the command whenever I need it.
But the Agent keep gives me error when I was trying to run the job.
Can anyone let me know the right command for this while running in SQL Server agent?
You might have other issues but the glaring first one you have is you do not have a CSV file. If you opened the file you would see that it is not following any CSV standard but that of a formatted table.
Search-ADAccount -AccountDisabled -UsersOnly |
Export-CSV -notypeinformation C:\Users\hou\Downloads\DisabledAccount.csv
Format-Table is just for showing information on the screen. Not for use of generating output. If you want to make a CSV then that is what Export-CSV is there to do. If you only wanted the name column then you could add a Select-Object in the pipe.
Search-ADAccount -AccountDisabled -UsersOnly |
Select-Object Name |
Export-CSV -notypeinformation C:\Users\hou\Downloads\DisabledAccount.csv

Exporting data onto a csv file or output via email using array

I have a powershell script which I have written, it also works as well. The problem that I now have is that originally the requirement was to save the results onto a database, now I want to email the results as well. I thought about a couple of options, but finding it difficult, now an easy way out which i thought of was to export the result to CSV then attach that CSV to the email.
The code below is inside my loop.
$sql = " INSERT INTO dbo.tb_checks ([ServerName],[Directory],[DirectoryFile] ,[FileCreationDate]) SELECT '$ServerName', '$Filepath', '$fileName', '$FileDate'"
Invoke-Sqlcmd2 -serverinstance $DBServer -database $Database -query $sql
SELECT '$ServerName', '$Filepath', '$fileName', '$FileDate' | Export-csv $Outfilename -append
The CSV file gets generated, but with no data.
Another idea which i thought of was to have the data stored in an array, then loop through/spit out the entire content of the array in an email.
Can someone help please ?
The reason that your CSV is empty is because you aren't feeding it an array that it can work with. What headers would it use in your script? It has no idea, it's just having random stuff thrown at it. Change that last line to this:
New-Object PSObject -Property #{Server=$ServerName;FilePath=$Filepath;FileName=$fileName;FileDate=$FileDate} | Export-csv $Outfilename -Append -NoTypeInformation
Assuming that your variables are set right it should output the file you want.
If you want to make it a table and put it in an email make an empty array before your loop, then do something like:
$LoopArray = #()
<start of loop>
$LoopArray += New-Object PSObject -Property #{Server=$ServerName;FilePath=$Filepath;FileName=$fileName;FileDate=$FileDate}
$LoopArray | Export-csv $Outfilename -Append -NoTypeInformation
<end of loop>
Then afterwards you have the array to work with that has all your data in the CSV stored and can be injected into an email.