Whats wrong with this PowerShell script? - sql

I am trying to export sql filestream data using powershell script and it exports only upto 8kb and it fails to export if the file is more than 8kb. But it creates the file partially. I dont know what is missing.
$Server = "(local)"; # SQL Server Instance.
$Database = "AdventureWorks";
$Dest = "D:\Export\"; # Path to export to.
$bufferSize = 8192; # Stream buffer size in bytes.
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$Server;" +
"Integrated Security=True;" +
"Initial Catalog=$Database";
$con.Open();
[System.Data.SqlClient.SqlTransaction]$tran = $con.BeginTransaction("fs");
$Sql = "SELECT GET_FILESTREAM_TRANSACTION_CONTEXT()";
$ctx = [array]::CreateInstance('Byte', 16);
$cmdct = New-Object Data.SqlClient.SqlCommand($Sql, $con, $tran);
$ctx = $cmdct.ExecuteScalar();
$cmdct.Dispose();
$Sql = "SELECT [FileName]
,[FileStreamData].PathName()
FROM dbo.FileStreamStorage ";
$out = [array]::CreateInstance('Byte', $bufferSize);
$cmd = New-Object Data.SqlClient.SqlCommand($Sql, $con, $tran);
$rd = $cmd.ExecuteReader();
While ($rd.Read())
{
Write-Output ("Exporting: {0}" -f $rd.GetString(0));
$fs = New-Object System.IO.FileStream ($Dest + $rd.GetString(0)), Create, Write;
$bw = New-Object System.IO.BinaryWriter($fs);
$sfs = New-Object System.Data.SqlTypes.SqlFileStream $rd.GetString(1), $ctx, Read, None, 0;
$start = 0;
While (1 -eq 1)
{
$received = $sfs.Read($out, $start, $bufferSize - 1);
$bw.Write($out, 0, $received);
$bw.Flush();
$start += $received;
If ($received -lt $bufferSize)
{ break; }
}
$bw.Close();
$fs.Close();
$sfs.Close();
}
$fs.Dispose();
$sfs.Dispose();
$rd.Close();
$rd.Dispose();
$tran.Commit();
$cmd.Dispose();
$tran.Dispose();
$con.Close();
$con.Dispose();
Write-Output ("Finished");
Any help will be really appreciated.

I think you are reading 8191 bytes, which means that $received will be 8191 if theres more data than this.
$received = $sfs.Read($out, $start, **$bufferSize - 1**);
Then you compare $received (8191) to $bufferSize (8192)
If ($received -lt $bufferSize)

I've done some work with filestreams and Powershell and I would agree with other comments--It appears there's an issue in how you are getting the length. Here's some code from my blog http://sev17.com/2010/05/11/t-sql-tuesday-006-blobs-filestream-and-powershell/ which demonstrates retrieving filestream data to a an image file:
$server = "Z002sql2k8"
$database = "AdventureWorks2008"
$query = "SELECT TOP(10) Document.PathName(), GET_FILESTREAM_TRANSACTION_CONTEXT(), Title + FileExtension AS FileName FROM Production.Document WHERE FileExtension = '.doc'"
$dirPath = "C:Usersu00"
$connection=new-object System.Data.SqlClient.SQLConnection
$connection.ConnectionString="Server={0};Database={1};Integrated Security=True" -f $server,$database
$connection.Open()
$command=new-object system.Data.SqlClient.SqlCommand("",$connection)
$command.CommandTimeout=120
$tran = $connection.BeginTransaction([System.Data.IsolationLevel]'ReadCommitted')
$command.Transaction = $tran
$command.CommandText = $query
$reader = $command.ExecuteReader()
while ($reader.Read())
{
$path = $reader.GetString(0)
[byte[]]$transactionContext = $reader.GetSqlBytes(1).Buffer
$filepath = "$dirPath{0}" -f $reader.GetValue(2)
$fileStream = new-object System.Data.SqlTypes.SqlFileStream($path,[byte[]]$reader.GetValue(1), [System.IO.FileAccess]'Read', [System.IO.FileOptions]'SequentialScan', 0)
$buffer = new-object byte[] $fileStream.Length
$fileStream.Read($buffer,0,$fileStream.Length)
$fileStream.Close()
[System.IO.File]::WriteAllBytes($filepath,$buffer)
}
$reader.Close()
$tran.Commit()
$connection.Close()

Related

Trying to extract data form SQL using PS script

I have been trying to get a PS script to work in extracting files (pdf, word, etc.) from an SQL Server database. I came across the PowerShell script below. The script runs and populates the destination folder but all files are 0 bytes and during the script execution. It throws the error:
"Exporting Objects from FILESTREAM container: .docx
Exception calling "GetBytes" with "5" argument(s): "Invalid attempt to GetBytes on column 'extension'. The GetBytes function can only be used on columns of typ
e Text, NText, or Image.""
Can anyone point me in what am I doing wrong and how to fix this please? Much appreciated.
$Server = ".\xxxxxx";
$Database = "xxxxxx";
$Dest = "C:\DATA\";
$bufferSize = 8192;
$Sql = "
SELECT
[extension]
FROM [XXXXXXXX].[dbo].[XXXXXXdocuments]
";
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$Server;" +
"Integrated Security=True;" +
"Initial Catalog=$Database";
$con.Open();
Write-Output ((Get-Date -format yyyy-MM-dd-HH:mm:ss) + ": Started ...");
$cmd = New-Object Data.SqlClient.SqlCommand $Sql, $con;
$cmd.CommandTimeout = 120
$rd = $cmd.ExecuteReader();
$out = [array]::CreateInstance('Byte', $bufferSize)
While ($rd.Read())
{
try
{
Write-Output ("Exporting Objects from FILESTREAM container: {0}" -f $rd.GetString(0));
$fs = New-Object System.IO.FileStream ($Dest + $rd.GetString(0)), Create, Write;
$bw = New-Object System.IO.BinaryWriter $fs;
$start = 0;
enter code here
$received = $rd.Getbytes(0, $start, $out, 0, $bufferSize - 1);
While ($received -gt 0)
{
$bw.Write($out, 0, $received);
$bw.Flush();
$start += $received;
$received = $rd.Getbytes(0, $start, $out, 0, $bufferSize - 1);
}
$bw.Close();
$fs.Close();
}
catch
{
Write-Output ($_.Exception.Message)
}
finally
{
$fs.Dispose();
}
}
$rd.Close();
$cmd.Dispose();
$con.Close();
Write-Output ("Finished");
Read-Host -Prompt "Press Enter to exit"
BinaryWriter is unnecessary. It's for writing primitive types to a Stream.
And there's no need to muck around with buffers; you can simply use SqlDataReader.GetStream(int).CopyTo(Stream), eg
$Server = "localhost";
$Database = "adventureworks2017";
$Dest = "C:\temp\";
$Sql = "
SELECT concat('photo', ProductPhotoID, '.jpg') name, LargePhoto from Production.ProductPhoto
";
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$Server;Integrated Security=True;Initial Catalog=$Database;TrustServerCertificate=true";
$con.Open();
Write-Output ((Get-Date -format yyyy-MM-dd-HH:mm:ss) + ": Started ...");
$cmd = New-Object Data.SqlClient.SqlCommand $Sql, $con;
$cmd.CommandTimeout = 120
$rd = $cmd.ExecuteReader();
While ($rd.Read())
{
try
{
Write-Output ("Exporting: {0}" -f $rd.GetString(0));
$fs = New-Object System.IO.FileStream ($Dest + $rd.GetString(0)), Create, Write;
$rd.GetStream(1).CopyTo($fs)
$fs.Close()
}
catch
{
Write-Output ($_.Exception.Message)
}
finally
{
$fs.Dispose();
}
}
$rd.Close();
$cmd.Dispose();
$con.Close();
Write-Output ("Finished");

Sending the results from a ForEach loop containing the Same SQL Query to 2 separate variables via PowerShell

Doing this query in sql server - it returns 3 rows of data. Running the script with the write-host $1_resultsDataTable and comment out the other variable $2_resultsDataTable- it returns only one row of the data array. Now if I reverse the comments so the $2_resultsDataTable is active for the write-host, it returns 6 rows of data.
How do I set this up so I would see the same 3 rows assigned to both $1_resultsDataTable and $2_resultsDataTable when I dump these variables to view the data results?
[string] $Server= "SERVER"
[string] $Database = "mvTest"
[string] $UserSqlQuery= $("select m.created_date, m.additional_data as ReasonDown from aeroscout.mv_audit m where m.created_date >= '2020-01-18' and m.additional_data like '%query-text%'")
#
$1_resultsDataTable, $2_resultsDataTable = foreach ($x in 1..2) {
$resultsDataTable = New-Object System.Data.DataTable
$resultsDataTable = ExecuteSqlQuery $Server $Database $UserSqlQuery
$resultsDataTable # first loop sends output to $1_resultsDataTable, second loop send to $2_resultsDataTable
Start-Sleep 3
}
# executes a query and populates the $datatable with the data
function ExecuteSqlQuery ($Server, $Database, $SQLQuery) {
$Datatable = New-Object System.Data.DataTable
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';Integrated Security=True;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$Command.CommandText = $SQLQuery
$Reader = $Command.ExecuteReader()
If ($Reader.HasRows) {
while($Reader.Read()) {
$props = #{}
for($i = 0; $i -lt $Reader.FieldCount; $i+=1) {
$name = $Reader.GetName($i)
$value = $Reader.item($i)
$props.Add($name, $value)
}
$obj = new-object PSObject -Property $props
Write-Output $obj
}
}
return $obj
$SqlConnection.Close()
}
#validate we got data
write-host $1_resultsDataTable
Start-Sleep 3
write-host $2_resultsDataTable

Following SQL Query, help needed to return records out of ForEach

[string] $Server= 'NERD\PAULSDB'
[string] $Database = "myPhotos"
$ShootDate= Get-Content -Path W:\W-SQL\PS\ShootDate.txt
ForEach ($S_Date in $ShootDate){
$SqlQuery=#"
SELECT * FROM Target.U_PhotoYears where StrmydateTaken= '$S_Date'
"#
$resultsDataTable = ExecuteSqlQuery $Server $Database $SqlQuery
function ExecuteSqlQuery ($Server, $Database, $SQLQuery) {
$Datatable = New-Object System.Data.DataTable
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';Integrated Security=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$Command.CommandText = $SQLQuery
$Reader = $Command.ExecuteReader()
$Datatable.Load($Reader)
$Connection.Close()
return $Datatable
}
$data=$(foreach($a in $resultsDataTable)
{
$a.P_Filename+$a.myYearTaken
}
)
$data
}
Text file has a distinct list of dates from all photo shoots. The first ForEach appears to be retrieving data ok. Then the Powershell script queries the myPhotos db based on values from text file. The query return the correct records for a given date.
However I cant get the records out of the second foreach to Export-Csv.
Any suggestions would be really appreciated
I think you shoud use bulkcopy for send your datas into temporary table. its really better for performance (you do only 3 actions on database and not X actions => X dates into you files)
try this (not tested)
$Connection=$null
$dttemporary=$null
$sqlBulkCopy=$null
$FinalData=$null
$Command=$null
$Reader=$null
$ErrorActionPreference = "Stop" #for stop script on first error
try
{
#create datatable for send data
$dttemporary = New-Object System.Data.Datatable
[void]$dttemporary.Columns.Add("mydate")
#add data to datatable
Get-Content -Path W:\W-SQL\PS\ShootDate.txt | %{
$NewRow=$dttemporary.NewRow()
$NewRow["mydate"]=$_
$dttemporary.Rows.Add($NewRow)
}
#Create connexion
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';Integrated Security=true;"
#create bulkcopy structure for send all data
$sqlBulkCopy = New-Object (“Data.SqlClient.SqlBulkCopy”) -ArgumentList $Connection
$sqlBulkCopy.DestinationTableName = “#TmpTable”
#Create datatable for final data
$FinalData = New-Object System.Data.DataTable
#create command
$Command = $Connection.CreateCommand()
$Connection.Open()
#create temporary table (exsit while connexion stay open)
$Command.CommandText = "CREATE TABLE #TmpTable(mydate varchar(30) NOT NULL)"
$Command.ExecuteNonQuery()
#send data to temporary command with bulkcopy
$sqlBulkCopy.WriteToServer($dttemporary)
#query for get your final datas
$Command.CommandText = "SELECT * FROM Target.U_PhotoYears f1 inner join #TmpTable f2 on f1.StrmydateTaken=f2.mydate"
$Reader = $Command.ExecuteReader()
$FinalData.Load($Reader)
$Connection.Close()
#loop on data for traitment
$FinalData | %{
$_["P_Filename"].ToString() + $_["StrmydateTaken"].ToString()
}
}
catch
{
$_.Exception.Message
}
finally
{
if ($Connection)
{
$Connection.Close()
$Connection.Dispose()
}
if ($dttemporary) {$dttemporary.Dispose()}
if ($sqlBulkCopy) {$sqlBulkCopy.Dispose()}
if ($FinalData) {$FinalData.Dispose()}
if ($Command) {$Command.Dispose()}
if ($Reader) {$Reader.Dispose()}
}

Building batch insert statement powershell to sql

I have a powershell script that writes every file and its attributes recursively starting from a specific directory. This works but the directories could have as many as 1,000,000 files. What I want to do is batch them at 1000 inserts per transaction. Here is the original PS:
$server = ""
$Database = ""
$Path = "C:\Test"
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name,Length,Mode, Directory,CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sql = "
begin
insert TestPowerShell
select '$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime'
end
"
$Command.CommandText = $sql
echo $sql
$Command.ExecuteNonQuery()
}
$Connection.Close()
My thoughts are to implement some sort of counter that will keep appending the insert until it reaches 1000 and then jump out of the loop and execute. I cannot figure out with this current setup how to batch at 1000, execute and then pick back up with the get-childitem loop.
Something like this should do:
function Execute-SqlQuery($query){
Write-Host "Executing query:"
Write-Host $query;
}
$data = #(1,2,3,4,5,6,7,8,9,10,11);
$batchSize = 2;
$counter = 0;
$sql = "";
foreach($item in $data){
if($counter -eq $batchSize){
Execute-SqlQuery $sql;
$counter = 0;
$sql = "";
}
$sql += "insert into myTable(id) values($item) `n";
$counter += 1;
}
Execute-SqlQuery $sql;
$server = ""
$Database = ""
$Path = "C:\Test"
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
# new variables to handle batching
$batchcounter=0
$batchsize=1000
$sqlValues = New-Object Collections.ArrayList
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name,Length,Mode, Directory,CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sqlValues.Add("('$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime')")
$batchcounter++
# if the counter hits batchsize, run the insert, using lots of:
# insert into table
# values (1,2,3)
# , (4,5,6)
# , (7,8,9)
if ($batchcounter % $batchsize -eq 0) {
$sql = "insert TestPowerShell values {0}" -f ($sqlValues.ToArray() -join "`r`n,")
$Command.CommandText = $sql
Write-Host $sql
$Command.ExecuteNonQuery()
$sqlValues.Clear()
}
}
# catch any remaining files
if ($batchcounter -gt 0) {
$sql = "insert TestPowerShell values {0}" -f ($sqlValues.ToArray() -join "`r`n,")
$Command.CommandText = $sql
Write-Host $sql
$Command.ExecuteNonQuery()
$sqlValues.Clear()
}
$Connection.Close()
For anyone interested - this is one way to do it:
function WriteBatch {
echo $sql
$Command.CommandText = $sql
$Command.ExecuteNonQuery()
}
$server = ""
$Database = ""
$Path = ""
$Counter = 0
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
[string]$sql = "
begin
insert into TestPowerShell(NameString, FileSize, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime)
values "
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name, Length, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sql = $sql + "('$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime'),"
$sql += "`n"
$Counter++
If($Counter -eq 900) {
$sql = $sql.Trim().Trim(',')
$sql = $sql + " End"
WriteBatch
$Counter = 0
$sql = "
begin
insert into TestPowerShell(NameString, FileSize, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime)
values "
}
}
if ($Counter -gt 0){
$sql = $sql.Trim().Trim(',')
$sql = $sql + " End"
WriteBatch
}
$Connection.Close()

Formatting a text file when importing into SQL

I can't seem to get a txt file to import correctly to a sql table. Here is a sample from my txt file:
Split/Skill:;File
;Agent Name;Login ID;Extn;AUX Reason;State;Split/Skill;Time;VDN Name
2;Smith, Joe;13429;64629;;AVAIL;0;93;
2;Gates, Bill;13458;64658;;AVAIL;0;85;
First I need to ignore the first line, the second line will be column names. Then I would like it to treat the line breaks as new rows and the semi-colons as new columns.
Here is as close as I could get:
$location = "path"
$file = "file"
$extension = ".txt"
$full = $location + $file + $extension
$all = Get-Content $full
$columns = Get-Content $full
$columns = $columns.Replace("`n`r",",")
   $table = "CREATE TABLE " + $file + "([" + $columns + "] VARCHAR(255))"
Write-Host $table
$Connection = New-Object System.Data.SqlClient.SqlConnection
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$Connection.ConnectionString = "Server=server;Database=db;Integrated Security=True"
$SqlCmd.CommandText = $table
$SqlCmd.Connection = $connection
$Connection.Open()
$sqlCmd.ExecuteNonQuery()
$Connection.Close()
Basically having the final output as:
Any advice is appreciated!
Lots of assumptions here, including assuming the file is trusted. Otherwise, there is a HUGE SQL injection vulnerability...
$location = "path"
$file = "file"
$extension = ".txt"
$full = $location + $file + $extension
$contents = (cat $full) -split "`r`n" | select -Skip 1
$columns = $contents | select -First 1 | % { $_ -split ";" } | % { if($_-eq ''){' '}else{$_} } | % { "[$_] VARCHAR(255)" }
$columns = $columns -join ","
$create = "CREATE TABLE [$file] ($columns)"
$rows = $contents | select -Skip 1 | % { ($_ -split ";" | % { "'$_'" }) -join "," }
$insert = $rows | % { "INSERT INTO [$file] VALUES($_)" }
$command = (#($create) + $insert) -join [Environment]::NewLine
$Connection = New-Object System.Data.SqlClient.SqlConnection
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$Connection.ConnectionString = "Server=server;Database=db;Integrated Security=True"
$SqlCmd.CommandText = $command
$SqlCmd.Connection = $connection
$Connection.Open()
$sqlCmd.ExecuteNonQuery()
$Connection.Close()
It first reads the file (skipping the first line) into $contents. Then, it gets the column definitions from the second line, splitting on ;. It replaces any empty string with a single space. It then takes the remaining lines as the insert statements, again splitting on ;, and joins them to the create statement as a single command.