I have multiple text files with need to be imported in SQL Server management studio. I have written a code in powershell to import certain files. But I have a file which ends with a row contains only '---------------------------------------'
For example
test|test2|test3
A|B|C
Q|W|E
'-----------'
is there a way to ignore or skip this row. I already tried select-object -skiplast 1
Function AutoImportCommaFlatFiles($location, $file, $extension, $server, $database)
{
$full = $location + $file + $extension
## $columns = Get-Content $full | Select-Object -skip 1 | set-Content $full
$all = Get-Content $full | select-Object -skip 3
$columns = $all[0]
$columns = $columns.Replace(" ","")
$columns = $columns.Replace("||","Column Emtpy|Column Empty 2|")
$columns = $columns.Replace("","Column Emtpy3")
$columns = $columns.TrimEnd('|')
$columns = $columns.Replace("|","] VARCHAR(100), [")
$table = "CREATE TABLE " + $file + "([" + $columns + "] VARCHAR(100))"
$connection = New-Object System.Data.SqlClient.SqlConnection
$buildTable = New-Object System.Data.SqlClient.SqlCommand
$insertData = New-Object System.Data.SqlClient.SqlCommand
$connection.ConnectionString = "Data Source=" + $server + ";Database=" + $database + ";integrated security=true"
$buildTable.CommandText = $table
$buildTable.Connection = $connection
## Added to function;
$x = 0
$insertData.CommandText = "EXECUTE stp_CommaBulkInsert #1,#2"
$insertData.Parameters.Add("#1", $full)
$insertData.Parameters.Add("#2", $file)
$insertData.Connection = $connection
$connection.Open()
$buildTable.ExecuteNonQuery()
$connection.Close()
## Added to function
$x = 1
if ($x = 1)
{
$connection.Open()
$insertData.ExecuteNonQuery()
$connection.Close()
}
}
and like this :
(Get-Content $full | Select-Object -skiplast 1) | set-Content $full
$all = Get-Content $full
Why you dont use import-csv command, Something like this :
$all=import-csv c:\temp\result.csv -Delimiter '|' -Header col1, col2, col3 | where col2 -ne $null
Related
Doing this query in sql server - it returns 3 rows of data. Running the script with the write-host $1_resultsDataTable and comment out the other variable $2_resultsDataTable- it returns only one row of the data array. Now if I reverse the comments so the $2_resultsDataTable is active for the write-host, it returns 6 rows of data.
How do I set this up so I would see the same 3 rows assigned to both $1_resultsDataTable and $2_resultsDataTable when I dump these variables to view the data results?
[string] $Server= "SERVER"
[string] $Database = "mvTest"
[string] $UserSqlQuery= $("select m.created_date, m.additional_data as ReasonDown from aeroscout.mv_audit m where m.created_date >= '2020-01-18' and m.additional_data like '%query-text%'")
#
$1_resultsDataTable, $2_resultsDataTable = foreach ($x in 1..2) {
$resultsDataTable = New-Object System.Data.DataTable
$resultsDataTable = ExecuteSqlQuery $Server $Database $UserSqlQuery
$resultsDataTable # first loop sends output to $1_resultsDataTable, second loop send to $2_resultsDataTable
Start-Sleep 3
}
# executes a query and populates the $datatable with the data
function ExecuteSqlQuery ($Server, $Database, $SQLQuery) {
$Datatable = New-Object System.Data.DataTable
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';Integrated Security=True;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
$Command.CommandText = $SQLQuery
$Reader = $Command.ExecuteReader()
If ($Reader.HasRows) {
while($Reader.Read()) {
$props = #{}
for($i = 0; $i -lt $Reader.FieldCount; $i+=1) {
$name = $Reader.GetName($i)
$value = $Reader.item($i)
$props.Add($name, $value)
}
$obj = new-object PSObject -Property $props
Write-Output $obj
}
}
return $obj
$SqlConnection.Close()
}
#validate we got data
write-host $1_resultsDataTable
Start-Sleep 3
write-host $2_resultsDataTable
I have a powershell script that writes every file and its attributes recursively starting from a specific directory. This works but the directories could have as many as 1,000,000 files. What I want to do is batch them at 1000 inserts per transaction. Here is the original PS:
$server = ""
$Database = ""
$Path = "C:\Test"
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name,Length,Mode, Directory,CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sql = "
begin
insert TestPowerShell
select '$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime'
end
"
$Command.CommandText = $sql
echo $sql
$Command.ExecuteNonQuery()
}
$Connection.Close()
My thoughts are to implement some sort of counter that will keep appending the insert until it reaches 1000 and then jump out of the loop and execute. I cannot figure out with this current setup how to batch at 1000, execute and then pick back up with the get-childitem loop.
Something like this should do:
function Execute-SqlQuery($query){
Write-Host "Executing query:"
Write-Host $query;
}
$data = #(1,2,3,4,5,6,7,8,9,10,11);
$batchSize = 2;
$counter = 0;
$sql = "";
foreach($item in $data){
if($counter -eq $batchSize){
Execute-SqlQuery $sql;
$counter = 0;
$sql = "";
}
$sql += "insert into myTable(id) values($item) `n";
$counter += 1;
}
Execute-SqlQuery $sql;
$server = ""
$Database = ""
$Path = "C:\Test"
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
# new variables to handle batching
$batchcounter=0
$batchsize=1000
$sqlValues = New-Object Collections.ArrayList
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name,Length,Mode, Directory,CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sqlValues.Add("('$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime')")
$batchcounter++
# if the counter hits batchsize, run the insert, using lots of:
# insert into table
# values (1,2,3)
# , (4,5,6)
# , (7,8,9)
if ($batchcounter % $batchsize -eq 0) {
$sql = "insert TestPowerShell values {0}" -f ($sqlValues.ToArray() -join "`r`n,")
$Command.CommandText = $sql
Write-Host $sql
$Command.ExecuteNonQuery()
$sqlValues.Clear()
}
}
# catch any remaining files
if ($batchcounter -gt 0) {
$sql = "insert TestPowerShell values {0}" -f ($sqlValues.ToArray() -join "`r`n,")
$Command.CommandText = $sql
Write-Host $sql
$Command.ExecuteNonQuery()
$sqlValues.Clear()
}
$Connection.Close()
For anyone interested - this is one way to do it:
function WriteBatch {
echo $sql
$Command.CommandText = $sql
$Command.ExecuteNonQuery()
}
$server = ""
$Database = ""
$Path = ""
$Counter = 0
$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString = "server='$Server';database='$Database';trusted_connection=true;"
$Connection.Open()
$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection
[string]$sql = "
begin
insert into TestPowerShell(NameString, FileSize, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime)
values "
foreach($file in Get-ChildItem -Verbose -Recurse -Path $Path | Select-Object Name, Length, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime) {
$fileName = $file.Name
$fileSize = ([int]$file.Length)
$fileMode = $file.Mode
$fileDirectory = $file.Directory
$fileCreationTime = [datetime]$file.CreationTime
$fileLastAccessTime = [datetime]$file.LastAccessTime
$fileLastWriteTime = [datetime]$file.LastWriteTime
$sql = $sql + "('$fileName', '$fileSize', '$fileMode', '$fileDirectory', '$fileCreationTime', '$fileLastAccessTime', '$fileLastWriteTime'),"
$sql += "`n"
$Counter++
If($Counter -eq 900) {
$sql = $sql.Trim().Trim(',')
$sql = $sql + " End"
WriteBatch
$Counter = 0
$sql = "
begin
insert into TestPowerShell(NameString, FileSize, Mode, Directory, CreationTime, LastAccessTime, LastWriteTime)
values "
}
}
if ($Counter -gt 0){
$sql = $sql.Trim().Trim(',')
$sql = $sql + " End"
WriteBatch
}
$Connection.Close()
I can't seem to get a txt file to import correctly to a sql table. Here is a sample from my txt file:
Split/Skill:;File
;Agent Name;Login ID;Extn;AUX Reason;State;Split/Skill;Time;VDN Name
2;Smith, Joe;13429;64629;;AVAIL;0;93;
2;Gates, Bill;13458;64658;;AVAIL;0;85;
First I need to ignore the first line, the second line will be column names. Then I would like it to treat the line breaks as new rows and the semi-colons as new columns.
Here is as close as I could get:
$location = "path"
$file = "file"
$extension = ".txt"
$full = $location + $file + $extension
$all = Get-Content $full
$columns = Get-Content $full
$columns = $columns.Replace("`n`r",",")
$table = "CREATE TABLE " + $file + "([" + $columns + "] VARCHAR(255))"
Write-Host $table
$Connection = New-Object System.Data.SqlClient.SqlConnection
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$Connection.ConnectionString = "Server=server;Database=db;Integrated Security=True"
$SqlCmd.CommandText = $table
$SqlCmd.Connection = $connection
$Connection.Open()
$sqlCmd.ExecuteNonQuery()
$Connection.Close()
Basically having the final output as:
Any advice is appreciated!
Lots of assumptions here, including assuming the file is trusted. Otherwise, there is a HUGE SQL injection vulnerability...
$location = "path"
$file = "file"
$extension = ".txt"
$full = $location + $file + $extension
$contents = (cat $full) -split "`r`n" | select -Skip 1
$columns = $contents | select -First 1 | % { $_ -split ";" } | % { if($_-eq ''){' '}else{$_} } | % { "[$_] VARCHAR(255)" }
$columns = $columns -join ","
$create = "CREATE TABLE [$file] ($columns)"
$rows = $contents | select -Skip 1 | % { ($_ -split ";" | % { "'$_'" }) -join "," }
$insert = $rows | % { "INSERT INTO [$file] VALUES($_)" }
$command = (#($create) + $insert) -join [Environment]::NewLine
$Connection = New-Object System.Data.SqlClient.SqlConnection
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$Connection.ConnectionString = "Server=server;Database=db;Integrated Security=True"
$SqlCmd.CommandText = $command
$SqlCmd.Connection = $connection
$Connection.Open()
$sqlCmd.ExecuteNonQuery()
$Connection.Close()
It first reads the file (skipping the first line) into $contents. Then, it gets the column definitions from the second line, splitting on ;. It replaces any empty string with a single space. It then takes the remaining lines as the insert statements, again splitting on ;, and joins them to the create statement as a single command.
I have a small powershell script that is meant to get column ServerName from a remote SQL database called, Hal0Test > from table, ServerList. However, I can not figure out this Powershell Error.
Newest Code:
Write-Output " `n Start of Hal0 `n";
$connectionString = "Server=QAUTILITYDB01;Database=Hal0Test;Integrated Security=True;"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$ServerArray = [System.Collections.ArrayList]#()
$query = "SELECT ServerName FROM ServerList"
$command.CommandText = $query
$ServerNames = $command.ExecuteReader()
$table = new-object “System.Data.DataTable”
$table.Load($ServerNames)
$ServerArray = $table | select -Expand ServerName
$ServerArray | ForEach-Object {
# $Server returns each server name
$os = Get-WmiObject -Class Win32_OperatingSystem -Computer $_
$disks = Get-WmiObject -Class Win32_LogicalDisk -Computer $_ |
Where-Object {$_.DriveType -eq 3} |
ForEach-Object {
'{0} {1:D} MB Free/{2:D} MB Used' -f $_.DeviceID,
[int]($_.FreeSpace/1MB), [int]($_.Size/1MB)
}
New-Object -Type PSCustomObject -Property #{
'FQDN' = $_
'ServerName' = $os.PSComputerName
'OperatingSystem' = $os.Caption
'Disks' = $disks -join ' | '
}
$command.CommandText = "UPDATE ServerList SET FQDN = '$_', OS = '$os.Caption' WHERE ServerName = '$os.PSComputerName';"
$result = $command.ExecuteNonQuery()
} | Export-Csv 'C:\Users\king\Desktop\HalO\output.csv' -Delimiter '|' -NoType
Write-Output "`n End of Hal0";
SQL Table:
You changed my ForEach-Object loop to a foreach loop. If you want to use the latter you need to change the current object variable $_ to your loop variable $Server:
foreach ($Server in $ServerArray) {
$os = Get-WmiObject -Class Win32_OperatingSystem -Computer $Server
$disks = Get-WmiObject -Class Win32_LogicalDisk -Computer $Server | ...
...
}
otherwise you need to change the loop back to a ForEach-Object loop:
$ServerArray | ForEach-Object {
$os = Get-WmiObject -Class Win32_OperatingSystem -Computer $_
$disks = Get-WmiObject -Class Win32_LogicalDisk -Computer $_ | ...
...
}
Also, there's no pipe between } and Export-Csv:
$result = $command.ExecuteNonQuery()
} Export-Csv 'C:\Users\mdaraghmeh\Desktop\HalO\output.csv' -Delimiter '|' -NoType
^
here
And even if there were would the foreach loop still be unable to feed its output into the pipeline. If you want to use foreach with a pipeline you must assign the output to a variable:
$output = foreach ($Server in $ServerArray) { ... }
$output | Export-Csv ...
or run it in an expression:
(foreach ($Server in $ServerArray) { ... }) | Export-Csv ...
For direct pipeline processing you need a ForEach-Object loop:
$ServerArray | ForEach-Object { ... } | Export-Csv ...
You need to use a [datatable] to store the result of your select:
$connectionString = "Server=QAUTILITYDB01;Database=Hal0Test;Integrated Security=True;"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$ServerArray = [System.Collections.ArrayList]#()
$query = "SELECT ServerName FROM ServerList"
$command.CommandText = $query
$ServerNames = $command.ExecuteReader()
$table = new-object "System.Data.DataTable"
$table.Load($ServerNames)
Now $table has your servers name list.
I am trying to export sql filestream data using powershell script and it exports only upto 8kb and it fails to export if the file is more than 8kb. But it creates the file partially. I dont know what is missing.
$Server = "(local)"; # SQL Server Instance.
$Database = "AdventureWorks";
$Dest = "D:\Export\"; # Path to export to.
$bufferSize = 8192; # Stream buffer size in bytes.
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$Server;" +
"Integrated Security=True;" +
"Initial Catalog=$Database";
$con.Open();
[System.Data.SqlClient.SqlTransaction]$tran = $con.BeginTransaction("fs");
$Sql = "SELECT GET_FILESTREAM_TRANSACTION_CONTEXT()";
$ctx = [array]::CreateInstance('Byte', 16);
$cmdct = New-Object Data.SqlClient.SqlCommand($Sql, $con, $tran);
$ctx = $cmdct.ExecuteScalar();
$cmdct.Dispose();
$Sql = "SELECT [FileName]
,[FileStreamData].PathName()
FROM dbo.FileStreamStorage ";
$out = [array]::CreateInstance('Byte', $bufferSize);
$cmd = New-Object Data.SqlClient.SqlCommand($Sql, $con, $tran);
$rd = $cmd.ExecuteReader();
While ($rd.Read())
{
Write-Output ("Exporting: {0}" -f $rd.GetString(0));
$fs = New-Object System.IO.FileStream ($Dest + $rd.GetString(0)), Create, Write;
$bw = New-Object System.IO.BinaryWriter($fs);
$sfs = New-Object System.Data.SqlTypes.SqlFileStream $rd.GetString(1), $ctx, Read, None, 0;
$start = 0;
While (1 -eq 1)
{
$received = $sfs.Read($out, $start, $bufferSize - 1);
$bw.Write($out, 0, $received);
$bw.Flush();
$start += $received;
If ($received -lt $bufferSize)
{ break; }
}
$bw.Close();
$fs.Close();
$sfs.Close();
}
$fs.Dispose();
$sfs.Dispose();
$rd.Close();
$rd.Dispose();
$tran.Commit();
$cmd.Dispose();
$tran.Dispose();
$con.Close();
$con.Dispose();
Write-Output ("Finished");
Any help will be really appreciated.
I think you are reading 8191 bytes, which means that $received will be 8191 if theres more data than this.
$received = $sfs.Read($out, $start, **$bufferSize - 1**);
Then you compare $received (8191) to $bufferSize (8192)
If ($received -lt $bufferSize)
I've done some work with filestreams and Powershell and I would agree with other comments--It appears there's an issue in how you are getting the length. Here's some code from my blog http://sev17.com/2010/05/11/t-sql-tuesday-006-blobs-filestream-and-powershell/ which demonstrates retrieving filestream data to a an image file:
$server = "Z002sql2k8"
$database = "AdventureWorks2008"
$query = "SELECT TOP(10) Document.PathName(), GET_FILESTREAM_TRANSACTION_CONTEXT(), Title + FileExtension AS FileName FROM Production.Document WHERE FileExtension = '.doc'"
$dirPath = "C:Usersu00"
$connection=new-object System.Data.SqlClient.SQLConnection
$connection.ConnectionString="Server={0};Database={1};Integrated Security=True" -f $server,$database
$connection.Open()
$command=new-object system.Data.SqlClient.SqlCommand("",$connection)
$command.CommandTimeout=120
$tran = $connection.BeginTransaction([System.Data.IsolationLevel]'ReadCommitted')
$command.Transaction = $tran
$command.CommandText = $query
$reader = $command.ExecuteReader()
while ($reader.Read())
{
$path = $reader.GetString(0)
[byte[]]$transactionContext = $reader.GetSqlBytes(1).Buffer
$filepath = "$dirPath{0}" -f $reader.GetValue(2)
$fileStream = new-object System.Data.SqlTypes.SqlFileStream($path,[byte[]]$reader.GetValue(1), [System.IO.FileAccess]'Read', [System.IO.FileOptions]'SequentialScan', 0)
$buffer = new-object byte[] $fileStream.Length
$fileStream.Read($buffer,0,$fileStream.Length)
$fileStream.Close()
[System.IO.File]::WriteAllBytes($filepath,$buffer)
}
$reader.Close()
$tran.Commit()
$connection.Close()