I want to extract data from SQL server to a new excel file using powershell . For small data set my code works but some tables has more than 100.000 rows and this will take ages. The reason why I don't use the utility in SQl server is because I want to extract mutilple tables.
Is there a way to optimize my script to export big tables to excel? or is there another way to do this?
I'm using the following script
## ---------- Working with SQL Server ---------- ##
## - Get SQL Server Table data:
$SQLServer = 'server';
$Database = 'database';
$SqlQuery = #' Select top 10 * from database.dbo.table '#;
## - Connect to SQL Server using non-SMO class 'System.Data':
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection;
$SqlConnection.ConnectionString = `
"Server = $SQLServer; Database = $Database; Integrated Security = True";
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand;
$SqlCmd.CommandText = $SqlQuery;
$SqlCmd.Connection = $SqlConnection;
## - Extract and build the SQL data object '$DataSetTable':
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter;
$SqlAdapter.SelectCommand = $SqlCmd;
$DataSet = New-Object System.Data.DataSet;
$SqlAdapter.Fill($DataSet);
$DataSetTable = $DataSet.Tables["Table"];
## ---------- Working with Excel ---------- ##
## - Create an Excel Application instance:
$xlsObj = New-Object -ComObject Excel.Application;
## - Create new Workbook and Sheet (Visible = 1 / 0 not visible)
$xlsObj.Visible = 0;
$xlsWb = $xlsobj.Workbooks.Add();
$xlsSh = $xlsWb.Worksheets.item(1);
## - Build the Excel column heading:
[Array] $getColumnNames = $DataSetTable.Columns | Select ColumnName;
## - Build column header:
[Int] $RowHeader = 1;
foreach ($ColH in $getColumnNames)
{
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true;
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName;
$RowHeader++;
};
## - Adding the data start in row 2 column 1:
[Int] $rowData = 2;
[Int] $colData = 1;
foreach ($rec in $DataSetTable.Rows)
{
foreach ($Coln in $getColumnNames)
{
## - Next line convert cell to be text only:
$xlsSh.Cells.NumberFormat = "#";
## - Populating columns:
$xlsSh.Cells.Item($rowData, $colData) = `
$rec.$($Coln.ColumnName).ToString();
$ColData++;
};
$rowData++; $ColData = 1;
};
## - Adjusting columns in the Excel sheet:
$xlsRng = $xlsSH.usedRange;
$xlsRng.EntireColumn.AutoFit();
## ---------- Saving file and Terminating Excel Application ---------- ##
## - Saving Excel file - if the file exist do delete then save
$xlsFile = `
"C:\path\file.xls";
if (Test-Path $xlsFile)
{
Remove-Item $xlsFile
$xlsObj.ActiveWorkbook.SaveAs($xlsFile);
}
else
{
$xlsObj.ActiveWorkbook.SaveAs($xlsFile);
};
## Quit Excel and Terminate Excel Application process:
$xlsObj.Quit(); (Get-Process Excel*) | foreach ($_) { $_.kill() };
## - End of Script - ##
There's some simple magic to make this a lot easier, and that's Copy/Paste. What you can do is convert your datatable to a tab delimited CSV, copy that to the clipboard, and paste it into Excel. I'll ignore your SQL part, since you seem to have that well in hand.
## ---------- Working with Excel ---------- ##
## - Create an Excel Application instance:
$xlsObj = New-Object -ComObject Excel.Application;
## - Create new Workbook and Sheet (Visible = 1 / 0 not visible)
$xlsObj.Visible = 0;
$xlsWb = $xlsobj.Workbooks.Add();
$xlsSh = $xlsWb.Worksheets.item(1);
## - Copy entire table to the clipboard as tab delimited CSV
$DataSetTable | ConvertTo-Csv -NoType -Del "`t" | Clip
## - Paste table to Excel
$xlsObj.ActiveCell.PasteSpecial() | Out-Null
## - Set columns to auto-fit width
$xlsObj.ActiveSheet.UsedRange.Columns|%{$_.AutoFit()|Out-Null}
Related
How to retrive image from a table and save it into a folder in different server using sql query
The image is stored in DB as BLOB.
You can do it with PowerShell, here an example:
$connectionString = "Data Source=SERVER;Initial Catalog=DATABASE;pwd=PASSWORD;User ID=USER;"
$sqlCommandText = "SELECT id, Photo, Photo_TypeMime FROM MYTABLE" #query
$saveToDir = "D:\" # output directory
$connection = new-object System.Data.SqlClient.SQLConnection($connectionString)
$command = new-object System.Data.sqlclient.sqlcommand($sqlCommandText,$connection)
$connection.Open()
$bufferSize = 8192 #default value
$buffer = [array]::CreateInstance('Byte', $bufferSize)
$dr = $command.ExecuteReader()
While ($dr.Read())
{
$ext = GetExtFromMimeType($dr.GetString(2)) # create a function to return extention from mime type if you don't have the file name saved in the database
$fs = New-Object System.IO.FileStream($saveToDir + $dr.GetDecimal(0) + $ext), Create, Write #my example id is decimal but you can change it
$bw = New-Object System.IO.BinaryWriter $fs
$start = 0
$received = $dr.GetBytes(1, $start, $buffer, 0, $bufferSize - 1)
While ($received -gt 0)
{
$bw.Write($buffer, 0, $received)
$bw.Flush()
$start += $received
# Read next byte stream
$received = $dr.GetBytes(1, $start, $buffer, 0, $bufferSize - 1)
}
$bw.Close()
$fs.Close()
}
$fs.Dispose()
$dr.Close()
$command.Dispose()
$connection.Close()
Also you can find a more detailed example here: https://social.technet.microsoft.com/wiki/contents/articles/890.export-sql-server-blob-data-with-powershell.aspx
You can't perform a File System Operation using SQL Server alone. So If you want to save an Image Stored in SQL Server database as a File in your File System, Try using C# or any such Front End Languages to retrieve the Record from the Database and then Save to the Desired Location
You can use similar to this.
-- This will create format file, replace [TABLE_NAME_WITH_DATABASE] with your table and [SERVERNAME] with your server name
EXEC xp_cmdshell 'bcp [TABLE_NAME_WITH_DATABASE] format null -S [SERVERNAME] -T -n -f c:\Test\PP.fmt'
-- After this step you will see a format file,in that you have to delete all other columns except your image column
-- and run below query.
EXEC xp_cmdshell 'bcp "SELECT Photo FROM Server.Db.Table WHERE PK = 1" queryout C:\Test\ProductPhotoID_69.[IMAGE_EXTENSTION] -S [SERVERNAME] -T -f C:\Test\PP.fmt'
I have a .xlsx file which was made into data table by oledb provider.Now I want to add value to that .xlsx based on the sql table data I have
(which is also converted into a csv file Book1.csv)
The sql table consists of name and notes...
Where name column is same in both .xlsx file and sql variable $sql
I want to add that close notes to f column of .xlsx file if the value of name matches with the value of sql table "A" column One I wrote below is very slow and not effective.
Any help would be highly appreciated.
$Excel = New-Object -ComObject Excel.Application
$Workbook = $Excel.Workbooks.Open('C:\Users\VIKRAM\Documents\Sample - Superstore.xlsx')
$workSheet = $Workbook.Sheets.Item(1)
$WorkSheet.Name
$Found = $WorkSheet.Cells.Find('$Data.number')
$Found.row
$Found.text
$Excel1 = New-Object -ComObject Excel.Application
$file = $Excel1.Workbooks.Open('C:\Users\VIKRAM\Documents\Book1.xlsx')
$ff=$file.Sheets.Item(1)
$ff.Name
$ff1=$ff.Range("A1").entirecolumn
$ff1.Value2
foreach ($line in $ff1.value2){
if( $found.text -eq $line)
{
Write-Host "success"
$fff=$ff1.Row
$WorkSheet.Cells.item($fff,20) =$ff.cells.item($fff,2)
}
}
Data in .xlsx file
Number Priority Comment
612721 4 - High
Data in Book1.csv
Number Clo_notes
612721 Order has been closed
I need to update clo_notes value to comment in .xlsx file if this "number" column in each file matches update the clos_notes to the corresponding column of comment
It looks like you answered my question about where "Nebraska" falls into the data.
Make sure to release any COM objects, or you'll have orphaned Excel processes.
You might try something like this. I was able to write the Clo_notes value into column 6 as you were requesting:
## function to close all com objects
function Release-Ref ($ref) {
([System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$ref) -gt 0)
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
}
## open Excel data
$Excel = New-Object -ComObject Excel.Application
$Workbook = $Excel.Workbooks.Open('C:\Users\51290\Documents\_temp\StackOverflowAnswers\Excel.xlsx')
$workSheet = $Workbook.Sheets.Item(1)
$WorkSheet.Name
## open SQL data
$Excel1 = New-Object -ComObject Excel.Application
$file = $Excel1.Workbooks.Open('C:\Users\51290\Documents\_temp\StackOverflowAnswers\SQL.xlsx')
$sheetSQL = $file.Sheets.Item(1)
$dataSQL = $sheetSQL.Range("A1").currentregion
$foundNumber = 0
$row_idx = 1
foreach ($row in $WorkSheet.Rows) {
"row_idx = " + $row_idx
if ($row_idx -gt 1) {
$foundNumber = $row.Cells.Item(1,1).Value2
"foundNumber = " + $foundNumber
if ($foundNumber -eq "" -or $foundNumber -eq $null) {
Break
}
foreach ($cell in $dataSQL.Cells) {
if ($cell.Row -gt 1) {
if ($cell.Column -eq 1 -and $cell.Value2 -eq $foundNumber) {
$clo_notes = $sheetSQL.Cells.Item($cell.Row, 2).Value2
Write-Host "success"
$WorkSheet.Cells.item($row_idx, 6).Value2 = $clo_notes
}
}
}
}
$row_idx++
}
$Excel.Quit()
$Excel1.Quit()
## close all object references
Release-Ref($WorkSheet)
Release-Ref($WorkBook)
Release-Ref($Excel)
Release-Ref($Excel1)
I am trying to load 160gb csv file to sql and I am using powershell script I got from Github and I get this error
IException calling "Add" with "1" argument(s): "Input array is longer than the number of columns in this table."
At C:\b.ps1:54 char:26
+ [void]$datatable.Rows.Add <<<< ($line.Split($delimiter))
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : DotNetMethodException
So I checked the same code with small 3 line csv and all of the columns match and also have header in first row and there are no extra delimiters not sure why I am getting this error.
The code is below
<# 8-faster-runspaces.ps1 #>
# Set CSV attributes
$csv = "M:\d\s.txt"
$delimiter = "`t"
# Set connstring
$connstring = "Data Source=.;Integrated Security=true;Initial Catalog=PresentationOptimized;PACKET SIZE=32767;"
# Set batchsize to 2000
$batchsize = 2000
# Create the datatable
$datatable = New-Object System.Data.DataTable
# Add generic columns
$columns = (Get-Content $csv -First 1).Split($delimiter)
foreach ($column in $columns) {
[void]$datatable.Columns.Add()
}
# Setup runspace pool and the scriptblock that runs inside each runspace
$pool = [RunspaceFactory]::CreateRunspacePool(1,5)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# Setup scriptblock. This is the workhorse. Think of it as a function.
$scriptblock = {
Param (
[string]$connstring,
[object]$dtbatch,
[int]$batchsize
)
$bulkcopy = New-Object Data.SqlClient.SqlBulkCopy($connstring,"TableLock")
$bulkcopy.DestinationTableName = "abc"
$bulkcopy.BatchSize = $batchsize
$bulkcopy.WriteToServer($dtbatch)
$bulkcopy.Close()
$dtbatch.Clear()
$bulkcopy.Dispose()
$dtbatch.Dispose()
}
# Start timer
$time = [System.Diagnostics.Stopwatch]::StartNew()
# Open the text file from disk and process.
$reader = New-Object System.IO.StreamReader($csv)
Write-Output "Starting insert.."
while ((($line = $reader.ReadLine()) -ne $null))
{
[void]$datatable.Rows.Add($line.Split($delimiter))
if ($datatable.rows.count % $batchsize -eq 0)
{
$runspace = [PowerShell]::Create()
[void]$runspace.AddScript($scriptblock)
[void]$runspace.AddArgument($connstring)
[void]$runspace.AddArgument($datatable) # <-- Send datatable
[void]$runspace.AddArgument($batchsize)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{ Pipe = $runspace; Status = $runspace.BeginInvoke() }
# Overwrite object with a shell of itself
$datatable = $datatable.Clone() # <-- Create new datatable object
}
}
# Close the file
$reader.Close()
# Wait for runspaces to complete
while ($runspaces.Status.IsCompleted -notcontains $true) {}
# End timer
$secs = $time.Elapsed.TotalSeconds
# Cleanup runspaces
foreach ($runspace in $runspaces ) {
[void]$runspace.Pipe.EndInvoke($runspace.Status) # EndInvoke method retrieves the results of the asynchronous call
$runspace.Pipe.Dispose()
}
# Cleanup runspace pool
$pool.Close()
$pool.Dispose()
# Cleanup SQL Connections
[System.Data.SqlClient.SqlConnection]::ClearAllPools()
# Done! Format output then display
$totalrows = 1000000
$rs = "{0:N0}" -f [int]($totalrows / $secs)
$rm = "{0:N0}" -f [int]($totalrows / $secs * 60)
$mill = "{0:N0}" -f $totalrows
Write-Output "$mill rows imported in $([math]::round($secs,2)) seconds ($rs rows/sec and $rm rows/min)"
Working with a 160 GB input file is going to be a pain. You can't really load it into any kind of editor - or at least you don't really analyze such a data mass without some serious automation.
As per the comments, it seems that the data has some quality issues. In order to find the offending data, you could try binary searching. This approach shrinks the data fast. Like so,
1) Split the file in about two equal chunks.
2) Try and load first chunk.
3) If successful, process the second chunk. If not, see 6).
4) Try and load second chunk.
5) If successful, the files are valid, but you got another a data quality issue. Start looking into other causes. If not, see 6).
6) If either load failed, start from the beginning and use the failed file as the input file.
7) Repeat until you narrow down the offending row(s).
Another a method would be using an ETL tool like SSIS. Configure the package to redirect invalid rows into an error log to see what data is not working properly.
So I’ve been at this for a while now and I’ve got various methods (some VBA, others PowerShell) semi working per say…
Quick overview of what I’m trying to accomplish is importing two CSV’s weekly (erasing the old data, headers always remain the same but are different between the two sheets) into two specific Excel Worksheets within the same workbook (Ex sheet1, sheet2, calculation sheet) which has another sheet that then calculates the data. Finally I’d like to export it as PDF.
Full Explanation:
Every Monday two queries are exported as .CSV to let’s say C:\Users\Me\Desktop\Data1.CSV & C:\Users\Me\Desktop\Data2.CSV
I would then like to have the CSV’s data input into their respective worksheets (Data1, Data2) within the workbook C:\Users\Me\Desktop\Calculation.xlsx
Data1 would look like this:
COUNT STATUS OPERATOR PRODUCT WEEK
1 CANCEL BOB Product 1 10
65 CLEAR JIM Product 2 10
20 SEND BOB Product 1 10
58 CC KRIS Product 4 10
3 CLEAR BOB Product 1 10
11 SEND SMIT Product 6 10
6 CANCEL JASON Product 7 10
Data2 would look like this:
OPERATOR CLEARS SENDS TOTAL CR WEEK
BOB 11 1 12 0.916667 10
JIM 17 2 19 0.894737 10
KRIS 9 1 10 0.9 10
SMITH 22 5 27 0.814815 10
JASON 25 7 32 0.78125 10
The calculation sheet will then recognize the data and process accordingly then export as a PDF. The following Monday, the windows timer service calls a .bat file which then runs this script (VBA or PowerShell) which erases the previous weeks data within this workbook and inputs the new data from the queries.
I’m very open as which language this is written in I have basic knowledge and understating of both PowerShell and VBA. I have not included the code I currently have as I’ve butchered it to try and get it to work within my needs as I have mixed together various methods from researching how to do this.
Hopefully I’ve provided enough information so that someone can point me in the right direction…
Thanks
EDIT
As per Chris here's some code I've been trying to utilize, it's probably extremely confusing as I was trying to modify it for my needs versus what it was initially made for:
Get-service bits | Select-Object COUNT, STATUS, OPERATOR, PRODUCT, WEEK | Export-Csv 'C:\Users\Me\Desktop\Data1.csv' -NoTypeInformation
Get-service bits | Select-Object COUNT, STATUS, OPERATOR, PRODUCT, WEEK | Export-Excel 'C:\Users\Me\Desktop\Calculations.xlsx'
$Results = #()
Import-Excel -Path 'C:\Users\Me\Desktop\Calculations.xlsx' | foreach {
$Properties = #{
COUNT = $PSItem.COUNT
STATUS = $PSItem.STATUS
OPERATOR = $PSItem.OPERATOR
PRODUCT = $PSItem.PRODUCT
WEEK = $PSItem.WEEK
}
$Results += New-Object -TypeName psobject -Property $Properties
}
Import-Csv -Path 'C:\Users\Me\Desktop\Data1.csv' | foreach {
$Properties = #{
COUNT = $PSItem.COUNT
STATUS = $PSItem.STATUS
OPERATOR = $PSItem.OPERATOR
PRODUCT = $PSItem.PRODUCT
WEEK = $PSItem.WEEK
}
$Results += New-Object -TypeName psobject -Property $Properties
}
$Results | Export-Excel -Path 'C:\Users\Me\Desktop\Calculations.xlsx'
Another example I tried:
$Excel = New-Object -ComObject Excel.Application
$XLSFile = 'C:\Users\me\Desktop\Calculations.xlxs'
$csvFile = 'C:\Users\me\Desktop\Data\Data1.csv'
$Excel.Visible = $true
$ExcelWorkBook = $Excel.Workbooks.Open($XLSFile)
$ExcelWorkSheet = $ExcelWorkBook.sheets.item('Data')
$ExcelWorkSheet.Activate()
# Go to the first empty row
$LastRow = $ExcelWorkSheet.UsedRange.rows.count + 1
Import-Csv -Path $csvFile | ForEach {
$ExcelWorkSheet.cells.Item($lastRow,1) = $psitem.COUNT
$ExcelWorkSheet.cells.Item($lastRow,2) = $psitem.STATUS
$ExcelWorkSheet.cells.Item($lastRow,3) = $psitem.OPERATOR
$ExcelWorkSheet.cells.Item($lastRow,4) = $psitem.PRODUCT
$ExcelWorkSheet.cells.Item($lastRow,5) = $psitem.WEEK
$LastRow = $LastRow + 1
}
$ExcelWorkBook.Save()
$ExcelWorkBook.Close()
$Excel.Quit()
$path = "C:\Users\me\Desktop\"
$xlFixedFormat = “Microsoft.Office.Interop.Excel.xlFixedFormatType” -as [type]
$excelFiles = Get-ChildItem -Path $path -include *.xls, *.xlsx -recurse
$objExcel = New-Object -ComObject excel.application
$objExcel.visible = $false
foreach($wb in $excelFiles)
{
$filepath = Join-Path -Path $path -ChildPath ($wb.BaseName + " Weekending " +(Get-Date).AddDays(-1).ToString('MMM-dd-yyyy') + “.pdf”)
$workbook = $objExcel.workbooks.open($wb.fullname, 3)
$workbook.Saved = $true
“saving $filepath”
$workbook.ExportAsFixedFormat($xlFixedFormat::xlTypePDF, $filepath, 1, 2)
$objExcel.Workbooks.close()
}
$objExcel.Quit()
Final example
$Excel = New-Object -ComObject Excel.Application
$XLSFile = 'C:\Users\me\Desktop\Calculations.xlsx'
$csvFile = 'C:\Users\me\Desktop\Data1.csv'
$Excel.Visible = $true
$ExcelWorkBook = $Excel.Workbooks.Open($XLSFile)
$ExcelWorkBook.worksheets.item("Data1").Delete()
# Create a new worksheet
$ExcelWorkSheet = $ExcelWorkBook.Worksheets.Add()
# Set the name for the worksheet
$ExcelWorkSheet.Name = "Data1"
$ExcelWorkSheet = $ExcelWorkBook.sheets.item('Data1')
$ExcelWorkSheet.Activate()
# Go to the first empty row
Import-Csv -Path $csvFile | ForEach {
$ExcelWorkSheet.cells.Item($lastRow,1) = $psitem.COUNT
$ExcelWorkSheet.cells.Item($lastRow,2) = $psitem.STATUS
$ExcelWorkSheet.cells.Item($lastRow,3) = $psitem.OPERATOR
$ExcelWorkSheet.cells.Item($lastRow,4) = $psitem.PRODUCT
$ExcelWorkSheet.cells.Item($lastRow,5) = $psitem.WEEK
$LastRow = $LastRow + 1
}
$ExcelWorkBook.Save()
$ExcelWorkBook.Close()
$Excel.Quit()
Filling the target cells one by one using Excel COM interface is very slow. Using Range.Copy and Sheet.Paste is much faster:
step 1: "ExcelApp.Workbooks.OpenText" to open data.csv file
step 2: "ExcelApp.Windows("data").ActiveSheet" to target the data sheet just opened
step 3: "ActiveSheet.Range" to target the data area
step 4: "Range.Copy" to copy the source data to the clipboard
step 5: "Sheet.Paste" to move the data to the destination
C++ version sample code(suppose we already have "pSheetCsvDest" which is our .csv data destination sheet) :
//import Excel library
//....
//declaration
Excel::_ApplicationPtr pAppExcel;
Excel::_WorksheetPtr pSheetCsvSource;
Excel::RangePtr pRangeCsvSource;
//initialze excel app and open the source csv file
HRESULT hApp = pAppExcel.CreateInstance(__uuidof(Excel::Application));
pAppExcel->Workbooks->OpenText(_bstr_t("data.csv"), 936, 1, xlDelimited, xlTextQualifierDoubleQuote, vtMissing, vtMissing, vtMissing, VARIANT_TRUE, vtMissing, vtMissing, vtMissing, vtMissing, vtMissing, vtMissing, vtMissing, vtMissing, vtMissing);
//target the data range
pSheetCsvSource= pAppExcel->Windows->GetItem(_bstr_t("data"))->ActiveSheet;
pRangeCsvSource= pSheetCsvSource->GetRange(_variant_t("Cell1:Cell2"));
//copy the source data
pRangeCsvSource->Copy();
//paste to destination sheet
pSheetCsvDest->Paste();
//close csv source child window
pAppExcel->Windows->GetItem(_bstr_t("data"))->Close();
For my job I often have to script out a table with all its keys, constraints and Triggers (basically a full script to recreate the table) from a Microsoft SQL 2008 server.I also have to do this for procedures and triggers.
What I do now is open SSMS right click the object and select script to and select to script it to a file. So if I have 3 procedures to do and 10 tables and 1 trigger I end up doing this 14 times .
What I would like is a powershell script that I could feed a list of objects to and then it would go and use SMO to script each on out to an individual file.
Thanks for the help
Here is a PowerShell function I use whenever I have to script a database. It should be easy to modify just to scripts the objects you need.
function SQL-Script-Database
{
<#
.SYNOPSIS
Script all database objects for the given database.
.DESCRIPTION
This function scripts all database objects (i.e.: tables, views, stored
procedures, and user defined functions) for the specified database on the
the given server\instance. It creates a subdirectory per object type under
the path specified.
.PARAMETER savePath
The root path where to save object definitions.
.PARAMETER database
The database to script (default = $global:DatabaseName)
.PARAMETER DatabaseServer
The database server to be used (default: $global:DatabaseServer).
.PARAMETER InstanceName
The instance name to be used (default: $global:InstanceName).
.EXAMPLE
SQL-Script-Database c:\temp AOIDB
#>
param (
[parameter(Mandatory = $true)][string] $savePath,
[parameter(Mandatory = $false)][string] $database = $global:DatabaseName,
[parameter(Mandatory = $false)][string] $DatabaseServer = $global:DatabaseServer,
[parameter(Mandatory = $false)][string] $InstanceName = $global:InstanceName
)
try
{
if (!$DatabaseServer -or !$InstanceName)
{ throw "`$DatabaseServer or `$InstanceName variable is not properly initialized" }
$ServerInstance = SQL-Get-Server-Instance $DatabaseServer $InstanceName
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
$s = New-Object Microsoft.SqlServer.Management.Smo.Server($ServerInstance)
$db = $s.databases[$database]
$objects = $db.Tables
$objects += $db.Views
$objects += $db.StoredProcedures
$objects += $db.UserDefinedFunctions
$scripter = New-Object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
$scripter.Options.AnsiFile = $true
$scripter.Options.IncludeHeaders = $false
$scripter.Options.ScriptOwner = $false
$scripter.Options.AppendToFile = $false
$scripter.Options.AllowSystemobjects = $false
$scripter.Options.ScriptDrops = $false
$scripter.Options.WithDependencies = $false
$scripter.Options.SchemaQualify = $false
$scripter.Options.SchemaQualifyForeignKeysReferences = $false
$scripter.Options.ScriptBatchTerminator = $false
$scripter.Options.Indexes = $true
$scripter.Options.ClusteredIndexes = $true
$scripter.Options.NonClusteredIndexes = $true
$scripter.Options.NoCollation = $true
$scripter.Options.DriAll = $true
$scripter.Options.DriIncludeSystemNames = $false
$scripter.Options.ToFileOnly = $true
$scripter.Options.Permissions = $true
foreach ($o in $objects | where {!($_.IsSystemObject)})
{
$typeFolder=$o.GetType().Name
if (!(Test-Path -Path "$savepath\$typeFolder"))
{ New-Item -Type Directory -name "$typeFolder"-path "$savePath" | Out-Null }
$file = $o -replace "\[|\]"
$file = $file.Replace("dbo.", "")
$scripter.Options.FileName = "$savePath\$typeFolder\$file.sql"
$scripter.Script($o)
}
}
catch
{
Util-Log-Error "`t`t$($MyInvocation.InvocationName): $_"
}
}
Here's a script to backup an individual object. Simply pass the object name to the function:
http://sev17.com/2012/04/backup-database-object/