Hello everyone I am currently trying to write the results of a query to a tab delimited csv file automatically on a schedule. Currently I am using powershell to do this
$results = Invoke-SQLCmd -ServerInstance $Server -Database $Database -query "select distinct * from WHIProducts"
$results | ConvertTo-csv -NoTypeInformation -Delimiter "`t" | Out-File "$inventorypath\inventory_$date\$filename" -fo -en ascii
The problem with this is that the results are so big I am getting a system.outofmemoryexception error. I have tried increasing the maxmemorypershell but I still get the same error. I need to this automatically so going into SSMS and doing it manually is not enough. Any ideas?
I am trying to print ~170k rows. There will be more eventually, probably up to about 300k. Here is the powershell error.
ConvertTo-csv : Exception of type 'System.OutOfMemoryException' was
thrown. At
C:\Users\pmaho\Dropbox\MASSFILEStest\scripts\daily_inventory.ps1:59
char:12 + $results | ConvertTo-csv -NoTypeInformation -Delimiter "`t"
| Out-Fil ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +
CategoryInfo : NotSpecified: (:) [ConvertTo-Csv], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ConvertToCsvCommand
–
I am using SQL Server express edition
Try piping the output of Invoke-SqlCmd to the Export-CSV cmdlet. Here is the script.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$SQLServer = "localhost\inst1"
$DBName = "ExportCSVTesting"
$ExportFile = "C:\Users\BIGRED-7\Documents\Git\csvfiles\addresses.csv"
$Counter = 0
while ( $true )
{
# Remove the export file
if (Test-Path -Path $ExportFile -PathType Leaf) {
Remove-Item $ExportFile -Force
}
# Clear the buffer cache to make sure each test is done the same
$ClearCacheSQL = "DBCC DROPCLEANBUFFERS"
Invoke-Sqlcmd -ServerInstance $SQLServer -Query $ClearCacheSQL
# Export the table through the pipeline and capture the run time. Only the export is included in the run time.
$ExportSQL = "SELECT * FROM [addresses] ;"
$sw = [Diagnostics.Stopwatch]::StartNew()
Invoke-Sqlcmd -ServerInstance $SQLServer -Database $DBName -Query $ExportSQL | Export-CSV -Path $ExportFile -NoTypeInformation
$sw.Stop()
$sw.Elapsed
$Milliseconds = $sw.ElapsedMilliseconds
# Get a row count for display
$RowCountSQL = "SELECT COUNT(0) AS [Count] FROM [addresses] ;"
$RowCount = Invoke-Sqlcmd -ServerInstance $SQLServer -Database $DBName -Query $RowCountSQL
$RowCount = $RowCount.Count
$Counter++
Write-Output ("Run $Counter of RowCount: $RowCount")
# Log the run statistics
$StatsSQL = "INSERT INTO [RunStats] (Counter,Milliseconds,Notes) VALUES ($RowCount,$Milliseconds,'Pipeline')"
Invoke-Sqlcmd -ServerInstance $SQLServer -Database $DBName -Query $StatsSQL
}
Related
hi i am running the following query in powershell:
Import-Module Hall.psm1
$Database = 'Report'
$Server = '192.168.1.2'
$Query = 'SELECT all * FROM [Report].[dbo].[TestView]'
$LogLocation = "\\Report\LogFile.csv"
$DynamicYear = (Get-Date).Year
$DynamicMonth = (Get-Culture).DateTimeFormat.GetMonthName((Get-Date).Month)
$FileDestination = "\\Report\MONTHLY REPORTS\"+$DynamicYear+"\"+$DynamicMonth+"\"
$Outputfilename='TestView-'+(Get-Date).ToString('MM-dd-yyyy')+'.csv'
$LocalCreate = 'C:\Scripts\LocalCreate\'
$FolderPathExtension = "Microsoft.PowerShell.Core\FileSystem::"
$CodeDestination = $FolderPathExtension+$FileDestination
$filedest=$LocalCreate+$outputfilename
$Logfile = $FolderPathExtension+$LogLocation
Invoke-sqlcmd -querytimeout 120 -query "
$Query
" -database $database -serverinstance $server |
ConvertTo-Csv -NoTypeInformation | # Convert to CSV string data without the type metadata
Select-Object -Skip 0 | # Trim header row, leaving only data columns
% {$_ -replace '"',''} | # Remove all quote marks
Set-Content -Path $filedest
(gc $filedest) | ? {$_.trim() -ne "" } | set-content $filedest
if(Test-Path ($filedest)) {
Move-Item -Path $filedest -Destination $CodeDestination -Force
$LogType = 'INFO'
$LogEntry = "$filedest MovedTo $To"
Write-Log -Message $LogEntry -Level $LogType -Logfile $Logfile
}
which works fine without any issue if the query has data.
however, if the query does not have any data it does not create a .csv. how can i get it to create a blank .csv? or .csv with headers only?
Use New-Item -ItemType File -Path $filedest before your Invoke-SqlCmd Or ConvertTo-Csv
I have not an Expert in power shell. I Need a script/Approch which handles the below requirement.
I have list of files in a folder and the file Names like below.
001_File.sql
002_File.sql
003_File.sql
004_File.sql
Also, I have a table in sql server which holds the file Name Information.
TableName: Executedfile with a column FileName.
002_File.sql
004_File.sql
My requirement is to read the files which is available in the folder but not in the table.
I have to the Read files only:
001_File.sql
003_File.sql
Now, I Need to Execute these two file in the sequential order under the same Transaction on SQL Server. As I Need to rollback all the transaction if any Error occurred.
As of now I wrote a power shell something below.
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDb' -Query "SELECT DISTINCT FNames from TableName"
Get-ChildItem "E:\Testing\" -Filter *.sql | Sort-Object $_.Name|
Foreach-Object {
$FileFullpath= $_.FullName
Write-Host $FileFullpath
$FileName = $_.Name
Write-Host $FileName
if(!$QueryResult.FName.Contains($FileName))
{
invoke-sqlcmd -inputfile $FileFullpath -serverinstance "servername\serverinstance" -database "mydatabase"
}
}
Please suggest me some script.
Challenges:
How to read the files in Sequential order as it has leading Zeros. Does the Above `Sort-Object $_.Name ' will Sort ?
How to Execute all the list of files under one transaction.
Thanks
Finally I did something like this.
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDb' -Query "SELECT DISTINCT FNames from TableName"
$FullScript = #()
$FullScript += "BEGIN TRANSACTION;"
Get-ChildItem "E:\Testing\" -Filter *.sql | Sort-Object $_.Name|
Foreach-Object {
if(!$QueryResult.FName.Contains($_.Name))
{
$FullScript += Get-Content $_.FullName
}
}
$FullScript += "COMMIT TRANSACTION;"
sqlcmd -S localhost -d test -Q "$FullScript"
Try this...
#get list of filenames from database...
$QueryResult = Invoke-Sqlcmd -ServerInstance 'MyServer' -Database 'MyDB' -Query "SELECT DISTINCT FNames from TableName" | Select-Object -ExpandProperty FileName
#get files from folder whose names are not in $queryresult...
$files = Get-ChildItem -Path E:\Testing -Filter *.sql | ? {(!($QueryResult.Contains($_.BaseName)))} | Sort-Object Name
#get the content of each $file and replace "GO" with empty string, etc...
$queries = #()
foreach ($file in $files) {
$queries += (Get-Content $file.FullName).replace("GO","")
}
#join each query into a single T-SQL statement...
$singleTransaction = $queries -join ";"
#execute statement...
Invoke-Sqlcmd -ServerInstance 'SERVER' -Database 'DB' -Query $singleTransaction
To really achieve a 'single transaction'...you may have to have a consistent input to modify and put into one statement. I am not sure how you will need to do that.
Finally I wrote the Script using SMO Objects to handle the GO Statement and Transactions.
$SqlFilePath = "D:\Roshan\Testing\SQL\"
$serverName = "MyServer"
$databaseName = "MyDB"
$QueryResult = Invoke-Sqlcmd -ServerInstance $serverName -Database $databaseName -Query "SELECT DISTINCT FName from dbo.TableName" -AS DataRows
$connection = new-object system.data.SqlClient.SQLConnection("Data Source=$serverName;Integrated Security=SSPI;Initial Catalog=$databaseName;Connection Timeout=600;Max Pool Size=10");
$Server = new-Object Microsoft.SqlServer.Management.Smo.Server(New-Object Microsoft.SqlServer.Management.Common.ServerConnection($connection))
$script_contents ="SET XACT_ABORT ON
GO
BEGIN TRANSACTION
GO"
Get-ChildItem $SqlFilePath -Filter *.sql| Sort-Object $_.Name|
ForEach-Object {
if(!$QueryResult.FName.Contains($_.Name))
{
Write-Host $_.Name -ForegroundColor Magenta
#[string]$script_contents = Get-Content $_.FullName
$script_contents += [IO.File]::ReadAllText($_.FullName)
#Write-Host $script_contents
#$Server.ConnectionContext.ExecuteNonQuery($script_contents)
}
}
$script_contents+= " COMMIT TRANSACTION;"
$Server.ConnectionContext.ExecuteNonQuery($script_contents)
You can write some thing in your shell script
select filename from tablename; >> file.out
--->002_File.sql
grep -v 'file.out' * >> excludedfile.out
My PowerShell script keeps failing when trying to import a CSV file.
The error message reads: "Invoke-Sqlcmd : Conversion failed when converting date and/or time from character string."
In my CSV file, I have a column called "LastWriteTime"
Here is my PowerShell script
$database = 'test'
$server = 'leasesql'
$table = 'dbo.ssis'
Import-CSV C:\temp\text.csv | ForEach-Object {
Invoke-Sqlcmd -Database $database -ServerInstance $server -Query "INSERT INTO $table (PSComputerName, FullName, Extension, LastWriteTime)
VALUES ('$_.PSComputerName','$_.FullName','$_.Extension','$_.LastWriteTime,')"
}
I was able to find out that I need to put the columns in $($_.colmunname)
Import-CSV C:\temp\ssis.csv | ForEach-Object {
Invoke-Sqlcmd -Database $database -ServerInstance $server -Query "INSERT INTO $table VALUES ('$($.PSComputerName)','$($.FullName)','$($.Extension)','$($.LastWriteTime)')"
I want to read a file that contains a line separated list of *.sql file names (all located at the same directory) to execute the following:
$reader = [System.IO.File]::OpenText("_Scripts.txt")
try {
for(;;) {
$line = $reader.ReadLine()
if ($line -eq $null) { break }
#output
$out = $line.split(".")[0] + ".txt" ;
# -serverinstance u should change the value of it according to use
invoke-sqlcmd -inputfile $line -serverinstance "." | format-table | out-file -filePath $out
$line
}
}
finally {
$reader.Close()
}
I'm trying to execute this script file by using a batch file containing the command:
powershell.exe -ExecutionPolicy Bypass -Command "_scripts.ps1"
but I get the error shown below:
Can anyone help me fix my ps1 script please?
This works for me:
$lines = Get-Content C:\Temp\TEST\_Scripts.txt
ForEach ($line in $lines)
{
$out = $line.split(".")[0] + ".txt" ;
Invoke-Sqlcmd -InputFile $line -ServerInstance "localhost" -Database "master" | Format-Table | Out-File -FilePath $out
}
I am using this form post as a baseline for a PowerCLI script I am trying to write.
Powershell Get-QADUser results to SQL table
My modified scrip looks like this:
Add-PSSnapin VMware.VimAutomation.Core
Add-PSSnapin SqlServerCmdletSnapin100
Add-PSSnapin SqlServerProviderSnapin100
$db_server = "127.0.0.1\VIM_SQLEXP"
$db = "Billing"
$table = "vmdata"
$username = "import"
$pwd = "myPassWord"
# First, clear existing table
$sql_query_del = "DELETE FROM $table"
Invoke-Sqlcmd -ServerInstance $db_server -Database $db -Username $username -Password $pwd -Query $sql_query_del
# Get vms and resources add to DB
Get-resourcepool -name finance | get-vm | foreach {
$name = $_.Name
$PowerState = $_.PowerState
$NumCPUs = $_.'Num CPUs'
$MemoryGB = $_.MemoryGB
Write-Host " Name : $Name PowerState : $PowerState MemoryGB : $MemoryGB Num CPUs : $NumCPUs"
$sql_query = "INSERT INTO $table (Name, PowerState, MemoryGB, NumCPUs) VALUES ('$Name', '$PowerState', '$MemoryGB', '$NumCPUs')"
Invoke-Sqlcmd -ServerInstance $db_server -Database $db -Username $username -Password $pwd -Query $sql_query
}
My problem is the Num CPUs that has a space in it when just executing this command
Get-resourcepool -name finance | get-vm
Name PowerState Num CPUs MemoryGB
---- ---------- -------- --------
GreatPlains01 PoweredOn 4 8.000
what is the appropriate method to reference this to have the result display?
When executing this via power shell I get the following result:
Name : GreatPlains01 PowerState : PoweredOn MemoryGB : 8 Num CPUs :
So the variable I am using isn't correct I have tried [] "" and ' ' around Num CPUs with no success.
Although the column header in the output of the Get-VM cmdlet is 'Num CPUs', the property name is NumCPU. So you should use:
$NumCPUs = $_.NumCPU