I need to load data table to ODBC driver connection with powershell.
With OLEDB and SQL server we can use Bulk Copy and insert data quickly.
Is there such posibility with ODBC ?
I'm using powershell because it shoud have the best support for these kind of opperations,
but my current code doesn't utillise an of the dlls.
So my code firstly needs to create an insert statements with two for loops and iterate on every row and hold it in its memory,
and then to construct INSERT INTO with 1000 rows, and then repeat same thing.
Am i doomed to something like this ?
$Datatable = New-Object System.Data.DataTable
$tabledump= $src_cmd.ExecuteReader()
$Datatable.Load($tabledump)
foreach ($item in $Datatable.Rows) {
$f +=1
for ($i = 0; $i -lt $item.ItemArray.Length; $i++) {
$items = $item[$i] -replace "'" , "''"
$val +="'"+ $items + "',"
}
$vals += $val
if ($f % 1000 -eq 0 -or $f -eq $row_cnt) {
$values = [system.String]::Join(" ", $vals)
$values = $values.TrimEnd(",")
$cols = [system.String]::Join(",", $columns)
$postgresCommand = "Insert Into $dst_schema.$dst_table ($cols) values $values"
$dest_cmd_.CommandText = $postgresCommand
$dest_cmd_.ExecuteNonQuery()
Bad code i admit, any advice on code compositions are welcomed.
You can use Get-ODBCDSN command to retrieve the values of the ODBC connections and use it with a query
$conn.ConnectionString= "DSN=$dsn;"
$cmd = new-object System.Data.Odbc.OdbcCommand($query,$conn)
$conn.open()
$cmd.ExecuteNonQuery()
$conn.close()
https://www.andersrodland.com/working-with-odbc-connections-in-powershell/
But the ODBC provider doesnt do bulk copy
https://learn.microsoft.com/en-us/sql/relational-databases/native-client-odbc-bulk-copy-operations/performing-bulk-copy-operations-odbc?view=sql-server-ver15
I know this post is not new, but i've been fiddeling around looking for a solution and also found nothing, however this post gave me a couple of insights.
First: There is no such thing as 'Bad Code'. If it works is not bad, heck even if it didn't worked, but helped with something..
Alright, what i did is not the best solution, but i'm trying to import Active Directory data on PostgreSQL, so...
I noticed that you're trying with pgsql as well, so you can use the COPY statement.
https://www.postgresql.org/docs/9.2/sql-copy.html
https://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/
In my case i used it with a csv file:
*Assuming you have installed pgsql ODBC driver
$DBConn = New-Object System.Data.Odbc.OdbcConnection
$DBConnectionString = "Driver={PostgreSQL UNICODE(x64)};Server=$ServerInstance;Port=$Port;Database=$Database;Uid=$Username;Pwd=$(ConvertFrom-SecureString -SecureString $Password);"
$DBConn.ConnectionString = $DBConnectionString
try
{
$ADFObject = #()
$ADComputers = Get-ADComputer -Filter * -SearchBase "OU=Some,OU=OrgU,OU=On,DC=Domain,DC=com" -Properties Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID | Select-Object Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID
foreach ($ADComputer in $ADComputers) {
switch ($ADComputer.Enabled) {
$true {
$ADEnabled = 1
}
$false {
$ADEnabled = 0
}
}
$ADFObject += [PSCustomObject] #{
ADName = $ADComputer.Name
ADInsert_Time = Get-Date
ADEnabled = $ADEnabled
ADDistinguishedName = $ADComputer.DistinguishedName
ADObjectGUID = $ADComputer.ObjectGUID
ADLastLogonTimestamp = [datetime]::FromFileTime($ADComputer.LastLogonTimestamp)
ADModifyTimestamp = $ADComputer.modifyTimestamp
ADDescription = $ADComputer.Description
}
}
$ADFObject | Export-Csv $Env:TEMP\TempPsAd.csv -Delimiter ',' -NoTypeInformation
docker cp $Env:TEMP\TempPsAd.csv postgres_docker:/media/TempPsAd.csv
$DBConn.Open()
$DBCmd = $DBConn.CreateCommand()
$DBCmd.CommandText = #"
COPY AD_Devices (ADName,ADInsert_Time,ADEnabled,ADDistinguishedName,ADObjectGUID,ADLastLogonTimestamp,ADModifyTimestamp,ADDescription)
FROM '/media/TempPsAd.csv'
DELIMITER ','
CSV HEADER
"#
$DBCmd.ExecuteReader()
$DBConn.Close()
docker exec postgres_docker rm -rf /media/TempPsAd.csv
Remove-Item $Env:TEMP\TempPsAd.csv -Force
}
catch
{
Write-Error "$($_.Exception.Message)"
continue
}
Hope it helps!
Cheers!
I'm struggling to use the Power BI API to programmatically download multiple reports into a pdf file format. I've written a short powershell script that connects to my accounts and produces a list of all available reports then exports the list as a csv. As there is no 'Export-pdf' attribute is there a work around? Here is my script below.
{
Connect-PowerBIServiceAccount
$myWorkspaces = Get-PowerBIWorkspace
Write-Host “The current user has –” $myWorkspaces.Count “– workspaces.”
$Reports =
ForEach ($workspace in $Workspaces)
{
Write-Host $workspace.Name
ForEach ($report in (Get-PowerBIReport -Scope Individual -WorkspaceId $workspace.Id))
{
[pscustomobject]#{
WorkspaceID = $workspace.Id
WorkspaceName = $workspace.Name
ReportID = $report.Id
ReportName = $report.Name
ReportURL = $report.WebUrl
ReportDatasetID = $report.DatasetId
}
}
}
Write-Host “The current user has –” $Reports.Count “– reports.”
$Reports | Export-csv -Path C:\Users\jackbolshaw\Desktop\eTech\Test\report.csv -NoTypeInformation
Disconnect-PowerBIServiceAccount
}
I've built a powershell script that allows me to run and email the result of sql queries that are being read from a sql file:
function global:readscript($path)
{
#using UTF7 encoding to allow select with accented/french/russian/chinese/... etc chars
$inenc = [System.Text.Encoding]::UTF7
$reader = new-object System.IO.StreamReader($path, $inenc)
$finalquery = ""
while ($line = $reader.ReadLine())
{
$finalquery += $line
}
$reader.close()
return $finalquery
}
function global:get-result($query)
{
$oracleconnection = new-object Oracle.ManagedDataAccess.Client.OracleConnection
$oracleconnection.connectionstring = $connectionstring
$oracleconnection.Open()
$oraclecommand = $oracleconnection.CreateCommand()
$oraclecommand.CommandText = $query
$reader = $oraclecommand.ExecuteReader()
#...etc
}
$scriptquery = readscript "d:\mysqlquery.sql"
get-result($scriptquery)
Everything is working fine so far, except this one sql script that contains the "+" sign for purpose of calculation.
Lets say file mysqlquery.sql contains a line such as:
(SELECT COUNT(a.ID)) + (SELECT COUNT(b.ID))
I can see in the console it's being translated to
(SELECT COUNT(a.ID)) (SELECT COUNT(b.ID))
and of course throws this annoying exception "missing right parenthesis"
How do I escape this plus sign when reading it from a txt file ?
Good morning stackoverflow. I have a PowerShell script that is executing a SQL query against an Oracle database and then taking the results and passing them to a local shell command. It works, mostly. What is happening is some of the results are being dropped and the only significance I can see about these is that they have a couple of columns that have null values (but only 2 out of the 8 columns that are being returned). When the query is executed in sQL developer I get all expected results. This issue applies to the $eventcheck switch, the $statuscheck works fine. The Powershell script is below:
param(
[parameter(mandatory=$True)]$username,
[parameter(mandatory=$True)]$password,
$paramreport,
$paramsite,
[switch]$eventcheck,
[switch]$statuscheck
)
$qry1 = Get-Content .\vantageplus_processing.sql
$qry2 = #"
select max(TO_CHAR(VP_ACTUAL_RPT_DETAILS.ETLLOADER_OUT,'YYYYMMDDHH24MISS')) Completed
from MONITOR.VP_ACTUAL_RPT_DETAILS
where VP_ACTUAL_RPT_DETAILS.REPORTNUMBER = '$($paramreport)' and VP_ACTUAL_RPT_DETAILS.SITE_NAME = '$($paramsite)'
order by completed desc
"#
$connString = #"
Provider=OraOLEDB.Oracle;Data Source=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST="HOST")(PORT="1521"))
(CONNECT_DATA=(SERVICE_NAME="SERVICE")));User ID="$username";Password="$password"
"#
function Get-OLEDBData ($connectstring, $sql) {
$OLEDBConn = New-Object System.Data.OleDb.OleDbConnection($connectstring)
$OLEDBConn.open()
$readcmd = New-Object system.Data.OleDb.OleDbCommand($sql,$OLEDBConn)
$readcmd.CommandTimeout = '300'
$da = New-Object system.Data.OleDb.OleDbDataAdapter($readcmd)
$dt = New-Object System.Data.DataTable
[void]$da.fill($dt)
$OLEDBConn.close()
return $dt
}
if ($eventcheck)
{
$output = Get-OLEDBData $connString $qry1
ForEach ($lines in $output)
{
start-process -NoNewWindow -FilePath msend.exe -ArgumentList #"
-n bem_snmp01 -r CRITICAL -a CSG_VANTAGE_PLUS -m "The report $($lines.RPT) for site $($lines.SITE) has not been loaded by $($lines.EXPDTE)" -b "vp_reportnumber='$($lines.RPT)'; vp_sitename='$($lines.SITE)'; vp_expectedcomplete='$($lines.SIMEXPECTED)'; csg_environment='Production';"
"# # KEEP THIS TERMINATOR AT THE BEGINNING OF LINE
}
}
if ($statuscheck)
{
$output = Get-OLEDBData $connString $qry2
# $output | foreach {$_.completed}
write-host -nonewline $output.completed
}
So that you can see the data, below is a csv output from Oracle SQL Developer with the ACTUAL results to the query that is being referenced by my script. Of these results lines 4, 5, 6, 7, 8, 10 are the only ones being passed along in the ForEach loop, while the others are not even captured in the $output array. If anyone can advise of a method for getting all of the results passed along, I would appreciate it.
"SITE","RPT","LSDTE","EXPDTE","CIMEXPECTED","EXPECTED_FREQUENCY","DATE_TIMING","ETME"
"chrcse","CPHM-054","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"chrcse","CPSM-226","","2014/09/21 12:00:00","20140921120000","MONTHLY","1",
"dsh","CPSD-176","2014/09/28 23:20:04","2014/09/30 04:00:00","20140930040000","DAILY","1",1.41637731481481481481481481481481481481
"dsh","CPSD-178","2014/09/28 23:20:11","2014/09/30 04:00:00","20140930040000","DAILY","1",1.4162962962962962962962962962962962963
"exp","CPSM-610","2014/08/22 06:42:10","2014/09/21 09:00:00","20140921090000","MONTHLY","1",39.10936342592592592592592592592592592593
"mdc","CPKD-264","2014/09/24 00:44:32","2014/09/30 04:00:00","20140930040000","DAILY","1",6.35771990740740740740740740740740740741
"nea","CPKD-264","2014/09/24 01:00:31","2014/09/30 03:00:00","20140930030000","DAILY","1",6.34662037037037037037037037037037037037
"twtla","CPOD-034","","2014/09/29 23:00:00","20140929230000","DAILY","0",
"twtla","CPPE-002","2014/09/29 02:40:35","2014/09/30 06:00:00","20140930060000","DAILY","1",1.27712962962962962962962962962962962963
"twtla","CPXX-004","","2014/09/29 23:00:00","20140929230000","DAILY","0",
It appears, actually, that somehow a comment that was in the query was causing this issue. I removed it and the results started returning normal. I have no idea why this would be the case, unless it has to do with the way the import works (does it import everything as one line?). Either way, the results are normal now.
Objective: Robo copy from multiple machines on the network to a network share using variables for both the machine name and the currently logged on user.
What I have: txt file with a list of computernames.
Issue: I cannot get the foreach to work with the .split("\")[1] I use on the username variable to remove the domain prefix so I can use the output from that in the robocopy path
something like
robocopy "\\$computername\c$\documents and settings\$username\backup" "\\networkshare\backup\$username\backup"
gives me the error
You cannot call a method on a null-valued expression.
At C:\Scripts\Test\backup.ps1:13 char:2
Here's what I have so far. Can somebody help please?
function Get-LoggedIn {
[CmdletBinding()]
param (
[Parameter(Mandatory=$True)]
[string[]]$computername
)
foreach ($pc in $computername){
$logged_in = (gwmi win32_computersystem -COMPUTER $pc).username
$name = $logged_in.split("\")[1]
"{1}" -f $pc,$name
}
}
$computers = Get-Content "C:\Scripts\testcomputers.txt"
foreach ($computer in $computers) {
$users = Get-LoggedIn $computer
}
$SourceFolder = "\\$computer\c$\users\$users\desktop"
$DestinationFolder = "\\networkshare\backups\$users\backup\desktop"
$Logfile = "\\networkshare\backups\$users\backup\backuplog.txt"
Robocopy $SourceFolder $DestinationFolder /E /R:1 /W:1 /LOG:$Logfile
I see multiple errors here. You're not running the copy commands inside the foreach-loop. The username property recieved from WMI can often be in the following format:
domain\computer\username (or computer\domain\username, unsure since I'm on non-domain workstation now)
Anyways, the username is always the last part, so get it by using the index [-1] instead.
Updated script (with indents!):
function Get-LoggedIn {
[CmdletBinding()]
param (
[Parameter(Mandatory=$True)]
[string[]]$computername
)
foreach ($pc in $computername){
$logged_in = (gwmi win32_computersystem -COMPUTER $pc).username
$name = $logged_in.split("\")[-1]
"{1}" -f $pc,$name
}
}
$computers = Get-Content "C:\Scripts\testcomputers.txt"
foreach ($computer in $computers) {
$users = Get-LoggedIn $computer
$SourceFolder = "\\$computer\c$\users\$users\desktop"
$DestinationFolder = "\\networkshare\backups\$users\backup\desktop"
$Logfile = "\\networkshare\backups\$users\backup\backuplog.txt"
& Robocopy $SourceFolder $DestinationFolder /E /R:1 /W:1 /LOG:$Logfile
}