Powershell | How to make an if statement that compares date from SQL Table with Date given in script - sql

Background information: So I am writing a Powershell script to clean up a SQL Server Database and I have a question on how to perform an if statement in this very specific case.
So the script works as follows:
The Script connects to the SQL Server Database
It get's the current date -5 days and converts it to unix Epoch time
It takes that Epoch date and cuts off the decimals
Then adds 3 zero's because the Database requires 13 numbers (Not sure why who ever set the database up decided to do this but it is what it is)
Then comes the if statement <-- my question
Define SQL Query
Run the sql query
The Question: So I want to have an if statement that compares the converted unix epoch date with the epoch date from the SQL Server
Database Table and if the date is less or equal then the entries are deleted.
My code is currently as follows:
#### Variables ####
# Sorry for having to censor the variable inputs, wouldn't wanna leak this sensitive information everywhere.
$UID = 'uid'
$PASS = 'pass'
$SI = 'serverinstance'
$DB = 'database'
$Table = 'table'
$Table_age = 'date_column'
$Age = 5 # Maximum age of an entry
$DateDecimal = ',' # Depends on the PC but this might have to be changed to a .period instead
$EQLcompat = '000' # EQL Database compatibility for the Epoch time
#### Get date and convert to Epoch ####
$Date = Get-Date $((Get-Date).AddDays(-$Age).ToUniversalTime()) -UFormat +%s
#Substring the decimal values
$DateCut = $Date.Substring(0, $Date.IndexOf($DateDecimal))
#Make Epoch date EQL Database compatible
$unixTime = $DateCut + $EQLcompat
#### if statement in Question ####
if ($Table -le $unixTime){
# SQL Query
$SQLquery = "
SELECT TOP (1) *
FROM [$Table]
"}
else{
Write-Host 'Nope, not this way'
}
#### Run SQL Query ####
Invoke-Sqlcmd -serverinstance $SI -Database $DB -username $UID -password $PASS -query $SQLquery
Does anyone have any idea how to do this if statement to make this work? Thank you very much!
EDIT 1:
I've tried if (+ $Table -le $unixTime){ however this gives me the error code;
Cannot convert value "databasename" to type "System.Int32". Error: "Input string was not in a correct format."
At C:\Users\fabstefanm.GEMEENTENET\Documents\Code\SQL Table Cleaner.ps1:35 char:5
+ if (+ $Table -le $unixTime){
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastFromStringToInteger
EDIT 2:
Meant SQL Server not MySQL, my bad got the two names crossed.
EDIT 3:
I've changed the if statement to this:
#### if statement ####
if ($DB.output -le $unixTime){
# SQL Query
$SQLquery = "SELECT TOP (1) * FROM [$Table] WHERE [$Table_age] <= [$unixTime]"
Write-Host 'Yes, this way'
}
Now this does feel like I've come a step closer to solving the issue, however te error I now get is Invoke-Sqlcmd : Invalid column name '1659532862000'. So it does seem that I am not yet in the right place, does anyone know how to fix this?

I changed around the if statement slightly and now it all works. Of course it's just a select query and not yet the delete query, but that's not a big change.
This is the new if statement + SQLquery is this:
#### if statement ####
if ($DB.output -le $unixTime){
# SQL Query
$SQLquery = "SELECT TOP (10) [$Table_age] FROM [$DB].[dbo].[$Table] WHERE [$Table_age] >= $unixTime"
Write-Host 'Yes, this way'
}
So if anyone else ever comes across the need to use Powershell to run a task and then an SQL query with an if statement in it then I hope this can be of some assistance.

Related

Trouble converting file stored in SQL table to file stored on disk (missing content)

I have a database table with following columns (among others):
emailAttachment
emailAttachmentFileName
I am using the following Powershell to convert a .csv file to binary, to store in emailAttachment (defined as varbinary(max)):
$("0x$(([Byte[]](Get-Content -Encoding Byte -Path c:\temp\test.csv) | ForEach-Object ToString X2) -join '')")
As far as I can tell, that works fine.
To retrieve the data, I run this SQL query:
$query = #'
SELECT
*
FROM
[dbo].[mydatabase].[mytable]
WHERE
id = 1
'#
$file = Invoke-Sqlcmd -Query $query -ServerInstance server.domain.local -Credential (Get-Credential)
Now that I have the bytes from SQL, I want to write it to a file:
[IO.File]::WriteAllBytes($file.emailAttachmentFileName, $file.emailAttachment)
That creates the file with the correct name and some of the rows of the .csv, but not all of them (63 instead of 464).
I also tried both, but they have the same result:
[IO.File]::WriteAllLines($file.emailAttachmentFileName, [System.Text.Encoding]::ASCII.GetString($file.emailAttachment))
[IO.File]::WriteAllText($file.emailAttachmentFileName, [System.Text.Encoding]::ASCII.GetString($file.emailAttachment))
What am I doing wrong here?

Find a character in a string using Powershell?

I know I could use Contains to find it but it doesn't work.
Full Story:
I have to get the PartNo, Ver, Rev from SQl db and check if they occur in the first line of the text file. I get the first line of the file and store it in $EiaContent.
The PartNo is associated with MAFN as in $partNo=Select PartNo Where MAFN=xxx. Most of the time MAFN returns one PartNo. But in some cases for one MAFN there could be multiple PartNo. So the query returns multiple PartNo(PartNo_1,PartNo_2,PartNo_3,and PartNo_4) but only one of these will be in the text file.
The issue is that each of these PartNo. is treated as a single character in PowerShell. $partNo.Length is 4. Therefore, my check If ($EiaContent.Contains("*$partNo*")) fails and it shouldn't in this case because I can see that one of the PartNo is mentioned in the file. Also, Contains wouldn't work if there was one PartNo. I use like as in If ($EiaContent -like "*$partNo*") to match the PartNo and it worked but it doesn't work when there are multiple PartNo.
Data type of $partNo is string and so is $EiaContent. The data type of PartNo. in SQL is varchar(50) collation is COLLATE SQL_Latin1_General_CP1_CI_AS
I am using PowerShell Core 7.2 and SQL 2005
Code:
$EiaContent = (Get-Content $aidLibPathFolder\$folderName\$fileName -TotalCount 1)
Write-host $EiaContent
#Sql query to get the Part Number
$partNoQuery = "SELECT PartNo FROM [NML_Sidney].[dbo].[vMADL_EngParts] Where MAFN = $firstPartTrimmed"
$partNoSql = Invoke-Sqlcmd -ServerInstance $server -Database $database -Query $partNoQuery
#Eliminate trailing spaces
$partNo = $partNoSql.PartNo.Trim()
If ($EiaContent.Contains("*$partNo*")) {
Write-Host "Part Matches"
}
Else {
#Send an email stating the PartNo discrepancy
}
Thank you in advance to those who try to help.
EDIT
Screenshot
[1]: https://i.stack.imgur.com/hIqJB.png
A1023 A1023MD C0400 C0400MD is the output of the variable $partNo and O40033( C0400 REV N VER 004, 37 DIA 4.5 BRAKE DRUM OP3 ) is the output of the variable $EiaContent
So the query returns multiple PartNo(PartNo_1,PartNo_2,PartNo_3,and PartNo_4) but only one of these will be in the text file.
A1023 A1023MD C0400 C0400MD is the output of the variable $partNo and O40033( C0400 REV N VER 004, 37 DIA 4.5 BRAKE DRUM OP3 ) is the output of the variable $EiaContent
So you first have to split $partNo and then for each sub string of $partNo, search for it in $EiaContent:
If ($partNo -split ' ' | Where-Object { $EiaContent.Contains( $_ ) }) {
Write-Host "Part Matches"
}
This is the generic form that most people are used to. We can simplify the query using the unary form of -split (as we split on the default separator) and use the intrinsic array method .Where() which is faster as it does not involve pipeline overhead.
If ((-split $partNo).Where{ $EiaContent.Contains( $_ ) }) {
Write-Host "Part Matches"
}
As correctly noted in comments, wildcards are not supported by the .Contains() string method.
Wildcards are supported only by the PowerShell -like operator. The following example is just for educational purposes, I wouldn't use it in your case as .Contains() string method is simpler and faster.
If ((-split $partNo).Where{ $EiaContent -like "*$EiaContent*" }) {
Write-Host "Part Matches"
}
Note that -contains would not be suitable here. A common misconception is that -contains does a substring search, when the LHS operand is a string. It doesn't! The operator tests whether a collection (such as an array) on the LHS contains the value given on the RHS.

ROracle Errors When Trying to Use Bound Parameters

I'm using ROracle on a Win7 machine running the following R version:
platform x86_64-w64-mingw32
arch x86_64
os mingw32
system x86_64, mingw32
status
major 3
minor 1.1
year 2014
month 07
day 10
svn rev 66115
language R
version.string R version 3.1.1 (2014-07-10)
nickname Sock it to Me
Eventually, I'm going to move the script to a *nix machine, cron it, and run it with RScript.
I want to do something similar to:
select * from tablename where 'thingy' in ('string1','string2')
This would return two rows with all columns in SQLDeveloper (or Toad, etc).
(Ultimately, I want to pull results from one DB into a single column in a data.frame then use those results to loop through
and pull results from a second db, but I also need to be able to do just this function as well.)
I'm following the documentation for RORacle from here.
I've also looked at this (which didn't get an answer):
Bound parameters in ROracle SELECT statements
When I attempt the query from ROracle, I get two different errors, depending on whether I try a dbGetQuery() or dbSendQuery().
As background, here are the versions, queries and data I'm using:
Driver name: Oracle (OCI)
Driver version: 1.1-11
Client version: 11.2.0.3.0
The connection information is standard:
library(ROracle)
ora <- dbDriver("Oracle")
dbcon <- dbConnect(ora, username = "username", password = "password", dbname = "dbnamefromTNS")
These two queries return the expected results:
rs_send <- dbSendQuery(dbcon, "select * from tablename where columname_A = 'thingy' and rownum <= 1000")
rs_get <- dbGetQuery(dbcon, "select * from tablename where columname_A = 'thingy' and rownum <= 1000")
That is to say, 1000 rows from tablename where 'thingy' exists in columnname_A.
I have a data.frame of one column, with two rows.
my.data = data.frame(RANDOM_STRING = as.character(c('string1', 'string2')))
and str(my.data) returns this:
str(my.data)
'data.frame': 2 obs. of 1 variable:
$ RANDOM_STRING: chr "string1" "string2"
my attempted queries are:
nope <- dbSendQuery(dbcon, "select * from tablename where column_A = 'thingy' and widget_name =:1", data = data.frame(widget_name =my.data$RANDOM_STRING))
which gives me an error of:
Error in .oci.SendQuery(conn, statement, data = data, prefetch = prefetch, :
bind data does not match bind specification
and
not_this_either <- dbGetQuery(dbcon, "select * from tablename where column_A = 'thingy' and widget_name =:1", data = data.frame(widget_name =my.data$RANDOM_STRING))
which gives me an error of:
Error in .oci.GetQuery(conn, statement, data = data, prefetch = prefetch, :
bind data has too many rows
I'm guessing that my problem is in the data=(widget_name=my.data$RANDOM_STRING) part of the queries, but haven't been able to rubber duck my way through it.
Also, I'm very curious as to why I get two separate and different errors depending on whether the queries use the send (and fetch later) format or the get format.
If you like the tidyverse there's a slightly more compact way to achieve the above using purrr
library(ROracle)
library(purrr)
ora <- dbDriver("Oracle")
con <- dbConnect(ora, username = "username", password = "password", dbname = "yourdbnamefromTNSlist")
yourdatalist <- c(12345, 23456, 34567)
output <- map_df(yourdatalist, ~ dbGetQuery(con, "select * from YourTableNameHere where YOURCOLUMNNAME = :d", .x))
Figured it out.
It wasn't a problem with Oracle or ROracle (I'd suspected this) but with my R code.
I stumbled over the answer trying to solve another problem.
This answer about "dynamic strings" was the thing that got me moving towards a solution.
It doesn't fit exactly, but close enough to rubberduck my way to an answer from there.
The trick is to wrap the whole thing in a function and run an ldply on it:
library(ROracle)
ora <- dbDriver("Oracle")
con <- dbConnect(ora, username = "username", password = "password", dbname = "yourdbnamefromTNSlist")
yourdatalist <- c(12345, 23456, 34567)
thisfinallyworks <- function(x) {
dbGetQuery(con, "select * from YourTableNameHere where YOURCOLUMNNAME = :d", data = x)
}
ldply(yourdatalist, thisfinallyworks)
row1 of results where datapoint in YOURCOLUMNNAME = 12345
row2 of results where datapoint in YOURCOLUMNNAME = 23456
row3 of results where datapoint in YOURCOLUMNNAME = 34567
etc

Yii: Get Column names from dynamic query

Is it possible to execute a query and just get the column names of the returned result set.
I need the column names since the query is dynamic and I don't know the names of the columns.
I will use these column names for sorting when executing the query again.
You could refer to my Previous question to get the idea why I need it.
Thanks.
Depending on which PDO driver is in use, you can get column names from PDOStatement::getColumnMeta, once the statement has been executed.
Here is one way it can be done in Yii 1.1:
$command = Yii::app()->{db}
->createCommand('SELECT "." `Stop!`, current_time `Hammer Time`');
$reader = $command->query();
$sth = $command->getPdoStatement();
for ($i = 0; $i < $sth->columnCount(); $i++) {
$col = $sth->getColumnMeta($i);
print $col['name'].' ';
}

MultiTable SQL query. (MAX) on new table field hangs the query

I am using BIDS to connect to a Progress DB through an ODBC connection:
This query works fine
SELECT
PUB."master"."app-number",
...
PUB."property"."prop-id",
FROM
PUB."master" master JOIN PUB."property" property ON
master."lt-acnt" = property."lt-acnt"
...
LEFT OUTER JOIN PUB."arm" arm ON
master."lt-acnt" = arm."lt-acnt"
WHERE
...
However, I need to add some additional fields from another table. The problem is that I only need the information from the last time these new fields were updated.
I have tried:
SELECT
yt."app-number"
...
yt."disc-adj-tot",
yt."rt-adj-nbr",
yt."base-disc-per"
FROM (
SELECT PUB."master"."app-number",
...
PUB."lt-rt-adj-hdr"."disc-adj-tot",
PUB."lt-rt-adj-hdr"."rt-adj-nbr",
PUB."lt-rt-adj-hdr"."base-disc-per"
FROM PUB."master" master JOIN PUB."property" property ON
master."lt-acnt" = property."lt-acnt"
...
JOIN PUB."lt-rt-adj-hdr" lt_rt_adj_hdr ON
lt_master."lt-acnt" = lt_rt_adj_hdr."lt-acnt") yt
INNER JOIN(
SELECT "app-number",
MAX("rt-adj-nbr") "rt-adj-nbr"
FROM ( PUB."lt-master" lt_master JOIN
PUB."lt-rt-adj-hdr" lt_rt_adj_hdr ON
lt_master."lt-acnt" = lt_rt_adj_hdr."lt-acnt")
GROUP BY "app-number") ss on yt."app-number" = ss."app-number" and
yt."rt-adj-nbr" = ss."rt-adj-nbr"
WHERE ...
This query just hangs and will not return results unless a very simple WHERE clause like "WHERE yt."app-number" = 123456" is used. I am completely stuck.
Has the owner of the Progress DB ever run "update statistics"? The Progress SQL query optimizer needs to have good statistics in order to execute efficiently. Progress applications usually use the 4GL engine rather than SQL so, in many cases, the administrator is not keeping the SQL statistics updated. Which often leads to very poor SQL query performance.
From the 4GL side the admin can use this script to generate a program that will do the job:
/* genUpdateSQL.p
*
* mpro dbName -p util/genUpdateSQL.p -param "tmp/updSQLstats.sql"
*
* sqlexp -user userName -password passWord -db dnName -S servicePort -infile tmp/updSQLstats.sql -outfile tmp/updSQLtats.log
*
*/
output to value( ( if session:parameter <> "" then session:parameter else "updSQLstats.sql" )).
for each _file no-lock where _hidden = no:
put unformatted
"UPDATE TABLE STATISTICS AND INDEX STATISTICS AND ALL COLUMN STATISTICS FOR PUB."
'"' _file._file-name '"' ";"
skip
.
put unformatted "commit work;" skip.
end.
output close.
return.
Or, you could do it if you have sufficient privileges (just plug in your table name for _file._file-name).