Timestamp issues with Powershell - sql

I have a small powershell script that pulls the last hour of punch data from a sql db, it then outputs that data to a .csv file. The script is working, but the timestamp is like this:
hh:mm:ss.xxx, i need it to be only hh:mm, Any help would be greatly appreciated!
Below is the script and a snippet of the output:
sqlcmd -h-1 -S ZARIRIS\IRIS -d IA3000SDB -Q "SET NOCOUNT ON; Select Distinct TTransactionLog_1.DecisionTimeInterval,
TTransactionLog_1.UserID, TTransactionLog_1.OccurDateTime, TTransactionLog_1.StableTimeInterval
From TTransactionLog_1
Inner join TSystemLog1 On TTransactionLog_1.NodeID=TSystemLog1.NodeID
Inner join TUser On TTransactionLog_1.UserID=Tuser.UserID
where TSystemLog1.NodeID = 3 and TTransactionLog_1.OccurDateTime >= dateadd(HOUR, -1, getdate())" -s "," -W -o "C:\atr\karen\adminreport3.csv"
Get-Content "C:\ATR\Karen\adminreport3.csv" | ForEach-Object {$_ -replace "44444444","IN PUNCH"} | ForEach-Object {$_ -replace "11111111","OUT PUNCH"} | Set-Content "C:\ATR\Karen\punchreport1.csv" -Force
Output: (where i need the hh:mm format, it needs to read 12:08, not 12:08:19.000)
112213,2022-10-31 12:08:19.000,OUT PUNCH

It would probably be best if your script were to write out a date formatted the way you want in the first place,
but if that's not an option, you really should consider using Import-Csv and Export-Csv to manipulate the data inside.
If the standard quoted csv output is something you don't want, please see this code to safely remove the quotes where possible.
Having said that, here's one way of doing it in a line-by-line fashion:
Get-Content "C:\ATR\Karen\adminreport3.csv" | ForEach-Object {
$line = $_ -replace "44444444","IN PUNCH" -replace "11111111","OUT PUNCH"
$fields = $line -split ','
# reformat the date by first parsing it out as DateTime object
$fields[1] = '{0:yyyy-MM-dd HH:mm}' -f [datetime]::ParseExact($fields[1], 'yyyy-MM-dd HH:mm:ss.fff',$null)
# or use regex on the date and time string as alternative
# $fields[1] = $fields[1] -replace '^(\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}).*', '$1'
# rejoin the fields with a comma
$fields -join ','
} | Set-Content "C:\ATR\Karen\punchreport1.csv" -Force

Related

Trouble converting file stored in SQL table to file stored on disk (missing content)

I have a database table with following columns (among others):
emailAttachment
emailAttachmentFileName
I am using the following Powershell to convert a .csv file to binary, to store in emailAttachment (defined as varbinary(max)):
$("0x$(([Byte[]](Get-Content -Encoding Byte -Path c:\temp\test.csv) | ForEach-Object ToString X2) -join '')")
As far as I can tell, that works fine.
To retrieve the data, I run this SQL query:
$query = #'
SELECT
*
FROM
[dbo].[mydatabase].[mytable]
WHERE
id = 1
'#
$file = Invoke-Sqlcmd -Query $query -ServerInstance server.domain.local -Credential (Get-Credential)
Now that I have the bytes from SQL, I want to write it to a file:
[IO.File]::WriteAllBytes($file.emailAttachmentFileName, $file.emailAttachment)
That creates the file with the correct name and some of the rows of the .csv, but not all of them (63 instead of 464).
I also tried both, but they have the same result:
[IO.File]::WriteAllLines($file.emailAttachmentFileName, [System.Text.Encoding]::ASCII.GetString($file.emailAttachment))
[IO.File]::WriteAllText($file.emailAttachmentFileName, [System.Text.Encoding]::ASCII.GetString($file.emailAttachment))
What am I doing wrong here?

Powershell - Can I scrape SQL file and compare results?

I am fairly new to Powershell and have run into a bit of a pickle.
I am trying to scrape a SQL script in txt file format. My goal is to check if every created (volatile) table is also dropped in the same file. And if not, output the name of the table which is not dropped in the SQL script.
On top of that I would like check if the "drop" of the table is AFTER "create volatile table" and not BEFORE, because that would be wrong syntax. Do you think it's somehow possible?
I tried to do it by extracting lines of codes where is the create table string, then using regex to get name(string) of the table and saving it to variable. I did the same with the "drop table...". Now I am trying to compared those two (string) variables by converting them to list, object or whatever. I feel like I am in dead end.
$vtMatch = Get-ChildItem "$packagepath\Scripts\" -Recurse | Select-String -Pattern "create multiset volatile table VT_","create multiset volatile table vt_"
$vtMatch = $vtMatch.line
$vt = $vtMatch | Select-String 'vt_(.*?)(?=,)' -AllMatches | Select-Object -Expand matches | ForEach-Object {$_.value}
$dropMatch = Get-ChildItem "$packagepath\Scripts\" -Recurse | Select-String -Pattern "drop table VT_","drop table vt_"
$dropMatch = $dropMatch.line
$drop = $dropMatch | Select-String 'vt_(.*?)(?=;)' -AllMatches | Select-Object -Expand matches | ForEach-Object {$_.value}
$missing = Compare-Object -ReferenceObject $vt -DifferenceObject $drop -Property item
$missing
The variable $vt contains these strings:
vt_R_transaction_Dm
vt_h_bank_account
vt_DD_Agreement_Detail_1
vt_DD_Agreement_Detail_2
vt_DM_Agreement_Detail_Prebase
vt_DM_Agreement_Detail_RWA
vt_DM_Agreement_Detail_1
vt_posted_transaction
vt_DM_Corp_Objective_Specific_MAT
vt_DM_Party_Detail
And variable $drop contains these strings:
vt_DD_Agreement_Detail_1
vt_DD_Agreement_Detail_2
vt_DM_Agreement_Detail_Prebase
vt_DM_Agreement_Detail_RWA
vt_DM_Agreement_Detail_1
vt_DM_Corp_Objective_Specific_MAT
vt_posted_transaction
vt_DM_Party_Detail

When I convert to upper case azure-powershell command is not working as expected

In azure pipeline I have to give $env = prd in small letters as its being used by many othe tasks. But actual resoucrce group names look like rg-e2-PRD-703. SO I have given below commands but its not giving me output
## $env =prd is given in pipeline
$environment = "$(env)".ToUpper()
write-host $environment ## its printing PRD as expected
$getNIClist = Get-AzNetworkInterface | Where-Object {$_.ResourceGroupName -clike 'rg-*-$environment-*' | Select-Object
You can use the command as below:
$aa = 'rg-*-' + $environment + '-*'
$getNIClist = Get-AzNetworkInterface | Where-Object {$_.ResourceGroupName -clike $aa} | Select-Object
or you can also change the ' to ":
$getNIClist = Get-AzNetworkInterface | Where-Object {$_.ResourceGroupName -clike "rg-*-$environment-*"} | Select-Object
because escape will not be done in single quotes
Hope it helps~

How to pass parameter into SQL file from UNIX script?

I'm looking to pass in a parameter into a SQL file from my UNIX script. Unfortunately having problems with it.
Please see UNIX script below:
#!/bin/ksh
############
# Functions
_usage() {
SCRIPT_NAME=XXX
-eq 1 -o "$1" = "" -o "$1" = help -o "$1" = Help -o "$1" = HELP ]; then
echo "Usage: $SCRIPT_NAME [ cCode ]"
echo " - For example : $SCRIPT_NAME GH\n"
exit 1
fi
}
_initialise() {
cCode=$1
echo $cCode
}
# Set Variables
_usage $#
_initialise $1
# Main Processing
sql $DBNAME < test.sql $cCode > $PVNUM_LOGFILE
RETCODE=$?
# Check for errors within log file
if [[ $RETCODE != 0 ]] || grep 'E_' $PVNUM_LOGFILE
then
echo "Error - 50 - running test.sql. Please see $PVNUM_LOGFILE"
exit 50
fi
Please see SQL script (test.sql):
SELECT DISTINCT v1.*
FROM data_latest v1
JOIN temp_table t
ON v1.number = t.id
WHERE v1.code = '&1'
The error I am receiving when running my UNIX script is:
INGRES TERMINAL MONITOR Copyright 2008 Ingres Corporation
E_US0022 Either the flag format or one of the flags is incorrect,
or the parameters are not in proper order.
Anyone have any idea what I'm doing wrong?
Thanks!
NOTE: While I don't work with the sql command, I do routinely pass UNIX parameters into SQL template/script files when using the isql command line tool, so fwiw ...
The first thing you'll want to do is replace the &1 string with the value in the cCode variable; one typical method is to use sed to do a global search and replace of &1 with ${cCode} , eg:
$ cCode=XYZ
$ sed "s/\&1/${cCode}/g" test.sql
SELECT DISTINCT v1.*
FROM data_latest v1
JOIN temp_table t
ON v1.number = t.id
WHERE v1.code = 'XYZ' <=== &1 replaced with XYZ
NOTE: You'll need to wrap the sed code in double quotes so that the value of the cCode variable can be referenced.
Now, to get this passed into sql there are a couple options ... capture the sed output to a new file and submit that file to sql or ... [and I'm guessing this is doable with sql], pipe the sed output into sql, eg:
sed "s/\&1/${cCode}/g" test.sql | sql $DBNAME > $PVNUM_LOGFILE
You may need '\p\g' around your SQL in the text file?
I personally tend to code in the SQL to the script itself, as in
#!/bin/ksh
var=01.01.2018
db=database_name
OUTLOG=/path/log.txt
sql $db <<_END_ > $OUTLOG
set autocommit on;
\p\g
set lockmode session where readlock = nolock;
\p\g
SELECT *
FROM table
WHERE date > '${var}' ;
\p\g
_END_
exit 0

SQL: how to fix these errors?

So I have to loop through a folder of .dat files, extract the data and use INSERT INTO to insert the data into a database.
Here is a pastebin of one of the files to see the data I am working with:
http://pastebin.com/dn4wQjjE
To run the script I just call:
populate_database.sh directoryWithDatFiles
And the contents of the populate_database.sh script:
rm test.sql;
sqlite3 test.sql "CREATE TABLE HotelReviews (HotelID SMALLINT, ReviewID SMALLINT, Author CHAR, Content CHAR, Date CHAR, Readers SMALLINT, HelpfulReviews SMALLINT, Over$
IFS=$'\n'
for file in $1/*;
do
author=($(grep "<Author>" $file | sed 's/<Author>//g'));
content=($(grep "<Content>" $file | sed 's/<Content>//g'));
date=($(grep "<Date>" $file | sed 's/<Date>//g'));
readers=($(grep "<No. Reader>" $file | sed 's/<No. Reader>//g'));
helpful=($(grep "<No. Helpful>" $file | sed 's/<No. Helpful>//g'));
overall=($(grep "<Overall>" $file | sed 's/<Overall>//g'));
value=($(grep "<Values>" $file | sed 's/<Value>//g'));
rooms=($(grep "<Room>" $file | sed 's/<Room>//g'));
location=($(grep "<Location>" $file | sed 's/<Location>//g'));
cleanliness=($(grep "<Cleanliness>" $file | sed 's/<Cleanliness>//g'));
receptionarea=($(grep "<Check in / front desk>" $file | sed 's/<Check in \/ front desk>//g'));
service=($(grep "<Service>" $file | sed 's/<Service>//g'));
businessservice=($(grep "<Business service>" $file | sed 's/<Business service>//g'));
length=${#author[#]}
hotelID="$(echo $file | sed 's/.dat//g' | sed 's/[^0-9]*//g')";
for((i = 0; i < length; i++)); do
sqlite3 test.sql "INSERT INTO HotelReviews VALUES($hotelID, $i, 'author', 'content', 'date', ${readers[i]}, ${helpful[i]}, ${overall[i]}, 9, 10, ${location[i]}, ${cleanliness[i]}, ${receptionarea[i]}, ${service[i]}, ${businessservice[i]})";
done
done
sqlite3 test.sql "SELECT * FROM HotelReviews;"
The problem I have though, is that although much of the script is working, there are still 5 of the 15 columns that I can't get working. I'll just screenshot the errors I get when trying to change the code from:
'author' --> ${author[i]}: http://i.imgur.com/zKQLSqT.jpg
'content' --> ${content[i]}: http://i.imgur.com/pnirIo3.jpg
'date' --> ${date[i]}: http://i.imgur.com/urF5DTa.jpg
9 --> ${value[i]}: http://i.imgur.com/AnBFSWp.jpg
10 --> ${rooms[i]}: same errors as above
Anyway, if anyone could help me out on this, I'd be massively grateful.
Cheers!
If you deal with a lot of XML, I recommend getting to know a SAX parser, such as the one in the Python standard library. Anyone willing to write a shell script like that has the chops to learn it, and the result will be easier to read and at least have a prayer at being correct.
If you want to stick with regex hacking, turn to awk. Using ">" as your field separator, your script could be simplified with awk lines like
/<Author>/ { gsub(/'/, "''", $2); author=$2 }
/<Content>/ { gsub(/'/, "''", $2); content=$2 }
...
END { print author, content, ... }
The gsub takes care of your SQL quoting problem by doubling any single quotes in the data.