How to conditionally send email alert in perl - sql

I have a perl script which is printing the table content from Oracle DB in HTML format.
My script will run on a daily basis , which will just email the o/p of the simple sql query (select query)
Now i want my script to stop email alert whenever record count of the table is NULL i.e no records in the table.
Here is my partial script
$retCode = executeSQL("select firstname,lastname,employee_id from employee");
if ($retCode) {
push(#HTML, "<tr><td> </td><td></td><td>");
push(#HTML, "<td></td><td></td></tr>\12");
}
push(#HTML, "</table>\12\12");
push(#HTML, "COUNT : $count\12");
&sendMail;
sub sendMail {
$sub = "sample data";
$from = 'xyz#abc.com';
$to = 'xys#abc.com';
open(MAIL, "|/usr/lib/sendmail -t");
print MAIL "From: $from \12"; print MAIL "To: $to \12";print MAIL "Cc: $Cc \12";
print MAIL "Subject: $sub \12";
print MAIL "Content-Type: text/html \12";
print MAIL "Content-Disposition:inline \12";
print MAIL #HTML;
close(MAIL);
}
sub executeSQL {
my $SQL = $_[0];
chomp($SQL);
print "$SQL\12";
my $hostname = $ENV{"ORACLE_DB"};
my $dbh = CommonFunctions::connect_DBI( $hostname, "USERNAME", "PASSWORD" )|| die "ERROR : Unable to connect to $hostname: $DBI::errstr\n\n";
my $sth = $dbh->prepare($SQL);
$sth->execute or die "EXEC ERROR $sth->errstr";
$count = 0;
while (#ary = $sth->fetchrow_array) {
$count++;
push(#HTML, "<tr>");
foreach(#ary) {
chomp($_);
push(#HTML, "<td>$_</td>");
print "$_,";
}
push(#HTML, "</tr>\12");
}
}

The solution is already there in the code. The program doesn't add table rows to the HTML body of the email if there are no rows returned from the DB query. Hence, you neeed to move the send command into that condition.
if($retCode) {
push(#HTML,"<tr><td> </td><td></td><td>");
push(#HTML,"<td></td><td></td></tr>\12");
push(#HTML,"</table>\12\12");
push(#HTML, "COUNT : $count\12");
&sendMail;
}

I think the big miss here is at the end of executeSQL where you failed to have a return clause indicating whether or not you found any rows in the query.
if (executeSQL("select firstname,lastname,employee_id from employee"))
{
push(#HTML, "<tr><td> </td><td></td><td>");
push(#HTML, "<td></td><td></td></tr>\12");
push(#HTML, "</table>\12\12");
push(#HTML, "COUNT : $count\12");
&sendMail;
}
sub sendMail {
# no changes
}
sub executeSQL {
my $SQL = shift;
print "$SQL\12";
my $hostname = $ENV{"ORACLE_DB"};
my $dbh = CommonFunctions::connect_DBI( $hostname, "USERNAME", "PASSWORD" ) ||
die "ERROR : Unable to connect to $hostname: $DBI::errstr\n\n";
my $sth = $dbh->prepare($SQL);
$sth->execute or die "EXEC ERROR $sth->errstr";
my $count = 0;
while (#ary = $sth->fetchrow_array) {
# no changes
}
$sth->finish;
$dbh->disconnect;
return $count; # this is what I think you're missing
}
That said, there is some other room for improvement, some of which has already been mentioned:
Consider passing a reference to #HTML instead of using it as a global -- loose coupling
Probably should close out your SQL -- I added the $sth->finish and $dbh->disconnect as examples
Have you looked into HTML::Table? I use it a lot, and it's a real time-saver. Creating HTML on the fly is always a last resort for me

Related

Data inserting to ODBC destination with powershell

I need to load data table to ODBC driver connection with powershell.
With OLEDB and SQL server we can use Bulk Copy and insert data quickly.
Is there such posibility with ODBC ?
I'm using powershell because it shoud have the best support for these kind of opperations,
but my current code doesn't utillise an of the dlls.
So my code firstly needs to create an insert statements with two for loops and iterate on every row and hold it in its memory,
and then to construct INSERT INTO with 1000 rows, and then repeat same thing.
Am i doomed to something like this ?
$Datatable = New-Object System.Data.DataTable
$tabledump= $src_cmd.ExecuteReader()
$Datatable.Load($tabledump)
foreach ($item in $Datatable.Rows) {
$f +=1
for ($i = 0; $i -lt $item.ItemArray.Length; $i++) {
$items = $item[$i] -replace "'" , "''"
$val +="'"+ $items + "',"
}
$vals += $val
if ($f % 1000 -eq 0 -or $f -eq $row_cnt) {
$values = [system.String]::Join(" ", $vals)
$values = $values.TrimEnd(",")
$cols = [system.String]::Join(",", $columns)
$postgresCommand = "Insert Into $dst_schema.$dst_table ($cols) values $values"
$dest_cmd_.CommandText = $postgresCommand
$dest_cmd_.ExecuteNonQuery()
Bad code i admit, any advice on code compositions are welcomed.
You can use Get-ODBCDSN command to retrieve the values of the ODBC connections and use it with a query
$conn.ConnectionString= "DSN=$dsn;"
$cmd = new-object System.Data.Odbc.OdbcCommand($query,$conn)
$conn.open()
$cmd.ExecuteNonQuery()
$conn.close()
https://www.andersrodland.com/working-with-odbc-connections-in-powershell/
But the ODBC provider doesnt do bulk copy
https://learn.microsoft.com/en-us/sql/relational-databases/native-client-odbc-bulk-copy-operations/performing-bulk-copy-operations-odbc?view=sql-server-ver15
I know this post is not new, but i've been fiddeling around looking for a solution and also found nothing, however this post gave me a couple of insights.
First: There is no such thing as 'Bad Code'. If it works is not bad, heck even if it didn't worked, but helped with something..
Alright, what i did is not the best solution, but i'm trying to import Active Directory data on PostgreSQL, so...
I noticed that you're trying with pgsql as well, so you can use the COPY statement.
https://www.postgresql.org/docs/9.2/sql-copy.html
https://www.postgresqltutorial.com/import-csv-file-into-posgresql-table/
In my case i used it with a csv file:
*Assuming you have installed pgsql ODBC driver
$DBConn = New-Object System.Data.Odbc.OdbcConnection
$DBConnectionString = "Driver={PostgreSQL UNICODE(x64)};Server=$ServerInstance;Port=$Port;Database=$Database;Uid=$Username;Pwd=$(ConvertFrom-SecureString -SecureString $Password);"
$DBConn.ConnectionString = $DBConnectionString
try
{
$ADFObject = #()
$ADComputers = Get-ADComputer -Filter * -SearchBase "OU=Some,OU=OrgU,OU=On,DC=Domain,DC=com" -Properties Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID | Select-Object Description,DistinguishedName,Enabled,LastLogonTimestamp,modifyTimestamp,Name,ObjectGUID
foreach ($ADComputer in $ADComputers) {
switch ($ADComputer.Enabled) {
$true {
$ADEnabled = 1
}
$false {
$ADEnabled = 0
}
}
$ADFObject += [PSCustomObject] #{
ADName = $ADComputer.Name
ADInsert_Time = Get-Date
ADEnabled = $ADEnabled
ADDistinguishedName = $ADComputer.DistinguishedName
ADObjectGUID = $ADComputer.ObjectGUID
ADLastLogonTimestamp = [datetime]::FromFileTime($ADComputer.LastLogonTimestamp)
ADModifyTimestamp = $ADComputer.modifyTimestamp
ADDescription = $ADComputer.Description
}
}
$ADFObject | Export-Csv $Env:TEMP\TempPsAd.csv -Delimiter ',' -NoTypeInformation
docker cp $Env:TEMP\TempPsAd.csv postgres_docker:/media/TempPsAd.csv
$DBConn.Open()
$DBCmd = $DBConn.CreateCommand()
$DBCmd.CommandText = #"
COPY AD_Devices (ADName,ADInsert_Time,ADEnabled,ADDistinguishedName,ADObjectGUID,ADLastLogonTimestamp,ADModifyTimestamp,ADDescription)
FROM '/media/TempPsAd.csv'
DELIMITER ','
CSV HEADER
"#
$DBCmd.ExecuteReader()
$DBConn.Close()
docker exec postgres_docker rm -rf /media/TempPsAd.csv
Remove-Item $Env:TEMP\TempPsAd.csv -Force
}
catch
{
Write-Error "$($_.Exception.Message)"
continue
}
Hope it helps!
Cheers!

Update two different SNMP OID values through Powershell

I'm trying to update info from 4 ups's with two different OID values through powershell. I can update one but when I try to update both values I receive an error. I figured out why it's not updating the values by inserting the values onto a new table. When it inserts/updates the values the script enters both values into the table column instead of having one value for temp and one value for battery. My question is how can I update both values if there is a way. Below is my loop I am running.
# If success go call func SNMP
if($ping_reply.status -eq "Success"){
try {
$frm_snmp = Invoke-SNMPget $ups_ip $oidTemp, $oidBatload "public"
} catch {
Write-Host "$ups_ip SNMP Get error: $_"
Return null
}
# if the data doesn't match record update ups_data
if([String]::IsNullOrWhiteSpace($frm_snmp.Data)){
Write-Host "Given string is NULL"
}else{
if(($ups_temp -and $battery_load -ne $frm_snmp.Data)) {
Write-Output "database update needed"
Write-Output $ups_ip, $ups_upsname $frm_snmp.Data
$new_temp = $frm_snmp.Data
$new_battery_load = $frm_snmp.Data
$update_con = New-Object System.Data.SqlClient.SqlConnection
$update_con.ConnectionString = "connection info"
$update_con.Open()
$SQLstmt = "update ups_data set temp = '$new_temp', batteryload = '$new_battery_load' where ip_address = '$ups_ip'"
$up_cmd = $update_con.CreateCommand()
$up_cmd.CommandText = $SQLstmt
$up_cmd.ExecuteNonQuery()
$update_con.Close()
This is the working code below
# If success go call func SNMP
if($ping_reply.status -eq "Success"){
try {
$frm_snmp = Invoke-SNMPget $ups_ip $oidTemp, $oidBatload "public"
} catch {
Write-Host "$ups_ip SNMP Get error: $_"
Return null
}
# if the data doesn't match record update ups_data
if([String]::IsNullOrWhiteSpace($frm_snmp.Data)){
Write-Host "Given string is NULL"
}else{
if(($ups_temp -and $battery_load -ne $frm_snmp.Data)) {
Write-Output "database update needed"
Write-Output $ups_ip, $ups_upsname $frm_snmp.Data
$new_temp = $frm_snmp.Data
$new_battery_load = $frm_snmp.Data
$update_con = New-Object System.Data.SqlClient.SqlConnection
$update_con.ConnectionString = "connection info"
$update_con.Open()
$SQLstmt = "update ups_data set temp = '$($new_temp[0])', batteryload = '$($new_battery_load[1])' where ip_address = '$ups_ip'"
$up_cmd = $update_con.CreateCommand()
$up_cmd.CommandText = $SQLstmt
$up_cmd.ExecuteNonQuery()
$update_con.Close()

How to increase efficiency of perl script which uses sqlplus

I have this perl script which takes the data from sqlplus database... this database adds a new entry every time when there is a change in the value of state for a particular serial number. Now we need to pick the entries at every state change and prepare a csv file with old state, new state and other fields. db table sample.
SERIALNUMBER STATE AT OPERATORID SUBSCRIBERID TRANSACTIONID
51223344558899 Available 20081008T10:15:47 vsuser
51223344558857 Available 20081008T10:15:49 vsowner
51223344558899 Used 20081008T10:20:25 vsuser
51223344558860 Stolen 20081008T10:15:49 vsanyone
51223344558857 Damaged 20081008T10:50:49 vsowner
51223344558899 Damaged 20081008T10:50:25 vsuser
51343253335355 Available 20081008T11:15:47 vsindian
my script:
#! /usr/bin/perl
#use warnings;
use strict;
#my $circle =
#my $schema =
my $basePath = "/scripts/Voucher-State-Change";
#my ($sec, $min, $hr, $day, $month, $years) = localtime(time);
#$years_+=1900;$mont_+=1;
#my $timestamp=sprintf("%d%02d%02d",$years,$mont,$moday);
sub getDate {
my $daysago=shift;
$daysago=0 unless ($daysago);
#my #months=qw(Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec);
my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time-(86400*$daysago));
# YYYYMMDD, e.g. 20060126
return sprintf("%d%02d%02d",$year+1900,$mon+1,$mday);
}
my $filedate=getDate(1);
#my $startdate="${filedate}T__:__:__";
my $startdate="20081008T__:__:__";
print "$startdate\n";
##### Generating output file---
my $outputFile = "${basePath}/VoucherStateChangeReport.$filedate.csv";
open (WFH, ">", "$outputFile") or die "Can't open output file $outputFile for writing: $!\n";
print WFH "VoucherSerialNumber,Date,Time,OldState,NewState,UserId\n";
##### Generating log file---
my $logfile = "${basePath}/VoucherStateChange.$filedate.log";
open (STDOUT, ">>", "$logfile") or die "Can't open logfile $logfile for writing: $!\n";
open (STDERR, ">>", "$logfile") or die "Can't open logfile $logfile for writing: $!\n";
print "$logfile\n";
##### Now login to sqlplus-----
my $SQLPLUS='/opt/oracle/product/11g/db_1/bin/sqlplus -S system/coolman7#vsdb';
`$SQLPLUS \#${basePath}/VoucherQuery1.sql $startdate> ${basePath}/QueryResult1.txt`;
open (FH1, "${basePath}/QueryResult1.txt");
while (my $serial = <FH1>) {
chomp ($serial);
my $count = `$SQLPLUS \#${basePath}/VoucherQuery2.sql $serial $startdate`;
chomp ($count);
$count =~ s/\s+//g;
#print "$count\n";
next if $count == 1;
`$SQLPLUS \#${basePath}/VoucherQuery3.sql $serial $startdate> ${basePath}/QueryResult3.txt`;
# print "select * from sample where SERIALNUMBER = $serial----\n";
open (FH3, "${basePath}/QueryResult3.txt");
my ($serial_number, $state, $at, $operator_id);
my $count1 = 0;
my $old_state;
while (my $data = <FH3>) {
chomp ($data);
#print $data."\n";
my #data = split (/\s+/, $data);
my ($serial_number, $state, $at, $operator_id) = #data[0..3];
#my $serial_number = $data[0];
#my $state = $data[1];
#my $at = $data[2];
#my $operator_id = $data[3];
$count1++;
if ($count1 == 1) {
$old_state = $data[1];
next;
}
my ($date, $time) = split (/T/, $at);
$date =~ s/(\d{4})(\d{2})(\d{2})/$1-$2-$3/;
print WFH "$serial_number,$date,$time,$old_state,$state,$operator_id\n";
$old_state = $data[1];
}
}
close(WFH);
query in VoucherQuery1.sql:
select distinct SERIALNUMBER from sample where AT like '&1';
query in VoucherQuery2.sql:
select count(*) from sample where SERIALNUMBER = '&1' and AT like '&2';
query in VoucherQuery2.sql:
select * from sample where SERIALNUMBER = '&1' and AT like '&2';
and my sample output:
VoucherSerialNumber,Date,Time,OldState,NewState,UserId
51223344558857,2008-10-08,10:50:49,Available,Damaged,vsowner
51223344558899,2008-10-08,10:20:25,Available,Used,vsuser
51223344558899,2008-10-08,10:50:25,Used,Damaged,vsuser
Script is working pretty fine. But problem is that actual db table has millions of records for a specific day... and therefore it is raising performance issues... could you please advise how can we improve the efficiency of this script in terms of time & load. Only restriction is that I can't use DBI module for this...
Also in case of any error in the sql queries, error msg is coming to QueryResult?.txt files. I want to handle and receive these errors in my log file. how this can be accomplished? thanks
I think you need to tune your query. A good starting point is to use the EXPLAIN PLAN, if it is an Oracle database.

Perl script to export sql query to csv

The code below works, but all of the data displays in one row(but different columns) when opened in Excel. The query SHOULD display the data headings, row 1, and row 2. Also, when I open the file, I get a warning that says "The file you are trying to open,'xxxx.csv', is in a different format than specified by the file extension. Verify that the file is not corrupted...etc. Do you want to open the file now?" Anyway to fix that? That may be the cause too.
tldr; export to csv with multiple rows - not just one. fix Excel error. Thanks!
#!/usr/bin/perl
use warnings;
use DBI;
use Text::CSV;
# local time variables
($sec,$min,$hr,$mday,$mon,$year) = localtime(time);
$mon++;
$year += 1900;
# set name of database to connect to
$database=MDLSDB1;
# connection to the database
my $dbh = DBI->connect("dbi:Oracle:$database", "", "")
or die "Can't make database connect: $DBI::errstr\n";
# some settings that you usually want for oracle 10
$dbh->{LongReadLen} = 65535;
$dbh->{PrintError} = 0;
# sql statement to run
$sql="select * from eg.well where rownum < 3";
my $sth = $dbh->prepare($sql);
$sth->execute();
my $csv = Text::CSV->new ( { binary => 1 } )
or die "Cannot use CSV: ".Text::CSV->error_diag ();
open my $fh, ">:raw", "results-$year-$mon-$mday-$hr.$min.$sec.csv";
$csv->print($fh, $sth->{NAME});
while(my $row = $sth->fetchrow_arrayref){
$csv->print($fh, $row);
}
close $fh or die "Failed to write CSV: $!";
while(my $row = $sth->fetchrow_arrayref){
$csv->print($fh, $row);
$csv->print($fh, "\n");
}
CSV rows are delimited by newlines. Just simply add a newline after each row.
I think another solution is to use the instantiation of the Text::CSV object and pass along the desired line termination there...
my $csv = Text::CSV->new ( { binary => 1 } )
or die "Cannot use CSV: " . Text::CSV->error_diag();
becomes:
my $csv = Text::CSV->new({ binary => 1, eol => "\r\n" })
or die "Cannot use CSV: " . Text::CSV->error_diag();

Passing SQL query result as Parameter for sending Email using Perl

I want to retrieve the email address from the table and using that to send email using perl script.
How to use the query result in mail.
I am new to perl scripting please help.
I have updated as suggested but still there are some issues.
Please tell me where I am going wrong.
Thanks in advance.
#!/usr/bin/perl
# $Id: outofstockmail.pl,v 1.0 2012/03/01 21:35:24 isha Exp $
require '/usr/home/fnmugly/main.cfg';
use DBI;
my $dbh = DBI->connect($CFG{'mysql_dsn'},$CFG{'mysql_user'},
$CFG{'mysql_password'})
or &PrintError("Could not connect to the MySQL Database.\nFile could not be made!\n");
$dbh->{RaiseError} = 1; # save having to check each method call
print "<H1>Hello World</H1>\n";
$sql = "Select OS.name, OS.customer_email, OS.product, OS.salesperson,
OS.salesperson_email
from products AS P
LEFT JOIN outofstock_sku AS OS ON OS.product = P.sku
LEFT JOIN tech4less.inventory AS I ON (I.sku = P.sku AND I.status = 'A')
WHERE mail_sent='0'
GROUP BY OS.product";
#$sth = $dbh->do($sql);
my $sth1 = $dbh->prepare($sql);
$sth1->execute;
while ( my #row = $sth1->fetchrow_array ) {
# email
open MAIL, "| $mail_prog -t" || die "Could not connect to sendmail.";
print MAIL "To: $row[1]";
print MAIL "From: $row[4]";
print MAIL "Reply-To:$row[4]";
print MAIL "CC: $row[4]";
print MAIL "Subject: Product requested is back in inventory\n";
print MAIL "\n";
print MAIL "Hi $row[0] , The product $row[2] is available in the stock.\n";
print MAIL "\n";
close MAIL;
$sql = "Update outofstock_sku SET mail_sent='0' WHERE mail_sent='1'";
$sth2 = $dbh->do($sql);
}
$sth = $dbh->do($sql);
$dbh->disconnect();
exit;
This script has too many problems to address question specifically:
1) Use use strict; use warnings;. It will help you to be more accurate.
2) DBI->connect() takes options as last argument, so you can set RaiseError there:
my $dbh = DBI->connect($dsn, $user, $pwd, { RaiseError => 1 });
3) $dbh->do doesn't return sth object. You need prepare and execute:
my $sth = $dbh->prepare($sql);
$sth->execute;
while ( my #row = $sth->fetchrow_array ) {
...
print MAIL "Hi $row[0]. We're happy to ... $row[1]...\n";
...
}
4) To send mail use a module, for example Email::Sender::Simple.
Scary to think how much old code like this is still in use out there.
You don't actually say what your question is. Just saying that the code has "some issues" doesn't really help us to help you.
Some suggestions...
Use strict and warnings
Pass RaiseError as an argument to the connect call
Print out the contents of #row so you can be sure that the SQL is correct
Use Email::Simple to create your email and Email::Sender to send it
(Far less important) Consider using DBIx::Class to talk to the database
#!/usr/bin/perl
require '/main.cfg';
# use DBI interface for MySQL
use DBI;
# connect to MySQL
$dbh = DBI->connect($CFG{'mysql_dsn'},$CFG{'mysql_user'},$CFG{'mysql_password'}) or
&PrintError("Could not connect to the MySQL Database.\nFile could not be made!\n");
$dbh->{RaiseError} = 1; # save having to check each method call
$mailprog = $CFG{'mail_prog'};
$sql = "Select OS.name,OS.customer_email,OS.product,OS.salesperson_email from products AS P LEFT JOIN outofstock_sku AS OS ON OS.product = P.sku LEFT JOIN inventory AS I ON (I.sku = P.sku AND I.status = 'A') WHERE mail_sent='0' GROUP BY OS.product";
$sth = $dbh->prepare($sql);
$sth->execute();
while($ref = $sth->fetchrow_hashref())
{
open MAIL, "| $mailprog -f sssss\#gmail.com -t" || die "Could not connect to sendmail.";
print "Content-Type: text/html\n\n";
print MAIL "To: $ref->{'customer_email'}\n";
print MAIL "From: \"\" <wwwwwwwwww\n>";
# print MAIL "From: \"\" <$ref->{'salesperson_email'}\n>";
print MAIL "Reply-To: ";
# print MAIL "CC: $ref->{'salesperson_email'}\n";
print MAIL "Subject:#$ref->{'product'}\n";
print MAIL "Hi $ref->{'name'},\nThe product $ref->{'product'} is available in the stock.\n";
close MAIL;
}
$sth->finish();
# Close MySQL connection
$dbh->disconnect();
exit;