Here is my sample html code:
when I send this as an email using mutt to my outlook, the output is as follows:
The command I used is:
mutt -e "my_hdr Content-Type: text/html" -s "Subject: my subject" mymail#outlookmail.com < sampletable.html
In the second table there is some different format. First table is not looking good as well.
If there are more than 2 tables, all the alternate tables are having the same format as second one.
How to make my email look good?
PS: I am generating this sample html table code using awk in a bash script.
Thanks in advance.
replace ending <tr> by closing tr : </tr>.
Related
I'm hope to create a script which looks up a date in a text file and then send an email to the corresponding email address. The text file would be structured something like this:
25/02/2018, blah#email.com
26/02/2018, blue#email.com
etc
This will be run on a Windows 2012 box so ideally i'd like to use powershell. I don't want any reliance on a database so i thought a simple text file that could be searched may be the easiest way. Essentially this will be used to send emails alerts for for certain events with the date being used to sent the email to the correct night duty staff. Need to use a script as some of the apps that I hope to link it to have little to no notification options but they can invoke external scripts.
Any guidance gratefully received.
Thanks,
A
Your question is quite broad; here's the outline of a solution (run as a script file, PSv3+):
# Declare a parameter for the target date.
param(
[datetime] $TargetDate = [datetime]::Today
)
# Format the target date as a string to match the input file's date format.
$targetDateFormatted = (Get-Date -Date $TargetDate -Format 'dd/MM/yyyy')
# Specify the input file (CSV format, as implied by your sample data).
$inputFile = 'DateEmail.csv'
Import-Csv -Header Date, Email $inputFile |
Where-Object Date -eq $targetDateFormatted |
ForEach-Object {
Write-Verbose -Verbose "Date is $($_.Date); email address is $($_.Email)"
# Invoke the Send-MailMessage cmdlet here.
}
I have a table called query_master table which has 4 columns and the 4th column has SQL query as values. In total there are 5 entries in the query table.
Table Structure:
S.No --> Key --> Title --> Query
1 100 EG select * from dual
Now my objective is, I have to fetch the SQL queries using shell script from the query_master and execute it. The output of that each SQL query should be written on a separate log file, and the log filename should be equal to the name of the title.
Can you please help in achieving this scenario using stored procedures or stored functions which will be more helpful for me.
I need to achieve this using shell scripting.
Try this, assuming you're using mysql:
awk -F'\t' 'NR!=1 {system("mysql -u user -p -e " $4 " database")}' file
Where file is the file containing the table, user is the user and database is the database. Alternatively set these as variables instead of hard coding them like this:
awk -F'\t' -v db="database" -v user="user" 'NR!=1 {system(""mysql -u " user " -p -e " $4 " " db)}' file
Make a shell script that accepts a SQL statement from commandline (or inputfile or stdin) and does all things for you like exporting ORACLE_HOME, tnsnames, username, password, redirecting output, calling sqlplus, output formatting, deleting column headers and other sqlplus settings.
With your magicsql.sh (after testing), aim for a solution like
magicsql.sh "select key, query from query_master order by key" | while read key query; do
magicsql.sh "${query}" > /tmp/${key}.out
done
i need to send a weekly email containing some dynamic information that will be collected in a txt file. what i can't accomplish is adding the content of that file into the subject of my email.
my file is named bkp.txt and has one line "weekly backups; total: 10; size: 3gb"
is there any way i can send an email using this format
mailx -s "weekly backups; total: 10; size: 3gb" "email.address#domain.com" < another.txt, keeping in mind the content of bkp txt can modidy every week?
thanks!
Use bash command substitution:
mailx -s "$(<bkp.txt)" ......
I am trying to delete last row in the file generated by nzsql.Please find the below query.
nzsql -A -c "SELECT * FROM AM_MAS_DIVISION_DIM" > abc.out
When I execute this query the output will be generated and stored in abc.out.This will include both header columns as well as some time information at the bottom.But I don't need the bottom metadata and want to keep only my header columns. How can I do this using only nzsql.Please help me.Thanks in advance.
use -r flag in the nzsql command to avoid getting that row [assuming the metadata referred in question is the row count summary line, ex: (3 rows)]
-r Suppresses the row count that is displayed at the end of the SQL output.
reference: http://pic.dhe.ibm.com/infocenter/ntz/v7r0m3/index.jsp?topic=%2Fcom.ibm.nz.adm.doc%2Fr_sysadm_nzsql_command.html
Why don't you just pipe the output to a unix command to remove it? I think something like this will work:
nzsql -A -c "SELECT * FROM AM_MAS_DIVISION_DIM" | sed '$d' > abc.out
Seems to be a recommended solution for getting rid of the last line (although ed, gawk, and other tools can handle it).
Long story short - I'm capturing SQLs from vendor tool to Oracle database by using Wireshark. It already has decoder for TNS protocol (which is great) and I can access text of SQL by
Right Click->Copy->Bytes(Printable Text Only).
The problem is that there are tons of packets and doing right-click on each of them could take ages. I was wondering if there any way to export 'Printable Text Only' right from Wireshark. Ideally I want to have a text file with statements.
Any help will be highly appreciated.
Finally found away to do this. First, use tshark capturing tns packets:
tshark -R tcp.port==1521 -T fields -e data.data -d tcp.port==1521,tns > input.txt
Then you could use home brew Ruby script below to transform from bytes to text:
file = ARGV[0]
print_all = ARGV[1]
File.open(file, "r").each {|line|
line.gsub(",", ":").split(':').each {|byte|
chr = Integer('0x' + byte).chr
print chr if ((' '..'~').include?(chr) or chr == "\n") or (print_all.downcase == 'all' if print_all)
} if !line.chomp.empty?
}
Examples are:
encode.rb input.txt > output.txt
will export printable text only from input to output
encode.rb input.txt all > output.txt
will export all text from input to output
An easy way of looking at them all that has worked for me is just Right Click -> Follow TCP Stream.
A note: unprintable characters are displayed as .s. If there are a bunch of these interspersed between all the text you want to extract (as there was for me), switch it to ASCII, save it and open it in your favourite text editor (vim for me), then run a search and replace similar to /\.//g.
I don't know how to do it with TNS. but you can do something like this using tshark, for example to look at http requests.
tshark -T fields -e http.request.uri
So if you can look at the options in the TNS decoder, you should be able to grab that field and redirect the output to a file.