SSIS - Send Mail Task - Attache multiple newly generated Log files - sql

I am executing SSIS package with Batch file execution. At the end it will generate multiple predefined named log files. I want to attache that log files in send mail task.
Log files are located as e.g. (D:\Folder1\Folder2\Folder3\ABC.log) and (D:\Folder 1\Folder 2\Folder 3\XYZ.log).
I am using following expression in Send Mail Task with File Attachment. "D:\FOLDER 1\FOLDER 2\FOLDER 3\*.log" but it doesn't recognize log file.
(There are "2 slashes" in entire path)
Please help me to attach log files.

Try creating two variables.
vPath = D:\FOLDER 1\FOLDER 2\FOLDER 3\
vLogFileMask = *.log
And add these two variables in expressions vPath + vLogFileMask.

Related

Using PDFtk to Update Web Server Files in Many Directories

long time reader, first time poster. Trying to automate a process to take many .PDF floorplan files and combine them into a single .PDF floorplan which will be referenced by a website.
To cut down on manual cut-and-paste from network shares to a web server as is current practice, I've written a PowerShell command as follows:
$SourcePath = '\\network\share\location\CAD Miniatures'
$DestinationPath = 'C:\inetpub\wwwroot\floorplans'
$LogFile = 'C:\Floorplan Transfer Logs\TransferLog.txt'
Robocopy $SourcePath $DestinationPath *.pdf /E /MIR /ZB /DCOPY:DAT /R:5 /W:10 /LOG+:$LogFile
My plan is to have this script run every hour as a Scheduled Task to mirror our local files and web files to ensure they remain up-to-date automatically.
The curve ball is the files being copied are individual files, within directories. I would like to take all .pdf files in a given folder and combine it into a single .pdf.
File structure is as such:
/floorplans
/ABC
/ABC-01.pdf
/ABC-02.pdf
/ABC-03.pdf
/XYZ
/XYZ-01.pdf
/XYZ-02.pdf
/XYZ-03.pdf
/XYZ-04.pdf
/XYZ-05.pdf
/XYZ-06.pdf
Within each directory (or in a subdirectory), I would like to have the combined output file be simple abc.pdf and xyx.pdf as per the examples above.
The file naming always follows the same format, but the number of files varies from a single file to over a dozen.
I would like to run the Robocopy and PDFtk tasks in the same script if possible (the idea to update all files, and combine them together). There would also be no need to merge files in which no updates have been detected.

Copy multiple files from multiple folder to single folder in Azure data factory

I have a folder structure like this as a source
Source/2021/01/01/*.xlsx files
Source/2021/03/02/*.xlsx files
Source/2021/04/03/.*xlsx files
Source/2021/05/04/.*xlsx files
I want to drop all these excel files into a different folder called Output.
Method 1:
When I am trying this, I used copy activity and I am able to get Files with folder structure( not a requirement) in Output folder. I used Binary file format.
Method 2:
Also, I am able to get files as some random id .xlsx in my output folder. I used Flatten Hierrachy.
My requirement is to get files with the same name as source.
This is what i suggest and I have implemented something in the past and i am pretty confident this should work .
Steps
Use getmetada activity and try to loop through all the folder inside Source/2021/
Use a FE loop and pass the ItemType as folder ( so that you get folder only and NO files , I know at this time you dont; have file )
Inside the IF , add a Execute pipeline activity , they should point to a new pipeline which will take a parameter like
Source/2021/01/01/
Source/2021/03/02/
The new pipeline should have a getmetadata activity and FE loop and this time we will look for files only .
Inside the FE loop add a copy activity and now will have to use the full file name as source name .

How to read (downloaded / result) filename from job

In a Pentaho 9.1 Job, using the "Get File(s) from SFTP" step, I download a CSV file. I would like to use that downloaded file name in the subject line of the email message in the "Mail" step.
I have tried calling it as a variable but it is not really a variable but a "results" value. Eample if what I tried below...
Downloaded file name = "somefile.csv"
Syntax in the "Mail" step for "Subject" = "File Processing Complete: ${short_filename}"
When email sends the subject is exactly "File Processing Complete: ${short_filename}" when I need it to be "File Processing Complete: somefile.csv"
We can get file information from result. But unfortunately this step available only in the transformation. Thus, we need to get help a transformation to read the file name. I have prepared a SOLUTION for you. You need to give right information for SFTP & MAIL configuration. Also please run job "getFromSFTP".
[getFromSFTP.kjb] Here, I download the csv file from sftp and send file information to transformation
[getFileName.ktr] Here, read the file information and send filename to another job for mail sending.
[sendMail.kjb] This job only use for send the mail with filename = ${filename1}

The process cannot access the file because it is being used by another process in SSIS Script Task. Any solution?

I am trying to attach all the txt files from a directory in email attachment in SSIS Script Task. I have the following code.
String[] File1 = Directory.GetFiles("D:\\Output\\", "*.txt");
foreach (var File in File1)
{
Attachment Attach = new Attachment(File);
email.Attachments.Add(Attach);
FileNames = Path.GetFileName(File);
}
My Script Task is executing properly. The next step is Batch File Processing. The Batch File will copy the files in the output folder into SFTP and then move the files into archive folder. While executing the Batch File Task (i.e., Execute Process Task in SSIS). I am getting the below error.
The process cannot access the file because it is being used by another process.
When i run the Script Task and Execute Process Task individually , both the tasks is running properly. Email is triggered and the files is moved. But when i execute as SQL Job. I am getting the above error. I understood that the Script Task is holding the files. I even tried the following code in the foreach loop.
Attach.Dispose()
But when i add the above code, my SQL Job ran successfully but email is not triggered.

RAR a folder without persisting the full path

1) I have a folder called CCBuilds containing couple of files in this path: E:\Testing\Builds\CCBuilds.
2) I have written C# code (Process.Start) to Rar this folder and save it in E:\Testing\Builds\CCBuilds.rar using the following command
"C:\program files\winrar\rar.exe a E:\Testing\Builds\CCBuilds.rar E:\Testing\Builds\CCBuilds"
3) The problem is that, though the rar file gets created properly, when I unrar the file to CCBuilds2 folder (both through code using rar.exe x command or using Extract in context menu), the unrared folder contains the full path, ie. extracting E:\Testing\Builds\CCBuilds.rar ->
E:\Testing\Builds\CCBuilds2\Testing\Builds\CCBuilds\<<my files>>
Whereas I want it to be something like this: E:\Testing\Builds\CCBuilds2\CCBuilds\<<my files>>
How can I avoid this full path persistence while adding to rar / extracting back from it. Any help is appreciated.
Use the -ep1 switch.
More info:
-ep = Files are added to an archive without including the path information. Could result in multiple files existing in the archive
with same name.
-ep1 = Do not store the path entered at the command line in archive. Exclude base folder from names.
-ep2 = Expand paths to full. Store full file paths (except drive letter and leading backslash) when archiving.
(source: http://www.qa.downappz.com/questions/winrar-command-line-to-add-files-with-relative-path-only.html)
Just in case this helps: I am currently working on an MS Access Database project (customer relations management for a small company), and one of the tasks there is to zip docx-files to be sent to customers, with a certain password encryption used.
In the VBA procedure that triggers the zip-packaging of the docx-files, I call WinRAR as follows:
c:\Programme\WinRAR\winrar.exe a -afzip -ep -pThisIsThePassword "OutputFullName" "InputFullName"
-afzip says: "Create a zip file (as opposed to a rar file)
-ep says: Do not include the paths of the source file, i.e. put the file directly into the zip folder
A full list of such switches is available in the WinRAR Help, section "Command line".
x extracts it as E:\Testing\Builds\CCBuilds2\Testing\Builds\CCBuilds\, because you're using full path when declaring the source. Either use -ep1 or set the default working dir to E:\Testing\Builds.
Use of -ep1 is needed but it's a bit tricky.
If you use:
Winrar.exe a output.rar inputpath
Winrar.exe a E:\Testing\Builds\CCBuilds.rar E:\Testing\Builds\CCBuilds
it will include the input path declared:
E:\Testing\Builds\CCBuilds -> E:\Testing\Builds\CCBuilds.rar:
Testing\Builds\CCBuilds\file1
Testing\Builds\CCBuilds\file2
Testing\Builds\CCBuilds\folder1\file3
...
which will end up unpacked as you've mentioned:
E:\Testing\Builds\CCBuilds2\Testing\Builds\CCBuilds\
There are two ways of using -ep1.
If you want the simple path:
E:\Testing\Builds\CCBuilds\
to be extracted as:
E:\Testing\Builds\CCBuilds2\CCBuilds\file1
E:\Testing\Builds\CCBuilds2\CCBuilds\file2
E:\Testing\Builds\CCBuilds2\CCBuilds\path1\file3
...
use
Winrar.exe a -ep1 E:\Testing\Builds\CCBuilds.rar E:\Testing\Builds\CCBuilds
the files inside the archive will look like:
CCBuilds\file1
CCBuilds\file2
CCBuilds\folder1\file3
...
or you could use ep1 to just add the files and folder structure sans the base folder with the help of recursion and defining the base path as the inner path of the structure:
Winrar.exe a -ep1 -r E:\Testing\Builds\CCBuilds.rar E:\Testing\Builds\CCBuilds\*
The files:
E:\Testing\Builds\CCBuilds\file1
E:\Testing\Builds\CCBuilds\file2
E:\Testing\Builds\CCBuilds\folder1\file3
...
inside the archive will look like:
file1
file2
folder1\file3
...
when extracted will look like:
E:\Testing\Builds\CCBuilds2\file1
E:\Testing\Builds\CCBuilds2\file2
E:\Testing\Builds\CCBuilds2\folder1\file3
...
Anyway, these are two ways -ep1 can be used to exclude base path with or without the folder containing the files (the base folder / or base path).