How to use 7z.dll? - dll

I have now a script to download a file and copy to a directory. But how could i make it so that if i compress a folder to a zip file , and then would need to extract it when that zipped folder is downloaded. It's takes too much time to to write the lines for every file separately. I know that i could use 7z.dll to decompress, but dont know how to put that in code.
[Code]
procedure InitializeWizard;
begin
idpDownloadAfter(wpReady);
end;
procedure CurPageChanged(CurPageID: Integer);
begin
if CurPageID = wpReady then
begin
idpClearFiles;
if IsComponentSelected('IGR') then
idpAddFile('http://www.mediafire.com/download/f9hnlkt1t75ykjk/waterfall_IGR.model', ExpandConstant('{tmp}\waterfall_IGR.model'));
end;
end;
procedure CurStepChanged(CurStep: TSetupStep);
begin
if CurStep = ssPostInstall then
begin
// Copy downloaded files to application directory
FileCopy(ExpandConstant('{tmp}\waterfall_IGR.model'), ExpandConstant('{app}\res_mods\0.8.10\content\Environment\env_waterfall\waterfall_IGR.model'), false);
end;
end;

I dont know if 7z.dll will work directly, but what can be done is to download 7zip portable, include its folder in your package and pass the unzipping command to 7za.exe .
Eg :
7za.exe x <path to>\in.zip -oc:\pathToOutFolder

I had the same problem when create a 7zip file and split it out in several smnall files using the -v option, the way I fixed is using powershell I get the list of files and the create dynamically the Inno project, it looks something like
$Files = Get-Item "$zipFilesLocation\*.*"
$files | Select-object #{Name="Address"; Expression={"idpAddFile('<webaddress>" + $_.Name + "' , ExpandConstant('{tmp}\58-Formulary_201311.7z.001'));"}}
and tghen just write each object into the iss file like
foreach ($elem in $files)
{
$e = "idpAddFile( WebWrlString + '" + $elem.Name + "', ExpandConstant('{tmp}\" + $elem.Name + "'));"
$e | Out-File "Innopackage.iss" -Encoding ASCII -Append
}
I hope this helps

Related

How to remove the special character in SQL file

I am facing very big issue, please help me. I am having "FileName.sql" file , in that file when I try to deploy to server using PowerShell script (that's our company deployment tool), it always complains that there is a special character inside the file, hence deployment fails.
But it never says on which line the special character is, could you please help me, how to identify the special character in SQL file?
When I execute this FileName.sql file manually in SQL Server Management Studio, it will execute without error, but through PowerShell script, I am not able to deploy, so please help me.
How to identify the special character inside my SQL file?
select * from TableName
select REPLACE(REPLACE(LTRIM(RTRIM([ColumnName])), CHAR(1), ''), CHAR(9), '')) [NewColumnName] from TableName
I have used REPLACE() function to remove the special charachers:-
LTRIM() & RTRIM() - Tab character or the Space bar character
CHAR(1) & CHAR(9) - ASCII Control Characters
If you are going to use this on multiple columns. You can create a Function. Let me know if you need help with creating a Function. Thank you
open file with notepad++. find the char and save/encode as ascii/utf8. possibly someone edited the file in Word and you have special/fancy quotes or hyphen.
No clue if this helps, but this may...
Find-FancyCharacters.ps1
[CmdletBinding()]
param (
[string[]]
$Files
)
$badCharacters = #"
‘
’
–
"#
$badCharactersList = $badCharacters -split "`r`n"
if (-not $Files)
{
$Files = #('D:\test\badchars.txt')
}
foreach ($file in $Files)
{
Write-Progress -Activity "Checking file: $file..."
$contents = Get-Content $file -Encoding UTF8
$lineNumber = 1
foreach ($line in $contents)
{
foreach ($invalidChar in $badCharactersList)
{
if ($line -match $invalidChar)
{
Write-Output "Invalid Character found: f=$file, l=$lineNumber, c=$invalidChar"
break
}
}
$lineNumber++
}
}

Cronjob does not execute command line in perl script

I am unfamiliar with linux/linux environment so do pardon me if I make any mistakes, do comment to clarify.
I have created a simple perl script. This script creates a sql file and as shown, it would execute the lines in the file to be inserted into the database.
#!/usr/bin/perl
use strict;
use warnings;
use POSIX 'strftime';
my $SQL_COMMAND;
my $HOST = "i";
my $USERNAME = "need";
my $PASSWORD = "help";
my $NOW_TIMESTAMP = strftime '%Y-%m-%d_%H-%M-%S', localtime;
open my $out_fh, '>>', "$NOW_TIMESTAMP.sql" or die 'Unable to create sql file';
printf {$out_fh} "INSERT INTO BOL_LOCK.test(name) VALUES ('wow');";
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
while( my $sql_file = glob '*.sql' )
{
my $status = system ( "$SQL_COMMAND < $sql_file" );
if ( $status == 0 )
{
print "pass";
}
else
{
print "fail";
}
}
}
insert();
This works if I execute it while I am logged in as a user(I do not have access to Admin). However, when I set a cronjob to run this file let's say at 10.08am by using the line(in crontab -e):
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl > /dev/null 2>&1
I know the script is being executed as the sql file is created. However no new rows are inserted into the database after 10.08am. I've searched for solutions and some have suggested using the DBI module but it's not available on the server.
EDIT: Didn't manage to solve it in the end. A root/admin account was used to to execute the script so that "solved" the problem.
First things first, get rid of the > /dev/null 2>&1 at the end of your crontab entry (at least temporarily) so you can actually see any errors that may be occurring.
In other words, change it temporarily to something like:
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl >/tmp/myfile 2>&1
Then you can examine the /tmp/myfile file to see what's being output.
The most likely case is that mysql is not actually on the path in your cron job, because cron itself gives a rather minimal environment.
To fix that problem (assuming that's what it is), see this answer, which gives some guidelines on how best to expand the cron environment to give you what you need. That will probably just involve adding the MySQL executable directory to your PATH variable.
The other thing you may want to consider is closing the out_fh file before trying to pass it to mysql - if the buffers haven't been flushed, it may still be an empty file as far as other processes are concerned.
The expression glob(".* *") matches all files in the current working
directory.
- http://perldoc.perl.org/functions/glob.html
you should not rely on the wd in a cron job. If you want to use a glob (or any file operation) with a relative path, set the wd with chdir first.
source: http://www.perlmonks.org/bare/?node_id=395387
So if your working directory is, for example /home/user, you should insert
chdir('/home/user/');
before the WHILE, ie:
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
chdir('/home/user/');
while( my $sql_file = glob '*.sql' )
{
...
replace /home/user with wherever your sql files are being created.
It's better to do as much processing within Perl as possible. It avoids the overhead of generating a separate shell process and leaves everything under the control of the program so that you can handle any errors much more simply
Database access from Perl is done using the DBI module. This program demonstrates how to achieve what you have written using the mysql utility. As you can see it's also much more concise
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $host = "i";
my $username = "need";
my $password = "help";
my $dbh = DBI->connect("DBI:mysql:database=test;host=$host", $username, $password);
my $insert = $dbh->prepare('INSERT INTO BOL_LOCK.test(name) VALUES (?)');
my $rv = $insert->execute('wow');
print $rv ? "pass\n" : "fail\n";

Powershell script to convert PDF to TIFF with Ghostscript

I have been asked to write a script that automatically converts PDF files to TIFF files so they can be processed furter. With a lot of help from Google and this site. (I never studied any programming language) I created the code below.
Even though it's working now, it is not quite what I was hoping for since it creates 13 files every time it runs where it should create only 1.
Could someone be kind enough to take a look at the script and tell me where I went wrong?
Thank you in advance!
EDIT:
In this (test) case there's only one PDF in the folder and it's named test.pdf, however the idea is that the script looks through all the PDF in the given folder since it's unsure how many PDF's are in the folder at any given time. Let it run as a service in the background(?)
I'll edit the post with the error code/description once I find out how to get them, I can't keep up with the command line.
#Path to your Ghostscript EXE
$tool = 'C:\\Program Files\\gs\\gs9.10\\bin\\gswin64c.exe'
#Directory containing the PDF files that will be converted
$inputDir = 'C:\\test\\'
#Output path where converted PDF files will be stored
$outputDirPDF = 'C:\\test\\oud\\'
#Output path where the TIF files will be saved
$outputDir = 'C:\\test\\TIFF'
$pdfs = get-childitem $inputDir -recurse | where {$_.Extension -match "pdf"}
foreach($pdf in $pdfs)
{
$tif = $outputDir + $pdf.BaseName + ".tif"
if(test-path $tif)
{
"tif file already exists " + $tif
}
else
{
'Processing ' + $pdf.Name
$param = "-sOutputFile=$tif"
& $tool -q -dNOPAUSE -sDEVICE=tiffg4 $param -r300 $pdf.FullName -c quit
}
Move-Item $pdf $outputDirPDF
}
It's working now, apparently I was missing an "exit" at the end of the code. It might not be the most beautiful piece of code, but it seems to do the job so I'm happy with it.
Below the piece of code that actually works;
#Path to your Ghostscript EXE
$tool = 'C:\\Program Files\\gs\\gs9.10\\bin\\gswin64c.exe'
#Directory containing the PDF files that will be converted
$inputDir = 'C:\\test\\'
#Output path where converted PDF files will be stored
$outputDirPDF = 'C:\\test\\oud\\'
#Output path where the TIF files will be saved
$outputDir = 'C:\\test\\TIFF\\'
$pdfs = get-childitem $inputDir -recurse | where {$_.Extension -match "pdf"}
foreach($pdf in $pdfs)
{
$tif = $outputDir + $pdf.BaseName + ".tif"
if(test-path $tif)
{
"tif file already exists " + $tif
}
else
{
'Processing ' + $pdf.Name
$param = "-sOutputFile=$tif"
& $tool -q -dNOPAUSE -sDEVICE=tiffg4 $param -r300 $pdf.FullName -c quit
}
Move-Item $pdf $outputDirPDF
}
EXIT
It appears to be creating one TIFF file for each PDF file in the source directory. How many PDF files are in the directory (and any sub-directories) ? How many pages in the input PDF file ?
I note that you move the original PDF from 'InputDir' to 'OutputDirPDF' when completed, but 'OutputDirPDF' is a child of 'InputDir', so if you recurse child directories when looking for input files you may find files you have already processed. NB I know nothing about Powershell so this may be just fine.
I'd suggest making 'InputDir' and 'OutputDirPDF' at the same level, eg "c:\temp\input" and "c:\temp\outputPDF".
That's about all I can say on the information here, you could state what the input PDF filename(s) and output Filename(s) are, and what the processing messages say.

Running shell commands in background, in a tcl proc

I'm trying to create a tcl proc, which is passed a shell command as argument and then opens a temporary file and writes a formatted string to the temporary file, followed by running the shell command in background and storing the output to the temp file as well.
Running the command in background, is so that the proc can be called immediately afterwards
with another arg passed to it, writing to another file. So running a hundred such commands should not take as long as running them serially would do. The multiple temp files can finally be concatenated into a single file.
This is the pseudocode of what I'm trying to do.
proc runthis { args }
{
set date_str [ exec date {+%Y%m%d-%H%M%S} ]
set tempFile ${date_str}.txt
set output [ open $tempFile a+ ]
set command [concat exec $args]
puts $output "### Running $args ... ###"
<< Run the command in background and store output to tempFile >>
}
But how do I ensure the background'ing of the task is done properly? What would need to be done to ensure that the multiple temp files get closed properly?
Any help would be welcome. I'm new at tcl and finding to get my mind around this. I read about using threads in tcl but I'm working with an older version of tcl which doesn't support threading.
How about:
proc runthis { args } {
set date_str [clock format [clock seconds] -format {+%Y%m%d-%H%M%S}]
set tempFile ${date_str}.txt
set output [ open $tempFile a+ ]
puts $output "### Running $args ... ###"
close $output
exec {*}$args >> $tempFile &
}
See http://tcl.tk/man/tcl8.5/TclCmd/exec.htm
Since you seem to have an older Tcl, replace
exec {*}$args >> $tempFile &
with
eval exec [linsert $args 0 exec] >> $tempFile &

Renaming a list of files and creating folder in Powershell

I'm in need a script, in PowerShell or batch script, that will do the following.
Rename a file to append creation date minus 1 day to the filename.
For example:
foo.xlsx (created 7/27/2011)
foo-2011-07-26.xlsx --note, it's yesterday's date.
Date format isn't too important as long as it's there. There will be 10 files (all with the same creation date), so either I can copy and paste the same renaming line for the different files (just rename the filename) or just have the script affect all *.xlsx files in the existing folder.
Create a new folder where those files are and name it 'fooFolder-2011-07-26' (yesterday's date).
Move those renamed files to that folder.
I only have limited experience with PowerShell. It's on my todo list of languages to learn..
Here you go. It could be shortened up a lot using aliases and piping and whatnot, but since you're unfamiliar with Powershell still, I decided to write in a more procedural style for your reading:
function MoveFilesAndRenameWithDate([string]$folderPrefix, [string]$filePattern) {
$files = Get-ChildItem .\* -include $filePattern
ForEach ($file in $files) {
$yesterDate = $file.CreationTime.AddDays(-1).ToString('yyyy-MM-dd')
$newSubFolderName = '{0}-{1}' -f $folderPrefix,$yesterDate
if (!(Test-Path $newSubFolderName)) {
mkdir $newSubFolderName
}
$newFileName = '{0}-{1}{2}' -f $file.BaseName,$yesterDate,$file.Extension
Move-Item $file (Join-Path $newSubFolderName $newFileName)
}
}
You would paste the above into your Powershell session (place it in your profile). Then you call the function like this:
MoveFilesAndRenameWithDate 'fooFolder' '*.xslx'
I tend to use more aliases and piping than the above function. The first version I wrote was this, and then I separated parts of it to make it more comprehensible to a Powershell newcomer:
function MoveFilesAndRenameWithDate([string]$folderPrefix, [string]$filePattern) {
gci .\* -include $filePattern |
% { $date = $_.CreationTime.AddDays(-1).ToString('yyyy-MM-dd')
mkdir "$folderPrefix-$date" 2>$null
mv $_ (join-path $newSubFolderName ('{0}-{1}{2}' -f $_.BaseName,$date,$_.Extension))}
}
Edit: Modified both functions to create dated folder for the files that match that date. I considered making a temporary directory and grabbing a single date from the files moved to it, finally renaming the directory after the loop. However, if a day should be missed and files for 2 (or more) days get processed together, there would still be a folder for each day with these, which is more consistent.
ok i´ve made it
function NameOfFunction([string]$folderpath)
{
foreach ($filepath in [System.IO.Directory]::GetFiles($folderpath))
{
$file = New-Object System.IO.FileInfo($filepath);
$date = $file.CreationTime.AddDays(-1).ToString('yyyy-MM-dd');
if (![System.IO.Directory]::Exists("C:\\test\foo-$date"))
{
[System.IO.Directory]::CreateDirectory("$folderpath\foo-$date");
}
$filename = $file.Name.Remove($file.Name.LastIndexOf('.'));
$fileext = $file.Name.SubString($file.Name.LastIndexOf('.'));
$targetpath = "$folderpath\foo-$date" + '\' + $filename + '-' + $date + $fileext;
[System.IO.File]::Move($filepath, $targetpath);
}
}
Explanation:
First get all Files in the rootfolder.
Foreach Folder we create a FileInfo-Object and get the CreationTime -1 Day.
Then we check, if the Directory, which should be created, exists and create it if false.
Then we get the filename and the extension.
At the End we move the files to the new Directory, under the new Filename.
Hope that help you