Reading an xml file on localmachine(on which ActiveTCL and expect package is installed) and writing it to a spawned machine - activetcl

I am trying to read an xml file which is on my local machine and writing it to a spawned process. When I execute the code using puts and gets, the xml file is getting printed on TCl open prompt, but not on the spawned process prompt. So, after googling a bit, a command 'send' is used to send data on spawned process. But reading it through gets and then writing using send doesn't helps. I am attaching my code below. Please help. Thanks in advance.
package require Expect
package require tdom
log_file -noappend myscript.log
set spawned_machine (some IP here)
set username un
set password pw
set timeout 10
spawn plink.exe -ssh $username#$spawned_machine -pw $password
expect "*myspawnedmachine>"
exp_send "loadlicense\r"
expect "*Press CTRL-D on a blank line when done."
set filename "C:/newfolder/pqr.xml"
set size [file size $filename]
set fd[open $filename]
fconfigure $fd -buffering line
gets $fd data
while{$data != ""} {
set data1[puts $data]
gets $fd data
send $data1
}
close $fd
exp_send "\004"
expect "*myspawnedmachine>"
If I change the above logic for reading the whole pqr.xml file in one go,i.e
set filename "C:/newfolder/pqr.xml"
set size[file size $filename]
set fd[open $filename r]
set xml[read $fd]
set data[split $xml "\n"]
send $xml
close $fd
exp_send "\r"
exp_send "\004"
expect "*myspawnedmachine>"
It is sending it to spawned machine using send, but the xml file is not being sent as expected. The problem is, it is not sent properly line after line, instead all lines are being printed on the same line on spawned process. Lets consider an xml file here-
<?xml version="1.0"?>
<catalog>
<book id="bk101">
<author>Gambardella, Matthew</author>
<title>XML Developer's Guide</title>
<genre>Computer</genre>
<price>44.95</price>
<publish_date>2000-10-01</publish_date>
<description>An in-depth look at creating applications
with XML.</description>
</book>
so, in my spawned machine this xml file is being printed as-
<?xml version="1.0"?><catalog> <book id="bk101"><author>Gambardella, Matthew</author> <title>XML Developer's Guide</title> <genre>Computer</genre><price>44.95</price><publish_date>2000-10-01</publish_date><description>An in-depth look at creating applications with XML.</description> </book>
That is, all on the same line.
while it is expected to be sent as,
<?xml version="1.0"?>
<catalog>
<book id="bk101">
<author>Gambardella, Matthew</author>
<title>XML Developer's Guide</title>
<genre>Computer</genre>
<price>44.95</price>
<publish_date>2000-10-01</publish_date>
<description>An in-depth look at creating applications
with XML.</description>
</book>
I know its a lengthy question, but any help would greatly be appreciated. Thanks in advance.

Related

Read all lines at the same time individually - Solaris ksh

I need some help with a script. Solaris 10 and ksh.
I Have a file called /temp.list with this content:
192.168.0.1
192.168.0.2
192.168.0.3
So, I have a script which reads this list and executes some commands using the lines values:
FILE_TMP="/temp.list"
while IFS= read line
do
ping $line
done < "$FILE_TMP"
It works, but it executes the command on line 1. When it's over, it goes to the line 2, and it goes successively until the end. I would like to find a way to execute the command ping at the same time in each line of the list. Is there a way to do it?
Thank you in advance!
Marcus Quintella
As Ari's suggested, googling ksh multithreading will produce a lot of ideas/solutions.
A simple example:
FILE_TMP="/temp.list"
while IFS= read line
do
ping $line &
done < "$FILE_TMP"
The trailing '&' says to kick the ping command off in the background, allowing loop processing to continue while the ping command is running in the background.
'course, this is just the tip of the proverbial iceberg as you now need to consider:
multiple ping commands are going to be dumping output to stdout (ie, you're going to get a mish-mash of ping output in your console), so you'll need to give some thought as to what to do with multiple streams of output (eg, redirect to a common file? redirect to separate files?)
you need to have some idea as to how you want to go about managing and (possibly) terminating commands running in the background [ see jobs, ps, fg, bg, kill ]
if running in a shell script you'll likely find yourself wanting to suspend the main shell script processing until all background jobs have completed [ see wait ]

Get output from EXE launched in Tcl and pause further processing until EXE finishes

I'm launching a single EXE from a Tcl script, and would like to get the output from the EXE and display it using a simple PUTS command to provide user feedback. At the moment, I am launching the EXE in a CMD window where the user can see the progress, and waiting for the EXE to create a file. The first script here works whenever the output file LUC.CSV is created.
file delete -force luc.csv
set cmdStatus [open "| cmd.exe /c start /wait uc.exe"]
while {![file exists "luc.csv"]} {
}
# continue after output file is created
However, sometimes the file is not created, so I can't rely on this method.
I've been trying to get my head around the use of fileevent and pipes, and have tried several incarnations of the script below, but I'm obviously either missing the point or just not getting the syntax right.
puts "starting"
set fifo [open "| cmd.exe /c start uc.exe" r]
fconfigure $fifo -blocking 0
proc read_fifo {fifo} {
puts "calling read_fifo"
if {[gets $fifo x] < 0} {
if {[eof $fifo]} {
close $fifo
}
}
puts "x is $x"
}
fileevent $fifo readable [list read_fifo $fifo]
vwait forever
puts"finished"
Any help would be greatly appreciated!
If you just want to launch a subprocess and do nothing else until it finishes, Tcl's exec command is perfect.
exec cmd.exe /c start /wait uc.exe
(Since you're launching a GUI application via start, there won't be any meaningful result unless there's an error in launching. And in that case you'll get a catchable error.) Things only get complicated when you want to do several things at once.
To make your original code work, you have to understand that the subprocess has finished. Tcl's just vwaiting forever because your code says to do that. We need to put something in to make the wait finish. A good way is to make the wait be for something to happen to the fifo variable, which can be unset after the pipe is closed as it no longer contains anything useful. (vwait will become eligible to return once the variable it is told about is either written to or destroyed; it uses a variable trace under the covers. It also won't actually return until the event handlers it is currently processing return.)
puts "starting"
# ***Assume*** that this code is in the global scope
set fifo [open "| cmd.exe /c start uc.exe" r]
fconfigure $fifo -blocking 0
proc read_fifo {} {
global fifo
puts "calling read_fifo"
if {[gets $fifo x] < 0} {
if {[eof $fifo]} {
close $fifo
unset fifo
}
}
puts "x is $x"
}
fileevent $fifo readable read_fifo
vwait fifo
puts "finished"
That ought to work. The lines that were changed were the declaration of read_fifo (no variable passed in), the adding of global fifo just below (because we want to work with that instead), the adding of unset fifo just after close $fifo, the setting up of the fileevent (don't pass an extra argument to the callback), and the vwait (because we want to wait for fifo, not forever).

Powershell script to convert PDF to TIFF with Ghostscript

I have been asked to write a script that automatically converts PDF files to TIFF files so they can be processed furter. With a lot of help from Google and this site. (I never studied any programming language) I created the code below.
Even though it's working now, it is not quite what I was hoping for since it creates 13 files every time it runs where it should create only 1.
Could someone be kind enough to take a look at the script and tell me where I went wrong?
Thank you in advance!
EDIT:
In this (test) case there's only one PDF in the folder and it's named test.pdf, however the idea is that the script looks through all the PDF in the given folder since it's unsure how many PDF's are in the folder at any given time. Let it run as a service in the background(?)
I'll edit the post with the error code/description once I find out how to get them, I can't keep up with the command line.
#Path to your Ghostscript EXE
$tool = 'C:\\Program Files\\gs\\gs9.10\\bin\\gswin64c.exe'
#Directory containing the PDF files that will be converted
$inputDir = 'C:\\test\\'
#Output path where converted PDF files will be stored
$outputDirPDF = 'C:\\test\\oud\\'
#Output path where the TIF files will be saved
$outputDir = 'C:\\test\\TIFF'
$pdfs = get-childitem $inputDir -recurse | where {$_.Extension -match "pdf"}
foreach($pdf in $pdfs)
{
$tif = $outputDir + $pdf.BaseName + ".tif"
if(test-path $tif)
{
"tif file already exists " + $tif
}
else
{
'Processing ' + $pdf.Name
$param = "-sOutputFile=$tif"
& $tool -q -dNOPAUSE -sDEVICE=tiffg4 $param -r300 $pdf.FullName -c quit
}
Move-Item $pdf $outputDirPDF
}
It's working now, apparently I was missing an "exit" at the end of the code. It might not be the most beautiful piece of code, but it seems to do the job so I'm happy with it.
Below the piece of code that actually works;
#Path to your Ghostscript EXE
$tool = 'C:\\Program Files\\gs\\gs9.10\\bin\\gswin64c.exe'
#Directory containing the PDF files that will be converted
$inputDir = 'C:\\test\\'
#Output path where converted PDF files will be stored
$outputDirPDF = 'C:\\test\\oud\\'
#Output path where the TIF files will be saved
$outputDir = 'C:\\test\\TIFF\\'
$pdfs = get-childitem $inputDir -recurse | where {$_.Extension -match "pdf"}
foreach($pdf in $pdfs)
{
$tif = $outputDir + $pdf.BaseName + ".tif"
if(test-path $tif)
{
"tif file already exists " + $tif
}
else
{
'Processing ' + $pdf.Name
$param = "-sOutputFile=$tif"
& $tool -q -dNOPAUSE -sDEVICE=tiffg4 $param -r300 $pdf.FullName -c quit
}
Move-Item $pdf $outputDirPDF
}
EXIT
It appears to be creating one TIFF file for each PDF file in the source directory. How many PDF files are in the directory (and any sub-directories) ? How many pages in the input PDF file ?
I note that you move the original PDF from 'InputDir' to 'OutputDirPDF' when completed, but 'OutputDirPDF' is a child of 'InputDir', so if you recurse child directories when looking for input files you may find files you have already processed. NB I know nothing about Powershell so this may be just fine.
I'd suggest making 'InputDir' and 'OutputDirPDF' at the same level, eg "c:\temp\input" and "c:\temp\outputPDF".
That's about all I can say on the information here, you could state what the input PDF filename(s) and output Filename(s) are, and what the processing messages say.

Reading files using Contiki on MSP430F5438A

I want to read a file from my computer and display it on the board. But we are facing a problem while reading the file as the board is constantly resetting. (The Watchdog is off)
Can anyone help?
Here are the steps to upload a file:
If you want to read a file from your local drive only way to do it by uploading the file into coffee file system (cfs) first than read the file using cfs library such as cfs_open, cfs_seek, and cfs_read as a reference have a look into this link:
https://github.com/contiki-os/contiki/wiki/Coffee-filesystem-guide
Modify the program ".c" file you are working to initialize the base64 and coffee commands in the shell by adding:
shell_base64_init();
shell_coffee_init();
Compile and upload via the command:
make TARGET=platformuaresingnow example.upload
to read/upload .txt file by modifying some bash code. To do so, add the following lines
%.shell-upload: %.txt
``(echo; sleep 4; echo "~K"; sleep 4;``
``echo "dec64 | write $*.txt | null"; sleep 4; \``
``../../tools/base64-encode < $<; sleep 4; ``
`` echo ""; echo "~K"; echo "read $*.txt | size"; sleep 4) | make login``
Now you can upload any .txt file to the coffee filesystem of the currently connected mote node using the command:
make testfile.shell-upload
Hope that it 'll solve your problem.

How to write a ping to a text file using VBS

If I am using VBS to run some CMD commands, in this example ping, how could I write the command to a text file using VBS not DOS?
Set objCmdTest = WScript.CreateObject ("WScript.Shell")
Set Output = CreateObject("Scripting.FileSystemObject").OpenTextFile("C:\vbs\test.txt",8,true)
Output.WriteLine (objCmdTest.run ("ping failboat"))
Output.WriteLine (objCmdTest.run ("ping 8.8.8.8"))
So this is what I'm working with however what happens is; The script runs, the file is made, 2 command prompts open to run the pings and finally the text inside the file reads:
0
0
When I'd much prefer it to have the ping output.
FYI: Please don't offer suggestions that require me to use DOS for the writing, I'd like to see how VBS can do what I need for multiple reasons, thanks!
The instruction Output.WriteLine (objCmdTest.run ("ping failboat")) will write the return value of the Run method to the output file. If you want to append the command output to an output file you have to either redirect the output in the command:
objCmdTest.run "%COMSPEC% /c ping failboat >>C:\vbs\test.txt", 0, True
or use Exec instead of Run:
Set ping = objCmdTest.Exec("ping failboat")
Do While ping.Status = 0
WScript.Sleep 100
Loop
Output.WriteLine ping.StdOut.ReadAll
WScript.Shell's run method returns the process's exit code. In order to get access to an application's output, you need to use the exec method instead, and use the object that returns to get access to the process's standard output through its StdOut property.