Trace32 continuously read file - scripting

I'm trying to make a script for Trace32 PRACTICE language that reads some data from a .txt file and then passes the values to a variable that I need to debug.
My problem is that I cannot make it return to the top of the file and start over, once the .txt reaches the end. Any ideas ?
Here is what I have at the moment and it enters the IF from the start :
OPEN #1 C:/Sandboxes/JLR_ADAS_DC/Trace32Scripts/ListCam_new.txt /Read
PRIVATE &value // declare macro
WHILE TRUE()
(
if FILE.EOFLASTREAD()
(
CLOSE #1
OPEN #1 C:/Sandboxes/JLR_ADAS_DC/Trace32Scripts/ListCam_new.txt /Read
)
else
(
READ #1 %LINE &line
&value=STRing.TRIM(STRing.SPLIT("&line",",",0))
PRINT "&value"
V canConfigInitFlag = &value
)
)
CLOSE #1
ENDDO
Thanks !

Closing and re-opening the file to return to the top of the file is the correct approach. (There is no seek-command.)
However there is a bug in your script:
The function FILE.EOFLASTREAD() returns TRUE() if the last READ command did only return empty data because you've reached the end of the file. In your code you do no longer execute the READ command once you've reached the end of the file. Thus FILE.EOFLASTREAD() will continue to return TRUE().
The following will work:
OPEN #1 C:/Sandboxes/JLR_ADAS_DC/Trace32Scripts/ListCam_new.txt /Read
PRIVATE &value &line // declare macros
WHILE TRUE()
(
READ #1 %LINE &line
IF FILE.EOFLASTREAD() // Last READ was after the end of the file
(
CLOSE #1
OPEN #1 C:/Sandboxes/JLR_ADAS_DC/Trace32Scripts/ListCam_new.txt /Read
READ #1 %LINE &line // Read 1st line
)
&value=STRing.TRIM(STRing.SPLIT("&line",",",0))
PRINT "&value"
Var.set canConfigInitFlag=&value
)
CLOSE #1
ENDDO

Related

QB64 Open Com port causes data read error

There seems to be a fundamental problem with QB64 Open Com statement at least with my compiler. When I open the com port with OPEN "Com3: 9600,n,8,1,ds0,cs0,rs" FOR RANDOM AS #1 while knowing that there is data in the buffer and print EOF, LOC, LOF. It shows EOF=0 OK fine but LOC and LOF both show 0. If you then exercise a GET statement you get a "bad record length" because LOF=0. If I use OPEN FOR INPUT then I immediately get EOF=-1, LOF and LOC=0. If I then use INPUT# I get an input past end of file error because EOF was already -1.
I know that the buffer contains" Voltage = 1.2* "(no quotes) If I say continue upon the input past end of file error I actually get Part of the message.
Is there a fix for this com port problem?
If you test the com port you will find Windows nicely blocks those ports:
REM test com port keyboard i/o
OPEN "COM3:9600,N,8,1,BIN,CS0,DS0" FOR RANDOM AS #1
DO
IF LOC(1) THEN
GET 1, , x
PRINT CHR$(x);
END IF
x$ = INKEY$
IF LEN(x$) THEN
IF x$ = CHR$(27) THEN END
x = ASC(x$)
PUT 1, , x
END IF
LOOP
END

Why can't I read whole file?

I'm trying to do some image processing with FPGA and my supporter want us to show some simulation result with Modelsim.
So, basically we try to read image file in testbench and write it to another file but it stop read file at half of the file. Here is my source code
module fileio1;
integer in,out,r;
reg [31:0]temp;
reg clk;
initial
begin
r=0;
temp =0;
clk = 0;
in = $fopen("test120.bmp","r");
out = $fopen("result.bmp","w");
end
always #1 clk = ~clk;
always #(negedge clk)
begin
r = $fscanf(in,"%c",temp);
end
always #(posedge clk)
begin
if(~r) $fwrite(out,"%c",temp);
end
endmodule
This is that source code and our input file is 120x180 size bitmapfile (64kb)
but output file is 38kb. Almost half of file. I try it with 480x720 size bitmapfile(1013kb), it's ouput file is almost half of the origin file too.
With very small size file input, we can get the right outputfile.
Why is this happen? Is there some better function to input/output file?
You should not use $fscanf in that situation. $fscanf skips over white-space, including blank lines (just like fscanf() function in C).
You should rather use $fread function:
always #(negedge clk)
begin
r = $fread(temp,in);
end

Only reading in one input from a file in ada

I am having some trouble reading input in from a file. So what I have done is made a proof of concept program, which is a piece of my main program that does much more but I am only having trouble reading the input.
Here is my proof of concept program:
WITH Ada.Text_IO; USE Ada.Text_IO;
with ada.Integer_Text_IO; use ada.Integer_Text_IO;
PROCEDURE Open_File IS
subtype idnum is string(1 ..7);
-- Make short names so that we can show where things come from
My_File : File_Type; -- Name for file in this program
Os_Name : String := "My_Data.txt"; -- OS name for the file
N : idnum; -- Temporary for reading and printing file contents
EOL : boolean;
C : character;
BEGIN
-- Open will raise an ADA.IO_EXCEPTIONS.NAME_ERROR expection
-- if the file does not exist.
Open (File => My_File, Mode => In_File, Name => Os_Name);
LOOP
EXIT WHEN End_Of_File (My_File);
Look_Ahead(My_File, C, EOL);
IF EOL THEN
Skip_Line;
ELSE
IF C = ' ' THEN
Get(My_File, C);
ELSE
Get (My_File, N);
Put_Line(N);
END IF;
END IF;
END LOOP;
Close (My_File);
END open_file;
My data file looks like this: (including the spaces with no new lines after the last id)
1234567
456784a
6758abc
When I compile and run my program only the first id number gets printed to the screen. I have no clue where to check my code because it should continue to get id numbers until the end of the file.
Any help would be greatly appreciated. Thanks!
When you Get the second (and third, for that matter) line, Data_Error exception will be raised, because 456784a is not a number, 'a' is not a numeric character. If you want it to be a hexadecimal number, the input should be 16#456784a# (by default).

How to run same syntax on multiple spss files

I have 24 spss files in .sav format in a single folder. All these files have the same structure. I want to run the same syntax on all these files. Is it possible to write a code in spss for this?
You can use the SPSSINC PROCESS FILES user submitted command to do this or write your own macro. So first lets create some very simple fake data to work with.
*FILE HANDLE save /NAME = "Your Handle Here!".
*Creating some fake data.
DATA LIST FREE / X Y.
BEGIN DATA
1 2
3 4
END DATA.
DATASET NAME Test.
SAVE OUTFILE = "save\X1.sav".
SAVE OUTFILE = "save\X2.sav".
SAVE OUTFILE = "save\X3.sav".
EXECUTE.
*Creating a syntax file to call.
DO IF $casenum = 1.
PRINT OUTFILE = "save\TestProcess_SHOWN.sps" /"FREQ X Y.".
END IF.
EXECUTE.
Now we can use the SPSSINC PROCESS FILES command to specify the sav files in the folder and apply the TestProcess_SHOWN.sps syntax to each of those files.
*Now example calling the syntax.
SPSSINC PROCESS FILES INPUTDATA="save\X*.sav"
SYNTAX="save\TestProcess_SHOWN.sps"
OUTPUTDATADIR="save" CONTINUEONERROR=YES
VIEWERFILE= "save\Results.spv" CLOSEDATA=NO
MACRONAME="!JOB"
/MACRODEFS ITEMS.
Another (less advanced) way is to use the command INSERT. To do so, repeatedly GET each sav-file, run the syntax with INSERT, and sav the file. Probably something like this:
get 'file1.sav'.
insert file='syntax.sps'.
save outf='file1_v2.sav'.
dataset close all.
get 'file2.sav'.
insert file='syntax.sps'.
save outf='file2_v2.sav'.
etc etc.
Good luck!
If the Syntax you need to run is completely independent of the files then you can either use: INSERT FILE = 'Syntax.sps' or put the code in a macro e.g.
Define !Syntax ()
* Put Syntax here
!EndDefine.
You can then run either of these 'manually';
get file = 'file1.sav'.
insert file='syntax.sps'.
save outfile ='file1_v2.sav'.
Or
get file = 'file1.sav'.
!Syntax.
save outfile ='file1_v2.sav'.
Or if the files follow a reasonably strict naming structure you can embed either of the above in a simple bit of python;
Begin Program.
imports spss
for i in range(0, 24 + 1):
syntax = "get file = 'file" + str(i) + ".sav.\n"
syntax += "insert file='syntax.sps'.\n"
syntax += "save outfile ='file1_v2.sav'.\n"
print syntax
spss.Submit(syntax)
End Program.

SAS - Reading a File Backwards?

I need SAS to read many large log files, which are set up to have the most recent activities at the bottom. All I need is the most recent time a particular activity occurred, and I was wondering if it's possible for SAS to skip parsing the (long) beginning parts of the file.
I looked online and found how to read a dataset backwards, but that would require SAS to first parse everything in the .log file into the dataset first. Is it possible to directly read the file starting from the very end so that I can stop the data step as soon as I find the most recent activity of a particular type?
I read up on infile as well, and the firstobs option, but I have no idea how long these log files are until they are parsed, right? Sounds like a catch-22 to me. So is what I'm describing doable?
I'd probably set up a filename pipe statement to use an operating system command like tail -r or tac to present the file in reverse order to SAS. That way SAS can read the file normally and you don't have to worry about how long the file is.
If you mean parsing a sas log file, I am not sure if reading the log file backward is worth the trouble in practice. For instance, the following code executes less than a tenth of a second on my PC and it is writing and reading a 10,000 line log file. How big is your log files and how many are there? Also as shown below, you don't have to "parse" everything on every line. You can selectively read some parts of the line and if it is not what you are looking for, then you can just go to the next line.
%let pwd = %sysfunc(pathname(WORK));
%put pwd=&pwd;
x cd &pwd;
/* test file. more than 10,000 line log file */
data _null_;
file "test.log";
do i = 1 to 1e4;
r = ranuni(0);
put r binary64.;
if r < 0.001 then put "NOTE: not me!";
end;
put "NOTE: find me!";
do until (r<0.1);
r = ranuni(0);
put r binary64.;
end;
stop;
run;
/* find the last line that starts with
NOTE: and get the rest of the line. */
data _null_;
length msg $80;
retain msg;
infile "test.log" lrecl=80 eof=eof truncover;
input head $char5. #;
if head = "NOTE:" then input #6 msg $char80.;
else input;
return;
eof:
put "last note was on line: " _n_ ;
put "and msg was: " msg $80.;
run;
/* on log
last note was on line: 10013
and msg was: find me!
*/