I have a FORTRAN code which performs a numerical integration. All the computed data are written in an external file (data.out). Here is a simple sketch of the code
OPEN(UNIT=10,FILE='data.out')
DO i=1,n
........
WRITE(10,'(7(E22.16,1x))')a,b,c,d,e,f,g
........
ENDDO
CLOSE(10)
The program keeps running for long time (about 1.5 h) until the numerical integration is finished. During the execution I would like to see the results in the .out file. However, when I try to open the .out file while the .exe is running I get the following message:
"The document data.out is used by another application and cannot be accessed." So, is there a way to open the .out file during the execution? It is important for me to observe the output values (they are not only seven as in the above example) so, it is not convenient to send them in screen output (is reduces significantly the speed of the code).
Many thanks in advance.
* EDIT *
This is another scenario quite similar to the above mentioned case. Here, the integration routine reads the initial conditions from an input file and writes the outputs to another external file. Below I present the corresponding skeleton of the code
OPEN(UNIT=10,FILE='input.par',STATUS='UNKNOWN')
2 READ(10,*,END=1) x_0,y_0
! INTEGRATION LOOP
DO i=1,n
........
ENDDO
OPEN(UNIT=12,FILE='data.out')
WRITE(12,'(7(E22.16,1x))')a,b,c,d,e,f,g
GOTO 2
1 CLOSE(10)
CLOSE(12)
So, the routine opens UNIT 10 reads the initial conditions, performs the integration and at the end of the integration it writes the outputs to UNIT 12. Then, it takes another set of initial conditions and repeat the same procedure until it there is no more initial conditions in UNIT 10. Again, I want to be able to open and monitor UNIT 12. I tried your approach but it does not work properly at this case. I can open UNIT 12 any time I want but the routine does not write all the outputs there. In fact, it writes only the outputs of the last set of initial conditions. Any ideas? I strongly believe that a minor modification to your approach could do the job.
Alternative to printing out the result to the screen, you could just close your file after each write operation and reopen it in the next integration cycle to append the new data to it.
program test
implicit none
integer :: ii
! Create emtpy file
open(10, file='data.out', status='REPLACE', action='WRITE')
close(10)
do ii = 1, 100
!...
! Reopen file, append new information and close it again.
open(10, file='data.out', status='OLD', action='WRITE', position='APPEND')
write(10,'(7(E23.16,1x))') 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0
close(10)
end do
end program test
EDIT: THe main idea is, that before you start any kind of looping, you create an empty file by the first open statement:
open(10, file='data.out', status='REPLACE', action='WRITE')
close(10)
Then, inside the loop you just append to that file, to make sure you do not replace content already being there:
open(10, file='data.out', status='OLD', action='WRITE', position='APPEND')
write(10,'(7(E23.16,1x))') 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0
close(10)
Please note the differences in the arguments passed to the open statement.
The logical way to do this would be to echo the output to the terminal. By default, this is unit 6. So I would change your output to:
WRITE(6,'(7(E22.16,1x))')a,b,c,d,e,f,g
WRITE(10,'(7(E22.16,1x))')a,b,c,d,e,f,g
Try it and see. It may even work.
Related
TL;DR: can I do an incremental read of binary files with Red or Rebol?
I would like to use Red to process some large (13MB to 2GB) structured binary files (Kurzweil synthesizer files). I've used other languages (C, Go, Tcl, Ruby, Dart) to walk through these files, and now I'd like to do the same with Red or Rebol.
Is there a way to incrementally read binary files, byte by byte? All I see is read/binary which seems to slurp the entire file at once (or a part of a file).
I'll need to jump around a little bit, too (either peek at the next byte, or skip to the end of a section, or skip past variable length strings to the start of data).
(Yes, I could make some helpers that tracked the position and used read/part/seek.)
I would like to make a call to the low level OS read/seek if that is possible - something new to learn.
This is on macos, but a portable solution would be great.
Thanks!
PS: "open/read %abc" gives an error "*** Script Error: open does not allow file! for its port argument", even though the help message say the port argument is "port [port! file! url! block!]"
Rebol has ports for that, which are planned for 0.7.0 release in Red. So, current I/O is very basic and buffer-only, and open is a preliminary stub.
I would like to make a call to the low level OS read/seek if that is possible - something new to learn.
You can leverage Rebol or Red/System FFI as a learning excercise.
Here is how you would do it in Rebol:
>> file: open/direct/binary %file.dat
>> until [none? probe copy/part file 20]
>> close file
#{732F7072696E74657253657474696E6773312E62}
#{696E504B01022D00140006000800000021006149}
#{0910890100001103000010000000000000000000}
...
#{000000006A290000646F6350726F70732F617070}
#{2E786D6C504B0506000000000D000D0068030000}
#{292C00000000}
none
first file or pick file 1 will return the next byte value (integer!)
This even works with text files: open/lines/direct, in that case copy/part file 20 will return 20 lines, or you can use pick file 1 or first file to get the next line.
Soon this will be available on Red too.
I'm attempting to zip a folder containing subfolders and items, using Windows shell CopyHere command:
https://msdn.microsoft.com/en-us/library/windows/desktop/bb787866(v=vs.85).aspx
https://msdn.microsoft.com/en-us/library/windows/desktop/ms723207(v=vs.85).aspx
Update: Note, prefer a native solution-- this is for a distributed Excel VBA tool, so bundling 3rd-party files is not ideal. And, need synchronous compression.
I can easily add a folder and its contents to the zip:
oShell.Namespace(sZipPath).CopyHere "C:\My Folder"
So we know CopyHere can process multiple objects inside a folder in 1 statement.
The problem is, the above command puts the containing-folder at the root of the zip, and it's contents inside of it. But, i don't want the containing folder-- just its contents.
The doc mentions a wildcard (option 128), but when i use a wildcard, i get an error:
oShell.Namespace(sZipPath).CopyHere "C:\My Folder\*"
The file name you specified is not valid or too long.
Perhaps there's a way to use my 1st command above, and then move the items in the zip to the root of the zip?
It would be acceptable to loop through each item in the source folder, adding one at a time to the zip. But, because CopyHere is asynchronous, each subsequent CopyHere fails if the previous CopyHere is not finished. None of the fixes work for this issue:
Comparing number of items in source-folder and destination-zip fail, because if the zip contains a folder, that counts as only 1 item (the items it contains are not counted. https://stackoverflow.com/a/16603850/209942
Waiting a while between each item works, but a timer is unacceptable: it's arbitrary. I cannot guess in advance the size or compress-time of each object.
Checking to see if the zip is locked for access failed for me. If I block my loop until the file is not locked, I still get a file-access error. https://stackoverflow.com/a/6666663/209942
Function FileIsOpen(sPathname As String) As Boolean ' true if file is open
Dim lFileNum As Long
lFileNum = FreeFile
Dim lErr As Long
On Error Resume Next
Open sPathname For Binary Access Read Write Lock Read Write As #lFileNum
lErr = Err
Close #lFileNum
On Error GoTo 0
FileIsOpen = (lErr <> 0)
End Function
Update: VBA can call shell commands synchronously (instead of creating a shell32.shell object in VBA), so if CopyHere works on command-line or PowerShell, that could be the solution. Investigating...
Automating Shell objects really isn't a viable approach as you have already discovered. The Explorer Shell doesn't really expose this capability in any other manner though, at least not before Windows Vista and then not in any fashion easily used from VB6 programs or VBA macros.
Your best bet is a 3rd party ActiveX library, but be careful about 64-bit VBA hosts where you'll need a 64-bit version of such a library.
Another option is to acquire a later copy of the zlibwapi.dll and use some VB6 wrapper code with it. This is also a 32-bit solution.
That's what Zipper & ZipWriter, Zipping from VB programs does. Considering your requirements (which for some reason includes a fear of the Timer control) you could use the synchronous ZipperSync Class. See post #4 there. That code includes a simple AddFolderToZipperSync bundling up the logic to add a folder instead of just a single file.
The downside of the synchronous class is that a large archival operation freezes your program UI until it completes. If you don't want that use the Zipper UserControl instead.
You could also take the ideas from that to write your own wrapper class.
Solution:
Windows contains another native compression utility: CreateFromDirectory at a PowerShell prompt.
https://msdn.microsoft.com/en-us/library/system.io.compression.zipfile.createfromdirectory(v=vs.110).aspx
https://blogs.technet.microsoft.com/heyscriptingguy/2015/03/09/use-powershell-to-create-zip-archive-of-folder/
This requires .Net 4.0 or later:
> Add-Type -AssemblyName System.IO.Compression
> $src = "C:\Users\v1453957\documents\Experiment\rezip\aFolder"
> $zip="C:\Users\v1453957\Documents\Experiment\rezip\my.zip"
> [io.compression.zipfile]::CreateFromDirectory($src, $zip)
Note, you may have to provide the complete pathnames-- active directory was not implicit on my machine.
The above compression is synchronous at the PowerShell prompt, as the OP requests.
Next step is executing synchronously from VBA. The solution there is the .Run method in Windows Script Host Object Model. In VBA, set a reference to that, and do the following, setting the 3rd parameter of .Run command, bWaitOnReturn to True:
Function SynchronousShell(sCmd As String)As Long
Dim oWSH As New IWshRuntimeLibrary.WshShell
ShellSynch = oWSH.Run(sCmd, 3, True)
Set oWSH = Nothing
End Function
Now call SynchronousShell, and pass it the entire compression script.
I believe the only way for this process to work is if CreateFromDirectory is executed in the same session as Add-Type.
So, we must pass the whole thing as 1 string. That is, load all 4 commands into a single sCmd variable, so that Add-Type remains associated with the subsequent CreateFromDirectory. In PowerShell syntax, you can separate them with ;
https://thomas.vanhoutte.be/miniblog/execute-multiple-powershell-commands-on-one-line/
Also, you'll want to use single-quotes instead of double-quotes, else double quotes around the strings are removed when the daisy-chained commands are passed to powershell.exe
https://stackoverflow.com/a/39801732/209942
sCmd = "ps4 Add-Type -AssemblyName System.IO.Compression; $src = 'C:\Users\v1453957\documents\Experiment\rezip\aFolder'; $zip='C:\Users\v1453957\Documents\Experiment\rezip\my.zip'; [io.compression.zipfile]::CreateFromDirectory($src, $zip)"
Solved. The above constitutes the complete solution.
Extra info: Additional comments below are for special circumstances:
Multi-version .Net environments
If a .NET < 4.0 is the active environment on your OS, then System.IO.Compression does not exist-- the Add-Type command will fail. But if your machine has the .NET 4 assemblies available, you can still do this:
Create a batch file which runs PowerShell with .Net 4. See https://stackoverflow.com/a/31279372
In your Add-Type command above, use the exact path to the .Net 4 Compression assembly. On my Win Server 2008:
Add-Type -Path "C:\Windows\Microsoft.NET\assembly\GAC_MSIL\System.IO.Compression.FileSystem\v4.0_4.0.0.0__b77a5c561934e089\System.IO.Compression.FileSystem.dll"
Portability
Turns out, on my machine, I can copy the compression dll to any folder, and make calls to the copy and it works:
Add-Type -Path "C:\MyFunnyFolder\System.IO.Compression.FileSystem.dll"
I don't know what's required to ensure this works-- it might require the full .Net 4.0 or 2.0 files to be located in their expected directories. I assume the dll makes calls to other .Net assemblies. Maybe we just got lucky with this one :)
Character Limit
Depending on the depth of our paths and filenames, character-count may be a concern. PowerShell may have a 260-character limit (not sure).
https://support.microsoft.com/en-us/kb/830473
https://social.technet.microsoft.com/Forums/windowsserver/en-US/f895d766-5ffb-483f-97bc-19ac446da9f8/powershell-command-size-limit?forum=winserverpowershell
Since .Run goes through the Windows shell, you also have to worry about that character limit, but at 8k+, it's a bit roomier:
https://blogs.msdn.microsoft.com/oldnewthing/20031210-00/?p=41553
https://stackoverflow.com/a/3205048/209942
Site below offers a 24k+ character method, but i've not studied it yet:
http://itproctology.blogspot.com/2013/06/handling-freakishly-long-strings-from.html
At minimum, since we can put the dll wherever we like, we can put it in a folder near C: root-- keeping our character-count down.
Update: This post shows how we can put the whole thing in a script-file, and call it with ps4.cmd. This may become my preferred answer:
.\ps4.cmd GC .\zipper.ps1 | IEX
-- depending on answer here.
CopyHere:
Re the question: can CopyHere command execute on command-line?
CopyHere can be executed directly at PowerShell prompt (code below). However, even in powershell it's asynchronous-- control returns to PowerShell prompt before the process is finished. Therefore, no solution for the OP. Here's how it's done:
> $shellapp=new-object -com shell.application
> $zippath="test.zip"
> $zipobj=$shellapp.namespace((Get-Location).Path + "\$zippath")
> $srcpath="src"
> $srcobj=$shellapp.namespace((Get-Location).Path + "\$srcpath")
> $zipobj.Copyhere($srcobj.items())
I experienced an error in SAP ABAP which says DATASET_CANT_CLOSE with error number 32 (Broken Pipe). Question is: what procedure triggered this kind of error?
As far as I know, this error was triggered by:
CLOSE DATASET dset
But I can't reproduce the error since I don't know what procedure does trigger this kind of error.
This is the code I use:
method GENERATE_TXT_FILE.
DATA :
lwa_data TYPE t_line,
lv_param TYPE sxpgcolist-parameters.
"Upload File to Server
*Open Dataset
OPEN DATASET im_file_name FILTER 'dos2ux'
FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
CLEAR lwa_data.
LOOP AT it_data INTO lwa_data.
CATCH SYSTEM-EXCEPTIONS file_access_errors = 4
OTHERS = 8.
TRANSFER lwa_data-lines TO im_file_name.
ENDCATCH.
IF sy-subrc <> 0.
CLEAR lwa_data.
EXIT.
ENDIF.
CLEAR lwa_data.
ENDLOOP.
*Close Dataset
CLOSE DATASET im_file_name.
As I have investigated through the background job log, it seems that the current server which run the background job haven't got mapped yet to the text file folder. Solution is to re-map the server to text file folder.
You are using the FILTER extension to OPEN DATASET - which can be a HUGE security issue as well as raise loads of portability issues unless you know what you're doing, but that's not what the question is about. From the documentation:
When the statement OPEN DATASET is executed, a process is started in
the operating system for the specified statement. When the file is
opened for reading, a channel (pipe) is linked with STDOUT of the
process, from which the data is read during file reading. The file
itself is linked with STDIN of the process. When the file is opened
for writing, a channel (pipe) is linked to STDIN of the process, to
which data is passed when writing. The output of the process is
diverted to this file.
In your case, the filter command probably decided to bail out - see this answer among many. Why is hard to investigate - you may have to go through various system logs to find out. If the problem really is some unmapped network folder, you could try switching to UNC paths.
I have a model of a heating process on Ansys Multiphysics, V11.
After running the simulation, I have a script to plot a temperature profile:
!---------------- POST PROCESSING -----------------------
/post1 ! tdatabase postprocessor
!---define profile temperature
path,s_temp1,2,,100 ! define a path
ppath,1,,dop/2,0,0 ! create a path point
ppath,2,,dop/2,1.5,0 ! create a path point
PDEF,surf_t1,TEMP, ,noav ! print a path
plpath,surf_t1 ! plot a path
What I now need, is to save the resulting path in a text file. I have already looked online for a solution, and found the following code to do it, which I appended after the lines above:
/OUTPUT,filename,extension
PRPATH,surf_t1
/OUTPUT
Ansys generates the file filename.extension but it is empty. I tried to place the OUTPUT command in a few locations in the script, but without any success.
I suspect I need to define something else, but I have no idea where to look, as Ansys documentation online is terribly chaotic, and all internet pages I've opened before writing this question are not better.
A final note: Ansys V11 is an old version of the software, but I don't want to upgrade it and fit the old model to the new software.
For the output of the simulation (which includes all calculation steps, and sub-steps description and node-by-node results) the output must be declared in the beginning of the code, and not in the postprocessing phase.
Declaring
/OUTPUT,filename,extension
in the preamble of the main script makes such that the output is stored in the right location, with the desired extension. At the end of the scripts, you must then declare
/OUTPUT
to reset the output file location for ANSYS.
The output to the PATH call made in the postprocessing script is however not printed in the file.
It is convenient to use
*CFOPEN,file,ext
*VWRITE,Vector(1,1).Vector(1,2)
(2F12.6)
*CFCLOSE
where Vector(1,1) is a two column array created by *DIM, and stores your data to output to file
As this is a special command, run it from file i.e. macro_output.mac
I have a program which performs a useful task. Now I want to produce the plain-text source code when the compiled executable runs, in addition to performing the original task. This is not a quine, but is probably related.
This capability would be useful in general, but my specific program is written in Fortran 90 and uses Mako Templates. When compiled it has access to the original source code files, but I want to be able to ensure that the source exists when a user runs the executable.
Is this possible to accomplish?
Here is an example of a simple Fortran 90 which does a simple task.
program exampl
implicit none
write(*,*) 'this is my useful output'
end program exampl
Can this program be modified such that it performs the same task (outputs a string when compiled) and outputs a Fortran 90 text file containing the source?
Thanks in advance
It's been so long since I have touched Fortran (and I've never dealt with Fortran 90) that I'm not certain but I see a basic approach that should work so long as the language supports string literals in the code.
Include your entire program inside itself in a block of literals. Obviously you can't include the literals within this, instead you need some sort of token that tells your program to include the block of literals.
Obviously this means you have two copies of the source, one inside the other. As this is ugly I wouldn't do it that way, but rather store your source with the include_me token in it and run it through a program that builds the nested files before you compile it. Note that this program will share a decent amount of code with the routine that recreates the code from the block of literals. If you're going to go this route I would also make the program spit out the source for this program so whoever is trying to modify the files doesn't need to deal with the two copies.
My original program (see question) is edited: add an include statement
Call this file "exampl.f90"
program exampl
implicit none
write(*,*) "this is my useful output"
open(unit=2,file="exampl_out.f90")
include "exampl_source.f90"
close(2)
end program exampl
Then another program (written in Python in this case) reads that source
import os
f=open('exampl.f90') # read in exampl.f90
g=open('exampl_source.f90','w') # and replace each line with write(*,*) 'line'
for line in f:
#print 'write(2,*) \''+line.rstrip()+'\'\n',
g.write('write(2,*) \''+line.rstrip()+'\'\n')
f.close
g.close
# then complie exampl.f90 (which includes exampl_source.f90)
os.system('gfortran exampl.f90')
os.system('/bin/rm exampl_source.f90')
Running this python script produces an executable. When the executable is run, it performs the original task AND prints the source code.