DATASET_CANT_CLOSE error number 32 "Broken Pipe" - abap

I experienced an error in SAP ABAP which says DATASET_CANT_CLOSE with error number 32 (Broken Pipe). Question is: what procedure triggered this kind of error?
As far as I know, this error was triggered by:
CLOSE DATASET dset
But I can't reproduce the error since I don't know what procedure does trigger this kind of error.
This is the code I use:
method GENERATE_TXT_FILE.
DATA :
lwa_data TYPE t_line,
lv_param TYPE sxpgcolist-parameters.
"Upload File to Server
*Open Dataset
OPEN DATASET im_file_name FILTER 'dos2ux'
FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
CLEAR lwa_data.
LOOP AT it_data INTO lwa_data.
CATCH SYSTEM-EXCEPTIONS file_access_errors = 4
OTHERS = 8.
TRANSFER lwa_data-lines TO im_file_name.
ENDCATCH.
IF sy-subrc <> 0.
CLEAR lwa_data.
EXIT.
ENDIF.
CLEAR lwa_data.
ENDLOOP.
*Close Dataset
CLOSE DATASET im_file_name.

As I have investigated through the background job log, it seems that the current server which run the background job haven't got mapped yet to the text file folder. Solution is to re-map the server to text file folder.

You are using the FILTER extension to OPEN DATASET - which can be a HUGE security issue as well as raise loads of portability issues unless you know what you're doing, but that's not what the question is about. From the documentation:
When the statement OPEN DATASET is executed, a process is started in
the operating system for the specified statement. When the file is
opened for reading, a channel (pipe) is linked with STDOUT of the
process, from which the data is read during file reading. The file
itself is linked with STDIN of the process. When the file is opened
for writing, a channel (pipe) is linked to STDIN of the process, to
which data is passed when writing. The output of the process is
diverted to this file.
In your case, the filter command probably decided to bail out - see this answer among many. Why is hard to investigate - you may have to go through various system logs to find out. If the problem really is some unmapped network folder, you could try switching to UNC paths.

Related

dbt Error : Encountered an error: 'utf-8' codec can't decode byte 0xa0 in position 441: invalid start byte

I have upgraded my dbt version to 1.0.0 yesterday night and ran few connection test. It went well . Now when i am running the my first dbt example model , i am getting below error , even though i have not changed any code in this default example model.
Same error i am getting while running dbt seed command also for a csv dataset . The csv is utf-8 encoded and no special character in it .
I am using python 3.9
Could anyone suggest what is the issue ?
Below is my first dbt model sql
After lots of back and forth, I figured out the issue. This is more like fundamental concept issue.
Every time we execute dbt run, dbt will scan through the entire project directory ( including seeds directory even though it is not materializing the seed ) [Attached screenshot below].
If it finds any csv it also parsed it .
In case of above error, I had a csv file which looks follows :
If we see the highlighted line it contains some symbol character which dbt (i.e python) was not able to parse it causing above error.
This symbol was not visible earlier in excel or notepad++.
It could be the issue with Snowflake python connector that #PeterH has pointed out .
As temporary solution , for now we are manually removing these character from Data file.
I’d leave this as a comment but I don’t have the rep yet…
This appears to be related to a recently-opened issue.
https://github.com/dbt-labs/dbt-snowflake/issues/66
Apparently it’s something to do with the snowflake python adapter.
Since you’re seeing the error from a different context, it might be helpful for you to post in that issue that you’re seeing this outside of query preview.

GrADS script sometimes throws error while opening grib ctl file

Getting this error for GrADS (SUN Grid Engine) logic. However, the grib file and its associated ctl file exist and are valid. If I try to re-run the grads script job, it succeeds. I just don't understand why it sometimes fails.
opening ctl file
/data/myprogram/20211027/gribs/mygribname.grb.ctl
Open Error: Can't open binary data file
File name = /data/myprogram/20211027/gribs/mygribname.grb
The funny thing is... the code is trying to open the ctl file and not the gribfile. Not sure why it even tried to open the gribfile instead.
I figured out the answer to my question. I had to study a little bit of GrADS so I could interpret the logs better. The error above is thrown when the file does not exist. In my case, the file was there. However, it was getting generated after it was accessed by the GrADS code. And sometimes, the file was getting written as GrADS code was trying to access it. I added timestamps to the operations with full iso mode. The operations were happening within milliseconds of each other.

Where does gui_upload uploads the files?

Simply, where does gui_upload function uploads the files in Application Server. How can I find the location of them, or is there any tcode that I can find last uploaded files or search by name ?
Thanks.
gui_upload function does not upload file to the Application server. it just reads a file from the presentation layer into an internal table. You will then need to use some other function to write this internal table to a file on the application server.
Hope this helps.
I can confirm both previous answers.
If you'd like to upload a text/csv file for example to an application server, you can use the following code.
Using GUI_UPLOAD to actually read the provided input file to an internal table:
lv_filename = p_filebp.
CLEAR lt_data_tab.
IF NOT lv_filename IS INITIAL.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = lv_filename
TABLES
data_tab = lt_data_tab
EXCEPTIONS
file_open_error = 1
OTHERS = 17.
IF sy-subrc <> 0.
EXIT.
ENDIF.
ENDIF.
Check if there was data in the text file, and has been transferred into the internal table correctly.
If so, upload it to a file on the application server. You can provide a path yourself where you want to see this uploaded:
READ TABLE lt_data_tab INDEX 1.
IF sy-subrc <> 0.
WRITE: / 'No data in input file.'.
ELSE.
CONCATENATE '/interfaces/' sy-sysid '/CRM_ACTIVITIES/' sy-datum '_INPUT.CSV' INTO p_serinp.
OPEN DATASET p_serinp FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc NE 0.
EXIT.
ENDIF.
LOOP AT lt_data_tab.
TRANSFER lt_data_tab TO p_serinp.
CLEAR lt_data_tab.
ENDLOOP.
CLOSE DATASET p_serinp.
IF sy-subrc = 0.
CLEAR gd_error_text.
CONCATENATE 'Download to file: ' p_serinp ' is finished.'
INTO gd_error_text SEPARATED BY space.
WRITE: / gd_error_text.
ENDIF.
ENDIF.
Keep in mind that, using the OPEN DATASET statement, you can write files anywhere on your network. That if at least you got the required authorizations.
Transaction AL11 can be used to explore the existing folders and files on your SAP system.
I don't have an ABAP stack on hand but I believe what you're looking for is the command OPEN DATASET, or something along those lines. This handles reading and writing files on the application server. The files that are processed via OPEN DATASET can be found with transaction AL11.

Exporting Mathematica Print[] Output to a .txt file

I have a large Mathematica notebook that uses Print[] commands periodically to output runtime messages. This is the only output (aside from exported files) that this notebook generates. Is there any way I can automate the export of this output to a .txt file without having to re-write the Print[] commands?
According to the documentation, Print outputs to the $Output channel which is a list of streams. So, at the beginning of the notebook,
strm = OpenWrite["output.log"];
AppendTo[ $Output, strm ];
and at the end of the notebook
Close[strm];
Note, if execution is interrupted prior to closing the stream, then you'll have to do it manually. Also, the above code will overwrite prior data in "output.log," so you may wish to use OpenAppend, instead.
Edit: to guarantee that Abort will be called, consider using one of the techniques outlined here.
You want the PutAppend command.

cobol Open-IO: create file if it doesn't exist

Does anyone have an idea how you can catch the exception that cobol throws if you try to open an IO file if it doesn't exist, and then create a new file?
The OPTIONAL phrase on the SELECT cause will do this:
SELECT OPTIONAL FILE-A
ASSIGN TO "INFILE"
ORGANIZATION INDEXED.
If OPEN IO the file will be created if necessary. For OPEN INPUT, the file not be created but treated as being at EOF and all random reads will be "INVALID KEY".
I'm pretty sure this is an ANSI standard clause, but can't remember when it showed up.
I don't know what version of Cobol you use or what platform you use it on. My program checks first to see if the file exists before it tries to open it. I use Unisys Cobol 85 on the MCP mainframe platform. The messages are lame, but who cares?
Here is a snippet from a job that runs daily:
968545 IF ATTRIBUTE RESIDENT OF OU3-WORK-LIST-FILE = VALUE TRUE
968550 DISPLAY "PROGRAM SHOWS ATTRIBUTE TRUE"
968555 OPEN EXTEND OU3-WORK-LIST-FILE
968560 ELSE
968565 DISPLAY "PROGRAM SHOWS FALSE"
968570 OPEN OUTPUT OU3-WORK-LIST-FILE
968575 END-IF.
968580
Cathy