Load xls with cell values of 80+ characters from frontend - abap

I need to read excel files via SAPGUI (not in batch, not from server).
Only one sheet/file, not a csv file.
I am aware of a few function modules that do that, but they are restricted to cell sizes of 32 or 40 or 50 characters per cell.
Are there function modules or classes/methods that allow me to read excel files with longer cells?
Longer means: either String or defined by the caller or at least 80.
Edit
I used ALSM_EXCEL_TO_INTERNAL_TABLE successfully in other projects where cell size is not that important. This module reads into a structure ALSMEX_TABLINE that restricts data to 50 characters.
KCD_EXCEL_OLE_TO_INT_CONVERT reads into a table with 32 characters / cell.

You are right, function module 'ALSM_EXCEL_TO_INTERNAL_TABLE' can manage only 50 characters. In this, one standard alternative is to use function module 'GUI_UPLOAD', which I have used, but in this case you must convert the excel file to a cvs file, which is not what you really want.
The other alternative, according to this link, is to create a copy of 'ALSM_EXCEL_TO_INTERNAL_TABLE' and then create a copy of the structure 'ALSMEX_TABLINE'.
The structure field 'VALUE' of the new structure must be changed to the length you need and then the copy of 'ALSM_EXCEL_TO_INTERNAL_TABLE' would use the new structure of 'ALSMEX_TABLINE'.
I haven't tried this solution but maybe could work for you.
Hope it helps.

You can use FILE_READ_AND_CONVERT_SAP_DATA for that aim. Its output table cell is limited to 256 characters, which would be quite sufficient for you. Code sample is given below:
TYPES: tv_data(256) TYPE c,
BEGIN OF ts_data,
value_0001 TYPE tv_data,
...
value_0020 TYPE tv_data,
END OF ts_data,
tt_data TYPE TABLE OF ts_data.
DATA: lv_fname TYPE filename-fileintern,
pt_data TYPE tt_data.
lv_fname = 'C:\test.xls'.
CALL FUNCTION 'FILE_READ_AND_CONVERT_SAP_DATA'
EXPORTING
i_filename = lv_fname
i_servertyp = 'OLE2'
i_fileformat = 'XLS'
TABLES
i_tab_receiver = pt_data
EXCEPTIONS
file_not_found = 1
close_failed = 2
authorization_failed = 3
open_failed = 4
conversion_failed = 5
OTHERS = 6.
IF sy-subrc <> 0.
* error handling
ENDIF.

Related

How to call BAPI_MATERIAL_SAVEDATA with custom fields from NCo?

In our current project we are using SAPNCO3 with RFC calls. The requirement is to create material with the function "BAPI_MATERIAL_SAVEDATA" and some custom fields (via EXTENSIONIN). The problem now is how to extend the needed structures "BAPI_TE_MARA/X" so that they can carry the custom fields? I cannot found any function for this.
Please have a look at the Code snippet at the bottom.
Thank you!
Tobias
var BAPI_TE_MARA = repo.GetStructureMetadata("BAPI_TE_MARA");
IRfcStructure structure = BAPI_TE_MARA.CreateStructure();
structure.SetValue("MATERIAL", material.Number);
//structure.SetValue("ZMM_JOB_REFERENCE", "f");
BAPI_MATERIAL_SAVEDATA has two table parameters EXTENSIONIN and EXTENSIONINX to which you pass lines with the values of your custom fields.
These table parameters have to indicate what extension structures you want to use and their values.
As these custom fields may extend different tables of the material, you have to indicate different extension structures depending on which table these fields belong to:
For the table MARA, the extension structures are BAPI_TE_MARA and BAPI_TE_MARAX.
For the table MARC, the extension structures are BAPI_TE_MARC and BAPI_TE_MARCX.
These extension structures should preferably have character-like fields to simplify the programming (and to support IDocs, as rule-of-thumb).
For instance, if you have the custom fields ZZCNAME (7 characters) and ZZCTEXT (50 characters) in the table MARA, they will also be defined in BAPI_TE_MARA and have the same names and types. In BAPI_TE_MARAX, you also have two fields with the same names, but always of length 1 character and their values must be 'X' to indicate that a value is passed in BAPI_TE_MARA (useful in case a blank value is passed that must not be ignored). The X extension structures are essential especially in "change" BAPIs.
If you want to pass values to the BAPI, you must first initialize these structures:
BAPI_TE_MARA:
MATERIAL ZZCNAME ZZCTEXT
------------ ------- -------
000000012661 NAME TEXT
BAPI_TE_MARAX:
MATERIAL ZZCNAME ZZCTEXT
------------ ------- -------
000000012661 X X
Then, you must initialize the two parameters of the BAPI:
EXTENSIONIN (notice that there are 3 spaces in NAME TEXT because the technical length of ZZCNAME is 7 characters and its value "NAME" occupies only 4 characters):
STRUCTURE VALUEPART1 (240 Char) VALUEPART2 (240) VALUEPART3 (240) VALUEPART4 (240)
------------ ----------------------- ---------------- ---------------- ----------------
BAPI_TE_MARA 000000012661NAME TEXT
EXTENSIONINX:
STRUCTURE VALUEPART1 (240 Char) VALUEPART2 (240) VALUEPART3 (240) VALUEPART4 (240)
------------- --------------------- ---------------- ---------------- ----------------
BAPI_TE_MARAX 000000012661XX
Consequently, your program must:
concatenate all BAPI_TE_MARA fields together and copy the resulting string into fields VALUEPART1 to VALUEPART4 of EXTENSIONIN as if it was a 960 characters field
concatenate all BAPI_TE_MARAX fields together and copy the resulting string into fields VALUEPART1 to VALUEPART4 of EXTENSIONINX
I guess you may use ToString() to get one concatenated string of characters of all fields of a structure, and to set the value of VALUEPART1, VALUEPART2, etc., you'll probably need to initialize them individually from the string of characters with Substring.
My comment was half by half correct and incorrect, I wasn't aware of the extension technique in this BAPI, so I wasn't aware of this structure is really used in this BAPI.
You asked
The problem now is how to extend the needed structures "BAPI_TE_MARA/X" so that they can carry the custom fields?`
and what I said is indeed stays valid: you cant extend the interface from NCo, only on backend.
You writes:
At this If I load BAPI_TE_MARA there aren't any custom fields but the material
and this get me to the idea that your ABAP developers made only half of the work. The things to be done on the SAP backend:
Extend MARA table with custom Z fields (in SAP it is called Append structure)
Extend interface structure BAPI_TE_MARA with the fields which should exactly correspond to the MARA fields
This is how it must look like on backend
If you don't see any custom fields in BAPI_TE_MARA except MATERIAL probably step 2 is missing on SAP side. As what I got from your comments, they created structure ZMM_S_MATMAS_ADDITION but appended it only to MARA, but not to BAPI_TE_MARA.
What is missing from Sandra excellent holistic answer is step 3: for all this construction to work some customizing need to be done.
T130F table must contain your custom fields. To maintain the entry for T130F go to transaction SPRO or directly to maintenance view V_130F.
SPRO way: go to SPRO -> Logistics-General -> Material Master -> Field Selection -> Assign fields to field Selection Groups and maintain the entry in the table
Sample ABAP code that does the thing:
DATA: ls_headdata TYPE bapimathead,
lt_extensionin TYPE STANDARD TABLE OF bapiparex,
ls_extensionin LIKE LINE OF lt_extensionin,
lt_extensioninx TYPE STANDARD TABLE OF bapiparexx,
ls_extensioninx LIKE LINE OF lt_extensioninx,
lt_messages TYPE bapiret2_t,
ls_bapi_te_mara TYPE bapi_te_mara,
ls_bapi_te_marax TYPE bapi_te_marax.
ls_headdata-material = |{ ls_headdata-material ALPHA = IN }|.
ls_headdata-basic_view = 'X'.
ls_bapi_te_mara-material = ls_headdata-material.
ls_bapi_te_mara-zztest1 = '322223'.
ls_bapi_te_marax-material = ls_headdata-material.
ls_bapi_te_marax-zztest1 = 'X'.
ls_extensionin-structure = 'BAPI_TE_MARA'.
ls_extensionin-valuepart1 = ls_bapi_te_mara
APPEND ls_extensionin TO lt_extensionin.
ls_extensioninx-structure = 'BAPI_TE_MARAX'.
ls_extensioninx-valuepart1 = ls_bapi_te_marax-zztest1.
APPEND ls_extensioninx TO lt_extensioninx.
CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
EXPORTING
headdata = ls_headdata
TABLES
returnmessages = lt_messages
extensionin = lt_extensionin
extensioninx = lt_extensioninx.
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = 'X'.
Based on this you can model your .Net code for BAPI calling.
P.S. Pay attention to the first line with ALPHA = IN. The input to the material number field must be in fully qualified 18-char format with leading zeroes, e.g. 000000000000323, otherwise the update will fail.
Always extend structure EMARA and not MARA, BAPI_TE_MARA, ... directly.

AS400 RPGLE/free dynamic variables in operations

I'm fairly certain after years of searching that this is not possible, but I'll ask anyway.
The question is whether it's possible to use a dynamic variable in an operation when you don't know the field name. For example, I have a data structure that contains a few hundred fields. The operator selects one of those fields and the program needs to know what data resides in the field from the data structure passed. So we'll say that there are 100 fields, and field50 is what the operator chose to operate on. The program would be passed in the field name (i.e. field50) in the FLDNAM variable. The program would read something like this the normal way:
/free
if field50 = 'XXX'
// do something
endif;
/end-free
The problem is that I would have to code this 100 times for every operation. For example:
/free
if fldnam = 'field1';
// do something
elseif fldnam = 'field2';
// do something
..
elseif fldnam = 'field50';
// do something
endif;
Is there any possible way of performing an operation on a field not yet known? (i.e. IF FLDNAM(pointer data) = 'XXX' then do something)
If the data structure is externally-described and you know what file it comes from, you could use the QUSLFLD API to find out the offset, length, and type of the field in the data structure, and then use substring to get the data and then use other calculations to get the value, depending on the data type.
Simple answer, no.
RPG's simply not designed for that. Few languages are.
You may want to look at scripting languages. Perl for instance, can evaluate on the fly. REXX, which comes installed on the IBM i, has an INTERPRET keyword.
REXX Reference manual

Lossless assignment between Field-Symbols

I'm currently trying to perform a dynamic lossless assignment in an ABAP 7.0v SP26 environment.
Background:
I want to read in a csv file and move it into an internal structure without any data losses. Therefore, I declared the field-symbols:
<lfs_field> TYPE any which represents a structure component
<lfs_element> TYPE string which holds a csv value
Approach:
My current "solution" is this (lo_field is an element description of <lfs_field>):
IF STRLEN( <lfs_element> ) > lo_field->output_length.
RAISE EXCEPTION TYPE cx_sy_conversion_data_loss.
ENDIF.
I don't know how precisely it works, but seems to catch the most obvious cases.
Attempts:
MOVE EXACT <lfs_field> TO <lfs_element>.
...gives me...
Unable to interpret "EXACT". Possible causes: Incorrect spelling or comma error
...while...
COMPUTE EXACT <lfs_field> = <lfs_element>.
...results in...
Incorrect statement: "=" missing .
As the ABAP version is too old I also cannot use EXACT #( ... )
Example:
In this case I'm using normal variables. Lets just pretend they are field-symbols:
DATA: lw_element TYPE string VALUE '10121212212.1256',
lw_field TYPE p DECIMALS 2.
lw_field = lw_element.
* lw_field now contains 10121212212.13 without any notice about the precision loss
So, how would I do a perfect valid lossless assignment with field-symbols?
Don't see an easy way around that. Guess that's why they introduced MOVE EXACT in the first place.
Note that output_length is not a clean solution. For example, string always has output_length 0, but will of course be able to hold a CHAR3 with output_length 3.
Three ideas how you could go about your question:
Parse and compare types. Parse the source field to detect format and length, e.g. "character-like", "60 places". Then get an element descriptor for the target field and check whether the source fits into the target. Don't think it makes sense to start collecting the possibly large CASEs for this here. If you have access to a newer ABAP, you could try generating a large test data set there and use it to reverse-engineer the compatibility rules from MOVE EXACT.
Back-and-forth conversion. Move the value from source to target and back and see whether it changes. If it changes, the fields aren't compatible. This is unprecise, as some formats will change although the values remain the same; for example, -42 could change to 42-, although this is the same in ABAP.
To-longer conversion. Move the field from source to target. Then construct a slightly longer version of target, and move source also there. If the two targets are identical, the fields are compatible. This fails at the boundaries, i.e. if it's not possible to construct a slightly-longer version, e.g. because the maximum number of decimal places of a P field is reached.
DATA target TYPE char3.
DATA source TYPE string VALUE `123.5`.
DATA(lo_target) = CAST cl_abap_elemdescr( cl_abap_elemdescr=>describe_by_data( target ) ).
DATA(lo_longer) = cl_abap_elemdescr=>get_by_kind(
p_type_kind = lo_target->type_kind
p_length = lo_target->length + 1
p_decimals = lo_target->decimals + 1 ).
DATA lv_longer TYPE REF TO data.
CREATE DATA lv_longer TYPE HANDLE lo_longer.
ASSIGN lv_longer->* TO FIELD-SYMBOL(<longer>).
<longer> = source.
target = source.
IF <longer> = target.
WRITE `Fits`.
ELSE.
WRITE `Doesn't fit, ` && target && ` is different from ` && <longer>.
ENDIF.

ABAP Report Logic

I am new to ABAP.
I have a requirement in abap.In my presentation server ,there is header text file, which I want to upload data from that text file to Header table. But the custom table is having different structure from text file.
It includes extra 4 fields- PO_CREATED_DATE, PO_CREATED_BY, PO_CHANGED_DATE, PO_CHANGED_BY.
These fields have to populate from our report program using sy-datum and sy-uname.
In this scenario,we have to check,If the data is existing then populate
PO_CHANGED_DATE, PO_CHANGED_BY and if the data is not there,then populate PO_CREATED_DATE, PO_CREATED_BY.
Please let me know the logic...
first load the file into an internal table with only 1 very long field (long enough to contain at least the longest possible line in the file). Then loop over that itab and split the individual lines using the separator that is used in the file. You split the contents into a work area that contains all your fields, including the 4 extra fields that may or may not be included in the file. Make sure to clear the work area before splitting the line into the WA. Append the work area to an itab with the same structure as the wa, then continue with the next line.
After that, loop over that second itab and check for lines where your 4 extra fields are initial. Those are the lines where you need to add the data by code. After that, do whatever you need to do with the data in the itab.
I uploaded text file header data to it_input1 using gui_upload.But the it_input1 is not having extra 4 fields.I declared another itable it_header which is having the same structure as Header custom table.Now i wnt to check whether the data in the it_input 1 is alredy existing or not.If existing ,populate it_header-po_changed_date and it_header-po_changed_by or else, it_header-po_created_date and it_header-po_created_by.
Take a look to the "Pattern" Button on top. Select ABAP Objects an press enter.
Now you can supply the class and methdo you want to call.
CL_GUI_FRONTEND_SERVICES=>GUI_UPLOAD
GUI_UPLOAD is a static method. If you are new that is the easiest way to see which parameters must be supplied. With the forward navigation (double-click) you can check the signature for typing the parameter variables.
Then you just need to convert your data (e.g. SPLIT). I can only recommend to use the F1-Help.
Kind regards!

How to comment on MATLAB variables

When I´m using MATLAB, sometimes I feel the need to make comments on some variables. I would like to save these comments inside these variables. So when I have to work with many variables in the workspace, and I forget the context of some of these variables I could read the comments I put in every one of them. So I would like to comment variables and keep the comments inside of them.
While I'm of the opinion that the best (and easiest) approach would be to make your variables self-documenting by giving them descriptive names, there is actually a way for you to do what you want using the object-oriented aspects of MATLAB. Specifically, you can create a new class which subclasses a built-in class so that it has an additional property describing the variable.
In fact, there is an example in the documentation that does exactly what you want. It creates a new class ExtendDouble that behaves just like a double except that it has a DataString property attached to it which describes the data in the variable. Using this subclass, you can do things like the following:
N = ExtendDouble(10,'The number of data points')
N =
The number of data points
10
and N could be used in expressions just as any double value would. Using this example subclass as a template, you could create "commented" versions of other built-in numeric classes, with the exception of those you are not allowed to subclass (char, cell, struct, and function_handle).
Of course, it should be noted that instead of using the ExtendDouble class like I did in the above example, I could instead define my variable like so:
nDataPoints = 10;
which makes the variable self-documenting, albeit with a little more typing needed. ;)
How about declaring another variable for your comments?
example:
\>> num = 5;
\>> numc = 'This is a number that contains 5';
\>> whos
...
This is my first post in StackOverflow. Thanks.
A convenient way to solve this is to have a function that does the storing and displaying of comments for you, i.e. something like the function below that will pop open a dialog box if you call it with comments('myVar') to allow you to enter new (or read/update previous) comments to variable (or function, or co-worker) labeled myVar.
Note that the comments will not be available in your next Matlab session. To make this happen, you have to add save/load functionality to comments (i.e. every time you change anything, you write to a file, and any time you start the function and database is empty, you load the file if possible).
function comments(name)
%COMMENTS stores comments for a matlab session
%
% comments(name) adds or updates a comment stored with the label "name"
%
% comments prints all the current comments
%# database is a n-by-2 cell array with {label, comment}
persistent database
%# check input and decide what to do
if nargin < 1 || isempty(name)
printDatabase;
else
updateDatabase;
end
function printDatabase
%# prints the database
if isempty(database)
fprintf('no comments stored yet\n')
else
for i=1:size(database,1)
fprintf('%20s : %s\n',database{i,1},database{i,2});
end
end
end
function updateDatabase
%# updates the database
%# check whether there is already a comment
if size(database,1) > 0 && any(strcmp(name,database(:,1)))
idx = strcmp(name,database(:,1));
comment = database(idx,2);
else
idx = size(database,1)+1;
comment = {''};
end
%# ask for new/updated comment
comment = inputdlg(sprintf('please enter comment for %s',name),'add comment',...
5,comment);
if ~isempty(comment)
database{idx,1} = name;
database(idx,2) = comment;
end
end
end
Always always always keep the Matlab editor open with a script documenting what you do. That is, variable assignments and calculations.
Only exceptions are very short sessions where you want to experiment. Once you have something -- add it to the file (It's also easier to cut and paste when you can see your entire history).
This way you can always start over. Just clear all and rerun the script. You never have random temporaries floating around in your workspace.
Eventually, when you are finished, you will also have something that is close to 'deliverable'.
Have you thought of using structures (or cells, although structures would require extra memory use)?
'>> dataset1.numerical=5;
'>> dataset1.comment='This is the dataset that contains 5';
dataset1 =
numerical: 5
comment: 'This is the dataset that contains 5'