I use the function module SAP_CONVERT_TO_CSV_FORMAT but I have the following runtime error UC_OBJECTS_NOT_CHARLIKE.
The field that causes the error is a type I.
What should I do avoid the short dump?
CALL FUNCTION 'SAP_CONVERT_TO_CSV_FORMAT'
EXPORTING
i_field_seperator = ';'
i_line_header = 'X'
TABLES
i_tab_sap_data = p_csv1
CHANGING
i_tab_converted_data = gt_row
EXCEPTIONS
conversion_failed = 1
OTHERS = 2.
Related
lv_objectkey2 = ls_mseg-matnr.
"Transport Category
CALL FUNCTION 'BAPI_OBJCL_GETDETAIL'
EXPORTING
objectkey = lv_objectkey2
objecttable = 'MARA'
classnum = 'Z_MATERIAL_CLASS'
classtype = '001'
TABLES
allocvaluesnum = lt_allocvaluesnum2
allocvalueschar = lt_allocvalueschar2
allocvaluescurr = lt_allocvaluescurr2
return = lt_return2.
READ TABLE lt_allocvaluesnum2 INTO ls_valnum2 WITH KEY charact= 'Z_ADR_QUANTITY'.
IF sy-subrc = 0.
lv_adr_quan = ls_valnum2-value_from + lv_adr_quan.
WRITE: lv_adr_quan TO ls_item-ADR_QUAN EXPONENT 0 DECIMALS 2.
* CONDENSE ls_item-ADR_QUAN.
ENDIF.
Here is my problem : The program gives me that problem : "ADR_QUAN" must be a character-type field (data type C, N, D or T).
I need your opinions to fix the issue or solutions.
CONDENSE Statement works only for the character-like variables. Here, ls_item-ADR_QUAN field is of type Float, That's why you're getting that error.
You can go through the attached documentation link for CONDENSE statement.
CONDENSE Documentation
Even if you want to perform CONDENSE then first, you've to assign ls_item-ADR_QUAN to a character-like variable.
Summary of the problem
Automatic Documentary Batch handling through custom ABAP code
My employer wishes to perform automatic documentary batch handling on some products from external vendors, and I'm trying to figure out how to set this up through Customizing and ABAP.
It seems to me that Documentary Batches are only meant to be used through MIGO - in any case I'm unable to find a proper solution to assign them programatically, and any hacked-together solution I can come up with, seems insufficient and unstable.
What avenues do I have to solve this issue?
Enhancing BAPI_GOODSMVT_CREATE?
Can I somehow do it through stuff like BAPI_GOODSMVT_CREATE?
Enhancing PPPI Message Destinations?
I also specifically need it to work for consumption messaging through PPPI, and I thought to build on top of the standard Message Destination PI04, FM COCI_CONFIRM_MATERIAL_CONS.
This FM creates a Material Document but does not go through the BAPI_GOODSMVT_CREATE FM.
It does however use MB_CREATE_GOODS_MOVEMENT.
What I've already tried
MIGO Snapshot based Single Use hack solution
I made hack-solution for one area, where I watched which table-updates MIGO performed and with which data (through FM's VB_INSERT_BATCH and VB_BATCH_WHERE_USED_LIST), and then filled out these structures manually.
However, providing all the needed info is not feasible for other implementation areas, as they do not have all the necessary values available, and it doesn't cover unforeseen situations where other parameters might be required.
Reading through BAPI_GOODSMVT_CREATE code
I've tried spying on whether BAPI_GOODSMVT_CREATE performs the same FM's but only found it accessing VB_BATCH_WHERE_USED_LIST.
It seems to be possible to activate this functionality by controlling Memory ID Documentary Batch #1, Documentary Batch #2, Documentary Batch #3 and Documentary Batch #5 (see FM VBDBDM_DATA_POST_IM), but this requires filling out a lot data, including the structure named DOCUBATCH_SCREEN_FIELDS, which again makes it seem like this might not be the correct avenue of approach.
Regardless, this still doesn't allow me to maintain batch through tables MCHA and MCH1.
Hacked together solution based on MIGO snapshot
Here is how my hacked solution looks. Again, this is not a feasible way to go about the problem, as other implementation areas does not have the resulting Material Document immediately available:
FUNCTION zproxy_mdr_goodsreceipt.
*"----------------------------------------------------------------------
*"*"Local Interface:
*" IMPORTING
*" VALUE(IS_GOODSRECEIPT_HEAD) TYPE ZPROXY_GOODSREC_HEAD
*" VALUE(IT_GOODSRECEIPT_ITEM) TYPE ZPROXY_GOODSREC_ITEM_T
*" REFERENCE(I_CREATE_TO_FROM_REQUIREMENTS) TYPE FLAG DEFAULT '-'
*" EXPORTING
*" REFERENCE(E_GOODSMVT_MSG_IDNO) TYPE CHAR23
*" REFERENCE(E_MBLNR) TYPE MBLNR
*" REFERENCE(E_TO_CREATION_SUBRC) TYPE SY-SUBRC
*" REFERENCE(E_LGNUM_ERROR) TYPE LGNUM
*" REFERENCE(E_TBNUM_ERROR) TYPE TBNUM
*" REFERENCE(E_DOCBATCH_SUBRC) TYPE SY-SUBRC
*" REFERENCE(E_DOCBATCH_MSG_IDNO) TYPE CHAR23
*" REFERENCE(E_CLASSNUM) TYPE BAPI1003_KEY-CLASSNUM
*" REFERENCE(E_OBJKEY) TYPE BAPI1003_KEY-OBJECT
*" EXCEPTIONS
*" GOODSMVT_FAILED
*" NO_TRANSFER_REQUIREMENTS
*" TRANSFER_ORDER_CREATION_ERROR
*"----------------------------------------------------------------------
FIELD-SYMBOLS: <return> TYPE bapiret2,
<goods_rec_item> TYPE zproxy_goodsrec_item,
<mseg> TYPE mseg,
<char_char> TYPE bapi1003_alloc_values_char,
<ltap_creat> TYPE LTAP_CREAT.
DATA: ls_header TYPE bapi2017_gm_head_01,
ls_code TYPE bapi2017_gm_code,
ls_item TYPE bapi2017_gm_item_create,
lt_item TYPE STANDARD TABLE OF bapi2017_gm_item_create,
lt_return TYPE STANDARD TABLE OF bapiret2,
ls_headret TYPE bapi2017_gm_head_ret,
l_mblnr LIKE bapi2017_gm_head_ret-mat_doc,
l_docubatch TYPE charg_d,
l_subrc TYPE sy-subrc,
lt_mseg TYPE STANDARD TABLE OF mseg.
CLEAR l_subrc.
* ############################## Create goods movement ##############################
* Build structures
MOVE-CORRESPONDING is_goodsreceipt_head TO ls_header.
ls_code-gm_code = '01'.
LOOP AT it_goodsreceipt_item ASSIGNING <goods_rec_item>.
MOVE-CORRESPONDING <goods_rec_item> TO ls_item.
APPEND ls_item TO lt_item.
ENDLOOP.
* BAPI call
CALL FUNCTION 'BAPI_GOODSMVT_CREATE'
EXPORTING
goodsmvt_header = ls_header
goodsmvt_code = ls_code
IMPORTING
goodsmvt_headret = ls_headret
materialdocument = l_mblnr
TABLES
goodsmvt_item = lt_item
return = lt_return.
* Check errors
READ TABLE lt_return ASSIGNING <return> WITH KEY type = 'E'.
IF sy-subrc = 0.
e_goodsmvt_msg_idno = <return>-id && <return>-number.
ROLLBACK WORK.
RAISE goodsmvt_failed.
ELSE.
e_mblnr = l_mblnr.
COMMIT WORK AND WAIT. "Wait for TO requirements to be created
ENDIF.
* Only proceede if Material Document has been successfully posted
CHECK l_subrc = 0 AND l_mblnr IS NOT INITIAL.
* ############################## Update with Documentary Batch ###################################
DATA: lt_chvw TYPE STANDARD TABLE OF chvw,
ls_chvw TYPE chvw,
lt_mch1 TYPE STANDARD TABLE OF mch1,
ls_mch1 TYPE mch1,
lt_mcha TYPE STANDARD TABLE OF mcha,
ls_mcha TYPE mcha,
lt_mchb TYPE STANDARD TABLE OF mchb,
lt_mska TYPE STANDARD TABLE OF mska,
lt_mspr TYPE STANDARD TABLE OF mspr,
lt_char_num TYPE STANDARD TABLE OF bapi1003_alloc_values_num,
lt_char_char TYPE STANDARD TABLE OF bapi1003_alloc_values_char,
lt_char_curr TYPE STANDARD TABLE OF bapi1003_alloc_values_curr,
l_objkey TYPE bapi1003_key-object,
l_classnum TYPE bapi1003_key-classnum,
l_atnam TYPE atnam.
REFRESH lt_chvw.
* Get material document items
SELECT *
FROM mseg
INTO TABLE lt_mseg
WHERE mblnr = l_mblnr.
* Perpare docubatch registration data
LOOP AT it_goodsreceipt_item ASSIGNING <goods_rec_item>.
* Generate class num and atnam from plant
CONCATENATE 'PI_' <goods_rec_item>-plant INTO l_classnum.
CONCATENATE 'Z_DOC_BATCH_' <goods_rec_item>-plant INTO l_atnam.
* Get material docubatch usage characteristic
REFRESH: lt_return,
lt_char_num,
lt_char_char,
lt_char_curr.
l_objkey(18) = <goods_rec_item>-material.
CALL FUNCTION 'BAPI_OBJCL_GETDETAIL'
EXPORTING
objectkey = l_objkey
objecttable = 'MARA'
classnum = l_classnum
classtype = '001'
TABLES
allocvaluesnum = lt_char_num
allocvalueschar = lt_char_char
allocvaluescurr = lt_char_curr
return = lt_return.
LOOP AT lt_return ASSIGNING <return> WHERE type = 'E'. "Check for errors
* Couldn't read characteristic, assume no docubatch handling
e_docbatch_subrc = '1'.
e_docbatch_msg_idno = <return>-id && <return>-number.
e_classnum = l_classnum.
e_objkey = l_objkey.
CONTINUE.
ENDLOOP.
READ TABLE lt_char_char ASSIGNING <char_char> WITH KEY charact = l_atnam.
IF sy-subrc <> 0 OR <char_char>-value_neutral = 0.
* No docubatch value
CONTINUE.
ENDIF.
* Get associated material document item
READ TABLE lt_mseg ASSIGNING <mseg>
WITH KEY mblnr = ls_headret-mat_doc
mjahr = ls_headret-doc_year
bwart = <goods_rec_item>-move_type
matnr = <goods_rec_item>-material
werks = <goods_rec_item>-plant
menge = <goods_rec_item>-entry_qnt
meins = <goods_rec_item>-entry_uom
hsdat = <goods_rec_item>-prod_date
kzbew = <goods_rec_item>-mvt_ind
lgort = <goods_rec_item>-stge_loc.
IF sy-subrc <> 0.
* No associated material document item
CONTINUE.
ENDIF.
* Check docubatch type
IF <char_char>-value_neutral <> 0.
* Perform basic docubatch actions (MCHA and MCH1)
* Verify that docubatch nr is assigned
IF <goods_rec_item>-vendrbatch IS INITIAL.
* !!!!!!!!!!!!! Venderbatch not filled even though material is docubatch managed, what to do? !!!!!!!!!!!!!!!
CONTINUE.
ENDIF.
* Prepare data for docubatch registration
CLEAR: ls_mch1,
ls_mcha.
ls_mch1-matnr = <goods_rec_item>-material.
ls_mch1-charg = <goods_rec_item>-vendrbatch.
ls_mch1-ersda = sy-datum.
ls_mch1-ernam = sy-uname.
ls_mch1-ersda_tmstp = sy-datum && sy-uzeit.
ls_mch1-ersda_tz_sys = sy-tzone.
ls_mch1-ersda_tz_usr = sy-zonlo.
MOVE-CORRESPONDING ls_mch1 TO ls_mcha. "Same fields from MCH1 are included in MCHA
ls_mcha-werks = <goods_rec_item>-plant.
APPEND: ls_mch1 TO lt_mch1,
ls_mcha TO lt_mcha.
ENDIF.
IF <char_char>-value_neutral = 2. "Also include batch where-used
* Perpare data for batch where-used registration
CLEAR ls_chvw.
ls_chvw-matnr = <goods_rec_item>-material.
ls_chvw-werks = <goods_rec_item>-plant.
ls_chvw-charg = <goods_rec_item>-vendrbatch.
ls_chvw-ebeln = <goods_rec_item>-po_number.
ls_chvw-ebelp = <goods_rec_item>-po_item.
ls_chvw-mblnr = ls_headret-mat_doc.
ls_chvw-mjahr = ls_headret-doc_year.
ls_chvw-zeile = <mseg>-zeile.
ls_chvw-budat = is_goodsreceipt_head-pstng_date.
ls_chvw-shkzg = 'S'. "??? VALUE ???
ls_chvw-bwart = <goods_rec_item>-move_type.
ls_chvw-kzbew = <goods_rec_item>-mvt_ind. "Goods Movement for Purchase Order
ls_chvw-menge = <goods_rec_item>-entry_qnt.
ls_chvw-meins = <goods_rec_item>-entry_uom.
APPEND ls_chvw TO lt_chvw.
ENDIF.
ENDLOOP.
* Perform batch registration
CALL FUNCTION 'VB_INSERT_BATCH'
TABLES
zmch1 = lt_mch1
zmcha = lt_mcha
zmchb = lt_mchb
zmska = lt_mska
zmspr = lt_mspr
.
* Perform batch where-used registration
CALL FUNCTION 'VB_BATCH_WHERE_USED_LIST'
TABLES
xchvw = lt_chvw.
Why this isn't good enough, and what I need
This performs as a snapshot of MIGO configured with documentary batch handling, but doesn't necessarily cover all cases.
It only works in the context of a Purchase Document, and doesn't cover other cases such as Orders and Sales Orders.
Additionally I only have the necessary date because of the material document being created immediately above, which is not possible for all implementation cases.
I would like to know if there is an intended way to perform Documentary Batch handling from custom code.
Quoting from the documentation:
If you work with RFID or TRM functions, or call IDocs/BAPIs, you can only book in documentary
batches by calling up the RFC-capable function module VBDBDM_DATA_MAINTAIN_RFC
beforehand or incorporating it into the process.
So maybe this function module is the key?
However, it seems you may not be the first to experience this pain. A comment on that documentation reads:
Documentary Batch has a lot of constraints and it seems to be a semifinished product of SAP, since is missing a lot of features of real batches.
Be prepared to make a lot of custom enhancements...
ADDENDUM from community: below is the solution taken from the Original Poster two days after this answer, moved away from his question.
Solution
Example call for Purchase Order Goods Receipt
LOOP AT it_goodsreceipt_item ASSIGNING <goods_rec_item>.
CALL FUNCTION 'VBDBDM_DATA_MAINTAIN_RFC'
EXPORTING
i_matnr = <goods_rec_item>-material
i_werks = <goods_rec_item>-plant
i_quantity = <goods_rec_item>-entry_qnt
i_uom = <goods_rec_item>-entry_uom
i_docubatch_charg = <goods_rec_item>-vendrbatch
* IT_DOCUBATCHES =
i_process_id = '01' "Goods Receipt for External Procurement
* I_REPLACE_EXISTING_DATA =
i_ebeln = <goods_rec_item>-po_number
i_ebelp = <goods_rec_item>-po_item
* I_AUFNR =
* I_AUFPS =
* I_RSNUM =
* I_RSPOS =
* I_RSART =
* I_VBELN =
* I_POSNR =
* IS_DOCUBATCH_COM =
* I_LINE_ID =
* I_LGNUM =
* I_TANUM =
* I_TAPOS =
EXCEPTIONS
parameter_error = 1
process_not_active = 2.
ENDLOOP.
* Follow up by creating Material Document, for example through BAPI_GOODSMVT_CREATE
I'm trying to use the class /ui5/cl_json_parser for parsing a JSON string.
The following code snippet reproduces the problem:
REPORT ztest_json_parse.
DATA: input TYPE string,
output TYPE string,
json_parser TYPE REF TO /ui5/cl_json_parser.
input = '{"address":[{"street":"Road","number":"545"},{"street":"Avenue","number":"15"}]}'.
CREATE OBJECT json_parser.
json_parser->parse( input ).
json_parser->print( ).
output = json_parser->value( path = '/address/1/street' ).
WRITE output.
The print method shows the correct parsed JSON string, but the output variable is always empty.
I have traced the code down to the method VALUE of the class /UI5/CL_JSON_PARSER, at line 15, which contains:
read table m_entries into l_entry with table key parent = l_parent name = l_name.
In the debugger, I can see that l_parent = '/address/1' and l_name = 'street', and that the internal table m_entries contains a record with parent = '/address/1' and name = 'street'. Nevertheless the READ statement always returns sy-subrc = 4 and does not find anything.
Can anyone help?
First: Do not use /ui5/cl_json_parser class, it is intended for internal use ONLY and has no reliable documentation
Secondly, here is the sample how you can fetch street value from the first element of your JSON:
DATA(o_json) = cl_abap_codepage=>convert_to( '{"address":[{"street":"Road","number":"545"},{"street":"Avenue","number":"15"}]' ).
DATA(o_reader) = cl_sxml_string_reader=>create( o_json ).
TRY.
DATA(o_node) = o_reader->read_next_node( ).
WHILE o_node IS BOUND.
DATA(op) = CAST if_sxml_open_element( o_node ).
LOOP AT op->get_attributes( ) ASSIGNING FIELD-SYMBOL(<a>).
DATA(attr) = <a>->get_value( ).
ENDLOOP.
IF attr <> 'street'.
o_node = o_reader->read_next_node( ).
ELSE.
DATA(val) = CAST if_sxml_value_node( o_reader->read_next_node( ) ).
WRITE: '/address/1/street =>', val->get_value( ).
EXIT.
ENDIF.
ENDWHILE.
CATCH cx_root INTO DATA(e_txt).
ENDTRY.
As far as I know, there is no class in ABAP that allows fetching single JSON attributes like XPath.
Certainly agree with Suncatcher on avoid UI5 Json parser.
If you dont control/know the structure of the source data, Suncatchers answer is good.
However,
if you know the basic structure of the source JSON and you must, if you plan to access the first address row, fieldname street .
AND you can have the source provided using uppercase variable names then you can use the so called identity transformation.
types: begin of ty_addr,
street type string,
number type string,
end of ty_addr.
types ty_addr_t type STANDARD TABLE OF ty_addr.
DATA: input TYPE string,
ls_addr TYPE ty_addr,
lt_addr type ty_addr_t.
input = '{"ADDRESS":[{"STREET":"Road","NUMBER":"545"},{"STREET":"Avenue","NUMBER":"15"}]}'.
CALL TRANSFORMATION id SOURCE XML input
RESULT address = lt_addr.
read table lt_addr index 1 into ls_addr.
WRITE ls_addr-street.
We got into difficulties in maintaining the ITXEX field (Long text indication) of an Infotype record.
Say we got an existing record in an Infotype database table with a long text filled (ITXEX field value in that record is set to 'X').
Some process updates the record through HR_CONTROL_INFTY_OPERATION like this:
CALL FUNCTION 'HR_CONTROL_INFTY_OPERATION'
EXPORTING
infty = '0081'
number = '12345678'
subtype = '01'
validityend = '31.12.9999'
validitybegin = '19.05.2019'
record = ls_0081 " ( ITXEX = 'X' )
operation = 'MOD'
tclas = 'A'
nocommit = abap_true
IMPORTING
return = ls_return.
This call does update the record but clearing it's ITXEX field.
It's important to say that making the same action through PA30 does update the record and maintain ITXEX field as it was.
The described problem seems similar to that question. Trying the solutions given there didn't solve the problem.
Why the two approaches (PA30 and function module) don't work the same? How to fix this?
First of all, FM parameters you use are incorrect. How do you want the infotype to be updated if you set nocommit = TRUE?
Also, you are missing the correct sequence which must be used for the update procedure:
Lock the Employee
Read the infotype
Update the infotype
Unlock the Employee
The correct snippet for your task would be
DATA: ls_return TYPE bapireturn1.
DATA: l_infty_tab TYPE TABLE OF p0002.
CALL FUNCTION 'HR_READ_INFOTYPE'
EXPORTING
pernr = '00000302'
infty = '0002'
TABLES
infty_tab = l_infty_tab.
READ TABLE l_infty_tab ASSIGNING FIELD-SYMBOL(<infotype>) INDEX 1.
<infotype>-midnm = 'Shicklgruber'. " updating the field of infotype
CALL FUNCTION 'ENQUEUE_EPPRELE'
EXPORTING
pernr = '00000302'
infty = '0002'.
CALL FUNCTION 'HR_CONTROL_INFTY_OPERATION'
EXPORTING
infty = <infotype>-infty
number = <infotype>-pernr
subtype = <infotype>-subty
validityend = <infotype>-endda
validitybegin = <infotype>-begda
record = <infotype>
operation = 'MOD'
tclas = 'A'
IMPORTING
return = ls_return.
CALL FUNCTION 'DEQUEUE_EPPRELE'
EXPORTING
pernr = '00000302'
infty = '0002'.
This way itxex field is treated correctly and if existed on that record, will remain intact. However, this method will not work for updating the long text itself, for that you must use object-oriented way, methods of class CL_HRPA_INFOTYPE_CONTAINER.
I need to get the particulars/long text of FI held documents. I tried the 'read_text' function module but had no luck since the held document has the temporary document number.
I tried looking for data in STXL and STXH tables, I also tried the function modules in FM group FTXT and STXD but had no luck.
Any other method to achieve that goal?
First of all, you need the temporary document number which can be get either from F-43 itself or from RFDT table.
In field SRTFD you should separate it from username.
Then run READ_TEMP_DOCUMENT FM, after running it you should have your texts in ABAP memory.
To get them use GET_TEXT_MEMORY.
ls_uf05a-tempd = '0012312356'. "doc number
ls_uf05a-unamd = 'JOHNDOE'. "username
CALL FUNCTION 'READ_TEMP_DOCUMENT'
EXPORTING
I_UF05A = ls_uf05a
TABLES
T_BKPF = lt_bkpf
T_BSEC = lt_bsec
T_BSED = lt_bsed
T_BSEG = lt_bseg
T_BSET = lt_bset
T_BSEZ = lt_bsez
.
DATA: lt_texts TYPE TABLE OF TCATALOG,
t_tline TYPE STANDARD TABLE OF tline,
memory_id(30) VALUE 'SAPLSTXD'.
CALL FUNCTION 'GET_TEXT_MEMORY'
TABLES
TEXT_MEMORY = lt_texts.
READ TABLE lt_texts ASSIGNING FIELD-SYMBOL(<cat>) WITH KEY tdobject = 'BELEG'
tdid = '0001'
tdspras = 'E' BINARY SEARCH.
IF sy-subrc = 0.
memory_id+8(6) = <cat>-id.
ENDIF.
IMPORT tline = t_tline FROM MEMORY ID memory_id.
LOOP AT t_tline ASSIGNING FIELD-SYMBOL(<tline>).
WRITE: <tline>-tdline. "showing the texts
ENDLOOP.