How to get data from another system - abap

I need to get some data from one system to another.
So far, I used the function module (FM) RFC_READ_TABLE where I filled all required fields and got data that I needed from another system's table.
I can't use RFC_READ_TABLE in task that I'm working on right now, because of interface agreement. I need to get posting status of invoice; I also found FM BAPI_BILLINGDOC_GETDETAIL, but this FM isn't on development system that I'm working on, but it is on system where confidential data are stored. I tried to google stuff, but I couldn't find good example with getting data from another system.
My question is, how do I get data from another system with FM BAPI_BILLINGDOC_GETDETAIL ?

Ok I found out where was the problem, on our development system there are no BAPI FM and the way how you can get data from another system is that put DESTINATION at the end of FM
CALL FUNCTION 'BAPI_BILLINGDOC_GETDETAIL'
DESTINATION lv_destination "<====== added this line
EXPORTING
...

Related

Best way to export data from other company's SAP

I need to extract some data from my client's SAP ECC (the SUIM -> Users by Complex Selection Criteria -program RSUSR002)
Normally I give them a table of values that I they have to fill some field to extract what I need.
They have to make 63 different extractions (with different values of objects, for example - but inside the same transaction - you can see in the print) from their SAP, to later send to me all extracted files.
Do you know if there is an automated way to extract that, so they don't have to make 63 extractions?
My biggest problem is that every time they make mistakes. It's a lot of things to fill..
Can I create a variant and send it to them? Is it possible to export my variant so they can import it without the need to fill 63x different data?
Thank you.
When this is a task which takes considerable effort by multiple people each year, then it is something which might be worth automatizing.
First you need to find out where that transaction gets its data from. If you spend some time analyzing and debugging the program behind the transaction, you will surely find which SELECT's on which database table(s) provide that data. If you are lucky, there might even be a function module for it.
Then you just need to write an own ABAP program which performs the same selections.
Now about the interesting part: How to get that data to you. There are several approaches here. The best one depends on your requirements and your technical infrastructure. Some possibilities are:
Let users run the program in foreground, use the method cl_gui_frontend_services=>gui_download to save the data to a file on the user's PC and ask them to send it to you via email
Run the program in background and save the file on the application server. Then ask your sysadmins how to get that file from their application server to you. The simplest way would be to just map a network fileserver so they all write to the same place, but there might be some organizational hurdles in the way which prevent that. (Our security people would call me crazy if I proposed to allow access to SMB shares from outside of our network, but your mileage may vary)
Have the program send the data to you directly via email. You can send emails from an SAP system using the function module SO_NEW_DOCUMENT_ATT_SEND_API1. This of course requires that the system was configured to be able to send emails (which you can do with transaction code SCOT). Again, security considerations apply. When it's PII or other confidential data, then you should not send it in an unencrypted email.
Use an RFC call to send the data to your own SAP system which aggregates the data
Use a webservice call to send the data to your own non-SAP system which aggregates the data
You can create a recording in transaction SM35.
There you fill a tcode (SUIM), start recording, make some input in transaction SUIM and then press 'Execute'. Then you can go back to recording (F3 multiple times) and the system will generate some table with commands (structure is BDCDATA). You can delete unnecessary part (i.e. BACK button click) and save it to use as a 'macro'. Then you can replay this recording and it will do exactly what you did.
Also it's possible to export/import the recording to text file, so you can explore it's structure, write some VBA script to create such recording from your parameters and sent it to users. But keep in mind that blanks are meaningful.
It's a standard tools so there's no any coding in the system.
You can save the selection as a variant.
Fill in the selection criteria and press Save.
It can be reused.
You can also transport Variants if the they have a special name

Make SAP Report available via RFC

One customer wants to access a SAP report via RFC.
Steps:
Third party application connects to SAP via RFC
RFC call gets transmitted
SAP runs the report
SAP returns the report.
How can this be implement the part inside SAP?
I am using PyRFC as client library. But AFAIK this does not matter at all for this question. This question is only about the server part inside SAP.
In this case it is the report RM07MLBS which should be made available via RFC.
You need an ABAPer to make a function to you, I think there's no way without it.
If you have a ABAPer just do something like this:
SUBMIT <REPORT_NAME> ... EXPORTING LIST TO MEMORY AND RETURN.
CALL FUNCTION 'LIST_FROM_MEMORY'
TABLES
listobject = t_listobj.
CALL FUNCTION 'LIST_TO_ASCI'
TABLES
listasci = t_ascilist
listobject = t_listobj.
Now you have the list in ASCII format, you can convert it to what you want.
Another way is send the report result to spool, get the spool and convert it to HTML/PDF.
To convert Spool you can use this functions:
RSPO_RETURN_ABAP_SPOOLJOB
RSPO_RETURN_SPOOLJOB_DAT
RSPO_RETURN_SPOOLJOB_HTML
In a perfect world, you'd have the report logic encapsulated in an abap class or a dedicated function module and use that as the basis for both your report and the RFC calls. But if this is a standard SAP report and SAP themselves weren't nice enough to provide said function module, you may not have this option.
I don't think this is the best solution for your request, but just to add another option to the ones already mentioned in other answers: the commercial product Theobald Xtract Universal can execute reports and return the results using several available destination types. Xtract is a windows service that offers connectivity to several target database types as well as a http based result stream. It isn't cheap though, and it essentially only can connect to SAP Netweaver based systems as its data source (at least S/4 is already supported). Target destinations have to be purchased extra, but at least not per system, only per destination type (Oracle, MySQL, MSSQL...).
https://theobald-software.com/en/xtract-universal/
Xtract Universal uses a number of customer function modules to execute the report in the target system, catch the output and return it, essentially as a wall of text. You'll have to parse that result yourself, you won't get a nice pre-parsed table with data in it.
Just to make sure there's no misunderstanding about a possible conflict of interest: I don't work for Theobald, but we are a paying customer and use Xtract for our own data extractions. It is very simple to use, can be executed in scripts, but as said, just does that one job.
As far as I know there is a limited possibility to trigger a report via a rfc-enabled function module. Otherwise try triggering a transaction (based on your report) via function module.
Also check: https://archive.sap.com/discussions/thread/811196
I dont think that you are able to respond the report result to your third party system.
From consulting perspective I would recommend running the report periodically, write results in a table, fetch data from table (from third party system) via RFC-enabled function module.
The best way to do this would be to wrap your report in a bespoke Function Module that is RFC Enabled, then give them access to run that RFC Function Module.

How to transfer data from SQL Server to mongodb (using mongoose schema for validation)

Goal
We have a MEAN stack application that implements a strict mongoose schema. The MEAN stack app needs to be seeded with data that originates from a SQL Server database. The app should function as expected as long as the seeded data complies with the mongoose schema.
Problem
Currently, the data transfer job is being done through the mongo CLI which does not perform validation. Issues that have come up have been Date objects being saved as strings, missing keys that are required on our schema, entire documents missing, etc. The dev team has lost hours of development time debugging the app and discovering these data issues.
Solution we are looking for
How can we validate data so it:
Throws errors
Fails and halts the transfer
Or gives some other indication that the data is not clean
Disclaimer
I was not part of the data transfer process so I don't have more detail on the specifics of that process.
This is a general problem of what you might call "batch import", "extract-transform-load (ETL)", or "data store migration", disconnected from any particular tech. I'd approach it by:
Export the data into some portable format (e.g. CSV or JSON)
Push the data into the new system through the same validation logic that will handle new data on an ongoing basis.
It's often necessary to modify that logic a bit. For example, maybe your API will autogenerate time stamps for normal operation, but for data import, you want to explicitly set them from the old data source. A more complicated situation would be when there are constraints that cross your models/entities that need to be suspended until all the data is present.
Typically, you write your import script or system to generate a summary of how many records were processed, which ones failed, and why. Then you fix the issues, run it on those remaining records. Repeat until you're happy.
P.S. It's a good idea to version control your import script.
Export to csv and write a small script using node. that will solve your problem. You can use fast-csv npm

SAP BAPI for vendor creation and editing

Has anyone ever created a BAPI to create or edit a vendor in SAP R/3 4.6c in the background? I found two BAPIs: BAPI_VENDOR_CREATE and BAPI_VENDOR_EDIT, but both only work online (they call transactions XK01 and XK02).
Basically I need a way to call a function module that would do the same work as transactions XK01 and XK02 but don't need to be called online.
Looking on the SAP community forums, I found a lot of people with the same needs as me, but the answer was never complete.
Could someone give me a suggestion?
As you have discovered, SAP doesn't provide BAPIs for vendor creation/change that can be called in the background (this is, unfortunately, still the case in newer releases).
You have a few possible options:
Create your own BAPI, using (unreleased) SAP function modules for the vendor update.
Create your own BAPI, creating/changing the vendor via a BDC session.
I'd go with option 2. No, BDCs are never ideal and they have a lot of downsides but even a lot of SAP standard function modules for vendor creation seem to go that route and you'll at least be certain that the data in your system is consistent, unlike if you use something like function module VENDOR_INSERT, which does direct table updates without application validation.
Check if the standard vendor data transfer program (RFBIKR00) is in your system - it uses batch input so could be a very useful starting point for your BDC.

Abap change pointers send complete material master data idoc

I have change pointers set-up for material master data and it is already working. Now, I have the requirement always send the whole material information to the external system. When doing manually in BD10, I can select the option to send the full material. However, when using change pointers and the program RBDMIDOC, there does not seem to be a proper way to do it.
I have searched around and the solutions seem to be creating a custom program (copy of RBDMIDOC) which implements the logic to manually change the table BCDP to fool the system to send the full material information.
This does not look like a good solution.
Any advise how this requirement can be realized "properly"? I don't think I am the first to have this requirement...
Thanks a lot for your answer(s)!
My solution was to set the fldname in the change pointers to the special value of "ALELISTING". This will cause the system to send the whole material master data via the IDoc. Note that this might only work for material master data but not other IDocs.