I'm migrating from ECC to S/4HANA. I'm trying to fix ATC message of a program.
But I'm stuck with the following error message:
Functionality not available: functional equivalent available
It's relevant to foreign trade which is not supported in S/4HANA.
I have searched on Internet and it said EIPO/EIKP is not used. So What is the table which holds data of EIPO/EIKP in new system?
Related
For example cudaMemcpy and cuMemcpy? I can see that the function definitions are different, but I mean the API in general. Why is there an api starting with cu... and one starting with cuda...? When should each API be used?
The API where the method names start with cu... is the so called Driver API. The API where the method names start with cuda... is the Runtime API.
Originally (up to CUDA 3.0) the APIs have been completely separated. A rough classification was: The Runtime API is simpler and more conventient. The Driver API is intended for more complex, "low level" programming (and maybe library development).
Since CUDA 3.0, both APIs are interoperable. That means that, for example, when you allocate memory with the Driver API using cuMemAlloc, then you can also use the same memory in Runtime API calls, like cudaMemcpy.
The major practical difference was that in the Runtime API, you could use the special kernel<<<...>>> launching syntax, whereas in the Driver API, you could load your CUDA programs as "modules" (with methods like cuModuleLoad), given in form of CUBIN files or PTX files, and launch these kernels programmatically using cuLaunchKernel.
In fact, I think that for the largest part of a CUDA program, the differences are negligible: Nearly every other functionality (except for kernel/module handling) is available in both APIs, and nearly equal in both. This refers to methods (cuMemcpy and cudaMemcpy etc., as well as to structures CU_event and cudaEvent etc.).
Further information can be found with websearches involving the keywords "CUDA Runtime Driver API", for example, at https://devtalk.nvidia.com/default/topic/522598/what-is-the-difference-between-runtime-and-driver-api-/
I am trying to evaluate the capabilities of the below IBM Rational tools to handle functional testing automation and performance testing for the "Finacle - Infosys" core banking system.
IBM Rational Functional Tester (RFT)
IBM Rational Performance Tester (RPT)
Also, I would like to confirm whether any additional components are required to be added for both tools to support my testing needs (e.g. plugin to test web services or handling Oracle NCA JInitiator ... etc.).
Appreciate your kind help and support.
Note: I apologize for posting the question twice on the "Software Quality Assurance and Testing" and the main "StackOverFlow" sites.
Rational Functional Tester(RFT) is a Functional/GUI Testing/Automation tool.
I am not sure about the application you are using however RFT provides a very good support for Html /Java /.Net /Win based applications (to name a few)
I am giving a brief description on what RFT can do and then you can decide if it meets your requirement for functional testing or automation.
RFT can be installed on an Eclipse IDE or Visual Studio IDE or Both and you can use whatever you are comfortable with.
RFT provides you the capability to create your test scripts using a Recorder which simply records the action you perform on your application , capturing the objects you interacted all along and saving the object map. You can create Verification points (data verfication point , property verification point etc) or use Datapool to run some tests with different sets of data.
The script thus generated can be further fine-tuned if required post recording also.
Once you have a script ready you can playback the script and then RFT would start executing the actions the script contains, RFT would use the object map to determine the properties and the hierachy of the object to find the controls and perform actions on them. It may also use the saved verification points if any, to verify any data/property . Or may use the datapool to data-drive the script.
Another way you could use RFT is by harnessing the power of descriptive programming (you would require some basic Java/VB skills).
Using descriptive programming you would essentially create the script on your own without using the recorder. You could use the find() api provided by RFT to find objects and then perform actions / verfication test etc. When using descriptive scripting we would usually unbind ourselves from the object map as we are defining what to find and where to find etc.
Also a combination of two can be used where you can use some objects from the object map and some can be found using the find() api. This would entirely depend on your the kind of test case you want to write and type of application you are automating/testing.
You could also use Simplified scripting which records a script in plain english like format and also allows capturing of application visuals enabling a user to modify a script even if the application is offline using the captured visuals.
As checked and confirmed with IBM representatives and according to IBM Portal (Link Here), the use of both Rational Performance Tester (RPT) and Rational Functional Tester (RFT) should be supported.
We will have a project running to conduct both functional testing automation and performance testing for the "Finacle - Infosys" core banking system during the coming 12 months, after-which I can post any updates on this question of mine.
Anybody aware of SAP module(s) being built using PowerBuilder or any road-map in place for such development in future.
There are some modules built in PowerBuilder. As we become more familiar with our SAP side of the business, we are hopeful to learn about more. We've met with the teams that build these modules and are understanding their roadmap and we've discussed the PowerBuilder roadmap with them as well.
Are you using PB to customize SAP? I'd be interested in discussing more of what prompted your question - please email me! I am out of the office today and tomorrow but will be back next week.
Sue Dunnell
PowerBuilder Product Manager
dunnell#sybase.com
If I'm understanding correctly you are wanting to call existing BAPI's via PowerBuilder; for example create a Purchase Requisition using the built in BAPI CreaeFromData ? I have created a couple back in version 8 or 9 of PB and it was quite a challenge it is totally do-able. I have code for creating Purchase Requisitions and have code which calls a custom "in-house" written BAPI.
I actually thought about building an API for all the SAP BAPI's because of how difficult it was and I can usually find the answer to anything on the web, but not this; had to trial and error it.
Two hints. Study the BAPI in SAP Object Explorer or maybe it was called BAPI Browser can't remember, but SAP shows the parameters in different order depending on which way you look at them. To make them work in PB you have to call the parms in the exact order, unlike the Microsoft languages where you can do named arguments and such. Then you have to be sure to make all REF types REF. Any ONE item wrong and you get the dreaded system crash which tells you nothing. If PowerBuilder had not started losing popularity I would have kept writing APIs for various BAPIs
Contact me if you want any sample code. I can't give any exact code from one of my corporate clients but would be happy to get you going in the right direction.
Sincerely,
R
I have some questions about customers about NF mode for DB2. Google had very little information about it.
I've been able to infer the following but I don't completely trust it...
NF and CM (compatibility mode) are settings on DB2 v8 on mainframe.
DB2 v8 on z/OS in CM is designed to allow DB2 v8 to be used as a drop in replacement for DB2 v7.
NF makes numerous changes that break backward compatibility.
In particular the "data dictionary" system tables are completely reworked.
I suspect the following, but I'm even less sure of it...
The reworked data dictionary and system tables are similar to those used by DB2 v8 on Linux/Unix/Windows.
DB2 v8 NF is largely compatible with DB2 v9.
DB2 v9 is much more consistent across platforms (Linux/Unix/Windows and mainframe).
Code written to work with DB2 v8 NF will generally work properly with DB2 v8 on LUW and DB2 v9 anywhere.
Can anyone tell me if I am right? Or add more detail?
These functional modes are basically just ways to do your planned upgrades. Both DB2/z v8 and v9 (and probably all the ones to come) have three modes:
compatibility (CM).
enable new function (ENFM).
new function (NFM).
It's used to ensure there's a smooth transition between versions. For example, all DBMS' in a group must be upgraded to v8 CM before any of them can be switched to v8 ENFM.
Compatibility mode for vX means you're using vX but with none of the new function (in other words, equivalent to v(X-1). If you're running v7 at the moment, you're unsupported - you should really be running v8 in CM (if you care about support and, believe me, if you're paying IBM those huge license fees, you should care :-).
Enable new function mode is when the database system tables have been updated to use new function but not user tables.
New function mode means that new functionality has been enabled for both system and user tables.
The actual content of the new function depends on the version.
See here for the v8 What's New information. Basically chapter 2 lists all the new functionality:
More SQL goodies.
More security.
Better compatibility with DB2/z's smaller siblings (LUW), including Unicode.
Scalability/Performance.
Availability (very important in the mainframe world).
By the way, IBM makes all its documentation available on the web for perusal, the top level of the public libraries being found here.
Our product currently spans a large number of technologies, including Java, PL/SQL, VB.Net and ABAP. We have a fairly mature source control and build system set up for all of the languages except ABAP, which is still in the stone ages. Since SAP has a build system set up within it, our engineers do all of their development in an SAP environment export transports, and check those into source control. Since we support a number of SAP versions, it becomes very difficult to track versions and migrate code across 4.6, 4.7, 5.0, etc.
My ideal process would be to check the ABAP code into source control in text files, and then load it into SAP and generate the transports as part of the build process. The SAP engineers don't think there are tools to support this model.
If you are managing ABAP code in a source control system, what does your process look like? Are there tools available (preferably command-line) for loading ABAP code into SAP? How do your engineers manage the code/test/debug cycle? Do they code in SAP and then export the code when finished, or edit in an external editor?
I used SAPLINK (mentioned in previous answer) for that purpose. There is also a related project called "zake", that supposedly can automate some of the tasks but I never used it. I simply exported my code manually to so-called slinkees (they contain single objects like function groups; nuggets on the other hand contain several objects).
Reasons to use some external source control system:
correlation to non-abap source code (as our software consisted of .net and abap code)
hosting / maintaining SAP was not something we were exactly good at, so it was good to know you had your code in a safe place
one thing though: you need at least WAS 620 in order to use saplink
I'm interested as to what the benefit is of version control outside the ABAP stack of the SAP system.
I've never seen anyone use external source code control for ABAP, as it's built right in. I've never seen anyone code ABAP outside the SAP system either. It really doesn't fit the model.
SAP's ABAP stack is a single-development system environment. All the developers log on to the one system and develop there. The system records versions automatically, and groups changed objects into transports. A transport is just a list of changed objects. Once you export the transport, the version numbers are incremented for each object and you get the package for the other systems.
The ABAP stack also doesn't really have a "build" concept as such. Everything you do is a patch.
Also check out SAP CTS+ which is used for managing transports and version control of ABAP and JAVA based components.
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0249083-c0ab-2a10-78b8-b7a7854b1070
At the very least, modifications should be done and tested in an SAP development system. Nobody uses an external editor with ABAP. (SAP Java on the other hand...) There is no reason why you can't keep backups of the SAP code, either directly, as text files, or, (preferably) with SAPLink or transport dump files. (Ask your BASIS people about the transport files). Realize that if you go the text file route, you might miss out on things like field text, etc., which are stored elsewhere in the database.
Hy,
As Dom told you, SAP has it's own version managment. However in order to makes regular save between transport releasing, you might use tools like :
SAPLink (as saied Wili aus Rohr)
ZAPLink
this tools could be use to extract ABAP components into XML. I really do not advise to make automatic import into SAP, for many reasons :
# thoses tools have no guaranties
# not all ABAP Compoponent can be handled like this
# you will lose SAP guaranty if you do this on a productive SAP system
But it might be interesting to use tools like (Google code) to display in detail software change, which could be more complicated on ABAP Object.
I developed this on ZAP Link framework with ZAPLINK_EXTRACTOR program that export SAP Components into XML when they have changed. This prevent XML file to change (new file but same content) and to be detected by tools such as mercurial as a change.
Hope it helps.
Keep in mind that you should use SAP tools to change SAP Component. SAP consultant might explain it to you in details.
Taryck.
[http://www.steria.com Steria (France)]
The 2020 answer to the question is simply: Use AbapGit.
It gives you all the advantages of modern version control, is fully documented, open source and works like a charm.