Is it possible to determine via code if the current system is S/4HANA Cloud Essentials?
I have a requirement in which I must only enable a field if it is not a S/4HANA Cloud Essentials system.
I have found this question: How to programmatically tell if system is R/3 or S/4 but with class cl_cos_utilities I can only check if the system is S/4HANA Cloud.
Thanks in advance!
If my knowledge is correct we can't make ABAP development in S/4HANA Cloud Essentials. If you want to check it manualy, may be you can check system full domain name (SAPLOCALHOSTFULL) in RZ11 t-code. I am not sure but we can see sap domain in SAPLOCALHOSTFULL.
Related
I am working on a project related to Network Function Virtualization. To virtualize my Network Function, I am planning to deploy the Network Function on an Oracle Virtualbox. If I deploy my Network Function on the Oracle Virtualbox, does that mean I am complying with the standards of NFV architecture according to ETSI.
If yes, how Oracle Virtualbox is implementing the NFV architecture. Any source documentation would be useful.
If no, how much is Oracle Virtualbox implementation different from ETSI standards and what opensource architecture is better for implementing my project. Any source documentation on how much Oracle Virtualbox deviating from ETSI standard is useful.
Yes. The ETSI NFV Standards do not mandate a specific hypervisor.
As you may see in the Architectural Framework specification the MANO (Management and Orchestration) component interacts with the virtualization layer by means of a Virtual Infrastructure Manager (VIM), which decouples from the specific hypervisors.
The requirements for the MANO - VIM interface are specified here.
If you are starting from scratch, you may want to check out Openstack as a VIM and OSM as the MANO. Others open source implementation for both components are also available.
Have a look at the OpenStack VirtualBox drivers to get more information.
Installation information for OSM Release 5 are available here.
We are in the process of procuring the hardware for worklight installation with Oracle DB. I looked up the perquisites for the Oracle DB (tablespace/temp) space requirements, but did not find any information on the IBM website. Can you help me determine what would be a good oracle database (tablespace/temp) space configuration for the worklight installation?
The database requirements, as well as all other hardware and system requirements, will depend heavily on your mobile requirements. The type of app you plan to develop (consumer and/or enterprise), total number of users, mobile device platforms supported and so forth will all affect your requirements.
Specifically for the database, Worklight Foundation requires one database schema per runtime environment (set of apps, adapters and project-specific configuration) and one database schema for all runtime environments (administration data). If you plan to deploy Application Center you will need an additional database schema, and you might need one extra database schema for reports, which is an optional feature.
The total database size requirement depends on all these factors, with the reporting schema (optional) being by far the most demanding one, requiring daily purges and a robust infrastructure.
You can find more information at:
Worklight 6.2 Knowledge Center
Worklight Foundation Hardware Calculator spreadsheet
Worklight Foundation Scalability and Hardware Sizing PDF
I would strongly suggest that you contact your IBM sales representative, who can help you assess your requirements and define the best topology and system/hardware requirements for your need.
I am working in ektron 8.6.
I am trying to connect a third party enterprise systems to ektron.
DxH has got a pre-built marketing automation connector,which is Marketo.
My third party enterprise connector is also a marketing automation system.
I am following the developer webinar to get this done.
Now,my question is eventhough i made establishing connection to the third party system by means of ConextBus API,i need to make some changes in the workarea files for implementing certain functionalities.
So is it necessary to have the connection established via DxH?
What is the significance of this compared to handling all these functionalities from the ektron site?
Can anyone provide me some insight on this?
If you're asking "why use DxH?" the main answer is its tight integration with Ektron. You can continue using your marketing automation system and it will share relevant data with Ektron. If you are instead looking to perform a wholesale migration away from a particular system, you can certainly do a one-time import of data into Ektron using the APIs. The difference there is that you would be moving away from the 3rd party system, whereas the DxH and its ContextBus allows both systems to work together.
You mentioned a webinar that you are following, so you may have already seen this, but there's a good webinar on the DxH here: http://www.ektron.com/Webinars/Details/Digital-Experience-Hub---Developer-Webinar/
Does anybody know if BLToolkit has been tested and certified for use with Azure Sql and if it supports the dropped connection retry functionality? And if not are there any plans to get it tested and passed?
At the time of writing, there is nothing official on the BL Toolkit website, and no issues listed in their issue tracker for Azure.
There are a few other requests (e.g. here and here) that are requesting the same details. At the moment they are unanswered but you could add your weight to them.
Based on this evidence from the official sources I would say that the Toolkit is not Tested or Certified specifically for Azure use.
With the exception of the transient nature of Azure which may require handling of database reconnections, there doesn't seem to be anything obvious that would prevent you from using the Toolkit however.
I would recommend you raise an issue with the developers regarding Azure Testing and Certification while performing a proof of concept test on Azure to determine how to best handle reconnections on Azure for your specific application.
There is recent activity on the project, so I'd be quite hopeful of a response.
I wrote Azure Sql Data Provider for BLToolkit. You can find its sources on github.
Also it's available over NuGet. Please, see how to install and configure it here.
I got following from web but dont know how to set up the network printer/scanner/fax/copier (using Ricoh Afico MP 6001) for KnowledgeLake. The capture server has been setup. \srvcapture\cache
Capture for MFP
Create batches from multifunction peripherals, fax servers or any
other interface.
Enable any capture device to integrate with SharePoint
Batch import documents from multi-function devices
Watch network directories for new documents
Distribute MFPs for decentralized scanning
Support for custom Process Activities
Enable off-hours batch processing by scheduling imports
The particular information you are looking for can be found in the KnowledgeLake Capture and Capture Server documentation provided on the KnowledgeLake Support Portal at http://support.knowledgelake.com.
KnowledgeLake also has a Professional Services team that specializes in assisting in the implementation and setup of these products in customer environments.
The KnowledgeLake Technical Support team would also be more than happy to assist with any questions you may have if you put in a support ticket. This option is also available through the support portal at http://support.knowledgelake.com.
If you currently do not have an account with access to the KnowledgeLake Support Portal, you can select the "Request a New Account" button at the bottom of the previous referenced link.