I may be completely off-track, but is it possible to use BO universe as a Source to my SSIS/SSRS.
We've been given a task recently where we've to tap into BO to make SSRS reports. Can we do that?
And if need be can we use that as a source to SSIS for any further transformation.
Also - please help in providing link on how to do it. Google provided me only links where SQL is a source instead of destination.
Thank you for your help.
You can, if:
you can query the universe using the REST interface (i.e. web service)
you can consume the OData flow that is returned as the result set
For more information, have a look at SAP BusinessObjects BI Universe Connector Technical Guide.
There are some requirements regarding the version of BusinessObjects and the type of universe you're using:
SAP BusinessObjects BI 4.1 SP2 or later are supported
the universe has to be created with the Information Design Tool (i.e. have a .UNX extension)
The current limitations are described in the aforementioned article.
You cannot use Business Objects Universes as a data source in SSIS/SSRS. Universes do not contain data, they are simply an abstraction layer that generates SQL for reporting.
You also cannot link SSRS/SSIS to Business Objects to use universes to generate SQL for downstream use. The links simply do not exist.
Related
Based on Pentaho guideline (https://help.pentaho.com/Documentation/8.2/Setup/Installation/Archive/MySQL_Repository) I successfully converted pentaho File based repository to MySQL database repository.
Now does anyone have any idea how MySQL repository store the data in database? It means If create new folder, new dashboard or new connection then how pentaho store this data in mysql database. Also need to know which tables is used for which purpose of data store.
Default created attached schema and tables based on mysql pentaho repository.
Please Provide any inputs or any reference material for same?
Pentaho's repository comprises three third party technologies: Jackrabbit, Hibernate, and Quartz. Reports/Jobs/Transformations and any other artifacts stored inside the Pentaho Server are generally stored in Jackrabbit. Scheduling info and triggers are stored in Quartz. And diagnostic info is stored in Hibernate (such as who accessed what reports, how long a report took to run, etc.).
None of this info is designed to be human readable directly out of the database tables. These are sort of "black box" technologies. These are third party technologies that Pentaho simply leverages for its repository functions. If you have additional questions, I'd recommend checking out the technologies themselves on their project pages.
Is there a SAP standard function module that transfers data of new items from ERP to CRM system?
It depends of what kind of items you wanna replicate:
For business-partners there are standard functions which are adjustable from SPRO
For vendor data you can utilize FSSC_BP_REPLICATION or CRM_VENMAP_TO_CRMM_BUT_VENDNO reports
Order replication is triggered automatically and you do not have to do anything, just to ensure the BAdI CHANGE_BEFORE_UPDATE is active
Check also the most common problems that are faced during distribution and their solutions: TIPS to check the distribution of documents between CRM and ECC
Manual ERP->CRM tables replication is not recommended by SAP, however there are also special tools for this. For example, SAP Data Services and SLT:
https://www.guru99.com/sap-ds-sap-data-services-in-sap-hana.html
https://blogs.sap.com/2018/10/30/slt-configuration-for-data-replication-from-s4hana-fashion-to-sap-car-system/
https://blogs.sap.com/2015/05/15/transformation-capabilities-of-sap-slt-vs-data-services/
Is there a good data integration tool available for SAP HANA which could accomplish the following :
Consuming data periodically (a user defined interval) from REST based web service (A simple URL containing XML)
Parsing the XML and extracting the data
Populating the associated table
I am aware that SQL Server Integration Service is one such tool available for Microsoft SQL Server, which does the above. Would like to know the equivalent in HANA. I did explore SAP Cloud Integration service and Business Object Data Services tool, but would like to have a first hand opinion on the same.
SAP HANA Smart Data Integration (SDI) is a standard product option that you can use for this. As it is a feature of SAP HANA, no additional server is required for this solution. This blog gives a good overview SAP SDI BLOG.
Data Services can of course also used for that, but would probably be the 'Ferrari for driving to the bakery'.
I have been working with Pentaho for the last few days. I have been able to setup the Pentaho Report Designer to generate a sample report by follow their documentation. Then I follow this article http://www.robertomarchetto.com/www/how_to_use_pentaho_report_designer_tutorial and managed to export the report to Pentaho BI server.
All I don't understand is Pentaho workflow. What should be the process I should follow which means what's the purpose of exporting the export to Pentaho BI server? Why there is a Data Integration tool? Why there is a BI sever when I can export the report from the Designer tool?
Requirement
All I want to do is retrieve the data from the MYSQL DB. Put them into a data-mart. Then from the data-mart generate a report.(According to what I have read, creating a data mart is the efficient way).
How can I get it done?
Pentaho Data Integration can be used to make this report generation automated.
In report designer you will be passing a parameter or set of parameters to generate a single report output.
With Data integration you can generate the reports for different set of parameters. for eg: if reports are generated on daily basis, we can make it automated for the whole month, so that there is no need of generating reports daily and manually.
And using the Pentaho Business Intelligence server we can make all these operations scheduled.
To generate Data/Table(Fact tables/dimension table) in MYSQL DB From difference source like files/different DB - Data Integration tool comes in to picture .
To create Schema on top of Fact tables - Mondrian tool
To handle user/roles on top of created cubes -Meta data editor
To create simple reports on top of small tables - Report Designer
For sequential Execution (at a go) usage of DI jobs/transformation , Reports, Java script - Design Studio
thanks to user surya.thanuri # forums.pentaho.com
The Data Integration tool is mostly for ETL, it's a separate tool and you can ignore it unless you are doing complex analysis of data from multiple dissimilar data sources. You don't need to 'export' reports to the pentaho server, you can write them directly to a directory then refresh the repository from inside the Pentaho web application. Exporting them is just one workflow technique.
You're going to find that there are about a dozen ways to do any one thing with Pentaho. For instance I use the CDA datasources with my reports vice placing the sql code inside my report. Alternatively you can link up to a Data Integration server to execute the Data Integration scripts to view a result set.
Just to answer your datamart question. In general a datamart should probably be supported by either the Data Integration tool (depending on your situation I don't exactly recommend this) or database functions/replication streams (recommended).
Just to hazard a guess, it sounds like someone tossed you a project saying: We need a BI system, here's the database where the data is stored, here are the reports we're already getting. X looked at Pentaho and liked it. You should use that.
First thing you need to do is understand the shape of the data, volume, tables, interrelations. Figure out what the real questions they want to answer are. Determine whether they need real time reporting, etc..etc. Just getting the datamart together itself, if you even need one, can take quite awhile. I think you may have jumped the gun on Pentaho itself.
thanks to user flamierd # forums.pentaho.com
Hello I would like some advice, links, etc on importing legacy data from an external system to SAP CRM 7.0
We are currently using a different SQL based CRM system and are moving to SAP CRM 7.0. I need to export all the SQL database data and feed it into SAP CRM 7.0. What tools are available to me or what can I do to accomplish this task.
I am very new to SAP CRM 7 and SAP in general. I guess you can assume that the data will be a text file (maybe a CSV). How will I handle 1->many relationships? etc...
Jon
I assume the data will be several CSV files - one per source table. I don't know a lot about CRM, but you might want to take a look at the Legacy System Migration Workbench - a great tool for importing data into SAP R/3 based systems. I assume you already know the docs at http://help.sap.com/crm. There are also some best practices documents available (navigate to Implementation Project --> Data Migration) - but you'll need a SAPnet Service Marketplace user ID (S.....) for this.
Your options may vary depending on what licenses or tools you have at fingertips or what backend is behind your legacy CRM, but generally SAP recommends for interchanging data between CRM and external systems SAP CRM Integration services which has several adapters for different data types.
SAP MDG is a more full-fledged solution, equipped with more tools. Also CRM has industry specific tools, for example, for utilities.
Also check this my answer that was written for MSSQL database, but it is fully relevant for any DB.