automated mapping in groovy for ODI with expression - dynamic

this is my first question and I hope you can help me. I make a script in Groovy (in Oracle Data Integrator 12c) to automate mappings. Here is the description of my prodecure:
1 step: removing old mapping if exists.
2 step: looking for the project and the folder (if doesn't exist: create new one).
3 step: create new mapping
4 step: implement source and target table
5 step: create expression
6 step: link every column
Now my question: Can someone help me to make this script with a dynamic expression? Like this:
step 1: get the data types of the target columns
step 2: get the right data types into the expression
step 3: change the false types (always Varchar) into the right types (Number or Date or still Varchar)
step 4: link every column
My handicap: I have never done something with groovy and in Java I'm not very good. So it is not possible for me to make this dynamic. Almost everything in my Script is placed together from some internet sites. It would be great to find some guys who know something about my problem. And I think it would be a good script for all who will change from OWB to ODI.
Thanks!
//Von ODI Studio erstellt
//
//name of the project
projectName = "SRC_TO_TRG"
//name of the folder
ordnerName = "FEN_TEST"
//name of the mapping
mappingName = "MAP1_FF_TO_TRG"
//name of the model
modelName = "DB_FEN"
//name of the source datastore
sourceDatastoreName = "SRC_TEST_FEN"
//name of the target datastore
targetDatastoreName = "TRG_TEST_FEN"
import oracle.odi.domain.project.finder.IOdiProjectFinder
import oracle.odi.domain.model.finder.IOdiDataStoreFinder
import oracle.odi.domain.project.finder.IOdiFolderFinder
import oracle.odi.domain.project.finder.IOdiKMFinder
import oracle.odi.domain.mapping.finder.IMappingFinder
import oracle.odi.domain.adapter.project.IKnowledgeModule.ProcessingType
import oracle.odi.domain.model.OdiDataStore
import oracle.odi.core.persistence.transaction.support.DefaultTransactionDefinition
//set expression to the component
def createExp(comp, tgtTable, propertyName, expressionText) {
DatastoreComponent.findAttributeForColumn(comp,tgtTable.getColumn(propertyName)) .setExpressionText(expressionText)
}
//delete mapping with the same name
def removeMapping(folder, map_name) {
txnDef = new DefaultTransactionDefinition()
tm = odiInstance.getTransactionManager()
tme = odiInstance.getTransactionalEntityManager()
txnStatus = tm.getTransaction(txnDef)
try {
Mapping map = ((IMappingFinder) tme.getFinder(Mapping.class)).findByName(folder, map_name)
if (map != null) {
odiInstance.getTransactionalEntityManager().remove(map);
}
} catch (Exception e) {e.printStackTrace();}
tm.commit(txnStatus)
}
//looking for a project and folder
def find_folder(project_code, folder_name) {
txnDef = new DefaultTransactionDefinition()
tm = odiInstance.getTransactionManager()
tme = odiInstance.getTransactionalEntityManager()
txnStatus = tm.getTransaction(txnDef)
pf = (IOdiProjectFinder)tme.getFinder(OdiProject.class)
ff = (IOdiFolderFinder)tme.getFinder(OdiFolder.class)
project = pf.findByCode(project_code)
//if there is no project, create new one
if (project == null) {
project = new OdiProject(project_code, project_code)
tme.persist(project)
}
//if there is no folder, create new one
folderColl = ff.findByName(folder_name, project_code)
OdiFolder folder = null
if (folderColl.size() == 1)
folder = folderColl.iterator().next()
if (folder == null) {
folder = new OdiFolder(project, folder_name)
tme.persist(folder)
}
tm.commit(txnStatus)
return folder
}
//name of the project and the folder
folder = find_folder(projectName,ordnerName)
//delete old mapping
removeMapping(folder, mappingName)
txnDef = new DefaultTransactionDefinition()
tm = odiInstance.getTransactionManager()
tme = odiInstance.getTransactionalEntityManager()
txnStatus = tm.getTransaction(txnDef)
dsf = (IOdiDataStoreFinder)tme.getFinder(OdiDataStore.class)
mapf = (IMappingFinder) tme.getFinder(Mapping.class)
//create new mapping
map = new Mapping(mappingName, folder);
tme.persist(map)
//insert source table
boundTo_emp = dsf.findByName(sourceDatastoreName, modelName)
comp_emp = new DatastoreComponent(map, boundTo_emp)
//insert target table
boundTo_tgtemp = dsf.findByName(targetDatastoreName, modelName)
comp_tgtemp = new DatastoreComponent(map, boundTo_tgtemp)
//create expression-operator
comp_expression = new ExpressionComponent(map, "EXPRESSION")
// define expression
comp_expression.addExpression("LAND_KM", "TO_NUMBER(SRC_TEST_FEN.LAND_KM)", null,null,null);
comp_expression.addExpression("DATE_OF_ELECTION", "TO_DATE(SRC_TEST_FEN.DATE_OF_ELECTION, 'DD.MM.YYYY')", null,null,null);
//weitere Transformationen anhängen möglich
//link source table with expression
comp_emp.connectTo(comp_expression)
//link expression with target table
comp_expression.connectTo(comp_tgtemp)
createExp(comp_tgtemp, boundTo_tgtemp, "ABBR", "SRC_TEST_FEN.ABBR")
createExp(comp_tgtemp, boundTo_tgtemp, "NAME", "SRC_TEST_FEN.NAME")
createExp(comp_tgtemp, boundTo_tgtemp, "LAND_KM", "EXPRESSION.LAND_KM")
createExp(comp_tgtemp, boundTo_tgtemp, "DATE_OF_ELECTION", "EXPRESSION.DATE_OF_ELECTION")
tme.persist(map)
tm.commit(txnStatus)

You can pass the Datatype as the third argument of the method addExpression.
You can also pass the size and the scale as fourth and fifth arguments.
For instance, for the LAND_KM expression, replace your line by this :
MapAttribute map_attr = DatastoreComponent.findAttributeForColumn(comp_tgtemp,boundTo_tgtemp.getColumn("LAND_KM"))
comp_expression.addExpression("LAND_KM", "TO_NUMBER(SRC_TEST_FEN.LAND_KM)", map_attr.getDataType(),map_attr.getSize(),map_attr.getScale());
It retrieves the target column for LAND_KM thanks to findAttributeForColumn, then retrieves the datatype, the size and the scale, and use that when adding the new expression in the Expression component.
If you want to auto map it based on the name, David Allan wrote a post on the official Oracle blog about how to do it and he provides his code : https://blogs.oracle.com/dataintegration/entry/odi_12c_mapping_sdk_auto

Related

SSRS/ASP.NET Setting all required report input parameters but still getting:One or more parameters required to run the report have not been specified

Although this has been asked before, none of the answers provided solve my issue. I AM passing in all of the required report parameters and all have HasValidValue set in the State property of the parameter array of ExecutionInfo. I am using ReportExecution2005.asmx in my ASP.NET web application. There is no GetParameters() call method to ReportExecution2005.ReportExecutionService object.
My report is calling a subreport so the input parameters are identical to the main report and the subreport. There are 4 single value inputs (two integer, two text) and then one Multivalue text input parameter which I supply multiple values for. I have double and tripled checked that the values I'm setting in the Multivalue input parameter match exactly one of the default parameters.
Here is the code that setups up the reports and renders/writes it out to a PDF file:
protected void btnPrintReq_Tests_Click(object sender, EventArgs e)
{
string query = string.Format("SELECT PatientFName, PatientLName, Address1 + ' ' + Address2 AS Address, City, Region AS State, PostalCode, Email, Phone, Gender, DOB, CONVERT(VARCHAR, ClinicID) AS ClinicID FROM Tbl_ShippingDropShip WHERE RequestID = {0}", giRequestID.ToString());
string sClinicID = string.Empty;
try
{
//get the drop ship test order details
SqlDataReader reader = default(SqlDataReader);
reader = o_cSQL.RunSQLReturnDataReader(query);
REService.ReportExecutionService rs_ext = new REService.ReportExecutionService();
rs_ext.Credentials = System.Net.CredentialCache.DefaultCredentials;
rs_ext.Url = ConfigurationManager.AppSettings["ReportExecutionServer"];
List<string> ordereditems = new List<string> { };
ordereditems = GetOrderedItems(); //panels and profiles
int numordereditems = ordereditems.Count;
List<String> comments = new List<String>();
if (cbIce.Checked)
comments.Add("Ice");
if (cbKit.Checked)
comments.Add("Kit");
if (cbSwabs.Checked)
comments.Add("Swabs");
if (cbGoldTop.Checked)
comments.Add("Gold Top");
if (cbNeedlePack.Checked)
comments.Add("Needle Pack");
if (cbButterflyPack.Checked)
comments.Add("Butterfly Pack");
if (comments.Count == 0)
comments.Add("");
byte[] result = null;
string historyID = null;
string devInfo = null;
string encoding;
string mimeType;
string extension;
REService.Warning[] warnings = null;
string[] streamIDs = null;
//define the size of the parameter array to pass into the REService
REService.ParameterValue[] rValues = new REService.ParameterValue[4 + numordereditems];
while (reader.Read())
{
sClinicID = reader["ClinicID"].ToString();
rValues[0] = new REService.ParameterValue();
rValues[0].Name = "ClinicID";
rValues[0].Value = sClinicID;
rValues[1] = new REService.ParameterValue();
rValues[1].Name = "ReqFormID";
rValues[1].Value = giRequestID.ToString();
rValues[2] = new REService.ParameterValue();
rValues[2].Name = "DropShipID";
rValues[2].Value = giDropShipID.ToString();
rValues[3] = new REService.ParameterValue();
rValues[3].Name = "Comments";
rValues[3].Value = string.Join(",",comments.ToArray());
int j = 0;
for (int i = 4; i < rValues.Length; i++)
{
rValues[i] = new REService.ParameterValue();
rValues[i].Name = "PanelsNProfiles";
rValues[i].Label = "Panels and Profiles :";
rValues[i].Value = ordereditems[j++];
}
}
REService.ExecutionInfo res_info = new REService.ExecutionInfo();
res_info = rs_ext.LoadReport("/zReportSandbox/RequisitionDS", historyID);
res_info = rs_ext.SetExecutionParameters(rValues, "en-us");
res_info = rs_ext.GetExecutionInfo();
result = rs_ext.Render("PDF", devInfo, out extension, out encoding, out mimeType, out warnings, out streamIDs);
string pdfFile = "Requisition_" + giRequestID.ToString() + "_"+ txtPatientName_Edit.Text + ".pdf";
string sPDFSavePath = ConfigurationManager.AppSettings["Shipping_Requisitions"] + sClinicID;
//add the date pathing
sPDFSavePath = ReportHelper.CreateDateDirectories(sPDFSavePath);
//make sure the file doesn't already exist and if it does, delete it so we create a new one
ReportHelper.checkPDFfilesNDirectory(sPDFSavePath +#"\", pdfFile);
FileStream stream = File.Create(sPDFSavePath + #"\" + pdfFile, result.Length);
stream.Write(result, 0, result.Length);
stream.Close();
}
catch(Exception ex)
{
Common.CommonUtilities.FatalError(sPageName, "btnPrintReq_Tests_Click", query, ex.Message.ToString(), ex.ToString());
}
}
I have noticed that the ExecutionInfo item shows me all parameters in the array of the report, meaning the required INPUT parameters AND the INTERNAL parameters, but also noticed that for the INTERNAL parameters, the default value that was set in the report is satisfied and ExecutionInfo shows them with the State of HasValidValue.
The following list is what I know
the report works on its own, so the Main report that calls the Subreport has no issues creating the PDF
there report when run in SSRS does prompt for the 4 input parameters
I have tried the removal of the report in SSRS and redeployed it because there were many that suggested this somehow fixes issues like this.
I have tried the Insert a table, delete the columns and put subreport in one cell of table trick and still the same error.
It would be nice if the error would state what parameter it sees as not being satisfied, even if there are more than one, then simply listing the first one before erroring out would help a developer work through the issues.
The attached picture shows the values I'm passing in for the multivalue parameters (PanelsNProfiles). I'm also attaching my RDL files for the main report and the Subreport, but had to add a (.txt) file extension to them to be able to upload them.
Here is the error in the SSRS execution log:
library!ReportServer_0-57!5250!03/23/2022-09:06:27:: i INFO: RenderForNewSession('/zReportSandbox/RequisitionDS')
processing!ReportServer_0-57!5250!03/23/2022-09:06:27:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: , Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: One or more parameters required to run the report have not been specified.;
reportrendering!ReportServer_0-57!5250!03/23/2022-09:06:27:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.RenderingObjectModelException: , Microsoft.ReportingServices.ReportProcessing.RenderingObjectModelException: One or more parameters were not specified for the subreport, 'RequisitionTopDS', located at: /zReportSandbox/RequisitionTopDS.;
processing!ReportServer_0-57!5250!03/23/2022-09:06:27:: e ERROR: An error has occurred while processing a sub-report. Details: One or more parameters were not specified for the subreport, 'RequisitionTopDS', located at: /zReportSandbox/RequisitionTopDS. Stack trace:
at Microsoft.ReportingServices.OnDemandReportRendering.SubReport.FinalizeErrorMessageAndThrow()
at Microsoft.ReportingServices.OnDemandReportRendering.SubReport.RetrieveSubreport()

Converting StructType to Avro Schema, returns type as Union when using databricks spark-avro

I am using databricks spark-avro to convert a dataframe schema into avro schema.The returned avro schema fails to have a default value. This is causing issues when i am trying to create a Generic record out of the schema. Can, any one help with the right way of using this function ?
Dataset<Row> sellableDs = sparkSession.sql("sql query");
SchemaBuilder.RecordBuilder<Schema> rb = SchemaBuilder.record("testrecord").namespace("test_namespace");
Schema sc = SchemaConverters.convertStructToAvro(sellableDs.schema(), rb, "test_namespace");
System.out.println(sc.toString());
System.out.println(sc.getFields().get(0).toString());
String schemaString = sc.toString();
sellableDs.foreach(
(ForeachFunction<Row>) row -> {
Schema scEx = new Schema.Parser().parse(schemaString);
GenericRecord gr;
gr = new GenericData.Record(scEx);
System.out.println("Generic record Created");
int fieldSize = scEx.getFields().size();
for (int i = 0; i < fieldSize; i++ ) {
// System.out.println( row.get(i).toString());
System.out.println("field: " + scEx.getFields().get(i).toString() + "::" + "value:" + row.get(i));
gr.put(scEx.getFields().get(i).toString(), row.get(i));
//i++;
}
}
);
This is the df schema:
StructType(StructField(key,IntegerType,true), StructField(value,DoubleType,true))
This is the avro converted schema:
{"type":"record","name":"testrecord","namespace":"test_namespace","fields":[{"name":"key","type":["int","null"]},{"name":"value","type":["double","null"]}]}
The problems is that the class SchemaConverters does not include default values as part of the schema creation. You have 2 options, modify the schema adding default values before Record creation or filling the record before building with some value( it could be actually values from your row). For example null. This is an example how create a Record using your schema
import org.apache.avro.generic.GenericRecordBuilder
import org.apache.avro.Schema
var schema = new Schema.Parser().parse("{\"type\":\"record\",\"name\":\"testrecord\",\"namespace\":\"test_namespace\",\"fields\":[{\"name\":\"key\",\"type\":[\"int\",\"null\"]},{\"name\":\"value\",\"type\":[\"double\",\"null\"]}]}")
var builder = new GenericRecordBuilder(schema);
for (i <- 0 to schema.getFields().size() - 1 ) {
builder.set(schema.getFields().get(i).name(), null)
}
var record = builder.build();
print(record.toString())

SSIS import a Flat File to SQL with the first row as header and last row as a total

I receive Text File that I have to Import to a SQL Table, I have to come with a SSIS because I will received the Flat File every Day , with the First Row as the Customer_ID, then come the invoice details and then the Total of the invoice.
Example :
30303
0000109291700080190432737000005Name of the product
0000210291700080190432737000010Name of the product
0000309291700080190432737000000Name of the product
003 000145
So let me Explain:
First 30303 is the Customer #
Other Rows Invoice Details
00001-> ROWID 092917-> DATE 000801904327->PROD 370->Trans 00010 -> AMOUNT
Name of the product
Last Row
003==>Total rows 000145==>Total of Invoice
Any Clue ?
I would use a Script Component as a source in a Data Flow Task. You can then use C# or VB.net to read the file, e.g., by using System.IO.StreamReader, in any way you wish. You can read a line at a time, store values in variables to write to every row (e.g., the customer number), etc. It's extremely flexible for complex files.
Here is an example script (C#) based on your data:
public override void CreateNewOutputRows()
{
System.IO.StreamReader reader = null;
try
{
bool line1Read = false;
int customerNumber = 0;
reader = new System.IO.StreamReader(Variables.FilePath); // this refers to a package variable that contains the file path
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
if (!line1Read)
{
customerNumber = Convert.ToInt32(line);
line1Read = true;
}
else if (!reader.EndOfStream)
{
Output0Buffer.AddRow();
Output0Buffer.CustomerNumber = customerNumber;
Output0Buffer.RowID = Convert.ToInt32(line.Substring(0, 5));
Output0Buffer.Date = DateTime.ParseExact(line.Substring(5, 6), "MMddyy", System.Globalization.CultureInfo.CurrentCulture);
Output0Buffer.Prod = line.Substring(11, 12);
Output0Buffer.Trans = Convert.ToInt32(line.Substring(23, 3));
Output0Buffer.Amount = Convert.ToInt32(line.Substring(26, 5));
Output0Buffer.ProductName = line.Substring(31);
}
}
}
catch
{
if (reader != null)
{
reader.Close();
reader.Dispose();
}
throw;
}
}
The columns in 'Output 0' of the Script Component are configured as follows:
Name DataType Length
==== ======== ======
CustomerNumber four-byte signed integer [DT_I4]
RowID four-byte signed integer [DT_I4]
Date database date [DT_DBDATE]
Prod string [DT_STR] 12
Trans four-byte signed integer [DT_I4]
Amount four-byte signed integer [DT_I4]
ProductName string [DT_STR] 255
To implement this:
Create a string variable called 'FilePath' with your file path in it for the script to reference.
Create a Data Flow Task.
Add a Script Component to the Data Flow Task - you'll be asked what type it should be, select 'Source'.
Right-click the Script Component, click 'Edit'.
On the 'Script' pane, add the 'FilePath' variable to the 'ReadOnlyVariables' section.
On the 'Inputs and Outputs' pane, expand 'Output 0' and add columns to the 'Output Columns' section as per the above table.
On the 'Script' pane, click 'Edit Script', and then paste my code over the public override void CreateNewOutputRows() method (replacing it).
Your Script Component source is now configured, and you'll be able to use it like any other data source component. To write this data to a SQL Server table, add an OLEDB Destination to the Data Flow Task, and link the Script Component to that, configuring the columns appropriately.

Setting Custom Menu Field values in Rightnow API of Oracle

How to set Custom Menu Field values in Rightnow API of Oracle ?
I have a Custom field of data type Menu like :
Custom field Name : user type
Data Type : Menu
Value can be : Free, Paid or Premium
Can any one send me the java code by solving this problem?
Thanks in Advance
The following link is from the Oracle Service Cloud developer documentation. It has an example of setting a contact custom field using Java and Axis2, which would likely give you most of the information that you need in order to set your custom field.
At a high level, you must create an Incident object and specific the ID of the incident that you want to update. Then, you must create the custom field object structure using generic objects (because each site can have its own unique custom fields). Ultimately, your SOAP envelope will contain the node structure that you build through your java code. Since you're trying to set a menu, the end result is that your custom field is a NamedID object. You'll set the lookup name of the menu to one of the three values that you give above.
I'm a C# guy myself, so my example is in C#, but it should be easy to port to Java using the link above as an example too.
public static void SetMenuTest()
{
Incident incident = new Incident();
incident.ID = new ID();
incident.ID.id = 1234;
incident.ID.idSpecified = true;
GenericField customField = new GenericField();
customField.name = "user_type";
customField.dataType = DataTypeEnum.NAMED_ID;
customField.dataTypeSpecified = true;
customField.DataValue = new DataValue();
customField.DataValue.Items = new object[1];
customField.DataValue.ItemsElementName = new ItemsChoiceType[18]; //18 is a named ID value. Inspect ItemChoiceTypes for values.
customField.DataValue.Items[0] = "Free"; //Or Paid, or Premium
customField.DataValue.ItemsElementName[0] = ItemsChoiceType.NamedIDValue;
GenericObject customFieldsc = new GenericObject();
customFieldsc.GenericFields = new GenericField[1];
customFieldsc.GenericFields[0] = customField;
customFieldsc.ObjectType = new RNObjectType();
customFieldsc.ObjectType.TypeName = "IncidentCustomFieldsc";
GenericField cField = new GenericField();
cField.name = "c";
cField.dataType = DataTypeEnum.OBJECT;
cField.dataTypeSpecified = true;
cField.DataValue = new DataValue();
cField.DataValue.Items = new object[1];
cField.DataValue.Items[0] = customFieldsc;
cField.DataValue.ItemsElementName = new ItemsChoiceType[1];
cField.DataValue.ItemsElementName[0] = ItemsChoiceType.ObjectValue;
incident.CustomFields = new GenericObject();
incident.CustomFields.GenericFields = new GenericField[1];
incident.CustomFields.GenericFields[0] = cField;
incident.CustomFields.ObjectType = new RNObjectType();
incident.CustomFields.ObjectType.TypeName = "IncidentCustomFields";
}

JSON Grail Groovy Update SQL

Using a Groovy script with grails and want to do an update to a record in the database. I do the basic get the object from JSON and convert it to the Domain class and then do save() on it. From what I understand since I am new to Groovy and grails the save should update if the "id" is already there. But I don't get that, I get the standard SQL error of "Duplicate entry '1' for key 'PRIMARY'". How do I fix this?
def input = request.JSON
def instance = new Recorders(input)
instance.id = input.getAt("id")
instance.save()
and my domain is:
class Recorders {
Integer sdMode
Integer gsmMode
static mapping = {
id generator: "assigned"
}
static constraints = {
sdMode nullable: true
gsmMode nullable: true
}
}
Instead of doing a new Recorders(input), you probably ought to get it:
def input = request.JSON
def instance = Recorders.get(input.getAt('id'))
instance.properties = input
instance.save()
Edit
(From your comment) If it doesn't exist and you want to insert it:
def input = request.JSON
def id = input.getAt('id')
def instance = Recorders.get(id)
if(!instance) {
instance = new Recorders(id: id)
}
instance.properties = input
instance.save()
I don't use assigned id generators much, so I'm not sure if Grails will bind the id automatically (since it's expecting it to be assigned). If it does, you can probably remove the id: id from the Recorders() constructor.