Salesforce Apex: Error ORA-01460 - api

I've developed an apex API on salesforce which performs a SOQL on a list of CSV data. It has been working smoothly until yesterday, after making a few changes to code that follow the SOQL query, I started getting a strange 500 error:
[{"errorCode":"APEX_ERROR","message":"System.UnexpectedException:
common.exception.SfdcSqlException: ORA-01460: unimplemented or
unreasonable conversion requested\n\n\nselect /SampledPrequery/
sum(term0) \"cnt0\",\nsum(term1) \"cnt1\",\ncount(*)
\"totalcount\",\nsum(term0 * term1) \"combined\"\nfrom (select /*+
ordered use_nl(t_c1) /\n(case when (t_c1.deleted = '0') then 1 else 0
end) term0,\n(case when (upper(t_c1.val18) = ?) then 1 else 0 end)
term1\nfrom (select /+ index(sampleTab AKENTITY_SAMPLE)
*/\nentity_id\nfrom core.entity_sample sampleTab\nwhere organization_id = '00Dq0000000AMfz'\nand key_prefix = ?\nand rownum <=
?) sampleTab,\ncore.custom_entity_data t_c1\nwhere
t_c1.organization_id = '00Dq0000000AMfz'\nand t_c1.key_prefix = ?\nand
sampleTab.entity_id =
t_c1.custom_entity_data_id)\n\nClass.labFlows.queryContacts: line 13,
column 1\nClass.labFlows.fhaQuery: line 6, column
1\nClass.zAPI.doPost: line 10, column 1"}]
the zAPI.doPost() is simply our router class which takes in the post payload as well as the requested operation. It then calls whatever function the operation requests. In this case, the call is to labFlows.queryContacts():
Public static Map<string,List<string>> queryContacts(string[] stringArray){
//First get the id to get to the associative entity, Contact_Deals__c id
List<Contact_Deals__c> dealQuery = [SELECT id, Deal__r.id, Deal__r.FHA_Number__c, Deal__r.Name, Deal__r.Owner.Name
FROM Contact_Deals__c
Where Deal__r.FHA_Number__c in :stringArray];
//Using the id in the associative entity, grab the contact information
List<Contact_Deals__c> contactQuery = [Select Contact__r.Name, Contact__r.Id, Contact__r.Owner.Name, Contact__r.Owner.Id, Contact__r.Rule_Class__c, Contact__r.Primary_Borrower_Y_N__c
FROM contact_deals__c
WHERE Id in :dealQuery];
//Grab all deal id's
Map<string,List<string>> result = new Map<string,List<string>>();
for(Contact_Deals__c i:dealQuery){
List<string> temp = new list<string>();
temp.add(i.Deal__r.Id);
temp.add(i.Deal__r.Owner.Name);
temp.add(i.Deal__r.FHA_Number__c);
temp.add(i.Deal__r.Name);
for(Contact_Deals__c j:contactQuery){
if(j.id == i.id){
//This doesn't really help if there are multiple primary borrowers on a deal - but that should be a SF worflow rule IMO
if(j.Contact__r.Primary_Borrower_Y_N__c == 'Yes'){
temp.add(j.Contact__r.Owner.Id);
temp.add(j.Contact__r.Id);
temp.add(j.Contact__r.Name);
temp.add(j.Contact__r.Owner.Name);
temp.add(j.Contact__r.Rule_Class__c);
break;
}
}
}
result.put(i.Deal__r.id, temp);
}
return result;
}
The only thing I've changed is moving the temp list to add elements before the inner-loop (previously temp would only capture things from the inner-loop). The error above is referencing line 13, which is specifically the first SOQL call:
List<Contact_Deals__c> dealQuery = [SELECT id, Deal__r.id, Deal__r.FHA_Number__c, Deal__r.Name, Deal__r.Owner.Name
FROM Contact_Deals__c
Where Deal__r.FHA_Number__c in :stringArray];
I've tested this function in the apex anonymous window and it worked perfectly:
string a = '00035398,00035401';
string result = zAPI.doPost(a, 'fhaQuery');
system.debug(result);
Results:
13:36:54:947 USER_DEBUG
[5]|DEBUG|{"a09d000000HRvBAD":["a09d000000HRvBAD","Contacta","11111111","Plaza
Center
Apts"],"a09d000000HsVAD":["a09d000000HsVAD","Contactb","22222222","The
Garden"]}
So this is working. The next part is maybe looking at my python script that is calling the API,
def origQuery(file_name, operation):
csv_text = ""
with open(file_name) as csvfile:
reader = csv.reader(csvfile, dialect='excel')
for row in reader:
csv_text += row[0]+','
csv_text = csv_text[:-1]
data = json.dumps({
'data' : csv_text,
'operation' : operation
})
results = requests.post(url, headers=headers, data=data)
print results.text
origQuery('myfile.csv', 'fhaQuery')
I've tried looking up this ORA-01460 apex error, but I can't find anything that will help me fix this issue.
Can any one shed ore light on what this error is telling me?
Thank you all so much!

It turns out the error was in the PY script. For some reason the following code isn't functioning as it is supposed to:
with open(file_name) as csvfile:
reader = csv.reader(csvfile, dialect='excel')
for row in reader:
csv_text += row[0]+','
csv_text = csv_text[:-1]
This was returning one very long string that had zero delimiters. The final line in the code was cutting off the delimiter. What I needed instead was:
with open(file_name) as csvfile:
reader = csv.reader(csvfile, dialect='excel')
for row in reader:
csv_text += row[0]+','
csv_text = csv_text[:-1]
Which would cut off the final ','
The error was occurring because the single long string was above 4,000 characters.

Related

Groovy parallel sql queries

I'm trying to run parallel sql queries using GPars. But somehow it isn't working as I expected. Since I'm relatively new to groovy/java concurrency I'm not sure how to solve my issue.
I following code:
def rows = this.sql.rows(
"SELECT a_id, b_id FROM data_ids LIMIT 10 OFFSET 10"
)
With this code I get a List of ids. Now I want the load the data for each loaded id and this should happen parallel to improve my performance, because I have a large database.
To get the detail data I use the following code:
GParsPool.withPool() {
result = rows.collectParallel {
// 2. Get the data for each source and save it in an array.
def tmpData = [:]
def row = it
sql.withTransaction {
if (row.a_id != null) {
tmpData.a = sql.firstRow("SELECT * FROM data_a WHERE id = '" + row.a_id + "'")
}
if (row.b_id != null) {
tmpData.b = sql.firstRow("SELECT * FROM data_b WHERE id = '" + row.b_id + "'")
}
}
return tmpData
}
// 3. Return the loaded data.
return result
Now I run the code and everything works fine except that the code isn't executed parallel. Using the JProfiler I can see that I have blocked threads and waiting threads, but 0 runnable threads.
Thanks for any help. I you need more information, I will provide them :)
Daniel

Getting missing row while using insertAll - BigQuery

I am getting missing row with invalid reason in response while inserting data into BigQuery.
{"errors":[{"message":"Missing row.","reason":"invalid"}],"index":0}, {"errors":[{"message":"Missing row.","reason":"invalid"}]
Below is the code which i am executing:
/The below lines calls the dfp API to get all adunits
AdUnitPage page = inventoryService.getAdUnitsByStatement(statementBuilder.toStatement());
List dfpadunits = new ArrayList();
if (page.getResults() != null) {
totalResultSetSize = page.getTotalResultSetSize();
int i = page.getStartIndex();
for (AdUnit adUnit : page.getResults()) {
/*System.out.printf(
"%d) Ad unit with ID '%s' and name '%s' was found.%n", i++,
adUnit.getId(), adUnit.getName());*/
//Map<String,Object> dfpadunitrow = new HashMap<String,Object>();
//dfpadunitrow.put(adUnit.getId(), adUnit.getName());
Rows dfpadunit = new TableDataInsertAllRequest.Rows();
dfpadunit.setInsertId(adUnit.getId());
dfpadunit.set("id",adUnit.getId());
dfpadunit.set("name",adUnit.getName());
dfpadunits.add(dfpadunit);
}
}
TableDataInsertAllRequest content = new TableDataInsertAllRequest();
content.setRows(dfpadunits);
content.setSkipInvalidRows(true);
content.setIgnoreUnknownValues(true);
System.out.println(dfpadunits.get(0));
Bigquery.Tabledata.InsertAll request = bigqueryService.tabledata().insertAll(projectId, datasetId, tableId, content);
TableDataInsertAllResponse response = request.execute();
System.out.println(response.getInsertErrors());
I put loggers to check my data is populated correctly but when i try to insert records into bigquery using insertAll , I get missing row in response with invalid reason.
Thanks,
Kapil
You need to use a TableRow object. This works (I tested):
TableDataInsertAllRequest.Rows dfpadunit = new TableDataInsertAllRequest.Rows();
TableRow row = new TableRow();
row.set("id",adUnit.getId());
row.set("name",adUnit.getName());
dfpadunit.setInsertId(adUnit.getId());
dfpadunit.setJson(row);
dfpadunits.add(dfpadunit);

How do I loop thought each DB field to see if range is correct

I have this response in soapUI:
<pointsCriteria>
<calculatorLabel>Have you registered for inContact, signed up for marketing news from FNB/RMB Private Bank, updated your contact details and chosen to receive your statements</calculatorLabel>
<description>Be registered for inContact, allow us to communicate with you (i.e. update your marketing consent to 'Yes'), receive your statements via email and keep your contact information up to date</description>
<grades>
<points>0</points>
<value>No</value>
</grades>
<grades>
<points>1000</points>
<value>Yes</value>
</grades>
<label>Marketing consent given and Online Contact details updated in last 12 months</label>
<name>c21_mrktng_cnsnt_cntct_cmb_point</name>
</pointsCriteria>
There are many many many pointsCriteria and I use the below xquery to give me the DB value and Range of what that field is meant to be:
<return>
{
for $x in //pointsCriteria
return <DBRange>
<db>{data($x/name/text())}</db>
<points>{data($x//points/text())}</points>
</DBRange>
}
</return>
And i get the below response
<return><DBRange><db>c21_mrktng_cnsnt_cntct_cmb_point</db><points>0 1000</points></DBRange>
That last bit sits in a property transfer. I need SQL to bring back all rows where that DB field is not in that points range (field can only be 0 or 1000 in this case), my problem is I dont know how to loop through each DBRange/DBrange in this manner? please help
I'm not sure that I really understand your question, however I think that you want to make queries in your DB using specific table with a column name defined in your <db> field of your xml, and using as values the values defined in <points> field of the same xml.
So you can try using a groovy TestStep, first parse your Xml and get back your column name, and your points. To iterate over points if the values are separated with a blank space you can make a split(" ") to get a list and then use each() to iterate over the points on this list. Then using groovy.sql.Sql you can perform the queries in your DB.
Only one more thing, you need to put the JDBC drivers for your vendor DB in $SOAPUI_HOME/bin/ext and then restart SOAPUI in order that it can load the necessary driver classes.
So the follow code approach can achieve your goal:
import groovy.sql.Sql
import groovy.util.XmlSlurper
// soapui groovy testStep requires that first register your
// db vendor drivers, as example I use oracle drivers...
com.eviware.soapui.support.GroovyUtils.registerJdbcDriver( "oracle.jdbc.driver.OracleDriver")
// connection properties db (example for oracle data base)
def db = [
url : 'jdbc:oracle:thin:#db_host:d_bport/db_name',
username : 'yourUser',
password : '********',
driver : 'oracle.jdbc.driver.OracleDriver'
]
// create the db instance
def sql = Sql.newInstance("${db.url}", "${db.username}", "${db.password}","${db.driver}")
def result = '''<return>
<DBRange>
<db>c21_mrktng_cnsnt_cntct_cmb_point</db>
<points>0 1000</points>
</DBRange>
</return>'''
def resXml = new XmlSlurper().parseText(result)
// get the field
def field = resXml.DBRange.db.text()
// get the points
def points = resXml.DBRange.points.text()
// points are separated by blank space,
// so split to get an array with the points
def pointList = points.split(" ")
// for each point make your query
pointList.each {
def sqlResult = sql.rows "select * from your_table where ${field} = ?",[it]
log.info sqlResult
}
sql.close();
Hope this helps,
Thanks again for your help #albciff, I had to add this into a multidimensional array (I renamed field to column and result is a large return from the Xquery above)
def resXml = new XmlSlurper().parseText(result)
//get the columns and points ranges
def Column = resXml.DBRange.db*.text()
def Points = resXml.DBRange.points*.text()
//sorting it all out into a multidimensional array (index per index)
count = 0
bigList = Column.collect
{
[it, Points[count++]]
}
//iterating through the array
bigList.each
{//creating two smaller lists and making it readable for sql part later
def column = it[0]
def points = it[1]
//further splitting the points to test each
pointList = points.split(" ")
pointList.each
{//test each points range per column
def sqlResult = sql.rows "select * from my_table where ${column} <> ",[it]
log.info sqlResult
}
}
sql.close();
return;

How do I correct the "Object Expected" Error when it inloves var queryStringVals = $().SPServices.SPGetQueryString();?

Hi have been getting an error stating.. Object Expected for these 2 lines of code:
function AsyncSave(send) {
//alert ('In CustomSave');
var drp = document.getElementById("Sample_sample_DropDownChoice");
var drpValue = drp.options[drp.selectedIndex].value;
var varAnalysis = getTagFromIdentifierAndTitle("textarea","TextField","Principal Comments");
var varAnalysisTextBoxID = RTE_GetEditorDocument(varAnalysis.id);
var varAnalysisText = varAnalysisTextBoxID.body.innerText;
alert ('Save N Send');
alert (drpValue);
alert (varAnalysisText);
Error Happens when it gets to the line below
var queryStringVals = $().SPServices.SPGetQueryString();
var itemID = queryStringVals["itemID"];
What could be the problem.. should I be running different up to date SPServices.. this is 2010 btw.
The goal is to take the values entered, save them, and update them (send) to another form/List.
SharePoint 2010 has a built-in JavaScript method for retrieving values from the query string. Try using the following:
var itemID = GetUrlKeyValue("itemID");
That's assuming the URL of the page on which the script is running actually has a query string parameter of "itemID"

BigQuery Data.ErrorProto.Reason "stopped"

I'm inserting data with insertAll() but DataInsertAllRespone.InsertErrors returns the same error of each row I have inserted.
The errors only give me the field
**Data.ErrorProto.Reason** which contains: **"stopped"**.
This is the method that call insertAll():
public bool InsertAll(BigqueryService s, String datasetId, String tableId, List<TableDataInsertAllRequest.RowsData> data)
{
TabledataResource t = s.Tabledata;
TableDataInsertAllRequest req = new TableDataInsertAllRequest()
{
Kind = "bigquery#tableDataInsertAllRequest",
Rows = data /*Posar aquĆ­ les files per pujar al BigQuery*/
};
TableDataInsertAllResponse response = t.InsertAll(req, projectId, datasetId, tableId).Execute();
if (response.InsertErrors != null) return true;
return false;
}
What happens? Why can't upload data?
*EDIT: * I realize that if i upload less than 6 rows works correctly, but the row size is about 1,6 Kb and the maximum row size is 20Kb.
Thanks,
Roger
Well, a few days ago I found the solution. When you streaming data into BigQuery using insertAll() method, you can stream multiple rows at once. If one of these rows is wrong Data.ErrorProto.Reason contains message for this error, for example, "Can't convert value to string." and the other rows contain "stopped" in Data.ErrorProto.Reason.
If you ever see this error, probably you have inconsistencies in the rows format