Example Program to insert a row using BAPI with JCO3 - sap

I am trying to "insert" (or) "add a row" to Purchase Requisition using standard BAPI (PurchaseRequisition.CreateFromData).
I am using JCo3. The example in JCo3 indicates that we should use table.appendRow() OR table.insertRow() methods. I am trying with table.appendRow() & table.appendRows(1). When i try to insert a row, i dont get any error and the row is not inserted.
Below is the program i am trying to execute.
/** Below are the inputs required for this program to run /
/ Step 1 **/
String BAPI_NAME = "BAPI_REQUISITION_CREATE";
/** Step 2 **/
String query_input_column1 = "DOCUMENTY_TYPE";
String query_input_column1_value = "NB";
String query_input_column2 = "PREQ_NAME";
String query_input_column2_value = "Name";
String query_input_column3 = "ACCTASSCAT";
String query_input_column3_value = "U";
String query_input_column4 = "DELIV_DATE";
String query_input_column4_value = "20131101";
String query_input_column5 = "MATERIAL";
String query_input_column5_value = "DELL-RQ2013";
String query_input_column6 = "QUANITY";
int query_input_column6_value = 10100;
/** Step 3 **/
String targetTableUnderBAPI = "REQUISITION_ITEMS";
/** Step 4 **/
/** For the confirmation read the value from export parameter after insertion execution **/
String result_column1 = "NUMBER";
JCoDestination destination = null;
try {
destination = JCoDestinationManager.getDestination(DestinationManager.DESTINATION_NAME1);
JCoRepository repository = destination.getRepository();
JCoContext.begin(destination);
JCoFunction function = repository.getFunction(BAPI_NAME);
if(function == null)
throw new RuntimeException(BAPI_NAME + " not found in SAP.");
System.out.println("BAPI Name from function object: " + function.getName());
//function.getImportParameterList().setValue(query_input_column1, query_input_column1_value);
JCoTable table = function.getTableParameterList().getTable(targetTableUnderBAPI); //it is taken from the response value of metadata
//System.out.println("No of Columns: "+ table.getNumColumns());
System.out.println("Trying to execute append row");
table.appendRow();
table.setValue(query_input_column1,query_input_column1_value);
table.setValue(query_input_column2,query_input_column2_value);
table.setValue(query_input_column3,query_input_column3_value);
//table.setValue(query_input_column4,new java.util.Date(query_input_column4_value));
//skipped Other columns related code
try{
function.execute(destination);
}
catch(AbapException e){
System.out.println(e.toString());
return;
}
System.out.println("Let us check the result from export parameter");
String exportParamStructure = (String)function.getExportParameterList().getValue(result_column1); //getStructure(result_column1); // getValue(result_column1);
System.out.println("Resulting PR#: "+exportParamStructure);
} catch (JCoException e) {
e.printStackTrace();
}
finally
{
try {
JCoContext.end(destination);
} catch (JCoException e) {
e.printStackTrace();
}
}
I did not understand how to read the response and am trying to fetch it from exportParameters!!
Can anybody share a piece of code to insert and
getting confirmation response (do we get the PREQ_NO in response?)
I am adding date field value as "20131101", but not sure if the format and approach is right?
when i try to add Quantity column value, i get an error message complaining this column is not part of BAPIEBANC. But the column is visible in BAPIEBANC type.
any configuration on SAP side to be checked?
should i activate any fields in JCo side? if so, how
Please note that my knowledge on SAP is very limited.
Waiting for an expert's response.
Thanks.

First, you should take a look at SAP JCo documentation, e.g.
http://help.sap.com/saphelp_nw04/helpdata/en/6f/1bd5c6a85b11d6b28500508b5d5211/content.htm
Regarding your code:
Adding (one) row to the table looks right on first sight.
Your code says QUANITY instead of QUANTITY.
You should add date values as java.util.Date; if creating a Date from a String format, you should use java.text.DateFormat.parse(). See http://docs.oracle.com/javase/6/docs/api/java/util/Date.html (this is however Java specific and has nothing to do with JCo).
If changing anything in SAP, never forget to call BAPI_TRANSACTION_COMMIT in the end to finish the logical unit of work (aka transaction) or nothing will actually be changed.
If you don't like to fiddle with the more or less complicated and verbose JCo API, try using Hibersap which gives you a much nicer programming model when calling functions in SAP ERP: http://hibersap.org.
However, you will still need a basic understanding on how SAP function modules work technically (such as parameter types or data types) as well as on the domain specific model which lies behind them (in your case, creating a requisition). I.e. you may need to communicate with your SAP experts.

Here I added 2 types of insertion :
insertval() function for user defined module resides in sap with the help of abap programmer
Its an standard module for insert a ticket using jco to SOLMAN system. First you have to analyse import, export, table & structure parameters, and according to that you have to pass values and retrieve response. In second function it will return ticket n° after successfull insertion of ticket in solman.
I hope this sample code will help you, it worked for me.
public class jco
{
static String DESTINATION_NAME1 = "ABAP_AS_WITHOUT_POOL";
static String DESTINATION_NAME2 = "ABAP_AS_WITH_POOL";
static
{
Properties connectProperties = new Properties();
connectProperties.setProperty(DestinationDataProvider.JCO_ASHOST, "192.1.1.1");
connectProperties.setProperty(DestinationDataProvider.JCO_SYSNR, "01");
connectProperties.setProperty(DestinationDataProvider.JCO_CLIENT, "500");
connectProperties.setProperty(DestinationDataProvider.JCO_USER, "uname");
connectProperties.setProperty(DestinationDataProvider.JCO_PASSWD, "pwd");
connectProperties.setProperty(DestinationDataProvider.JCO_LANG, "en");
createDestinationDataFile(DESTINATION_NAME1, connectProperties);
connectProperties.setProperty(DestinationDataProvider.JCO_POOL_CAPACITY, "3");
connectProperties.setProperty(DestinationDataProvider.JCO_PEAK_LIMIT, "10");
createDestinationDataFile(DESTINATION_NAME2, connectProperties);
System.err.println("hai");
}
static void createDestinationDataFile(String destinationName, Properties connectProperties)
{
File destCfg = new File(destinationName+".jcoDestination");
try
{
try (FileOutputStream fos = new FileOutputStream(destCfg, false)) {
connectProperties.store(fos, "for tests only !");
}
}
catch (IOException e)
{
throw new RuntimeException("Unable to create the destination files", e);
}
}
public void insertval() throws JCoException
{
JCoDestination destination = JCoDestinationManager.getDestination(DESTINATION_NAME1);
JCoFunction jf=destination.getRepository().getFunction("ZUSER_DET");
jf.getImportParameterList().setValue("FIRST_NAME","member");
jf.getImportParameterList().setValue("LAST_NAME","c");
jf.getImportParameterList().setValue("USER_NO","1000");
jf.execute(destination);
System.out.println(jf);
}
public void insertticket() throws JCoException
{
JCoDestination destination = JCoDestinationManager.getDestination(DESTINATION_NAME2);
System.out.println("test"+"\n");
JCoFunction jf=destination.getRepository().getFunction("BAPI_NOTIFICATION_CREATE");
JCoTable jt1=jf.getTableParameterList().getTable("APPX_HEADERS");
JCoTable jt2=jf.getTableParameterList().getTable("APPX_LINES");
JCoTable jt3=jf.getTableParameterList().getTable("APPX_LINES_BIN");
JCoTable jt4=jf.getTableParameterList().getTable("NOTIF_NOTES");
JCoTable jt5=jf.getTableParameterList().getTable("NOTIF_PARTNERS");
JCoTable jt6=jf.getTableParameterList().getTable("NOTIF_SAP_DATA");
JCoTable jt7=jf.getTableParameterList().getTable("NOTIF_TEXT_HEADERS");
JCoTable jt8=jf.getTableParameterList().getTable("NOTIF_TEXT_LINES");
JCoStructure jfn1=jf.getImportParameterList().getStructure("NOTIF_EXT");
JCoStructure jfn2=jf.getImportParameterList().getStructure("NOTIF_CRM");
JCoStructure jfn3=jf.getImportParameterList().getStructure("IBASE_DATA");
jfn1.setValue("NUMB","1234");
jfn1.setValue("REFNUM","123");
jfn1.setValue("TYPE_NOTIF","SLFN");
jfn1.setValue("SUBJECT","tl");
jfn1.setValue("PRIORITY","2");
jfn1.setValue("LANGUAGE","EN");
jfn1.setValue("CATEGORY","Z01");
jfn2.setValue("CODE","0011");
jfn2.setValue("CODEGROUP","0011");
jfn2.setValue("CATEGORY","Z01");
jfn3.setValue("INSTANCE","489");
jfn3.setValue("IBASE","500");
jt1.appendRow();
jt1.setValue("DESCR","practise");
jt2.appendRow();
jt2.setValue("LINE","CVXCVXCV");
jt3.appendRow();
jt3.setValue("LINE","second text line");
jt4.appendRow();
jt4.setValue("TYPE_NOTE","my");
jt4.setValue("IDENT","hoe twwrtgw");
jt4.setValue("DESCRIPTION","its description ");
jt5.appendRow();
jt5.setValue("PARNR","new ");
jt5.setValue("TYPE_PAR","FN");
jt5.setValue("FUNC_PAR","EN");
jt5.setValue("PAR_ACTIVE","1");
jt6.appendRow();
jt6.setValue("INSTN","0020214076");
jt6.setValue("COMP","FI-AA");
jt6.setValue("SYSTYPE","P");
jt6.setValue("SYSID","PRD");
jt6.setValue("MANDT","900");
jt8.appendRow();
jt8.setValue("TXT_NUM","1");
jt8.setValue("TDFORMAT",">X");
jt8.setValue("TDLINE","/(performing all test)");
jf.execute(destination);
String jfex=jf.getExportParameterList().getString("REFNUM");
System.out.println("hi "+jfex);
}

Related

How can I log something in USQL UDO?

I have custom extractor, and I'm trying to log some messages from it.
I've tried obvious things like Console.WriteLine, but cannot find where output is. However, I found some system logs in adl://<my_DLS>.azuredatalakestore.net/system/jobservice/jobs/Usql/.../<my_job_id>/.
How can I log something? Is it possible to specify log file somewhere on Data Lake Store or Blob Storage Account?
A recent release of U-SQL has added diagnostic logging for UDOs. See the release notes here.
// Enable the diagnostics preview feature
SET ##FeaturePreviews = "DIAGNOSTICS:ON";
// Extract as one column
#input =
EXTRACT col string
FROM "/input/input42.txt"
USING new Utilities.MyExtractor();
#output =
SELECT *
FROM #input;
// Output the file
OUTPUT #output
TO "/output/output.txt"
USING Outputters.Tsv(quoting : false);
This was my diagnostic line from the UDO:
Microsoft.Analytics.Diagnostics.DiagnosticStream.WriteLine(System.String.Format("Concatenations done: {0}", i));
This is the whole UDO:
using System.Collections.Generic;
using System.IO;
using System.Text;
using Microsoft.Analytics.Interfaces;
namespace Utilities
{
[SqlUserDefinedExtractor(AtomicFileProcessing = true)]
public class MyExtractor : IExtractor
{
//Contains the row
private readonly Encoding _encoding;
private readonly byte[] _row_delim;
private readonly char _col_delim;
public MyExtractor()
{
_encoding = Encoding.UTF8;
_row_delim = _encoding.GetBytes("\n\n");
_col_delim = '|';
}
public override IEnumerable<IRow> Extract(IUnstructuredReader input, IUpdatableRow output)
{
string s = string.Empty;
string x = string.Empty;
int i = 0;
foreach (var current in input.Split(_row_delim))
{
using (System.IO.StreamReader streamReader = new StreamReader(current, this._encoding))
{
while ((s = streamReader.ReadLine()) != null)
{
//Strip any line feeds
//s = s.Replace("/n", "");
// Concatenate the lines
x += s;
i += 1;
}
Microsoft.Analytics.Diagnostics.DiagnosticStream.WriteLine(System.String.Format("Concatenations done: {0}", i));
//Create the output
output.Set<string>(0, x);
yield return output.AsReadOnly();
// Reset
x = string.Empty;
}
}
}
}
}
And these were my results found in the following directory:
/system/jobservice/jobs/Usql/2017/10/20.../diagnosticstreams
good question. I have been asking myself the same thing. This is theoretical, but I think it would work (I'll updated if I find differently).
One very hacky way is that you could insert rows into a table with your log messages as a string column. Then you can select those out and filter based on some log_producer_id column. You also get the benefit of logging if part of the script works, but later parts do not assuming the failure does not roll back. Table can be dumped at end as well to file.
For the error cases, you can use the Job Manager in ADLA to open the job graph and then view the job output. The errors often have detailed information for data-related errors (e.g. row number in file with error and a octal/hex/ascii dump of the row with issue marked with ###).
Hope this helps,
J
ps. This isn't a comment or an answer really, since I don't have working code. Please provide feedback if the above ideas are wrong.

ImmutablePropertyException periodically when changing enum field's value via Rational Team Concert API

Hitting this issue with changing certain enumeration-based fields in my new RTC work item for a RTC API tool I'm working on.
Basically, I get an ImmutablePropertyException the first time I change the field, but the next time it works without an exception.
Want to get rid of the exceptions. I'm using a value RTC is actually returning to me as a valid enum value for the field.
Assigning RTC work item field: odc.impact a field value of ->
Integrity [odc.impact.literal.l4]
EXCEPTION: Could not assign value,
even though it was found in the enumeration list: [Unassigned,
Installability, Standards, Integrity]
com.ibm.team.repository.common.internal.ImmutablePropertyException at
com.ibm.team.repository.common.internal.util.ItemUtil$ProtectAdapter.notifyChanged(ItemUtil.java:2070)
at
org.eclipse.emf.common.notify.impl.BasicNotifierImpl.eNotify(BasicNotifierImpl.java:380)
at
com.ibm.team.repository.common.model.impl.StringExtensionEntryImpl.setTypedValue(StringExtensionEntryImpl.java:178)
at
com.ibm.team.repository.common.model.impl.StringExtensionEntryImpl.setValue(StringExtensionEntryImpl.java:360)
at org.eclipse.emf.common.util.BasicEMap.putEntry(BasicEMap.java:303)
at org.eclipse.emf.common.util.BasicEMap.put(BasicEMap.java:584) at
org.eclipse.emf.common.util.BasicEMap$DelegatingMap.put(BasicEMap.java:799)
at
com.ibm.team.repository.common.model.impl.ItemImpl.setStringExtension(ItemImpl.java:1228)
at
com.ibm.team.workitem.common.internal.model.impl.WorkItemImpl.setEnumeration(WorkItemImpl.java:3779)
at
com.ibm.team.workitem.common.internal.model.impl.WorkItemImpl.setValue(WorkItemImpl.java:2915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620) at
com.ibm.team.repository.common.internal.util.ItemStore$ItemInvocationHandler.invoke(ItemStore.java:597)
at com.sun.proxy.$Proxy18.setValue(Unknown Source) at
com.rtc.vda.WorkItemInitialization.setAttributeValueEx(WorkItemInitialization.java:237)
at
com.rtc.vda.WorkItemInitialization.setAttributeValue(WorkItemInitialization.java:210)
at
com.rtc.vda.WorkItemInitialization.execute(WorkItemInitialization.java:186)
at
com.ibm.team.workitem.client.WorkItemOperation.execute(WorkItemOperation.java:85)
at
com.ibm.team.workitem.client.WorkItemOperation.doRun(WorkItemOperation.java:272)
at
com.ibm.team.workitem.client.WorkItemOperation.run(WorkItemOperation.java:242)
at
com.ibm.team.workitem.client.WorkItemOperation.run(WorkItemOperation.java:189)
at com.rtc.vda.RTCUtilities.createWorkItem(RTCUtilities.java:191) at
com.rtc.vda.RTCMain.main(RTCMain.java:178)
Assigning: odc.impact -> Integrity [odc.impact.literal.l4]
This is the code snippet to set the enum value:
public boolean setAttributeValueEx (IWorkItem w, String attributeKey, String valueName) {
// (REO) Get the attribute
IAttribute a = customAttributesMap.get(attributeKey);
// (REO) Buffer of valid values for error reporting
StringBuffer b = new StringBuffer();
try {
// (REO) Get the enumeration for this attribute from the repository (DO NOT CACHE IT OR YOU WILL HAVE PROBLEMS)
IWorkItemClient workItemClient = (IWorkItemClient) rtcParameters.getTeamRepository().getClientLibrary(IWorkItemClient.class);
IEnumeration<? extends ILiteral> rtcAttrEnumeration = workItemClient.resolveEnumeration(a, curMonitor);
// (REO) Find an enum value that matches this string and assign it
for (ILiteral literal : rtcAttrEnumeration.getEnumerationLiterals()) {
String vName = literal.getName();
String vId = literal.getIdentifier2().getStringIdentifier();
b.append(",");
b.append(vName);
if (valueName.equalsIgnoreCase(vName)) {
String msg2 = "Assigning: " + a.getIdentifier() + " -> " + vName + " [" + vId + "]";
RTCMain.out(msg2);
w.setValue(a, literal.getIdentifier2()); // (REO) SOURCE OF PERIODIC EXCEPTION
return true;
}
}
} catch (Exception e) {
RTCMain.out("EXCEPTION: Could not assign value, even though it was found in the enumeration list:\n\t[" + b + "]");
e.printStackTrace();
RTCMain.out("");
return false;
}
RTCMain.out("VALUE NOT FOUND: Valid values are:" + b);
return false;
}
Anyone know why I'm getting the periodic ImmutablePropertyException for only some of the fields, and why it goes away on the second call?
Thanks!
You just need to use the workingCopy.getWorkItem() object passed in to the execute() call rather than a cached version in a member variable. The attributes on the workingCopy object are not immutable and work fine.
public class WorkItemCreator extends WorkItemOperation {
...
#Override
protected void execute(WorkItemWorkingCopy workingCopy, IProgressMonitor monitor) throws TeamRepositoryException {
IWorkItem newWorkItem = workingCopy.getWorkItem();
// Set attribute values on newWorkItem to avoid ImmutablePropertyExceptions

Registering plugin on quick find in Dynamics CRM 2013

I have to register a plugin on Quick Find search on "Artilce" entity. When user enter any thing in quick find text box on Article entity at that time my plugin execute and return filter the data based on our business logic.
1.What event is fired when we find using quick find.
2.What message passes when this event is fired.
I have tried registering the plugin on RetrieveMultiple message but this is not triggered when we click on search in quick find.
Please help.
We have a Plugin registered on the RetrieveMultiple. We had a business requirement to search for the records, using WildCard by default.
Plugin Registration Details:
Message: RetrieveMultiple
Primary Entity:None
Secondary Entity:None
Pre-Operation
Code:
public const String QueryLiteral = "Query";
public const String LIKE = "%";
public void Execute(IServiceProvider serviceProvider)
{
String ParentEntity = String.Empty;
String OriginalSearch = String.Empty;
// Obtain the execution context from the service provider.
var ContextInstance = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
// Get a reference to the Organization service.
IOrganizationService ServiceInstance =
((IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory))).
CreateOrganizationService(ContextInstance.InitiatingUserId);
// Critical Point here - NOTICE that the InputParameters Contains the word Query
if (ContextInstance.Depth < 2 && ContextInstance.InputParameters.Contains(QueryLiteral) &&
ContextInstance.InputParameters[QueryLiteral] is QueryExpression)
{
QueryExpression QueryPointer = (ContextInstance.InputParameters[QueryLiteral] as QueryExpression);
//Verify the conversion worked as expected - if not, everything else is useless
if (null != QueryPointer)
{
// Check if the request is coming from any Search View
// We know this b/c Criteria isn't null and the Filters Count > 1
if (null != QueryPointer.Criteria && QueryPointer.Criteria.Filters.Count > 1)
{
ParentEntity = ContextInstance.PrimaryEntityName;
OriginalSearch = QueryPointer.Criteria.Filters[1].Conditions[0].Values[0].ToString();
OriginalSearch = String.Format(CultureInfo.CurrentCulture,
"{0}{1}", LIKE, OriginalSearch);
}
ConditionExpression NewCondition = null;
FilterExpression NewFilter = null;
if (null != QueryPointer.Criteria)
{
//Change the default 'BeginsWith'Operator to 'Contains/Like' operator in the basic search query
foreach (FilterExpression FilterSet in QueryPointer.Criteria.Filters)
{
foreach (ConditionExpression ConditionSet in FilterSet.Conditions)
{
if (ConditionSet.Operator == ConditionOperator.Like)
{
if (OriginalSearch != "")
ConditionSet.Values[0] = OriginalSearch;
else
{
OriginalSearch = QueryPointer.Criteria.Filters[0].Conditions[0].Values[0].ToString();
OriginalSearch = String.Format(CultureInfo.CurrentCulture,
"{0}{1}", LIKE, OriginalSearch);
ConditionSet.Values[0] = OriginalSearch;
}
}
}
}
}
}
ContextInstance.InputParameters[QueryLiteral] = QueryPointer;
}
}
Check details on this Post: http://www.williamgryan.mobi/?p=596
We have raised a ticket with Microsoft to address this situaation.
The solution they provided was to modify the Database to make the message
SearchByTitleKbArticleRequest available in the plugin registration tool.
I currently dont remember the table that we updated a flag against these messages.
After updating the table we were able to register the plugin against the message
SearchByTitleKbArticleRequest
Then the plugin triggered and we modified the entity collection returned from there.

Dapper.Net and the DataReader

I have a very strange error with dapper:
there is already an open DataReader associated with this Command
which must be closed first
But I don't use DataReader! I just call select query on my server application and take first result:
//How I run query:
public static T SelectVersion(IDbTransaction transaction = null)
{
return DbHelper.DataBase.Connection.Query<T>("SELECT * FROM [VersionLog] WHERE [Version] = (SELECT MAX([Version]) FROM [VersionLog])", null, transaction, commandTimeout: DbHelper.CommandTimeout).FirstOrDefault();
}
//And how I call this method:
public Response Upload(CommitRequest message) //It is calling on server from client
{
//Prepearing data from CommitRequest
using (var tr = DbHelper.DataBase.Connection.BeginTransaction(IsolationLevel.Serializable))
{
int v = SelectQueries<VersionLog>.SelectVersion(tr) != null ? SelectQueries<VersionLog>.SelectVersion(tr).Version : 0; //Call my query here
int newVersion = v + 1; //update version
//Saving changes from CommitRequest to db
//Updated version saving to base too, maybe it is problem?
return new Response
{
Message = String.Empty,
ServerBaseVersion = versionLog.Version,
};
}
}
}
And most sadly that this exception appearing in random time, I think what problem in concurrent access to server from two clients.
Please help.
This some times happens if the model and database schema are not matching and an exception is being raised inside Dapper.
If you really want to get into this, best way is to include dapper source in your project and debug.

Create table out of a query result?

I'm using Bigquery's Java API. I'm running a select query and want the result saved to a destination table.
I've set the loadConfig.setDestinationTable() but I am getting "Load configuration must specify at least one source URI".
Could you please explain what am I doing wrong?
You don't want to set the loadConfig destination table, but the queryConfig.setDestinationTable() instead (since this isn't a load job -- it is a query job). As Fh said, if you share the code you're using we can give more detailed help.
this is the code i am using to do this:
public static String copyTable(String project, String dataSet, String table) {
String newTableName = table + "_copy_"+System.currentTimeMillis();;
try {
Job copyJob = new Job();
TableReference source = new TableReference();
source.setProjectId(project);
source.setDatasetId(dataSet);
source.setTableId(table);
TableReference destination = new TableReference();
destination.setProjectId(project);
destination.setDatasetId(dataSet);
destination.setTableId(newTableName);
JobConfiguration configuration = new JobConfiguration();
JobConfigurationTableCopy copyConf = new JobConfigurationTableCopy();
copyConf.setSourceTable(source);
copyConf.setDestinationTable(destination);
configuration.setCopy(copyConf);
copyJob.setConfiguration(configuration);
bigquery.jobs().insert(project, copyJob).execute();
return newTableName;
} catch (Exception e) {
e.printStackTrace();
logger.warn("unable to copy table :" + project + "."
+ dataSet + "." + table, e);
throw new RuntimeException(e);
}
}
please contact me if you have any more questions
Assuming you are running an interactive asynchronous query, you essentially want to pass the query, destination projectId, destination dataSetId and destination tableId in one request body. Refer to the Java API example here: https://developers.google.com/bigquery/querying-data#asyncqueries