Below is the code session. please check and solve my issue.
holder.outer_qty.setOnLongClickListener(new View.OnLongClickListener() {
private Handler mHandler = new Handler();
private Runnable incrementRunnable = new Runnable() {
#Override public void run() {
mHandler.removeCallbacks(incrementRunnable);
if (holder.outer_qty.isPressed()) {
if (holder.outer_count > 1) {
Fragment_cataloguesall.mselected_update = 1;
holder.outer_count = holder.outer_count - 1;
holder.Total_outerQTY = holder.outerqntyValue * holder.outer_count;
holder.Total_qty.setText(Integer.toString(holder.Total_outerQTY));
Fragment_cataloguesall.dbcustom_list_vertical.notifyDataSetChanged();
helper.update_allpro(holder.productCode.getText().toString(), holder.plus_sign.getText().toString(), holder.outer_qty.getText().toString(), holder.Total_qty.getText().toString(), String.valueOf(holder.outer_count), "0");
holder.outer_count++;
dbList.clear();
mFilteredList.clear();
dbList = helper.GetProductData(Fragment_cataloguesall.CatCode, Fragment_cataloguesall.SubcatCode, "", Filter_screen);
mFilteredList = dbList;
selectedCard = dbList;
} else {}
mHandler.postDelayed(incrementRunnable, 1);
}
}
};
#Override
public boolean onLongClick(View view) {
mHandler.postDelayed(incrementRunnable, 1);
return true;
}
});
How to solve this issue please explain or help me to solve it?
Related
I am new with unit test and I am trying to test this method, but it did not manage to capture the query of the method, I only managed to get it to enter an exception but not to take the query and return it.
Is there a way to return "result.getResult().get(0)" in the unit test?
Thanks
#Override
public HouseModel findByCode(String code) {
var sQuery = "SELECT {h:pk} FROM {House as h} WHERE {h:id} = ?id ";
var query = new FlexibleSearchQuery(sQuery);
query.addQueryParameter("id", Objects.requireNonNullElse(code, ""));
SearchResult<HouseModel> result = flexibleSearchService.search(query);
return result.getResult().get(0);
}
Code Test:
#Test
public void testFindByCode() {
when(flexibleSearchService.search((FlexibleSearchQuery) any())).thenThrow(new RuntimeException("test"));
RuntimeException exception = new RuntimeException();
try {
var result2 = houseDAOImpl.findByCode("testcode");
} catch (RuntimeException e) {
e.printStackTrace();
exception = e;
}
boolean shouldtrue = exception.getMessage().equalsIgnoreCase("test");
System.out.println(exception.getMessage());
System.out.println(shouldtrue);
assertTrue(shouldtrue);
}
Hybris supports TransactionTest incase of interaction with db.
public class HouseDAOImpTest extends HybrisJUnit4TransactionalTest
{
private TypeService typeService;
private ModelService modelService;
private DeeplinkUrlDao dao;
private List<HouseModel> createdRules;
/**
* #throws java.lang.Exception
*/
#Before
public void setUp() throws Exception
{
createdRules = createHouses();
houseDAOImpl = (HouseFinderDao) Registry.getApplicationContext().getBean("houseFinderDao");
}
#Test
public void testFindByCode()
{
final HouseModel hm = houseDAOImpl.findByCode("testcode");
assertThat(hm.getCode(), is(equalTo(""testcode""));
}
private ModelService getModelService()
{
if (modelService == null)
{
modelService = (ModelService) Registry.getApplicationContext().getBean("modelService");
}
return modelService;
}
private TypeService getTypeService()
{
if (typeService == null)
{
typeService = (TypeService) Registry.getApplicationContext().getBean("typeService");
}
return typeService;
}
/**
* Creates the Houses.
*/
private List<HouseModel> createHouses()
{
final List<HouseModel> result = new ArrayList<HouseModel>();
final HouseModel houseModel1 = getModelService().create(HouseModel.class);
houseModel.setCode("testcode");
modelService.save(houseModel1);
// create other houses model and follow previous steps
result.add(houseModel1);
result.add(houseModel2);
result.add(houseModel3);
return result;
}
}
Before adding viewmodel & livedata , i successfully implemented infinity scroll with retrofit. But after adding viewmodel & livedata with Retrofit, My can't update recyclerview with new data call or viewmodel observer not update the list.
I simply want to infinite scrolling as my code does before. I add a global variable to reuse next page token. Am i missing anything or any sample to implement infinite recyclerview with viewmodel & retrofit will be awesome.
public static String NEXT_PAGE_URL = null;
I coded like that.
My Activity -> PlaceListActivity
placeRecyclerView.addOnScrollListener(new RecyclerView.OnScrollListener() {
#Override
public void onScrollStateChanged(RecyclerView recyclerView, int newState) {
super.onScrollStateChanged(recyclerView, newState);
LogMe.d(tag, "onScrollStateChanged:: " + "called");
// check scrolling started or not
if (newState == AbsListView.OnScrollListener.SCROLL_STATE_TOUCH_SCROLL) {
isScrolling = true;
}
}
#Override
public void onScrolled(RecyclerView recyclerView, int dx, int dy) {
LogMe.d(tag, "onScrolled:: " + "called");
super.onScrolled(recyclerView, dx, dy);
currentItem = layoutManager.getChildCount();
totalItems = layoutManager.getItemCount();
scrolledOutItems = ((LinearLayoutManager) recyclerView.getLayoutManager()).findFirstVisibleItemPosition();
LogMe.d(tag, "currentItem:: " + currentItem);
LogMe.d(tag, "totalItems:: " + totalItems);
LogMe.d(tag, "scrolledOutItems:: " + scrolledOutItems);
if (isScrolling && (currentItem + scrolledOutItems == totalItems)) {
LogMe.d(tag, "view:: " + "finished");
isScrolling = false;
if (ApplicationData.NEXT_PAGE_URL != null) {
LogMe.d(tag, "place adding:: " + " onScrolled called");
ll_loading_more.setVisibility(View.VISIBLE);
// todo: call web api here
callDataFromLocationAPi(type, ApplicationData.NEXT_PAGE_URL, currentLatLng);
} else {
LogMe.d(tag, "next_page_url:: " + " is null");
}
}
}
});
private void callDataFromLocationAPi(String type, String next_page_url, LatLng latLng) {
if (Connectivity.isConnected(activity)) {
showProgressDialog();
model.getNearestPlaces(type, next_page_url, latLng).
observe(activity, new Observer<List<PlaceDetails>>() {
#Override
public void onChanged(#Nullable List<PlaceDetails> placeDetails) {
ll_loading_more.setVisibility(View.GONE);
LogMe.i(tag, "callDataFromLocationAPi: onChanged called !");
hideProgressDialog();
if (placeDetails != null) {
placeDetailsList = placeDetails;
placeListAdapter.setPlaceList(placeDetails);
}
}
});
} else {
showAlertForInternet(activity);
}
}
In PlaceViewModel
public class PlaceViewModel extends AndroidViewModel {
//this is the data that we will fetch asynchronously
private MutableLiveData<List<PlaceDetails>> placeList;
private PlaceRepository placeRepository;
private String tag = getClass().getName();
public PlaceViewModel(Application application) {
super(application);
placeRepository = new PlaceRepository(application);
}
//we will call this method to get the data
public MutableLiveData<List<PlaceDetails>> getNearestPlaces(String type,
String next_page_token,
LatLng latLng) {
//if the list is null
if (placeList == null) {
placeList = new MutableLiveData<>();
//we will load it asynchronously from server in this method
//loadPlaces(type, next_page_token, latLng);
placeList = placeRepository.getNearestPlacesFromAPI(type, next_page_token, latLng);
}
//finally we will return the list
return placeList;
}
}
In my PlaceRepository.java looks
public class PlaceRepository {
private static final Migration MIGRATION_1_2 = new Migration(1, 2) {
#Override
public void migrate(SupportSQLiteDatabase database) {
// Since we didn't alter the table, there's nothing else to do here.
}
};
private PlaceDatabase placeDatabase;
private CurrentLocation currentLocation = null;
private String tag = getClass().getName();
//this is the data that we will fetch asynchronously
private MutableLiveData<List<PlaceDetails>> placeList;
public PlaceRepository(Context context) {
placeDatabase = PlaceDatabase.getDatabase(context);
//addMigrations(MIGRATION_1_2)
placeList =
new MutableLiveData<>();
}
public MutableLiveData<List<PlaceDetails>> getNearestPlacesFromAPI(String type, final String next_page_token, LatLng latLng) {
List<PlaceDetails> placeDetailsList = new ArrayList<>();
try {
ApiInterface apiService = ApiClient.getClient().create(ApiInterface.class);
Call<Example> call = apiService.getNearbyPlaces(type,
latLng.latitude + "," +
latLng.longitude, ApplicationData.PROXIMITY_RADIUS,
ApplicationData.PLACE_API_KEY, next_page_token);
call.enqueue(new Callback<Example>() {
#Override
public void onResponse(Call<Example> call, Response<Example> response) {
try {
Example example = response.body();
ApplicationData.NEXT_PAGE_URL = example.getNextPageToken();
// next_page_url = example.getNextPageToken();
LogMe.i(tag, "next_page_url:" + ApplicationData.NEXT_PAGE_URL);
if (example.getStatus().equals("OK")) {
LogMe.i("getNearbyPlaces::", " --- " + response.toString() +
response.message() + response.body().toString());
// This loop will go through all the results and add marker on each location.
for (int i = 0; i < example.getResults().size(); i++) {
Double lat = example.getResults().get(i).getGeometry().getLocation().getLat();
Double lng = example.getResults().get(i).getGeometry().getLocation().getLng();
String placeName = example.getResults().get(i).getName();
String vicinity = example.getResults().get(i).getVicinity();
String icon = example.getResults().get(i).getIcon();
String place_id = example.getResults().get(i).getPlaceId();
PlaceDetails placeDetails = new PlaceDetails();
if (example.getResults().get(i).getRating() != null) {
Double rating = example.getResults().get(i).getRating();
placeDetails.setRating(rating);
}
//List<Photo> photoReference = example.getResults().
// get(i).getPhotos();
placeDetails.setName(placeName);
placeDetails.setAddress(vicinity);
placeDetails.setLatitude(lat);
placeDetails.setLongitude(lng);
placeDetails.setIcon(icon);
placeDetails.setPlace_id(place_id);
//placeDetails.setPlace_type(place_type_title);
double value = ApplicationData.
DISTANCE_OF_TWO_LOCATION_IN_KM(latLng.latitude, latLng.longitude, lat, lng);
//new DecimalFormat("##.##").format(value);
placeDetails.setDistance(new DecimalFormat("##.##").format(value));
String ph = "";
if (example.getResults().
get(i).getPhotos() != null) {
try {
List<Photo> photos = example.getResults().
get(i).getPhotos();
//JSONArray array = new JSONArray(example.getResults().
//get(i).getPhotos());
//JSONObject jsonObj = new JSONObject(array.toString());
//ph = jsonObj.getString("photo_reference");
ph = photos.get(0).getPhotoReference();
//LogMe.i(tag, "\n" + ph);
} catch (Exception e) {
e.printStackTrace();
//placeDetails.setPicture_reference(ph);
//PLACE_DETAILS_LIST.add(placeDetails);
//LogMe.i(tag, "#### Exception Occureed ####");
ph = "";
//continue;
}
}
placeDetails.setPicture_reference(ph);
placeDetailsList.add(placeDetails);
placeList.postValue(placeDetailsList);
}
} else {
}
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void onFailure(Call<Example> call, Throwable t) {
Log.e("onFailure", t.toString());
}
});
} catch (RuntimeException e) {
//hideProgressDialog();
Log.d("onResponse", "RuntimeException is an error");
e.printStackTrace();
} catch (Exception e) {
Log.d("onResponse", "Exception is an error");
}
return placeList;
}
}
I precise code due to question simplicity.
Though you already use android-jetpack, take a look at Paging library. It's specially designed for building infinite lists using RecyclerView.
Based on your source code, I'd say that you need PageKeyedDataSource, here is some example which includes info about how to implement PageKeyedDataSource -
7 steps to implement Paging library in Android
If talking about cons of this approach:
You don't need anymore to observe list scrolling (library doing it for you), you just need to specify your page size in the next way:
PagedList.Config myPagingConfig = new PagedList.Config.Builder()
.setPageSize(50)
.build();
From documentation:
Page size: The number of items in each page.
Your code will be more clear, you'll get rid of your RecyclerView.OnScrollListener
ViewModel code will be shorter, it's will provide only PagedList:
#NonNull
LiveData<PagedList<ReviewSection>> getReviewsLiveData() {
return reviewsLiveData;
}
JPanel p2 = new JPanel(new GridLayout(6,1));
ButtonGroup tubtype = new ButtonGroup();
JRadioButton roundrButton = new JRadioButton("Round", true);
tubtype.add(roundrButton);
JRadioButton ovalrButton = new JRadioButton("Oval", false);
tubtype.add(ovalrButton);
calcButton = new JButton("Calculate Volume");
exitButton = new JButton("Exit");
hlength = new JTextField(5);
hwidth = new JTextField(5);
hdepth = new JTextField(5);
hvolume = new JTextField(5);
lengthLabel = new JLabel("Enter the tub's length (ft):");
widthLabel = new JLabel("Enter the tub's width (ft):");
depthLabel = new JLabel("Enter the tub's depth (ft):");
volumeLabel = new JLabel("The tub's volume (ft^3):");
p2.add(roundrButton);
p2.add(ovalrButton);
p2.add(lengthLabel);
p2.add(hlength);
p2.add(widthLabel);
p2.add(hwidth);
p2.add(depthLabel);
p2.add(hdepth);
p2.add(volumeLabel);
p2.add(hvolume);
p2.add(calcButton);
p2.add(exitButton);
tab.addTab( "Hot Tubs", null, p2, " Panel #1" );
calcButtonHandler2 ihandler =new calcButtonHandler2();
calcButton.addActionListener(ihandler);
exitButtonHandler ghandler =new exitButtonHandler();
exitButton.addActionListener(ghandler);
FocusHandler hhandler =new FocusHandler();
hlength.addFocusListener(hhandler);
hwidth.addFocusListener(hhandler);
hdepth.addFocusListener(hhandler);
hvolume.addFocusListener(hhandler);
// add JTabbedPane to container
getContentPane().add( tab );
setSize( 550, 500 );
setVisible( true );
}
public class calcButtonHandler implements ActionListener {
public void actionPerformed(ActionEvent e) {
DecimalFormat num =new DecimalFormat(",###.##");
double sLength, sWidth, sdepth, Total;
sLength = Double.valueOf(plength.getText());
sWidth =Double.valueOf(pwidth.getText());
sdepth =Double.valueOf(pdepth.getText());
if(e.getSource() == pcalcButton) {
Total = sLength * sWidth * sdepth;
pvolume.setText(num.format(Total));
try{
String value=pvolume.getText();
File file = new File("output.txt");
FileWriter fstream = new FileWriter(file,true);
BufferedWriter out = new BufferedWriter(fstream);
out.write("Length= "+sLength+", Width= "+sWidth+", Depth= "+sdepth+" so the volume of Swimming Pool is "+value);
out.newLine();
out.close();
}
catch(Exception ex){}
}
}
}
public class calcButtonHandler2 implements ActionListener {
public void actionPerformed(ActionEvent g) {
DecimalFormat num =new DecimalFormat(",###.##");
double cLength, cWidth, cdepth, Total;
cLength = Double.valueOf(hlength.getText());
cWidth = Double.valueOf(hwidth.getText());
cdepth = Double.valueOf(hdepth.getText());
try
{
AbstractButton roundrButton;
if(roundrButton.isSelected())
{
Total = Math.PI * Math.pow(cLength / 2.0, 2) * cdepth;
}
else
{
Total = Math.PI * Math.pow(cLength * cWidth, 2) * cdepth;
}
hvolume.setText(""+num.format(Total));
}
catch(Exception ex){}
}
}
}
class exitButtonHandler implements ActionListener {
public void actionPerformed(ActionEvent g){
System.exit(0);
}
}
class FocusHandler implements FocusListener {
public void focusGained(FocusEvent e) {
}
public void focusLost(FocusEvent e) {
}
public static void main( String args[] )
{
Final tabs = new Final();
tabs.setDefaultCloseOperation( JFrame.EXIT_ON_CLOSE );
}
I am getting an error that says my roundrButton may not have been initialized. Please help.
The problem is these lines here:
try
{
AbstractButton roundrButton;
if(roundrButton.isSelected())
{
You're declaring a variable called roundrButton, and then attempting to call .isSelected() on it. You never assign anything to roundButton, though, so it is going to be null.
It seems like you might want to be referring to the roundrButton that you declared earlier. Have you considered making it a field of your class?
The error happens here:
try {
AbstractButton roundrButton;
if (roundrButton.isSelected()) {
You declared a variable called roundrButton but did not assign any object to it.
Then you called a method on, well, who knows? There is no object on which to call the method, because the variable roundrButton never received a value.
I guess your problem is in method actionPerformed of class calcButtonHandler2 :
try {
AbstractButton roundrButton;
if(roundrButton.isSelected())
roundrButton is indeed not initialized and it will lead to a NullPointerException at runtime.
What's the simplest way to use MiniProfiler's database profiling with NHibernate? In order for the profiler to work, I need to wrap the DbConnection that NHibernate uses in a ProfiledDbConnection.
I'm not too familiar with the internals of NHibernate, so I don't know where all the extensibility points are. (I noticed that an NHibernate ISession has a Connection property, but it is read-only.)
[UPDATE] Please see the following links for a version of that uses RealProxy to proxy the SqlCommand - batching is now supported
blog http://blog.fearofaflatplanet.me.uk/mvcminiprofiler-and-nhibernate-take-2
gist https://gist.github.com/1110153
I've left the original answer unaltered as it was accepted. [/UPDATE]
I've managed to partially get this to work by implementing a Profiled Client Driver (example for Sql Server 2008 below) - this works for simple examples, however I haven't yet found a solution for NH batching (which attempts to cast the command back to SqlCommand)
public class ProfiledSql2008ClientDriver : Sql2008ClientDriver
{
public override IDbCommand CreateCommand()
{
return new ProfiledDbCommand(
base.CreateCommand() as DbCommand,
null,
MiniProfiler.Current);
}
public override IDbConnection CreateConnection()
{
return ProfiledDbConnection.Get(
base.CreateConnection() as DbConnection,
MiniProfiler.Current);
}
}
I extended Roberts answer above to work with NHibernate batching. There is a lot of code here so it can possibly be shortened, some of it based on the nHibernate source for the client driver.
<property name="connection.driver_class">YoureOnTime.Data.ProfiledSqlClientDriver, YoureOnTime.Common</property>
public class ProfiledSqlClientDriver : DriverBase, IEmbeddedBatcherFactoryProvider
{
public override IDbConnection CreateConnection()
{
return new ProfiledSqlDbConnection(
new SqlConnection(),
MiniProfiler.Current);
}
public override IDbCommand CreateCommand()
{
return new ProfiledSqlDbCommand(
new SqlCommand(),
null,
MiniProfiler.Current);
}
public override bool UseNamedPrefixInSql
{
get { return true; }
}
public override bool UseNamedPrefixInParameter
{
get { return true; }
}
public override string NamedPrefix
{
get { return "#"; }
}
public override bool SupportsMultipleOpenReaders
{
get { return false; }
}
public static void SetParameterSizes(IDataParameterCollection parameters, SqlType[] parameterTypes)
{
for (int i = 0; i < parameters.Count; i++)
{
SetVariableLengthParameterSize((IDbDataParameter)parameters[i], parameterTypes[i]);
}
}
private const int MaxAnsiStringSize = 8000;
private const int MaxBinarySize = MaxAnsiStringSize;
private const int MaxStringSize = MaxAnsiStringSize / 2;
private const int MaxBinaryBlobSize = int.MaxValue;
private const int MaxStringClobSize = MaxBinaryBlobSize / 2;
private const byte MaxPrecision = 28;
private const byte MaxScale = 5;
private const byte MaxDateTime2 = 8;
private const byte MaxDateTimeOffset = 10;
private static void SetDefaultParameterSize(IDbDataParameter dbParam, SqlType sqlType)
{
switch (dbParam.DbType)
{
case DbType.AnsiString:
case DbType.AnsiStringFixedLength:
dbParam.Size = MaxAnsiStringSize;
break;
case DbType.Binary:
if (sqlType is BinaryBlobSqlType)
{
dbParam.Size = MaxBinaryBlobSize;
}
else
{
dbParam.Size = MaxBinarySize;
}
break;
case DbType.Decimal:
dbParam.Precision = MaxPrecision;
dbParam.Scale = MaxScale;
break;
case DbType.String:
case DbType.StringFixedLength:
dbParam.Size = IsText(dbParam, sqlType) ? MaxStringClobSize : MaxStringSize;
break;
case DbType.DateTime2:
dbParam.Size = MaxDateTime2;
break;
case DbType.DateTimeOffset:
dbParam.Size = MaxDateTimeOffset;
break;
}
}
private static bool IsText(IDbDataParameter dbParam, SqlType sqlType)
{
return (sqlType is StringClobSqlType) || (sqlType.LengthDefined && sqlType.Length > MsSql2000Dialect.MaxSizeForLengthLimitedStrings &&
(DbType.String == dbParam.DbType || DbType.StringFixedLength == dbParam.DbType));
}
private static void SetVariableLengthParameterSize(IDbDataParameter dbParam, SqlType sqlType)
{
SetDefaultParameterSize(dbParam, sqlType);
// Override the defaults using data from SqlType.
if (sqlType.LengthDefined && !IsText(dbParam, sqlType))
{
dbParam.Size = sqlType.Length;
}
if (sqlType.PrecisionDefined)
{
dbParam.Precision = sqlType.Precision;
dbParam.Scale = sqlType.Scale;
}
}
public override IDbCommand GenerateCommand(CommandType type, SqlString sqlString, SqlType[] parameterTypes)
{
IDbCommand command = base.GenerateCommand(type, sqlString, parameterTypes);
//if (IsPrepareSqlEnabled)
{
SetParameterSizes(command.Parameters, parameterTypes);
}
return command;
}
public override bool SupportsMultipleQueries
{
get { return true; }
}
#region IEmbeddedBatcherFactoryProvider Members
System.Type IEmbeddedBatcherFactoryProvider.BatcherFactoryClass
{
get { return typeof(ProfiledSqlClientBatchingBatcherFactory); }
}
#endregion
}
public class ProfiledSqlClientBatchingBatcher : AbstractBatcher
{
private int batchSize;
private int totalExpectedRowsAffected;
private SqlClientSqlCommandSet currentBatch;
private StringBuilder currentBatchCommandsLog;
private readonly int defaultTimeout;
public ProfiledSqlClientBatchingBatcher(ConnectionManager connectionManager, IInterceptor interceptor)
: base(connectionManager, interceptor)
{
batchSize = Factory.Settings.AdoBatchSize;
defaultTimeout = PropertiesHelper.GetInt32(NHibernate.Cfg.Environment.CommandTimeout, NHibernate.Cfg.Environment.Properties, -1);
currentBatch = CreateConfiguredBatch();
//we always create this, because we need to deal with a scenario in which
//the user change the logging configuration at runtime. Trying to put this
//behind an if(log.IsDebugEnabled) will cause a null reference exception
//at that point.
currentBatchCommandsLog = new StringBuilder().AppendLine("Batch commands:");
}
public override int BatchSize
{
get { return batchSize; }
set { batchSize = value; }
}
protected override int CountOfStatementsInCurrentBatch
{
get { return currentBatch.CountOfCommands; }
}
public override void AddToBatch(IExpectation expectation)
{
totalExpectedRowsAffected += expectation.ExpectedRowCount;
IDbCommand batchUpdate = CurrentCommand;
string lineWithParameters = null;
var sqlStatementLogger = Factory.Settings.SqlStatementLogger;
if (sqlStatementLogger.IsDebugEnabled || log.IsDebugEnabled)
{
lineWithParameters = sqlStatementLogger.GetCommandLineWithParameters(batchUpdate);
var formatStyle = sqlStatementLogger.DetermineActualStyle(FormatStyle.Basic);
lineWithParameters = formatStyle.Formatter.Format(lineWithParameters);
currentBatchCommandsLog.Append("command ")
.Append(currentBatch.CountOfCommands)
.Append(":")
.AppendLine(lineWithParameters);
}
if (log.IsDebugEnabled)
{
log.Debug("Adding to batch:" + lineWithParameters);
}
currentBatch.Append(((ProfiledSqlDbCommand)batchUpdate).Command);
if (currentBatch.CountOfCommands >= batchSize)
{
ExecuteBatchWithTiming(batchUpdate);
}
}
protected void ProfiledPrepare(IDbCommand cmd)
{
try
{
IDbConnection sessionConnection = ConnectionManager.GetConnection();
if (cmd.Connection != null)
{
// make sure the commands connection is the same as the Sessions connection
// these can be different when the session is disconnected and then reconnected
if (cmd.Connection != sessionConnection)
{
cmd.Connection = sessionConnection;
}
}
else
{
cmd.Connection = (sessionConnection as ProfiledSqlDbConnection).Connection;
}
ProfiledSqlDbTransaction trans = (ProfiledSqlDbTransaction)typeof(NHibernate.Transaction.AdoTransaction).InvokeMember("trans", System.Reflection.BindingFlags.GetField | System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance, null, ConnectionManager.Transaction, null);
if (trans != null)
cmd.Transaction = trans.Transaction;
Factory.ConnectionProvider.Driver.PrepareCommand(cmd);
}
catch (InvalidOperationException ioe)
{
throw new ADOException("While preparing " + cmd.CommandText + " an error occurred", ioe);
}
}
protected override void DoExecuteBatch(IDbCommand ps)
{
log.DebugFormat("Executing batch");
CheckReaders();
ProfiledPrepare(currentBatch.BatchCommand);
if (Factory.Settings.SqlStatementLogger.IsDebugEnabled)
{
Factory.Settings.SqlStatementLogger.LogBatchCommand(currentBatchCommandsLog.ToString());
currentBatchCommandsLog = new StringBuilder().AppendLine("Batch commands:");
}
int rowsAffected;
try
{
rowsAffected = currentBatch.ExecuteNonQuery();
}
catch (DbException e)
{
throw ADOExceptionHelper.Convert(Factory.SQLExceptionConverter, e, "could not execute batch command.");
}
Expectations.VerifyOutcomeBatched(totalExpectedRowsAffected, rowsAffected);
currentBatch.Dispose();
totalExpectedRowsAffected = 0;
currentBatch = CreateConfiguredBatch();
}
private SqlClientSqlCommandSet CreateConfiguredBatch()
{
var result = new SqlClientSqlCommandSet();
if (defaultTimeout > 0)
{
try
{
result.CommandTimeout = defaultTimeout;
}
catch (Exception e)
{
if (log.IsWarnEnabled)
{
log.Warn(e.ToString());
}
}
}
return result;
}
}
public class ProfiledSqlClientBatchingBatcherFactory : IBatcherFactory
{
public virtual IBatcher CreateBatcher(ConnectionManager connectionManager, IInterceptor interceptor)
{
return new ProfiledSqlClientBatchingBatcher(connectionManager, interceptor);
}
}
public class ProfiledSqlDbCommand : ProfiledDbCommand
{
public ProfiledSqlDbCommand(SqlCommand cmd, SqlConnection conn, MiniProfiler profiler)
: base(cmd, conn, profiler)
{
Command = cmd;
}
public SqlCommand Command { get; set; }
private DbTransaction _trans;
protected override DbTransaction DbTransaction
{
get { return _trans; }
set
{
this._trans = value;
ProfiledSqlDbTransaction awesomeTran = value as ProfiledSqlDbTransaction;
Command.Transaction = awesomeTran == null ? (SqlTransaction)value : awesomeTran.Transaction;
}
}
}
public class ProfiledSqlDbConnection : ProfiledDbConnection
{
public ProfiledSqlDbConnection(SqlConnection connection, MiniProfiler profiler)
: base(connection, profiler)
{
Connection = connection;
}
public SqlConnection Connection { get; set; }
protected override DbTransaction BeginDbTransaction(System.Data.IsolationLevel isolationLevel)
{
return new ProfiledSqlDbTransaction(Connection.BeginTransaction(isolationLevel), this);
}
}
public class ProfiledSqlDbTransaction : ProfiledDbTransaction
{
public ProfiledSqlDbTransaction(SqlTransaction transaction, ProfiledDbConnection connection)
: base(transaction, connection)
{
Transaction = transaction;
}
public SqlTransaction Transaction { get; set; }
}
Try implementing NHibernate.Connection.IConnectionProvider (you could just inherit DriverConnectionProvider), in GetConnection() wrap the IDbConnection as you need.
Plug your connection provider using the Environment.ConnectionProvider key in your config properties.
If anyone is interested I have done an integration using a custom Log4net appender instead. This way I feel safe that I don't mess with the Connection object.
The rough outline is something along these lines: NHibernate emits the sqlstrings as debug statements and the appender configured in log4net.xml calls Start and Dispose on the MiniProfiler.
Hi i m trying to run two jobs using batch framework.
My problem is SimpleJobLauncher is running only one job which is last in the jobs list.
Here what i am doing:
I have two jobs in my database along with the steps for the jobs.
I read the job data from database and process it as following
public class BatchJobScheduler {
private static Log sLog = LogFactory.getLog(BatchJobScheduler.class);
private ApplicationContext ac;
private DataSourceTransactionManager mTransactionManager;
private SimpleJobLauncher mJobLauncher;
private JobRepository mJobRepository;
private SimpleStepFactoryBean stepFactory;
private MapJobRegistry mapJobRegistry;
private JobDetailBean jobDetail;
private CronTriggerBean cronTrigger;
private SimpleJob job;
private SchedulerFactoryBean schedulerFactory;
private static String mDriverClass;
private static String mConnectionUrl;
private static String mUser;
private static String mPassword;
public static JobMetaDataFeeder metadataFeeder;
static {
try {
loadProperties();
metadataFeeder = new JobMetaDataFeeder();
metadataFeeder.configureDataSource(mDriverClass, mConnectionUrl,
mUser, mPassword);
} catch (FileNotFoundException e) {
} catch (IOException e) {
} catch (SQLException e) {
} catch (ClassNotFoundException e) {
}
}
private static void loadProperties() throws FileNotFoundException,
IOException {
Properties properties = new Properties();
InputStream is;
if (BatchJobScheduler.class.getClassLoader() != null) {
is = BatchJobScheduler.class.getClassLoader().getResourceAsStream(
"batch.properties");
} else {
is = System.class.getClassLoader().getResourceAsStream(
"batch.properties");
}
properties.load(is);
mDriverClass = properties.getProperty("batch.jdbc.driver");
mConnectionUrl = properties.getProperty("batch.jdbc.url");
mUser = properties.getProperty("batch.jdbc.user");
mPassword = properties.getProperty("batch.jdbc.password");
}
public void start(WebApplicationContext wac) throws Exception {
try {
ac = new FileSystemXmlApplicationContext("batch-spring.xml");
mTransactionManager = (DataSourceTransactionManager) ac
.getBean("mTransactionManager");
mJobLauncher = (SimpleJobLauncher) ac.getBean("mJobLauncher");
mJobRepository = (JobRepository) ac.getBean("mRepositoryFactory");
mJobLauncher.afterPropertiesSet();
List<JobMetadata> jobsMetaData = getJobsData(mDriverClass,
mConnectionUrl, mUser, mPassword, null);
createAndRunScheduler(jobsMetaData);
} catch (Exception e) {
e.printStackTrace();
sLog.error("Exception while starting job", e);
}
}
#SuppressWarnings("unchecked")
public List<CronTriggerBean> getJobTriggers(List<JobMetadata> jobsMetaData)
throws Exception {
List<CronTriggerBean> triggers = new ArrayList<CronTriggerBean>();
for (JobMetadata jobMetadata : jobsMetaData) {
job = (SimpleJob) ac.getBean("job");
job.setName(jobMetadata.getJobName());
ArrayList<Step> steps = new ArrayList<Step>();
for (StepMetadata stepMetadata : jobMetadata.getSteps()) {
// System.err.println(ac.getBean("stepFactory").getClass());
stepFactory = new SimpleStepFactoryBean<String, Object>();
stepFactory.setTransactionManager(mTransactionManager);
stepFactory.setJobRepository(mJobRepository);
stepFactory.setCommitInterval(stepMetadata.getCommitInterval());
stepFactory.setStartLimit(stepMetadata.getStartLimit());
T5CItemReader itemReader = (T5CItemReader) BeanUtils
.instantiateClass(Class.forName(stepMetadata
.getStepReaderClass()));
itemReader
.setItems(getItemList(jobMetadata.getJobParameters()));
stepFactory.setItemReader(itemReader);
stepFactory.setItemProcessor((ItemProcessor) BeanUtils
.instantiateClass(Class.forName(stepMetadata
.getStepProcessorClass())));
stepFactory.setItemWriter((ItemWriter) BeanUtils
.instantiateClass(Class.forName(stepMetadata
.getStepWriterClass())));
stepFactory.setBeanName(stepMetadata.getStepName());
steps.add((Step) stepFactory.getObject());
}
job.setSteps(steps);
ReferenceJobFactory jobFactory = new ReferenceJobFactory(job);
mapJobRegistry = (MapJobRegistry) ac.getBean("jobRegistry");
mapJobRegistry.register(jobFactory);
jobDetail = (JobDetailBean) ac.getBean("jobDetail");
jobDetail.setJobClass(Class.forName(jobMetadata.getMJoblauncher()));
jobDetail.setGroup(jobMetadata.getJobGroupName());
jobDetail.setName(jobMetadata.getJobName());
Map<String, Object> jobDataMap = new HashMap<String, Object>();
jobDataMap.put("jobName", jobMetadata.getJobName());
jobDataMap.put("jobLocator", mapJobRegistry);
jobDataMap.put("jobLauncher", mJobLauncher);
jobDataMap.put("timestamp", new Date());
// jobDataMap.put("jobParams", jobMetadata.getJobParameters());
jobDetail.setJobDataAsMap(jobDataMap);
jobDetail.afterPropertiesSet();
cronTrigger = (CronTriggerBean) ac.getBean("cronTrigger");
cronTrigger.setJobDetail(jobDetail);
cronTrigger.setJobName(jobMetadata.getJobName());
cronTrigger.setJobGroup(jobMetadata.getJobGroupName());
cronTrigger.setCronExpression(jobMetadata.getCronExpression());
triggers.add(cronTrigger);
}
return triggers;
}
private void createAndRunScheduler(List<JobMetadata> jobsMetaData)
throws Exception {
// System.err.println(ac.getBean("schedulerFactory").getClass());
schedulerFactory = new SchedulerFactoryBean();
List<CronTriggerBean> triggerList = getJobTriggers(jobsMetaData);
Trigger[] triggers = new Trigger[triggerList.size()];
int triggerCount = 0;
for (CronTriggerBean trigger : triggerList) {
triggers[triggerCount] = trigger;
triggerCount++;
}
schedulerFactory.setTriggers(triggers);
schedulerFactory.afterPropertiesSet();
}
private List<JobMetadata> getJobsData(String driverClass,
String connectionURL, String user, String password, String query)
throws SQLException, ClassNotFoundException {
metadataFeeder.createJobMetadata(query);
return metadataFeeder.getJobsMetadata();
}
private List<String> getItemList(String jobParameterString) {
List<String> itemList = new ArrayList<String>();
String[] parameters = jobParameterString.split(";");
for (String string : parameters) {
String[] mapKeyValue = string.split("=");
if (mapKeyValue.length == 2) {
itemList.add(mapKeyValue[0] + ":" + mapKeyValue[1]);
} else {
// exception for invalid job parameters
System.out.println("exception for invalid job parameters");
}
}
return itemList;
}
private Map<String, Object> getParameterMap(String jobParameterString) {
Map<String, Object> parameterMap = new HashMap<String, Object>();
String[] parameters = jobParameterString.split(";");
for (String string : parameters) {
String[] mapKeyValue = string.split("=");
if (mapKeyValue.length == 2) {
parameterMap.put(mapKeyValue[0], mapKeyValue[1]);
} else {
// exception for invalid job parameters
System.out.println("exception for invalid job parameters");
}
}
return parameterMap;
}
}
public class MailJobLauncher extends QuartzJobBean {
/**
* Special key in job data map for the name of a job to run.
*/
static final String JOB_NAME = "jobName";
private static Log sLog = LogFactory.getLog(MailJobLauncher.class);
private JobLocator mJobLocator;
private JobLauncher mJobLauncher;
/**
* Public setter for the {#link JobLocator}.
*
* #param jobLocator
* the {#link JobLocator} to set
*/
public void setJobLocator(JobLocator jobLocator) {
this.mJobLocator = jobLocator;
}
/**
* Public setter for the {#link JobLauncher}.
*
* #param jobLauncher
* the {#link JobLauncher} to set
*/
public void setJobLauncher(JobLauncher jobLauncher) {
this.mJobLauncher = jobLauncher;
}
#Override
#SuppressWarnings("unchecked")
protected void executeInternal(JobExecutionContext context) {
Map<String, Object> jobDataMap = context.getMergedJobDataMap();
executeRecursive(jobDataMap);
}
private void executeRecursive(Map<String, Object> jobDataMap) {
String jobName = (String) jobDataMap.get(JOB_NAME);
JobParameters jobParameters = getJobParametersFromJobMap(jobDataMap);
sLog.info("Quartz trigger firing with Spring Batch jobName=" + jobName
+ jobDataMap + jobParameters);
try {
mJobLauncher.run(mJobLocator.getJob(jobName), jobParameters);
} catch (JobInstanceAlreadyCompleteException e) {
jobDataMap.remove("timestamp");
jobDataMap.put("timestamp", new Date());
executeRecursive(jobDataMap);
} catch (NoSuchJobException e) {
sLog.error("Could not find job.", e);
} catch (JobExecutionException e) {
sLog.error("Could not execute job.", e);
}
}
/*
* Copy parameters that are of the correct type over to {#link
* JobParameters}, ignoring jobName.
* #return a {#link JobParameters} instance
*/
private JobParameters getJobParametersFromJobMap(
Map<String, Object> jobDataMap) {
JobParametersBuilder builder = new JobParametersBuilder();
for (Entry<String, Object> entry : jobDataMap.entrySet()) {
String key = entry.getKey();
Object value = entry.getValue();
if (value instanceof String && !key.equals(JOB_NAME)) {
builder.addString(key, (String) value);
} else if (value instanceof Float || value instanceof Double) {
builder.addDouble(key, ((Number) value).doubleValue());
} else if (value instanceof Integer || value instanceof Long) {
builder.addLong(key, ((Number) value).longValue());
} else if (value instanceof Date) {
builder.addDate(key, (Date) value);
} else {
sLog
.debug("JobDataMap contains values which are not job parameters (ignoring).");
}
}
return builder.toJobParameters();
}
}
I couldnt figure it out why launcher is ignoring all other jobs please help me.
Regards
Make sure these properties are set:
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=3
org.quartz.threadPool.threadPriority=5
This will allow a few jobs to run at the same time. Adjust the settings as needed.