We have a pretty basic Lucene set up. We recently noticed that some documents aren't written to the index.
This is how we create the document:
private void addToDirectory(SpecialDomainObject specialDomainObject) throws IOException {
Document document = new Document();
document.add(new TextField("id", String.valueOf(specialDomainObject.getId()), Field.Store.YES));
document.add(new TextField("name", specialDomainObject.getName(), Field.Store.YES));
document.add(new TextField("tags", joinTags(specialDomainObject.getTags()), Field.Store.YES));
document.add(new TextField("contents", getContents(specialDomainObject), Field.Store.YES));
for (Language language : getAllAssociatedLanguages(specialDomainObject)) {
document.add(new IntField("languageId", language.getId(), Field.Store.YES));
}
specialDomainObjectIndexWriter.updateDocument(new Term("id", document.getField("id").stringValue()), document);
specialDomainObjectIndexWriter.commit();
}
This is how we create the analyzer and the index writer:
<bean id="luceneVersion" class="org.apache.lucene.util.Version" factory-method="valueOf">
<constructor-arg value="LUCENE_46"/>
</bean>
<bean id="analyzer" class="org.apache.lucene.analysis.standard.StandardAnalyzer">
<constructor-arg ref="luceneVersion"/>
</bean>
<bean id="specialDomainObjectIndexWriter" class="org.apache.lucene.index.IndexWriter">
<constructor-arg ref="specialDomainObjectDirectory" />
<constructor-arg>
<bean class="org.apache.lucene.index.IndexWriterConfig">
<constructor-arg ref="luceneVersion"/>
<constructor-arg ref="analyzer" />
<property name="openMode" value="CREATE_OR_APPEND"/>
</bean>
</constructor-arg>
</bean>
Indexing is done with a scheduled task:
#Component
public class ScheduledSpecialDomainObjectIndexCreationTask implements ScheduledIndexCreationTask {
private static final Logger logger = LoggerFactory.getLogger(ScheduledSpecialDomainObjectIndexCreationTask.class);
#Autowired
private IndexOperator specialDomainObjectIndexOperator;
#Scheduled(fixedDelay = 3600 * 1000)
#Override
public void createIndex() {
Date indexCreationStartDate = new Date();
try {
logger.info("Updating complete special domain object index...");
specialDomainObjectIndexOperator.createIndex();
if (logger.isDebugEnabled()) {
Date indexCreationEndDate = new Date();
logger.debug("Index creation duration: {} ms", indexCreationEndDate.getTime() - indexCreationStartDate.getTime());
}
} catch (IOException e) {
logger.error("Could update complete special domain object index.", e);
}
}
}
createIndex() is implemented as follows:
#Override
public void createIndex() throws IOException {
logger.trace("Preparing for index generation...");
IndexWriter indexWriter = getIndexWriter();
Date start = new Date();
logger.trace("Deleting all documents from index...");
indexWriter.deleteAll();
logger.trace("Starting index generation...");
long numberOfProcessedObjects = fillIndex();
logger.debug("Index written in " + (new Date().getTime() - start.getTime()) + " milliseconds.");
logger.debug("Number of processed objects: {}", numberOfProcessedObjects);
logger.debug("Number of documents in index: {}", indexWriter.numDocs());
indexWriter.commit();
indexWriter.forceMerge(1);
}
#Override
protected long fillIndex() throws IOException {
Page<SpecialDomainObject> specialDomainObjectsPage = specialDomainObjectRepository.findAll(new PageRequest(0, MAXIMUM_PAGE_ELEMENTS));
while (true) {
addToDirectory(specialDomainObjectsPage);
if (specialDomainObjectsPage.hasNextPage()) {
specialDomainObjectsPage =
specialDomainObjectRepository.findAll(new PageRequest(specialDomainObjectsPage.getNumber() + 1, specialDomainObjectsPage.getSize()));
} else {
break;
}
}
return specialDomainObjectsPage.getTotalElements();
}
There are about 2000 specialDomainObject instances and about 80 aren't written to the index (we checked this with Luke).
Is there anything that could cause the missing documents?
We found the problem: The default encoding of the operating system was not set to UTF-8.
Related
I am trying to achieve a distributed transaction in Apache TomEE. In words, the flow is:
A message reader (i.e. a message driven bean) reads from a queue (1) and processes one message triggering:
one or more inserts/updates on a database (2)
sending a message to another queue (3)
Operations 1, 2, & 3 are all part of the same XA transaction controlled by TomEE. Therefore, under any circumstances they either all fail or all succeed.
tomee.xml
<?xml version="1.0" encoding="UTF-8"?>
<tomee>
this resource adapter is just necessary to tell tomee to not start internal ActiveMq instance
<Resource id="MyAdapter" type="ActiveMQResourceAdapter">
BrokerXmlConfig
ServerUrl tcp://fakehost:666
</Resource>
<Resource id="jms/MyIncomingConnFactory" type="javax.jms.ConnectionFactory" class-name="org.apache.activemq.ActiveMQXAConnectionFactory">
BrokerURL tcp://externalhost:61616?jms.redeliveryPolicy.maximumRedeliveries=0
</Resource>
<Resource id="jms/MyOutgoingConnFactory" type="javax.jms.ConnectionFactory" class-name="org.apache.activemq.ActiveMQXAConnectionFactory">
BrokerURL tcp://externalhost:61616?jms.redeliveryPolicy.maximumRedeliveries=0
</Resource>
<Resource id="jms/MyOutgoingQueue" class-name="org.apache.activemq.command.ActiveMQQueue">
PhysicalName MY_OUTGOING_QUEUE
</Resource>
<Resource id="jms/MyIncomingQueue" class-name="org.apache.activemq.command.ActiveMQQueue">
PhysicalName MY_INCOMING_QUEUE
</Resource>
<Resource id="jdbc/myDBXAPooled" type="DataSource">
XaDataSource myDBXA
DataSourceCreator dbcp
JtaManaged true
UserName TestUser
Password TestPassword
MaxWait 2000
ValidationQuery SELECT 1
MaxActive 15
</Resource>
<Resource id="myDBXA" type="XADataSource" class-name="com.mysql.jdbc.jdbc2.optional.MysqlXADataSource">
Url jdbc:mysql://localhost:3306/test
User TestUser
Password TestPassword
</Resource>
</tomee>
Springconfig.xml:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:jee="http://www.springframework.org/schema/jee"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-4.2.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-4.2.xsd
http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-4.2.xsd">
<!-- <jee:jndi-lookup jndi-name="myDBXAPooled" id="myDatasource" resource-ref="true" /> -->
<jee:jndi-lookup jndi-name="jms/MyOutgoingConnFactory" id="myOutgoingConnFactory" resource-ref="true" />
<jee:jndi-lookup jndi-name="jms/MyIncomingConnFactory" id="myIncomingConnFactory" resource-ref="true" />
<jee:jndi-lookup jndi-name="jms/MyOutgoingQueue" id="myOutgoingQueue" resource-ref="true" />
<jee:jndi-lookup jndi-name="jms/MyIncomingQueue" id="myIncomingQueue" resource-ref="true" />
<jee:jndi-lookup jndi-name="jdbc/myDBXAPooled" id="myDatasource" resource-ref="true" />
<tx:jta-transaction-manager/>
<!-- <bean id="transactionManager" class="org.springframework.transaction.jta.JtaTransactionManager"/> -->
<!-- the previous two ways of getting the transactionManager seems equivalent and both get Geronimo -->
</beans>
SpringConfig.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:jee="http://www.springframework.org/schema/jee"
xmlns:tx="http://www.springframework.org/schema/tx"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-4.2.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx-4.2.xsd
http://www.springframework.org/schema/jee http://www.springframework.org/schema/jee/spring-jee-4.2.xsd">
<bean id="messageListener" class="com.test.MyListener">
<property name="connectionFactory" ref="myIncomingConnFactory" />
<property name="destination" ref="myIncomingQueue" />
<!-- <property name="sessionTransacted" value="true" /> -->
<property name="concurrentConsumers" value="1" />
<property name="maxConcurrentConsumers" value="6" />
<property name="messageListener" ref="myMessageProcessor" />
<property name="transactionManager" ref="transactionManager" />
<property name="taskExecutor" ref="msgListenersTaskExecutor" />
</bean>
<bean id="myMessageProcessor" class="com.test.MyMessageReceiver">
<property name="forwardConnectionFactory" ref="myOutgoingConnFactory" />
<property name="forwardQueue" ref="myOutgoingQueue" />
<property name="datasource" ref="myDatasource" />
</bean>
<bean id="msgListenersTaskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor"/>
</beans>
MyMessageReceiver.java:
package com.test;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import javax.jms.ConnectionFactory;
import javax.jms.JMSException;
import javax.jms.Message;
import javax.jms.MessageListener;
import javax.jms.MessageProducer;
import javax.jms.Queue;
import javax.jms.Session;
import javax.jms.TextMessage;
import javax.sql.DataSource;
import org.apache.log4j.Logger;
import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional;
public class MyMessageReceiver implements MessageListener {
static Logger log = Logger.getLogger(MyMessageReceiver.class);
private ConnectionFactory forwardConnectionFactory;
private Queue forwardQueue;
private DataSource datasource;
public void setForwardConnectionFactory(ConnectionFactory connFactory) {
forwardConnectionFactory=connFactory;
}
public void setforwardQueue(Queue queue) {
forwardQueue=queue;
}
public void setDatasource(DataSource ds) {
datasource=ds;
}
#Override
#Transactional(propagation=Propagation.REQUIRED)
public void onMessage(Message message) {
log.info("************************************");
MyListener listener = (MyListener)SpringContext.getBean("messageListener");
listener.printInfo();
log.info("************************************");
TextMessage msg = (TextMessage) message;
String text = null;
try {
text = msg.getText();
if (text != null) log.info("MESSAGE RECEIVED: "+ text);
updateDB(text); // function call to update DB
sendMsg(text); // function call to publish messages to queue
System.out.println("****************Rollback");
// Throwing exception to rollback DB, Message should not be
// published and consumed message sent to a DLQ
//(Broker side DLQ configuration already done)
throw new RuntimeException();
//if (text!=null && text.indexOf("rollback")!=-1) throw new RuntimeException("Message content includes the word rollback");
} catch (Exception e) {
log.error("Rolling back the entire XA transaction");
log.error(e.getMessage());
throw new RuntimeException("Rolled back because of "+e.getMessage());
}
}
private void updateDB(String text) throws Exception {
Connection conn = null;
PreparedStatement ps = null;
try {
System.out.println("*******datasource "+datasource);
conn = datasource.getConnection();
System.out.println("*******conn "+conn.getMetaData().getUserName());
if (conn!=null) {
System.out.println("*******conn "+conn.getMetaData().getUserName());
ps = conn.prepareStatement("INSERT INTO MY_TABLE (name) VALUES(?)");
ps.setString(1, text);
ps.executeUpdate();
}
} catch (Exception e) {
throw e;
} finally {
if (ps!=null) {
try {
ps.close();
} catch (SQLException e) {
log.error(e.getMessage());
// do nothing
}
}
if (conn!=null) {
try {
conn.close();
} catch (SQLException e) {
log.error(e.getMessage());
// do nothing
}
}
}
}
private void sendMsg(String msgToBeSent) throws Exception {
javax.jms.Connection conn = null;
Session session = null;
try {
System.out.println("*************forwardConnectionFactory"+forwardConnectionFactory);
conn = forwardConnectionFactory.createConnection();
session = conn.createSession(true, Session.AUTO_ACKNOWLEDGE);
MessageProducer messageProducer = session.createProducer(forwardQueue);
TextMessage msg = session.createTextMessage(msgToBeSent);
messageProducer.send(msg);
} catch (Exception e) {
throw e;
} finally {
if (session != null) {
try {
session.close();
} catch (JMSException e) {
// do nothing
}
}
if (conn != null) {
try {
conn.close();
} catch (JMSException e) {
// do nothing
}
}
}
}
}
MyListener.java:
package com.test;
import javax.transaction.Status;
import javax.transaction.SystemException;
import org.apache.log4j.Logger;
import org.springframework.jms.listener.DefaultMessageListenerContainer;
import org.springframework.transaction.jta.JtaTransactionManager;
public class MyListener extends DefaultMessageListenerContainer {
static Logger log = Logger.getLogger(MyListener.class);
public void printInfo() {
try {
log.info("trans manager="+((JtaTransactionManager)this.getTransactionManager()).getTransactionManager()+","+((JtaTransactionManager)this.getTransactionManager()).getTransactionManager().getStatus()+", this.isSessionTransacted()="+this.isSessionTransacted());
log.info("STATUS_ACTIVE="+Status.STATUS_ACTIVE);
log.info("STATUS_COMMITTEDE="+Status.STATUS_COMMITTED);
log.info("STATUS_COMMITTING="+Status.STATUS_COMMITTING);
log.info("STATUS_MARKED_ROLLBACK="+Status.STATUS_MARKED_ROLLBACK);
log.info("STATUS_NO_TRANSACTION="+Status.STATUS_NO_TRANSACTION);
log.info("STATUS_PREPARED="+Status.STATUS_PREPARED);
log.info("STATUS_PREPARING="+Status.STATUS_PREPARING);
log.info("STATUS_ROLLEDBACK="+Status.STATUS_ROLLEDBACK);
log.info("STATUS_ROLLING_BACK="+Status.STATUS_ROLLING_BACK);
log.info("STATUS_UNKNOWN="+Status.STATUS_UNKNOWN);
} catch (SystemException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void forceRollback() {
try {
((JtaTransactionManager)this.getTransactionManager()).getTransactionManager().setRollbackOnly();
} catch (IllegalStateException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SystemException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
After updating the database and sending the message to the outgoing queue I am purposefully throwing a RuntimeException just to test the transaction rollback of both the database and the message broker.
All three operations are committed in case of success, but it only rolls back the database operation in case of failure, while the two JMS operations are committed anyway.
It could either be:
wrong settings in my tomee.xml (especially queue connection factories) or somewhere else
a bug??
I already spent quite much time fighting with the thing and searching for possible solutions.
It would be great to hear your opinion on this and, once again, advanced apologies if it turns out to be a mistake on my side.
I believe you need to use the ActiveMQ JCA resource adapter to ensure that connections are automatically enlisted into the XA transaction. Try this:
<tomee>
<Resource id="MyJmsResourceAdapter" type="ActiveMQResourceAdapter">
# Do not start the embedded ActiveMQ broker
BrokerXmlConfig =
ServerUrl = tcp://externalhost:61616?jms.redeliveryPolicy.maximumRedeliveries=0
</Resource>
<Resource id="jms/MyIncomingConnFactory" type="javax.jms.ConnectionFactory">
resourceAdapter = MyJmsResourceAdapter
transactionSupport = xa
</Resource>
<Resource id="jms/MyOutgoingConnFactory" type="javax.jms.ConnectionFactory">
resourceAdapter = MyJmsResourceAdapter
transactionSupport = xa
</Resource>
<Resource id="jms/MyOutgoingQueue" type="javax.jms.Queue"/>
<Resource id="jms/MyIncomingQueue" type="javax.jms.Queue"/>
<Resource id="jdbc/myDBXAPooled" type="DataSource">
XaDataSource myDBXA
DataSourceCreator dbcp
JtaManaged true
UserName TestUser
Password TestPassword
MaxWait 2000
ValidationQuery SELECT 1
MaxActive 15
</Resource>
<Resource id="myDBXA" type="XADataSource" class-name="com.mysql.jdbc.jdbc2.optional.MysqlXADataSource">
Url jdbc:mysql://localhost:3306/test
User TestUser
Password TestPassword
</Resource>
</tomee>
I’m developing a web app with JSF and Hibernate, and I’m facing a problem with Hibernate. At some point, I want to INSERT a row in a table. Just after that, I want to retrieve the 3 last inserted elements of the same table. The problem is that the INSERT is being made correctly (I checked that with a DB Client), but when I try to retrieve those 3 elements, the one I have just inserted is not returned by the SELECT statement. It seems like a kind of hibernate session problem, not reflecting the change when I make the SELECT. I attach the relevant part of the code:
hibernate.cfg.xml:
<hibernate-configuration>
<session-factory>
<property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="hibernate.connection.url">jdbc:mysql://localhost:3306/habana</property>
<property name="hibernate.connection.username">root</property>
<property name="connection.password"></property>
<property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property>
<property name="current_session_context_class">thread</property>
<property name="show_sql">true</property>
<property name="hibernate.cache.use_second_level_cache">false</property>
<property name="hibernate.cache.use_query_cache">false</property>
<mapping class="org.blk.lahabana.model.ClaseMovimiento"/>
<mapping class="org.blk.lahabana.model.Devolucion"/>
<mapping class="org.blk.lahabana.model.DevolucionProducto"/>
<mapping class="org.blk.lahabana.model.DevolucionProductoId"/>
<mapping class="org.blk.lahabana.model.Evento"/>
<mapping class="org.blk.lahabana.model.Movimiento"/>
<mapping class="org.blk.lahabana.model.Pedido"/>
<mapping class="org.blk.lahabana.model.PedidoProducto"/>
<mapping class="org.blk.lahabana.model.PedidoProductoId"/>
<mapping class="org.blk.lahabana.model.Permiso"/>
<mapping class="org.blk.lahabana.model.Producto"/>
<mapping class="org.blk.lahabana.model.TipoProducto"/>
<mapping class="org.blk.lahabana.model.Trabajador"/>
<mapping class="org.blk.lahabana.model.Turno"/>
<mapping class="org.blk.lahabana.model.TurnoTrabajador"/>
<mapping class="org.blk.lahabana.model.TurnoTrabajadorId"/>
<mapping class="org.blk.lahabana.model.Usuario"/>
</session-factory>
</hibernate-configuration>
HibernateUtil:
public class HibernateUtil {
/** Logger for this class and subclasses */
protected static final Log logger = LogFactory.getLog(HibernateUtil.class);
private static SessionFactory sessionFactory;
static {
try {
sessionFactory = new AnnotationConfiguration().configure()
.buildSessionFactory();
} catch (Throwable ex) {
logger.error("Error al crear el sessionFactory de Hibernate",ex);
throw new ExceptionInInitializerError(ex);
}
}
public static SessionFactory getSessionFactory() {
// Alternatively, we could look up in JNDI here
return sessionFactory;
}
public static void closeSession() {
// Close caches and connection pools
getSessionFactory().close();
}
}
EventsController:
public String create(){
String view = "";
Integer id = eventosService.createEvent(evento);
if(id !=null){
FacesMessage facesMsg = new FacesMessage(FacesMessage.SEVERITY_INFO, "Evento Guardado", "El evento se ha guardado con éxito");
FacesContext.getCurrentInstance().addMessage(null, facesMsg);
}
else{
FacesMessage facesMsg = new FacesMessage(FacesMessage.SEVERITY_ERROR, "Evento No Guardado", "Se ha producido un error al guardar el evento");
FacesContext.getCurrentInstance().addMessage(null, facesMsg);
}
view = dashboardController.input();
return view;
}
#Override
public Integer createEvent(Evento event) {
Integer result = null;
event.setEstado(Constants.EVENT_STATE_PLANED);
boolean saved = eventoDao.save(event);
if( saved ){
result = event.getId();
}
return result;
}
public boolean save(T entity) {
Session hbsession = HibernateUtil.getSessionFactory().getCurrentSession();
Transaction transaction = null;
boolean result = false;
try {
transaction = hbsession.beginTransaction();
hbsession.save(entity);
hbsession.flush();
transaction.commit();
result = true;
} catch (HibernateException e) {
logger.error("Error al guardar una entidad entidad",e);
if(transaction != null){
transaction.rollback();
}
}
finally{
if(hbsession != null && hbsession.isOpen()){
hbsession.close();
}
}
return result;
}
DashboardController:
public String input(){
String view = "dashboard";
// Obtenemos los útimos eventos
List<Evento> events = eventosService.getLastEvents();
// Convertimos los eventos al VO para la vista
this.lastEvents = new ArrayList<EventoResumenVO>();
for (Evento evento : events) {
this.lastEvents.add(eventoResumenConverter.convertFromEvent(evento));
}
return view;
}
#Override
public List<Evento> getLastEvents() {
return eventoDao.findLastEvents(3);
}
#SuppressWarnings("unchecked")
#Override
public List<Evento> findLastEvents(Integer number) {
Session hbsession = HibernateUtil.getSessionFactory().getCurrentSession();
Transaction transaction = null;
List<Evento> result = null;
try {
transaction = hbsession.beginTransaction();
Criteria criteria = hbsession.createCriteria(entityClass);
criteria.setFetchMode("movimientos", FetchMode.JOIN);
criteria.addOrder(Order.desc("fechaFin"));
criteria.setMaxResults(number.intValue());
result = criteria.list();
} catch (HibernateException e) {
logger.error("Error al recuperar la lista de entidades",e);
if(transaction != null){
transaction.rollback();
}
}
finally{
if(hbsession != null && hbsession.isOpen()){
hbsession.close();
}
}
return result;
}
Thanks for helping!
I am using Lucene 3.6. I want to know why update does not work. Is there anything wrong?
public class TokenTest
{
private static String IndexPath = "D:\\update\\index";
private static Analyzer analyzer = new StandardAnalyzer(Version.LUCENE_33);
public static void main(String[] args) throws Exception
{
try
{
update();
display("content", "content");
}
catch (IOException e)
{
e.printStackTrace();
}
}
#SuppressWarnings("deprecation")
public static void display(String keyField, String words) throws Exception
{
IndexSearcher searcher = new IndexSearcher(FSDirectory.open(new File(IndexPath)));
Term term = new Term(keyField, words);
Query query = new TermQuery(term);
TopDocs results = searcher.search(query, 100);
ScoreDoc[] hits = results.scoreDocs;
for (ScoreDoc hit : hits)
{
Document doc = searcher.doc(hit.doc);
System.out.println("doc_id = " + hit.doc);
System.out.println("内容: " + doc.get("content"));
System.out.println("路径:" + doc.get("path"));
}
}
public static String update() throws Exception
{
IndexWriterConfig writeConfig = new IndexWriterConfig(Version.LUCENE_33, analyzer);
IndexWriter writer = new IndexWriter(FSDirectory.open(new File(IndexPath)), writeConfig);
Document document = new Document();
Field field_name2 = new Field("path", "update_path", Field.Store.YES, Field.Index.ANALYZED);
Field field_content2 = new Field("content", "content update", Field.Store.YES, Field.Index.ANALYZED);
document.add(field_name2);
document.add(field_content2);
Term term = new Term("path", "qqqqq");
writer.updateDocument(term, document);
writer.optimize();
writer.close();
return "update_path";
}
}
I assume you want to update your document such that field "path" = "qqqq". You have this exactly backwards (please read the documentation).
updateDocument performs two steps:
Find and delete any documents containing term
In this case, none are found, because your indexed documents does not contain path:qqqq
Add the new document to the index.
You appear to be doing the opposite, trying to lookup by document, then add the term to it, and it doesn't work that way. What you are looking for, I believe, is something like:
Term term = new Term("content", "update");
document.removeField("path");
document.add("path", "qqqq");
writer.updateDocument(term, document);
I am able to write a java program using Rabbit Java API's doing the following:
Client sends message over Rabbit MQ exchange/queue with correlation Id (Say UUID - "348a07f5-8342-45ed-b40b-d44bfd9c4dde").
Server receives the message.
Server sends response message over Rabbit MQ exchange/queue with the same correlation Id - "348a07f5-8342-45ed-b40b-d44bfd9c4dde".
Client received the correlated message only in the same thread as 1.
Below is the Send.java and Recv.java using Rabbit APIs. I need help to convert this sample to use Spring AMQP integration especially receiving part on step 4. I am looking for something like receive method which can filter message using correlation Id.
Send.java:
import java.util.UUID;
import com.rabbitmq.client.AMQP.BasicProperties;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;
import com.rabbitmq.client.QueueingConsumer;
public class Send {
private final static String REQUEST_QUEUE = "REQUEST.QUEUE";
private final static String RESPONSE_QUEUE = "RESPONSE.QUEUE";
public static void main(String[] argv) throws Exception {
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.queueDeclare(REQUEST_QUEUE, false, false, false, null);
String message = "Hello World!";
String cslTransactionId = UUID.randomUUID().toString();
BasicProperties properties = (new BasicProperties.Builder())
.correlationId(cslTransactionId)
.replyTo(RESPONSE_QUEUE).build();
channel.basicPublish("", REQUEST_QUEUE, properties, message.getBytes());
System.out.println("Client Sent '" + message + "'");
Channel responseChannel = connection.createChannel();
responseChannel.queueDeclare(RESPONSE_QUEUE, false, false, false, null);
QueueingConsumer consumer = new QueueingConsumer(channel);
responseChannel.basicConsume(RESPONSE_QUEUE, false, consumer);
String correlationId = null;
while (true) {
QueueingConsumer.Delivery delivery = consumer.nextDelivery();
String responseMessage = new String(delivery.getBody());
correlationId = delivery.getProperties().getCorrelationId();
System.out.println("Correlation Id:" + correlationId);
if (correlationId.equals(cslTransactionId)) {
System.out.println("Client Received '" + responseMessage + "'");
responseChannel.basicAck(delivery.getEnvelope().getDeliveryTag(), true);
break;
}
}
channel.close();
connection.close();
}
}
Recv.java
import com.rabbitmq.client.ConnectionFactory;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.QueueingConsumer;
import com.rabbitmq.client.AMQP.BasicProperties;
public class Recv {
private final static String REQUEST_QUEUE = "REQUEST.QUEUE";
private final static String RESPONSE_QUEUE = "RESPONSE.QUEUE";
public static void main(String[] argv) throws Exception {
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.queueDeclare(REQUEST_QUEUE, false, false, false, null);
System.out.println(" [*] Waiting for messages. To exit press CTRL+C");
QueueingConsumer consumer = new QueueingConsumer(channel);
channel.basicConsume(REQUEST_QUEUE, true, consumer);
String correlationId = null;
while (true) {
QueueingConsumer.Delivery delivery = consumer.nextDelivery();
String message = new String(delivery.getBody());
correlationId = delivery.getProperties().getCorrelationId();
System.out.println("Correlation Id:" + correlationId);
System.out.println("Server Received '" + message + "'");
if (correlationId != null)
break;
}
String responseMsg = "Response Message";
Channel responseChannel = connection.createChannel();
responseChannel.queueDeclare(RESPONSE_QUEUE, false, false, false, null);
BasicProperties properties = (new BasicProperties.Builder())
.correlationId(correlationId).build();
channel.basicPublish("", RESPONSE_QUEUE, properties,responseMsg.getBytes());
System.out.println("Server Sent '" + responseMsg + "'");
channel.close();
connection.close();
}
}
After running the Java configuration provided by gary, I am trying to move the configuration to XML format for server side adding listener. Below is the XML configuration:
server.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-amqp="http://www.springframework.org/schema/integration/amqp"
xmlns:rabbit="http://www.springframework.org/schema/rabbit"
xmlns:int-stream="http://www.springframework.org/schema/integration/stream"
xsi:schemaLocation="http://www.springframework.org/schema/integration/amqp http://www.springframework.org/schema/integration/amqp/spring-integration-amqp.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/stream http://www.springframework.org/schema/integration/stream/spring-integration-stream.xsd
http://www.springframework.org/schema/rabbit http://www.springframework.org/schema/rabbit/spring-rabbit-1.3.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<bean
id="serviceListenerContainer"
class="org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer">
<property name="connectionFactory" ref="connectionFactory" />
<property name="queues" ref="requestQueue"/>
<property name="messageListener" ref="messageListenerAdaptor"/>
</bean>
<bean id="messageListenerAdaptor"
class="org.springframework.amqp.rabbit.listener.adapter.MessageListenerAdapter">
<property name="delegate" ref="pojoListener" />
</bean>
<bean
id="pojoListener"
class="PojoListener"/>
<bean
id="replyListenerContainer"
class="org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer">
<property name="connectionFactory" ref="connectionFactory"/>
<property name="queues" ref="replyQueue"/>
<property name="messageListener" ref="fixedReplyQRabbitTemplate"/>
</bean>
<!-- Infrastructure -->
<rabbit:connection-factory
id="connectionFactory"
host="localhost"
username="guest"
password="guest"
cache-mode="CHANNEL"
channel-cache-size="5"/>
<rabbit:template
id="fixedReplyQRabbitTemplate"
connection-factory="connectionFactory"
exchange="fdms.exchange"
routing-key="response.key"
reply-queue="RESPONSE.QUEUE">
<rabbit:reply-listener/>
</rabbit:template>
<rabbit:admin connection-factory="connectionFactory"/>
<rabbit:queue id="requestQueue" name="REQUEST.QUEUE" />
<rabbit:queue id="replyQueue" name="RESPONSE.QUEUE" />
<rabbit:direct-exchange name="fdms.exchange" durable="true" auto-delete="false">
<rabbit:bindings>
<rabbit:binding queue="RESPONSE.QUEUE" key="response.key" />
</rabbit:bindings>
</rabbit:direct-exchange>
</beans>
SpringReceive.java
import org.springframework.amqp.core.Message;
import org.springframework.amqp.core.MessageBuilder;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer;
import org.springframework.context.support.ClassPathXmlApplicationContext;
public class SpringReceive {
/**
* #param args
*/
public static void main(String[] args) {
ClassPathXmlApplicationContext context = new ClassPathXmlApplicationContext("cslclient.xml");
SimpleMessageListenerContainer serviceListenerContainer = context.getBean("serviceListenerContainer", SimpleMessageListenerContainer.class);
serviceListenerContainer.start();
}
}
You can use RabbitTemplate.sendAndReceive() (or convertSendAndReceive()) with a reply listener container (Docs here); the template will take care of the correlation for you.
If you are using Spring Integration, use an outbound gateway with an appropriately configured rabbit template.
In My App I have documents represents my data for each category, my application perform an automatic index to new and the modified documents.
if i performed index for all documents in one category, its work fine and retrieve a correct results, but the problem is, if i modified or create new document its will not retrieve it, if its matched my search query.
usually keeps return all docs except the last modified one.
any help please ?
I have this IndexWriter config :
private IndexWriter getIndexWriter() throws IOException {
Directory directory = FSDirectory.open(new File(filepath));
IndexWriterConfig config = new IndexWriterConfig(Version.LUCENE_43, IndexFactory.ANALYZER);
config.setRAMBufferSizeMB(350);
TieredMergePolicy tmp = new TieredMergePolicy();
tmp.setUseCompoundFile(false);
config.setMergePolicy(tmp);
ConcurrentMergeScheduler scheduler = (ConcurrentMergeScheduler) config.getMergeScheduler();
scheduler.setMaxThreadCount(2);
scheduler.setMaxMergeCount(20);
IndexWriter writer = new IndexWriter(directory, config);
writer.forceMerge(1);
return writer;
My Collector :
public void collect(int docNum) throws IOException {
try {
if ((getCount() == getMaxSearchLimit() + 1) && getMaxSearchResults() != null) {
setCounterExceededLimit(true);
return;
}
addDocKey();// method to add and render the matching docs by customize way
} catch(IOException exp) {
if (!getErrors().toArrayList(getApplication().getLocale()).contains(exp.getMessage())) {
getErrors().addError(exp.getMessage());
}
} catch (BusinessException bEx) {
if (!getErrors().containsError(bEx.getErrorNumber())) {
getErrors().addError(bEx);
}
} catch (CounterExceededLimitException counterEx) {
return;
}
}
#Override
public boolean acceptsDocsOutOfOrder() {
// TODO Auto-generated method stub
return true;
}
#Override
public void setNextReader(AtomicReaderContext context) throws IOException {
// TODO Auto-generated method stub
}
#Override
public void setScorer(Scorer scorer) throws IOException {
// TODO Auto-generated method stub
}
acually i have this busniess logic to save my doc, then i asked if the doc saved successfully to add it to the index process.
public boolean saveDocument(CategoryDocument doc) {
boolean saved = false;
// code to save my doc
if(saved) {
//add this document to the index process
IndexManager.getInstance().addToIndex(this);
}
}
then my index manager create a new thread to handle indexing this doc.
here is my process to index my data document :
private void processDocument(IndexDocument indexDoc, DocKey docKey, boolean addToIndex) throws SearchException, BusinessException {
CategorySetting catSetting = docKey.getCategorySetting();
Integer catID = catSetting.getID();
IndexManager manager = IndexManager.getInstance();
IndexWriter writer = null;
try {
//Delete the lock file in case previous index operation failed to delete it
File lockFile = new File(filepath, IndexWriter.WRITE_LOCK_NAME);
if (lockFile != null && lockFile.exists()) {
lockFile.delete();
}
if(!manager.isGlobalIndexingProcess(catID)) {
writer = getIndexWriter();
} else {
writer = manager.getGlobalIndexWriter(catID);
}
writer.forceMerge(1);
removeDocument(docKey, writer);
if (addToIndex) {
writer.addDocument(indexDoc.getLuceneIndexDoc());
}
} catch(IOException exp) {
throw new SearchException(exp.getMessage(), true);
} finally {
if(!manager.isGlobalIndexingProcess(catID)) {
if (writer != null) {
try {
writer.close(true);
} catch(IOException ex) {
throw new SearchException(ex);
}
}
}
}
}
Use lucene search and search for the word or phrase that you edited in the document and let us know whether you get the correct hits or not. If you didn't get any hits then probably you are not indexing edited or newly added documents.