I created lucene index in gfsh using the following command create lucene index --name=myLucIndex --region=myRegion --field=title
--analyzer=org.apache.lucene.analysis.en.EnglishAnalyzer --serializer=a.b.c.MyLatLongSerializer
My serializer is as follows :
class MyLatLongSerializer implements LuceneSerializer<Book> {
#Override
public Collection<Document> toDocuments(LuceneIndex luceneIndex, Book book) {
logger.debug("inside custom lucene serializer ...");
// Writes fields of Book into a document
Document newDocument = new Document();
newDocument.add(new StoredField("title", book.getTitle()));
newDocument.add(new LatLonPoint("location", book.getLatitude(), book.getLongitude()));
return Collections.singleton(newDocument);
}
}
My spring boot configuration file is as follows:
#Configuration
#ClientCacheApplication
#EnableClusterDefinedRegions(clientRegionShortcut = ClientRegionShortcut.CACHING_PROXY)
#EnableIndexing
public class BookConfiguration {
#Bean(name = "bookGemfireCache")
ClientCacheConfigurer bookGemfireCache(
#Value("${spring.data.geode.locator.host:localhost}") String hostname,
#Value("${spring.data.geode.locator.port:10334}") int port) {
// Get clientCache
}
#Bean
Region<Long, Book> bookRegion(ClientCache clientCache) {
logger.debug("inside regions ...");
return clientCache.getRegion("myRegion");
}
#Bean
LuceneService ukBikesLuceneService(ClientCache clientCache) {
return LuceneServiceProvider.get(clientCache);
}
}
I load data to geode using the following code :
bookRegion.putAll(Map<bookId, Book>);
describe lucene index --name=myLucIndex --region=myRegion then document # 0 but when I create lucene index using the below command
create lucene index --name=myLucIndex --region=myRegion --field=title
--analyzer=org.apache.lucene.analysis.en.EnglishAnalyzer
then load the data again, run
describe lucene index --name=myLucIndex --region=myRegion
then document # 96.
I use spring data geode 2.1.8.RELEASE, geode-core 1.9.0, lucene-core 8.2.0
What am I missing here ?
Apache Geode currently uses Apache Lucene version 6.6.6 and you're using lucene-core 8.2.0, which is not backward compatible with major older versions like 6.X, that's the reason why you're getting these exceptions. Everything should work just fine if you use the Lucene version shipped with Geode.
As a side note, there are current efforts to upgrade the Lucene version used by Geode, you can follow the progress through GEODE-7039.
Hope this helps.
Related
I am working on a legacy non-Spring application, and it is being migrated from Hibernate 3 to Hibernate 5.6.0.Final (latest at this time). I have generally never used Hibernate Event Listeners in my work, so this is quite new to me, and I am studying these in Hibernate 5.
Currently in some test class we have defined the code this way for Hibernate 3:
protected static Configuration createSecuredDatabaseConfig() {
Configuration config = createUnrestrictedDatabaseConfig();
config.setListener("pre-insert", "com.app.server.services.db.eventlisteners.MySecurityHibernateEventListener");
config.setListener("pre-update", "com.app.server.services.db.eventlisteners.MySecurityHibernateEventListener");
config.setListener("pre-delete", "com.app.server.services.db.eventlisteners.MySecurityHibernateEventListener");
config.setListener("pre-load", "com.app.server.services.db.eventlisteners.EkoSecurityHibernateEventListener");
return config;
}
This is obviously no longer valid, and I believe I need to create a Hibernate Integrator, which I have done.
public class MyEventListenerIntegrator implements Integrator {
#Override
public void integrate(Metadata metadata, SessionFactoryImplementor sessionFactory,
SessionFactoryServiceRegistry serviceRegistry) {
EventListenerRegistry eventListenerRegistry = serviceRegistry.getService(EventListenerRegistry.class);
eventListenerRegistry.getEventListenerGroup(EventType.PRE_INSERT).appendListener(new MySecurityHibernateEventListener());
eventListenerRegistry.getEventListenerGroup(EventType.PRE_UPDATE).appendListener(new MySecurityHibernateEventListener());
eventListenerRegistry.getEventListenerGroup(EventType.PRE_DELETE).appendListener(new MySecurityHibernateEventListener());
eventListenerRegistry.getEventListenerGroup(EventType.PRE_LOAD).appendListener(new MySecurityHibernateEventListener());
}
So, now I believe the next step is to add this to the session via the registry builder. I am using this website to help me:
https://www.boraji.com/hibernate-5-event-listener-example
Because we were using older Hibernate 3, we had code to create our session factory as follows:
protected static SessionFactory buildSessionFactory(Database db)
{
if (db == null) {
throw new NullPointerException("Database specifier cannot be null");
}
try {
Configuration config = createSessionFactoryConfiguration(db);
String url = config.getProperty("connection.url");
String user = config.getProperty("connection.username");
String password = config.getProperty("connection.password");
try {
String dbDriver = config.getProperty("hibernate.connection.driver_class");
Class.forName(dbDriver);
Connection conn = DriverManager.getConnection(url, user, password);
}
catch (SQLException error) {
logger.info("Didn't find driver, on QA or production, so it's okay to assume we have DB connection");
error.printStackTrace();
}
SessionFactory sessionFactory = config.buildSessionFactory();
sessionFactoryConfigs.put(sessionFactory, config); // Cannot recover config from factory instance, must be stored.
return sessionFactory;
}
catch (Throwable ex) {
// Make sure you log the exception, as it might be swallowed
logger.error("Initial SessionFactory creation failed.", ex);
throw new ExceptionInInitializerError(ex);
}
}
The link that I referred to above has a much different way of creating the sessionfactory. So, I'll be testing that out to see if it works in our app.
Without Spring handling our sessions and transactions, in this app it is coded by hand the way it was done before Spring, and I haven't seen that kind of code in years.
I solved this issue with the help from the link I provided above. However, I didn't copy exactly what they did, but some of it helped. My solution is as follows:
protected static SessionFactory createSecuredDatabaseConfig() {
Configuration config = createUnrestrictedDatabaseConfig();
BootstrapServiceRegistry bootstrapRegistry =
new BootstrapServiceRegistryBuilder()
.applyIntegrator(new EkoEventListenerIntegrator())
.build();
ServiceRegistry serviceRegistry = new StandardServiceRegistryBuilder(bootstrapRegistry).applySettings(config.getProperties()).build();
SessionFactory sessionFactory = config.buildSessionFactory(serviceRegistry);
return sessionFactory;
}
This was it. I tried multiple different ways to register the events without the BootstrapServiceRegistry, but none of those worked. I did have to create the integrator. What I did NOT include was the following:
MetadataSources sources = new MetadataSources(serviceRegistry )
.addPackage("com.myproject.server.model");
Metadata metadata = sources.getMetadataBuilder().build();
// did not create the sessionFactory this way
sessionFactory = metadata.getSessionFactoryBuilder().build();
If I had gone further and use this method to create the sessionFactory, then all of my queries would have been complaining about not being able to find the parameterName, which is something else.
The Hibernate Integrator and this method to create the sessionFactory is all for the unit tests. Without registering these events, one unit test would fail, and now it doesn't. So, this solves my problem for now.
I am trying to read a Solr index file. This file is created by an example from Solr download pages in version 6.4.
I am using this code:
import java.io.File;
import java.io.IOException;
import org.apache.lucene.document.Document;
import org.apache.lucene.index.IndexReader;
import org.apache.lucene.store.Directory;
import org.apache.lucene.store.FSDirectory;
public class TestIndex {
public static void main(String[] args) throws IOException {
Directory dirIndex = FSDirectory.open(new File("D:\\data\\data\\index"));
IndexReader indexReader = IndexReader.open(dirIndex);
Document doc = null;
for(int i = 0; i < indexReader.numDocs(); i++) {
doc = indexReader.document(i);
}
System.out.println(doc.toString());
indexReader.close();
dirIndex.close();
}
}
Solr jar : solr-solrj-6.5.1.jar
Lucene : lucene-core-r1211247.jar
Exception :
Exception in thread "main"
org.apache.lucene.index.IndexFormatTooOldException: Format version is not
supported (resource:
ChecksumIndexInput(MMapIndexInput(path="D:\data\data\index\segments_2"))):
1071082519 (needs to be between -9 and -12). This version of Lucene only
supports indexes created with release 3.0 and later.
Updated code with lucene 6.5.1
Path path = FileSystems.getDefault().getPath("D:\\data\\data\\index");
Directory dirIndex = FSDirectory.open(path);
DirectoryReader dr = DirectoryReader.open(dirIndex);
Document doc = null;
for(int i = 0; i < dr.numDocs(); i++) {
doc = dr.document(i);
}
System.out.println(doc.toString());
dr.close();
dirIndex.close();
Exception :
java.lang.UnsupportedClassVersionError: org/apache/lucene/store/Directory : Unsupported major.minor version 52.0.
Could you please help me to run this code?
Thanks
Virendra Agarwal
I suggest to use Luke.
https://github.com/DmitryKey/luke
Luke is the GUI tool for introspecting your Lucene / Solr / Elasticsearch index. It allows:
Viewing your documents and analyzing their field contents (for stored fields) Searching in the index
Performing index maintenance: index health checking, index optimization (take a - backup before running this!)
Reading index from hdfs
Exporting the index or portion of it into an xml format
Testing your custom Lucene analyzers
Creating your own plugins!
That lucene-jar seems to be from 2012, so it's over five years old. Use lucene-core-6.5.1 to read index files generated by Solr 6.5.1.
You can pin your dependencies in your build file if it's picking the arbitrarily named file by error.
I am trying to write a quick class to trigger the data import on solr. I know I can just use HttpClient, but I've already got Spring-Data-Solr configured and it has the server configured etc.
Is it possible to use the Query interface and the Solr Template to just send a request to dataimport request handler with "command=full-import" as params?
How can I do that?
If you have access to SolrTemplate instance, you could execute a SolrCallback as follows:
solrTemplate.execute(new SolrCallback<Void>() {
#Override
public Void doInSolr(SolrServer solrServer) throws SolrServerException, IOException {
ModifiableSolrParams params = new ModifiableSolrParams();
params.set("qt", "/dataimport");
params.set("command", "full-import");
solrServer.query(params);
return null;
}
});
I have a very loosely couplet system that takes about any json payload and saves in a mongo colection.
There are no entities to expose as resouces, but only controller endpoints
eg.
#RequestMapping(method = RequestMethod.POST, consumes = MediaType.APPLICATION_JSON_VALUE, produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<Map<String, Object>> publish(#RequestBody Map<String, Object> jsonBody) {
.. save the body in mongo
....}
I still want to build a hypermedia driven app. with links for navigation and paging.
The controller therefor implements ResourceProcessor
public class PublicationController implements ResourceProcessor<RepositoryLinksResource> {
....
#Override
public RepositoryLinksResource process(RepositoryLinksResource resource) {
resource.add(linkTo(methodOn(PublicationController.class).getPublications()).withRel("publications"));
return resource;
}
The problem is that the processor never gets called ??
Putting #EnableWebMvc on a configuration class solves it (the processor gets called), but firstly that should not be necessary, and secondary the format of HAL links seems broken
eg. gets formattet as a list
links: [
{
"links":[
{
"rel":"self",
"href":"http://localhost:8080/api/publications/121212"
},
{
"rel":"findByStartTimeBetween",
"href":"http://localhost:8080/api/publications/search/findStartTimeBetween?timeStart=2015-04-10T13:44:56.437&timeEnd=2015-04-10T13:44:56.439"
}
]
}
Are there alternatives to #enableWebMvc so the processor gets called ?
Currently I'm running Spring boot v. 1.2.3
Well it turns out that the answer was quite simple.
The problem was that I had static content (resources/static/index.html)
This will suppress the hypermedia links from the root.
Moving the static content made everything thing work great.
I am generating a PDF file via fop 1.0 out of a java library. The unit tests are running fine and the PDF is rendered as expected, including an external graphic:
<fo:external-graphic content-width="20mm" src="url('images/image.png')" />
If I render this within a Java EE application in glassfish 3.1, I always get the following error:
Image not found. URI: images/image.png. (No context info available)
I double-checked whether the image is available. It is available within the .jar file in the .ear file and should therfore be available by the ClasspathUriResolver. This is a code-snipplet of how I setup the fop-factory:
FopFactory fopFactory = FopFactory.newInstance();
URIResolver uriResolver = new ClasspathUriResolver();
fopFactory.setURIResolver(uriResolver);
Fop fop = fopFactory.newFop(MimeConstants.MIME_PDF, out);
...
I also assigned the URI resolver to the TransformerFactory and the Transformer with no success. Would be great if someone can help me out.
-- Wintermute
Btw: the ClasspathUriResolver() looks like this
public class ClasspathUriResolver implements URIResolver {
#Override
public Source resolve(String href, String base) throws TransformerException {
Source source = null;
InputStream inputStream = ClassLoader.getSystemResourceAsStream(href);
if (inputStream != null) {
source = new StreamSource(inputStream);
}
return source;
}
}
You consider a different class loader then ClassLoader.getSystemResourceAsStream(href);
Try InputStream inputStream = getClass().getResourceAsStream(href); or something else, maybe.
Does it work, then?