Custom Lucene Index not visible in Sitecore's Indexing Manager - lucene

I have been given the task of implementing content search in a Sitecore (ver. 7.2) based website. For the same, I am planning to use the Lucene search provider as it comes bundled with Sitecore out of the box and also since our search requirements don't seem too exhaustive for me to attempt using Solr.
We want users to be able to search a bucketable list of content residing in Sitecore from the main site.
The documentation and blogs explaining how to do this are sketchy and incomplete.
I used the below blog as a reference point:
http://www.mattburkedev.com/sitecore-7-contentsearch-tips/
After adding the index configuration file in App_Config/Include folder, I expected to see the new index in Sitecore's Indexing Manager. However i do not notice the same there. Any ideas on what I'm doing wrong?
I wanted to create a custom index so that I can target only particular sitecore nodes. Please see my configuration file. I only need to search for data within the articles node using the fields set in the articles item template.
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
<sitecore>
<contentSearch>
<configuration type="Sitecore.ContentSearch.LuceneProvider.LuceneSearchConfiguration, Sitecore.ContentSearch.LuceneProvider">
<indexes hint="list:AddIndex">
<index id="book_index" type="Sitecore.ContentSearch.LuceneProvider.LuceneIndex, Sitecore.ContentSearch.LuceneProvider">
<param desc="name">$(id)</param>
<param desc="folder">$(id)</param>
<!-- This initializes index property store. Id has to be set to the index id -->
<param desc="propertyStore" ref="contentSearch/databasePropertyStore" param1="$(id)" />
<strategies hint="list:AddStrategy">
<!-- NOTE: order of these is controls the execution order -->
<strategy ref="contentSearch/indexUpdateStrategies/syncMaster" />
</strategies>
<commitPolicyExecutor type="Sitecore.ContentSearch.CommitPolicyExecutor, Sitecore.ContentSearch">
<policies hint="list:AddCommitPolicy">
<policy type="Sitecore.ContentSearch.TimeIntervalCommitPolicy, Sitecore.ContentSearch" />
</policies>
</commitPolicyExecutor>
<locations hint="list:AddCrawler">
<crawler type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/content/support/articles</Root>
</crawler>
</locations>
</index>
</indexes>
</configuration>
</contentSearch>
</sitecore>
</configuration>

I was finally able to see my index in the Indexing Manager. There seemed to be a problem with the name of the config file. I named my index file "Sitecore.ContentSearch.Lucene.Downloads.config" and after that, the index appeared. The file was being patched in before the standard Lucene config and hence the issue.

Just rename your config file to z.Sitecore.ContentSearch.Lucene.Downloads.config
This is because when sitecore will merge all your configs in one file, the filename is taken into consideration.
Thanks

Some of your namespaces are off in your config. Please try this config and see if it shows up for you. I have modified it to match your parameters. This should work for you.
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
<sitecore>
<contentSearch>
<configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
<indexes hint="list:AddIndex">
<index id="book_index" type="Sitecore.ContentSearch.LuceneProvider.LuceneIndex, Sitecore.ContentSearch.LuceneProvider">
<param desc="name">$(id)</param>
<param desc="folder">$(id)</param>
<!-- This initializes index property store. Id has to be set to the index id -->
<param desc="propertyStore" ref="contentSearch/databasePropertyStore" param1="$(id)" />
<configuration ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration" />
<strategies hint="list:AddStrategy">
<!-- NOTE: order of these is controls the execution order -->
<strategy ref="contentSearch/indexUpdateStrategies/syncMaster" />
</strategies>
<commitPolicyExecutor type="Sitecore.ContentSearch.CommitPolicyExecutor, Sitecore.ContentSearch">
<policies hint="list:AddCommitPolicy">
<policy type="Sitecore.ContentSearch.TimeIntervalCommitPolicy, Sitecore.ContentSearch" />
</policies>
</commitPolicyExecutor>
<locations hint="list:AddCrawler">
<crawler type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/content/support/articles</Root>
</crawler>
</locations>
</index>
</indexes>
</configuration>
</contentSearch>
</sitecore>
</configuration>

Related

Running Gatling tests not showing testFile.log on IntelliJ

We have an issue where whenever our Gatling performance tests are run the .log file that should generate at the root of the folder is not there.
This is my whole logback file if anyone able to help please.
<contextListener class="ch.qos.logback.classic.jul.LevelChangePropagator">
<resetJUL>true</resetJUL>
</contextListener>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<!-- <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>INFO</level>
</filter> -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
<immediateFlush>true</immediateFlush>
</encoder>
</appender>
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
<file>testFile.log</file>
<append>false</append>
<!-- encoders are assigned the type
ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%-5level] %logger{15} - %msg%n%rEx</pattern>
</encoder>
</appender>
<!-- uncomment and set to DEBUG to log all failing HTTP requests -->
<!-- uncomment and set to TRACE to log all HTTP requests -->
<logger name="io.gatling.http.engine.response" level="TRACE" />
<root level="WARN">
<appender-ref ref="CONSOLE" />
<appender-ref ref="FILE" />
</root>
Thank you very much.
Update
It seems the issue may be with IntelliJ itself as noticed we can see the file when going directly to the finder.
Disabling custom plugins should help. Seems one of configuration was corrupted.
There's a good chance the file is simply not generated where you expect it. Try setting an absolute path instead to verify.

Dyanmically populating a variable in a log file

I have a web service (Jenkins) that handles user requests, and I want to be able to dynamically append the request session id to each log line without having to actually add that variable to each and every log action.
I'm using log4j2 with slf4j implementation, I initialize the logger using an external configuration file with org.apache.logging.log4j.core.config.Configurator, I create an instance of the logger per every session using
final Logger logger = LoggerFactory.getLogger(MyClass.class);
I have for example:
logger.debug("received new request");
...
logger.debug("added something");
And I want the user session id to be added to each line without having to add it myself like:
logger.debug("{} received new request",session.getId());
...
logger.debug("{} added something",session.getId());
My log4j2.xml file is:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE xml>
<Configuration status="INFO">
<Properties>
<Property name="logPath">...</Property>
<Property name="rollingFileName">...</Property>
</Properties>
<Appenders>
<Console name="console" target="SYSTEM_OUT">
<PatternLayout pattern="[%highlight{%-5level}] %d{DEFAULT} %c{1}.%M() - %msg%n%throwable{short.lineNumber}" />
</Console>
<RollingFile name="rollingFile" fileName="${logPath}/${rollingFileName}.log" filePattern="${logPath}/${rollingFileName}_%d{yyyy-MM-dd}.log">
<PatternLayout pattern="[%highlight{%-5level}] %d{DEFAULT} %c{1}.%M() - %msg%n%throwable{short.lineNumber}" />
<Policies>
<!-- Causes a rollover if the log file is older than the current JVM's start time -->
<OnStartupTriggeringPolicy />
<!-- Causes a rollover once the date/time pattern no longer applies to the active file -->
<TimeBasedTriggeringPolicy interval="1" modulate="true" />
</Policies>
</RollingFile>
</Appenders>
<Loggers>
<Logger name="com.project" level="debug" additivity="false">
<AppenderRef ref="console"/>
<AppenderRef ref="rollingFile"/>
</Logger>
</Loggers>
</Configuration>
Actual results from current log file:
[[36mDEBUG[m] 2019-02-05 16:42:09,794 SpellCheck.getResult() - start
[[36mDEBUG[m] 2019-02-05 16:42:10,420 SpellCheck.getResult() - Spelling correction returned no results.
[[36mDEBUG[m] 2019-02-05 16:42:10,420 SpellCheck.getResult() - end
What I want to achieve:
[[36mDEBUG[m] 2019-02-05 16:42:09,794 SpellCheck.getResult() - 1234 - start
[[36mDEBUG[m] 2019-02-05 16:42:10,420 SpellCheck.getResult() - 1234 - Spelling correction returned no results.
[[36mDEBUG[m] 2019-02-05 16:42:10,420 SpellCheck.getResult() - 1234 - end
Where 1234 is for example the session id.
Thanks.
I figured it out, was a lot easier than I thought.
Basically added %X{userSessionId} to the
<PatternLayout pattern= ... />
row in log4j2.xml.
And in the code added
HttpSession session = request.getSession();
org.apache.logging.log4j.ThreadContext.put("userSessionId", session.getId());

How to make Log4j2 configurable by environment using Spring Boot 1.3.6.RELEASE

I would like to change some properties from the log4j2.xml file depending on the my application.properties, for example define some properties and then substitute in the log4j2 those properties that are parameters.
I ran different approaches but I still did not get the right thing. I would like to have different configs depending on the environment (DEV, QA or PROD). How to accomplish this?
I'm trying to have this in my properties
#Place holders for log4j2.xml file
log.file.path=/opt/tomcat/logs
log.file.name=dummydummy
log.file.size=100 MB
log.level=DEBUG
My log4j2 below.
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" monitorInterval="30">
<Properties>
<Property name="PID">????</Property>
<property name="name">my-log</property>
</Properties>
<Appenders>
<RollingFile name="file" fileName="${log.file.path}${log.file}.log"
filePattern="${log.file.path}${log.file}-%d{yyyy-MM-dd}-%i.log.gz">
<PatternLayout
pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %5p ${sys:PID} --- [%t] %c{1}(%M:%L) : %m%n%wEx" />
<Policies>
<TimeBasedTriggeringPolicy /><!-- Rotated everyday -->
<SizeBasedTriggeringPolicy size="${log.file.size}" /> <!-- Or every 100 MB -->
</Policies>
</RollingFile>
<Console name="Console" target="SYSTEM_OUT" follow="true">
<PatternLayout
pattern="%clr{%d{yyyy-MM-dd HH:mm:ss.SSS}}{faint} %clr{%5p} %clr{${sys:PID}}{magenta} %clr{---}{faint} %clr{[%t]}{faint} %clr{%c{1}(%M:%L)}{cyan} %clr{:}{faint} %m%n%wEx" />
</Console>
</Appenders>
<Loggers>
<Logger name="org.hibernate.validator.internal.util.Version"
level="warn" />
<Logger name="org.apache.coyote.http11.Http11NioProtocol" level="warn" />
<Logger name="org.apache.tomcat.util.net.NioSelectorPool" level="warn" />
<Logger name="org.apache.catalina.startup.DigesterFactory" level="error" />
<Logger name="org.springframework.web" level="error" />
<Root level="${log.level}">
<AppenderRef ref="Console" />
<AppenderRef ref="file" />
</Root>
</Loggers>
</Configuration>
The properties lookup element allows to refer properties from an external properties file in the log4j configuration.
For your example it should be something like this:
A file env.properties contains the following properties:
log.file.path=/opt/tomcat/logs
log.file.name=dummydummy
log.file.size=100 MB
log.level=DEBUG
The properties lookup should be defined as properties of the log4j2.xml:
<Configuration>
<Properties>
<property name="log.file.path">${bundle:env:log.file.path}</property>
<property name="log.file.name">${bundle:env:log.file.name}</property>
<property name="log.file.size">${bundle:env:log.file.size}</property>
<property name="log.level">${bundle:env:log.level}</property>
</Properties>
Now the properties may be referred in appenders with ${property_name} notation. Each property reference will be interpolated with the real value from the env.properties.
You can find another example of properties lookup here.
As of Log4j 2.13.0 Log4j 2 now provides a Spring Lookup as part of its Spring Cloud Config support. It will allow you to reference properties defined in your application.properties or application.yml file of your Spring Boot application in the log4j2.xml.
Since log4j 2.14.0, you can now use Spring Boot environment variables without Spring Cloud and without doing direct reference to the properties file. You'll need at least Spring Boot 2.0.3
<property name="applicationName">${spring:spring.application.name}</property>
Documentation: https://logging.apache.org/log4j/2.x/log4j-spring-boot/index.html
Maven repository: https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-spring-boot
Ensure that log4j2 starter is added in the classpath and then remove logging related properties in application.properties then spring will load your log4j2.xml from resources folder.
This way you can have full control over logging. If you want to substitute values then refer this link
Note:: If you have actuator in your project then remove spring boot logger starter
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
Workaround in case you need just different property values in log4j2.xml config for different spring profiles (dev, test, prod, etc.).
You could access spring profile files like: ${bundle:application-${sys:spring.profiles.active}:log.file.name}
application-dev.properties file:
log.file.name=dev_log
application-prod.properties file:
log.file.name=app_name
log4j2.xml config:
<Configuration>
<Properties>
<Property name="file_name">${bundle:application-${sys:spring.profiles.active}:log.file.name}</Property>
</Properties>
<Appenders>
<RollingFile name="RollingFile"
fileName="/logs/${file_name}.log"
...
NOTE: You have to add log.file.name property to each spring profile, log4j2 sees it just as separate text files and will not resolve default values from application.properties file and etc.

MSBuild XmlPeek task help required

My sample.xml file is below
<deployment>
<definition type="xpath">
<xpath>configuration/Settings/add[#key='NetworkPath'][#value]</xpath>
<attribute>value</attribute>
<value>http://www.google.com</value>
</definition>
</deployment>
I want to fetch the value "http://www.google.com" corresponsing to xpath "configuration/Settings/add[#key='NetworkPath'][#value]". I am writing below XmlPeek task but it is not working
<XmlPeek XmlInputPath="C:\Sample.xml"
Query="configuration/Settings/add[#key='NetworkPath'][#value]">
<Output TaskParameter="Result" ItemName="Peeked" />
</XmlPeek>
<Message Text="Peeked value is #(Peeked)"/>
There might be some confusion about XPath.
If you want to retrieve htttp://www.google.com from Sample.xml you need to apply this query:
<XmlPeek XmlInputPath="Sample.xml"
Query="/deployment/definition/value/text()">
<Output TaskParameter="Result" ItemName="Peeked" />
</XmlPeek>
If you want to extract the path configuration/Settings/add[#key='NetworkPath'][#value]/#value it corresponds to another Xml file having this form:
<configuration>
<Settings>
<add key="NetworkPath" value="http://www.google.com"/>
</Settings>
</configuration>
Check out some XPath examples.

Retrieving dependencies with empty type

I'm trying to figure out how to omit the [type] part in an Ivy retrieve pattern for artifacts that don't have type declared. I use the following ant statement:
<ivy:retrieve pattern="${lib.dir}/[artifact](-[type]).[ext]" conf="compile" />
Despite the parentheses, Ivy produces files like
junit-jar.jar
junit-javadoc.jar
junit-source.jar
The latter two ones are as expected but the first one should be "junit.jar" instead.
The result is the same as when I omit the parentheses.
Edit:
What I'm doing up to now to work around the problem: I have multiple retrieve statements in the build.xml:
<ivy:retrieve pattern="${lib.dir}/[artifact]-[type].[ext]" type="source" />
<ivy:retrieve pattern="${lib.dir}/[artifact].[ext]" type="jar" />
(The "conf" attribute in the original post is not related to this topic.)
But that looks rather silly when there's the feature of optional tokens.
The type defaults to jar, it can't be ommited. See Documentation
So this (-[type]) does not have an affect.
Perhaps you could do something like this in the build.xml (if you control the ivy.xml).
<ivy:retrieve pattern="${lib.dir}/[artifact].[ext]" conf="compile" />
<ivy:retrieve pattern="${lib.dir}/[artifact](-[type]).[ext]" conf="extras" />
You'd have to publish the jar in the compile config and the other jars in the extra config.
Or just name the other ones junit-javadoc and junit-source in the ivy.xml:
<?xml version="1.0" encoding="UTF-8"?>
<ivy-module version="2.0">
<info organisation="junit"
module="jnuit"
revision="4.8.2"
status="release"
publication="20110531150115"
default="true"
/>
<configurations>
<conf name="default" visibility="public"/>
</configurations>
<publications>
<artifact name="junit" type="jar" />
<artifact name="junit-sources" type="jar" />
<artifact name="junit-javadoc" type="jar" />
</publications>
</ivy-module>