how to construct ISO8583 message header for ascii channel and iso93ascii packager in jpos? - iso8583

I am trying to create an ISO8583 message using JPOS in java using the ASCII channel to send the message and iso93ascii packager to pack the ISO message.
But after sending the message I am getting invalid header error from the server.
So my question is what is the header made up of exactly and how do I frame my header for MTI value 1200.
ISOMsg.setHeader("HEADER".getBytes());
How should I frame up my HEADER?
New Development :
After taking a look at the server configuration I need to send the header prepended by the length of the ISO8583 message(2 byte length in hex converted to bytes). How can I do this using JPOS? Also am not able to set anything using channel.setHeader("xxx").getBytes()).
How do I see what raw message is being sent from my terminal to the server.
Here are some of the excerpts from the code
Deploy files
filename : 10_clientsimualtor_channel.xml
<?xml version="1.0" ?>
<channel-adaptor name='jpos-client-adaptor'
class="org.jpos.q2.iso.ChannelAdaptor" logger="Q2">
<channel class="org.jpos.iso.channel.ASCIIChannel" logger="Q2"
packager="org.jpos.iso.packager.ISO93APackager" header= "ISO026000075">
<property name="host" value="xxx.xx.xx.xx" />
<property name="port" value="xxxxx" />
</channel>
<in>jpos-client-send</in>
<out>jpos-client-receive</out>
<reconnect-delay>10000</reconnect-delay>
</channel-adaptor>
Code :
packager = new ISO93APackager();
ISOMsg m = new ISOMsg();
m.setPackager(packager);
System.out.println(packager);
m.setHeader("ISO026000075".getBytes());
System.out.println("Head err..........."+newString(m.getHeader()));
Date now = new Date();
m.setMTI("1200");
m.set(2,"xx");
m.set(3,"xxxxx");
m.set(4,"000000010000");
m.set(11,"214491");
m.set(12,"160203");
m.set(123, "xxxxxx");
m.set(125, "xxxx");
byte b[] = m.pack();
System.out.println("\n\n\n\nPACKAGER =====------"+m.getPackager());
System.out.printf("\n\n\n\nMessage ===== %s",new String(b));
System.out.println("\n\n\n"+ISOUtil.hexdump(b));return m;

How do I see what raw message is being sent from my terminal to the
server?
The simplest way is to extend your channel and do an hexdump from its send and receive methods.
Here is a simple test code that will create a client and server, send and receive a message with a header.
You will see the output contains the header definition that was sent.
Once you get it this, moving it to the Q2 way of doing things via deploy should work.
Replace the packager to what you are using in the code.
Make sure there is no header attribute in your packager xml.
Add the header to your client channel adapter deploy.
import java.io.IOException;
import org.jpos.iso.ISOChannel;
import org.jpos.iso.ISOException;
import org.jpos.iso.ISOMsg;
import org.jpos.iso.ISORequestListener;
import org.jpos.iso.ISOServer;
import org.jpos.iso.ISOSource;
import org.jpos.iso.ServerChannel;
import org.jpos.iso.channel.ASCIIChannel;
import org.jpos.iso.packager.GenericPackager;
import org.jpos.util.Logger;
import org.jpos.util.SimpleLogListener;
import org.jpos.util.ThreadPool;
public class Test {
public static void main(String[] args) throws IOException, ISOException {
Logger l = new Logger();
l.addListener(new SimpleLogListener());
GenericPackager serverPkg = new GenericPackager(
"C:\\temp\\iso93asciiB-custom.xml");
serverPkg.setLogger(l, "Server"); // So that the output can be differentiated based on realm
GenericPackager clientPkg = new GenericPackager(
"C:\\temp\\iso93asciiB-custom.xml");
clientPkg.setLogger(l, "Client");// So that the output can be differentiated based on realm
// Simulate a server and listen on a port
ISOChannel serverChannel = new ASCIIChannel(serverPkg);
((ASCIIChannel) serverChannel).setHeader("ISO70100000");
// AN equivalent in your channel adaptor deploy file would be
// <channel class="org.jpos.iso.channel.ASCIIChannel"
// packager="org.jpos.iso.packager.GenericPackager"
// header="ISO70100000"> .....
// This is evident from the ChanelAdaptor code
// QFactory.invoke (channel, "setHeader", e.getAttributeValue ("header"));
((ASCIIChannel) serverChannel).setLogger(l, "server");
ISOServer server = new ISOServer(7654, (ServerChannel) serverChannel,
new ThreadPool(10, 100, "serverListeningThread"));
server.addISORequestListener(new ISORequestListener() {
// If the client sends a message, the server will respond with and approval if its a request message
#Override
public boolean process(ISOSource source, ISOMsg msg) {
try {
if (!msg.isRequest()) {
msg.setResponseMTI();
msg.set(39, "000");
source.send(msg);
}
}
catch (ISOException | IOException ex) {
}
return true;
}
});
Thread serverThread = new Thread(server);
serverThread.start(); // beyond this point the server is listening for a client connection
ASCIIChannel clientChannel = new ASCIIChannel("127.0.0.1", 7654, clientPkg);
clientChannel.setHeader("ISO70100000");​ //Similar to server, you can configure the constant in your deploy file​
clientChannel.setLogger(l, "client");
clientChannel.connect(); // connect to server, it will be seen in the output console
ISOChannel connectChannel = server.getLastConnectedISOChannel();// Since server can have multiple connections,
// we get the last one that connected to it.
ISOMsg serverInitiatedRequest = new ISOMsg();
serverInitiatedRequest.set(0, "1804");
serverInitiatedRequest.set(7, "1607161705");
serverInitiatedRequest.set(11, "888402");
serverInitiatedRequest.set(12, "160716170549");
serverInitiatedRequest.set(24, "803");
serverInitiatedRequest.set(25, "0000");
serverInitiatedRequest.set(33, "101010");
serverInitiatedRequest.set(37, "619817888402");
connectChannel.send(serverInitiatedRequest); // use the last one connected to send a request message to the client.
ISOMsg receivedRequest = clientChannel.receive();// receive the serers request message at the client
ISOMsg clientResponse = (ISOMsg) receivedRequest.clone();
clientResponse.setResponseMTI();
clientResponse.set(39, "000");
clientChannel.send(clientResponse); // send the response to server
}
}
Your output should look like
<log realm="client/127.0.0.1:7654" at="Sun Jul 17 12:31:55 IST 2016.764" lifespan="33ms">
<connect>
127.0.0.1:7654
</connect>
</log>
<log realm="Server" at="Sun Jul 17 12:31:55 IST 2016.784" lifespan="4ms">
<pack>
31383034023001808800000031363037313631373035383838343032313630373136313730353439383033303030303036313031303130363139383137383838343032
</pack>
</log>
<log realm="server/127.0.0.1:10614" at="Sun Jul 17 12:31:55 IST 2016.785" lifespan="7ms">
<send>
<isomsg direction="outgoing">
<!-- org.jpos.iso.packager.GenericPackager[C:\temp\iso93asciiB-custom.xml] -->
<field id="0" value="1804"/>
<field id="7" value="1607161705"/>
<field id="11" value="888402"/>
<field id="12" value="160716170549"/>
<field id="24" value="803"/>
<field id="25" value="0000"/>
<field id="33" value="101010"/>
<field id="37" value="619817888402"/>
</isomsg>
</send>
</log>
<log realm="Client" at="Sun Jul 17 12:31:55 IST 2016.787" lifespan="1ms">
<unpack>
31383034023001808800000031363037313631373035383838343032313630373136313730353439383033303030303036313031303130363139383137383838343032
<bitmap>{7, 11, 12, 24, 25, 33, 37}</bitmap>
<unpack fld="7" packager="org.jpos.iso.IFA_NUMERIC">
<value>1607161705</value>
</unpack>
<unpack fld="11" packager="org.jpos.iso.IFA_NUMERIC">
<value>888402</value>
</unpack>
<unpack fld="12" packager="org.jpos.iso.IFA_NUMERIC">
<value>160716170549</value>
</unpack>
<unpack fld="24" packager="org.jpos.iso.IFA_NUMERIC">
<value>803</value>
</unpack>
<unpack fld="25" packager="org.jpos.iso.IFA_NUMERIC">
<value>0000</value>
</unpack>
<unpack fld="33" packager="org.jpos.iso.IFA_LLNUM">
<value>101010</value>
</unpack>
<unpack fld="37" packager="org.jpos.iso.IF_CHAR">
<value>619817888402</value>
</unpack>
</unpack>
</log>
<log realm="client/127.0.0.1:7654" at="Sun Jul 17 12:31:55 IST 2016.789" lifespan="4ms">
<receive>
<isomsg direction="incoming">
<!-- org.jpos.iso.packager.GenericPackager[C:\temp\iso93asciiB-custom.xml] -->
<header>49534F3730313030303030</header>
<field id="0" value="1804"/>
<field id="7" value="1607161705"/>
<field id="11" value="888402"/>
<field id="12" value="160716170549"/>
<field id="24" value="803"/>
<field id="25" value="0000"/>
<field id="33" value="101010"/>
<field id="37" value="619817888402"/>
</isomsg>
</receive>
</log>
<log realm="Client" at="Sun Jul 17 12:31:55 IST 2016.791">
<pack>
31383134023001808A00000031363037313631373035383838343032313630373136313730353439383033303030303036313031303130363139383137383838343032303030
</pack>
</log>
<log realm="Server" at="Sun Jul 17 12:31:55 IST 2016.791">
<unpack>
31383134023001808A00000031363037313631373035383838343032313630373136313730353439383033303030303036313031303130363139383137383838343032303030
<bitmap>{7, 11, 12, 24, 25, 33, 37, 39}</bitmap>
<unpack fld="7" packager="org.jpos.iso.IFA_NUMERIC">
<value>1607161705</value>
</unpack>
<unpack fld="11" packager="org.jpos.iso.IFA_NUMERIC">
<value>888402</value>
</unpack>
<unpack fld="12" packager="org.jpos.iso.IFA_NUMERIC">
<value>160716170549</value>
</unpack>
<unpack fld="24" packager="org.jpos.iso.IFA_NUMERIC">
<value>803</value>
</unpack>
<unpack fld="25" packager="org.jpos.iso.IFA_NUMERIC">
<value>0000</value>
</unpack>
<unpack fld="33" packager="org.jpos.iso.IFA_LLNUM">
<value>101010</value>
</unpack>
<unpack fld="37" packager="org.jpos.iso.IF_CHAR">
<value>619817888402</value>
</unpack>
<unpack fld="39" packager="org.jpos.iso.IFA_NUMERIC">
<value>000</value>
</unpack>
</unpack>
</log>
<log realm="server/127.0.0.1:10614" at="Sun Jul 17 12:31:55 IST 2016.792" lifespan="26ms">
<receive>
<isomsg direction="incoming">
<!-- org.jpos.iso.packager.GenericPackager[C:\temp\iso93asciiB-custom.xml] -->
<header>49534F3730313030303030</header>
<field id="0" value="1814"/>
<field id="7" value="1607161705"/>
<field id="11" value="888402"/>
<field id="12" value="160716170549"/>
<field id="24" value="803"/>
<field id="25" value="0000"/>
<field id="33" value="101010"/>
<field id="37" value="619817888402"/>
<field id="39" value="000"/>
</isomsg>
</receive>
</log>
<log realm="client/127.0.0.1:7654" at="Sun Jul 17 12:31:55 IST 2016.793" lifespan="3ms">
<send>
<isomsg direction="outgoing">
<!-- org.jpos.iso.packager.GenericPackager[C:\temp\iso93asciiB-custom.xml] -->
<header>49534F3730313030303030</header>
<field id="0" value="1814"/>
<field id="7" value="1607161705"/>
<field id="11" value="888402"/>
<field id="12" value="160716170549"/>
<field id="24" value="803"/>
<field id="25" value="0000"/>
<field id="33" value="101010"/>
<field id="37" value="619817888402"/>
<field id="39" value="000"/>
</isomsg>
</send>
</log>

You can set your header at the channel level (i.e. channel.setHeader("xxx").getBytes()) or set it on a per-message level (i.e. m.setHeader("xxx".getBytes())).
It is important that the channel knows the header length at receive time, so even if you use per message header, you should set a dummy header at the channel level as well.
Using Q2 and the ChannelAdaptor or QServer components will make your life really easier. Take a look at http://jpos.org/doc/proguide-draft.pdf

from jpos you will get only hexdump of header.in order to send ascii93 version you need to convert hexdump to bytes and then bits.
https://github.com/vikrantlabde/iso8583-Java/blob/master/src/ISO/iso8583.java
you can use this program to pack and send isoascii93 message

Related

No matching record found for external id 'groups_group1' in field 'Group'

Please i want to establish access rights for my module, so I modify the file csv I try to install the module, but this error appears at the time of installation of the module:
File "/home/omar/odoo/odoo11/odoo/modules/loading.py", line 95, in _load_data
tools.convert_file(cr, module_name, filename, idref, mode, noupdate, kind, report)
File "/home/omar/odoo/odoo11/odoo/tools/convert.py", line 785, in convert_file
convert_csv_import(cr, module, pathname, fp.read(), idref, mode, noupdate)
File "/home/omar/odoo/odoo11/odoo/tools/convert.py", line 832, in convert_csv_import
raise Exception(_('Module loading %s failed: file %s could not be processed:\n %s') % (module, fname, warning_msg))
Exception: Module loading moduletest failed: file moduletest/security/ir.model.access.csv could not be processed:
No matching record found for external id 'groups_group1' in field 'Group'
No matching record found for external id 'groups_group2' in field 'Group'
One of the things that make this error occur is when you don't respect the order of security files, you have to put your .xml file that is related to security before the .csv file in your __ manifest __.py
Here's a simple example:
'data': [
'views/security.xml',
'security/ir.model.access.csv',
# ... other includes ...
]
this is a sample code for category and group
<record model="ir.module.category" id="module_category_stock_quotation_request"> -->
<!-- <field name="parent_id" ref="module_category_localization" /> -->
<!-- <field name="name">Stock Transfert Request</field>
<field name="visible" eval="0" />
</record>
<record id="group_stock_quotation_request_user" model="res.groups">
<field name="name">User</field>
<field name="implied_ids" eval="[(4, ref('base.group_user'))]"/>
<field name="category_id" ref="module_category_stock_quotation_request"/>
</record>
<record id="group_stock_quotation_request_manager" model="res.groups">
<field name="name">Manager</field>
<field name="implied_ids" eval="[(4, ref('transfert_request.group_stock_quotation_request_user'))]"/>
<field name="category_id" ref="module_category_stock_quotation_request"/>
</record>

Sitecore custom index configure languages

I created a custom Lucene index in a Sitecore 8.1 environment like this:
<?xml version="1.0"?>
<configuration xmlns:x="http://www.sitecore.net/xmlconfig/">
<sitecore>
<contentSearch>
<configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
<indexes hint="list:AddIndex">
<index id="Products_index" type="Sitecore.ContentSearch.LuceneProvider.LuceneIndex, Sitecore.ContentSearch.LuceneProvider">
<param desc="name">$(id)</param>
<param desc="folder">$(id)</param>
<param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" />
<configuration ref="contentSearch/indexConfigurations/ProductIndexConfiguration" />
<strategies hint="list:AddStrategy">
<strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/onPublishEndAsync" />
</strategies>
<commitPolicyExecutor type="Sitecore.ContentSearch.CommitPolicyExecutor, Sitecore.ContentSearch">
<policies hint="list:AddCommitPolicy">
<policy type="Sitecore.ContentSearch.TimeIntervalCommitPolicy, Sitecore.ContentSearch" />
</policies>
</commitPolicyExecutor>
<locations hint="list:AddCrawler">
<crawler type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/content/General/Product Repository</Root>
</crawler>
</locations>
<enableItemLanguageFallback>false</enableItemLanguageFallback>
<enableFieldLanguageFallback>false</enableFieldLanguageFallback>
</index>
</indexes>
</configuration>
<indexConfigurations>
<ProductIndexConfiguration type="Sitecore.ContentSearch.LuceneProvider.LuceneIndexConfiguration, Sitecore.ContentSearch.LuceneProvider">
<initializeOnAdd>true</initializeOnAdd>
<analyzer ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration/analyzer" />
<documentBuilderType>Sitecore.ContentSearch.LuceneProvider.LuceneDocumentBuilder, Sitecore.ContentSearch.LuceneProvider</documentBuilderType>
<fieldMap type="Sitecore.ContentSearch.FieldMap, Sitecore.ContentSearch">
<fieldNames hint="raw:AddFieldByFieldName">
<field fieldName="_uniqueid" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="key" storageType="YES" indexType="UNTOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider"/>
</fieldNames>
</fieldMap>
<documentOptions type="Sitecore.ContentSearch.LuceneProvider.LuceneDocumentBuilderOptions, Sitecore.ContentSearch.LuceneProvider">
<indexAllFields>true</indexAllFields>
<include hint="list:AddIncludedTemplate">
<Product>{843B9598-318D-4AFA-B8C8-07E3DF5C6738}</Product>
</include>
</documentOptions>
<fieldReaders ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration/fieldReaders"/>
<indexFieldStorageValueFormatter ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration/indexFieldStorageValueFormatter"/>
<indexDocumentPropertyMapper ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration/indexDocumentPropertyMapper"/>
</ProductIndexConfiguration>
</indexConfigurations>
</contentSearch>
</sitecore>
</configuration>
Actually a quite straightforward index that points to a root, includes a template and stores a field. It all works. But I need something extra:
Limit the entries in the index to the latest version
Limit the entries in the index to one language (English)
I know this can be easily triggered in the query as well, but I want to get my indexes smaller so they will (re)build faster.
The first one could be solved by indexing web instead of master, but I actually also want the unpublished ones in this case. I'm quite sure I will need a custom crawler for this.
The second one (limit the languages) is actually more important as I will also need this in other indexes. The easy answer is probably also to use a custom crawler but I was hoping there would be a way to configure this without custom code. Is it possible to configure an custom index to only include a defined set of languages (one or more) instead of all?
After some research I came to the conclusion that to match all the requirements a custom crawler was the only solution. Blogged the code: http://ggullentops.blogspot.be/2016/10/custom-sitecore-index-crawler.html
The main reason was that we really had to be able to set this per index - which is not possible with the filter approach. Also, when we added a new version the previous version had to be deleted from the index - also not possible without the crawler...
There is quite some code so I won't copy it all here but we had to override the DoAdd and DoUpdate methods of the SitecoreItemCrawler, and for the languages we also had to make a small override of the Update method.
You can easily create a custom processor that will let you filter what you want:
namespace Sitecore.Custom
{
public class CustomInboundIndexFilter : InboundIndexFilterProcessor
{
public override void Process(InboundIndexFilterArgs args)
{
var item = args.IndexableToIndex as SitecoreIndexableItem;
var language = Sitecore.Data.Managers.LanguageManager.GetLanguage("en");
if (item != null && (!item.Item.Versions.IsLatestVersion() || item.Item.Language == language))
{
args.IsExcluded = true;
}
}
}
}
And add this processor into the config:
<pipelines>
<indexing.filterIndex.inbound>
<processor type="Sitecore.Custom.CustomInboundIndexFilter, Sitecore.Custom"></processor>
</indexing.filterIndex.inbound>
</pipelines>

Not getting values of fields of types Droplist and TreeList from Lucene index

I am using Sitecore 7.2. I created a custom Lucene index.
While I am able to get the values of fields of type Single-line Text, Rich Text & DateTime, I am not getting the values of fields of types such as Droplist and TreeList.
I have tried changing the indexType of these fields to "UNTOKENISED" but still continue to get this issue. I also checked my index using Luke and observed that only the Title, Summary and Body fields are part of the index.
Below is the portion of my index configuration where I've defined my fields. products and type are Treelist and Droplist fields respectively.
<fieldMap type="Sitecore.ContentSearch.FieldMap, Sitecore.ContentSearch">
<fieldNames hint="raw:AddFieldByFieldName">
<field fieldName="title" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="summary" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="body" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="datemodified" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="products" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<fieldType fieldName="type" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String"
settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
</fieldNames>
</fieldMap>
The class I am using to fetch values from the index is as below:
public class DownloadResult : SearchResultItem
{
[IndexField("title")]
public string Title { get; set; }
[IndexField("summary")]
public string Summary { get; set; }
[IndexField("type")]
public string Type { get; set; }
[IndexField("body")]
public string Body { get; set; }
[IndexField("datemodified")]
public DateTime DateModified { get; set; }
[indexfield("products")]
public string products { get; set; }
}
In the file Sitecore.ContentSearch.Lucene.DefaultIndexConfiguration.config
you need to have next fieldReaders for indexing droplist,treelist,etc.
These are default configuration in Sitecore:
<fieldReader fieldTypeName="checklist|multilist|treelist|treelistex|tree list" fieldNameFormat="{0}" fieldReaderType="Sitecore.ContentSearch.FieldReaders.MultiListFieldReader, Sitecore.ContentSearch" />
<fieldReader fieldTypeName="icon|droplist|grouped droplist" fieldNameFormat="{0}" fieldReaderType="Sitecore.ContentSearch.FieldReaders.DefaultFieldReader, Sitecore.ContentSearch" />
<fieldReader fieldTypeName="name lookup value list|name value list" fieldNameFormat="{0}" fieldReaderType="Sitecore.ContentSearch.FieldReaders.NameValueListFieldReader, Sitecore.ContentSearch" />
<fieldReader fieldTypeName="droplink|droptree|grouped droplink|tree" fieldNameFormat="{0}" fieldReaderType="Sitecore.ContentSearch.FieldReaders.LookupFieldReader, Sitecore.ContentSearch" />
please check if you have these configurations and if you reference this config file into your custom index file.
You should use computed index fields to get such fields (Treelist and Droplist) indexed.
Reference for HowTo : How to Add Computed Index
Try declaring your fields like this, this always works for me:
[IndexField("products")]
public IEnumerable<ID> Products { get; set; }
[IndexField("type")]
public IEnumerable<ID> Type { get; set; }
You could also try adjusting the storage type settings in the AddFieldByFieldTypeName settings
<fieldType fieldTypeName="droptree" storageType="YES" indexType="TOKENIZED" vectorType="NO" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
please check your search's default configuration file for fieldtypes "DropList" and "TreeList". If its not added please add it: It will look something like this:
<fieldTypes hint="raw:AddFieldTypes">
…
<fieldType name="droplist" storageType="NO" indexType="TOKENIZED" vectorType="NO" boost="1f" />
<fieldType name="treelist" storageType="NO" indexType="TOKENIZED" vectorType="NO" boost="1f" />
…
</fieldTypes>
(Not able to comment on Sairaj's answer)
While indexAllFields is a solution it is not ideal. The real problem I assume (since it happened to us) is that you are trying to index a field in which it's value is pulled from standard values. While viewing the item in Content Editor, it looks to have a value, but per Lucene (don't ask my why) it does not have a value.
Apparently Lucene does not read a fields value from standard values unless you tell it to indexAllFields. While a custom field could also be used to solve this problem, I feel like this is either A) a bug in Sitecore or B) a configuration option that no one has been able to track down.
(Edit)
We noticed this on 8.1
After trying out all the different suggestions and hitting a blank, I changed the value of indexAllFields to true in the configuration. That solved it.

WSO2 ESB Store and Forward Processor not obeying endpoint message format

We have been testing the message processors and queues on wso2 esb. We have been using the sampling processor quite successfully to just log data to a DB. This POX end to end. The sampling processor correctly dequeues a message and sends it to the endpoint. The endpoint is defined as POX and non-chunked, and all this works well.
We decided to try the store and forward processor as we wanted to test out the guaranteed delivery mechanism. So we created a new processor and defined it as store and forward. In the main sequence that stores the message, we added the target.endpoint property before storing the message. The property was set to the same endpoint that was being used in the sampling scenario.
However - what have found is that in this mode, the message transformation does not happen correctly. The content type is set to text/html and the output is chunked. This causes our service to return a 415 error.
We have tried adding messageType, contentType etc to multiple place, to the axis http sender transport, etc, but it seems to make no difference at all.
Any guidance on this would be appreciated
Define the following property in the sequence before sending to the endpoint
<property name="DISABLE_CHUNKING" value="true" scope="axis2"/>
The JS code below simply inserts the messageID into the return payload to the caller. Using this with a sampling processor and sequence that simply sends to the same endpoint works fine.
<?xml version="1.0" encoding="UTF-8"?>
<definitions xmlns="http://ws.apache.org/ns/synapse">
<registry provider="org.wso2.carbon.mediation.registry.WSO2Registry">
<parameter name="cachableDuration">15000</parameter>
</registry>
<endpoint name="test_e">
<address uri="http://192.168.45.168:8080/cgi-bin/esbcgi.pl" format="pox"/>
<property name="DISABLE_CHUNKING" value="true" scope="axis2"/>
</endpoint>
<sequence name="fault" trace="enable">
<log level="full">
<property name="MESSAGE" value="Executing default 'fault' sequence"/>
<property name="ERROR_CODE" expression="get-property('ERROR_CODE')"/>
<property name="ERROR_MESSAGE" expression="get-property('ERROR_MESSAGE')"/>
</log>
<drop/>
</sequence>
<sequence name="main" onError="fault" trace="enable">
<in>
<log level="full"/>
<property name="OUT_ONLY" value="true"/>
<property name="FORCE_HTTP_1.0" value="true" scope="axis2"/>
<script language="js">
var message = mc.getMessageID();
var messageId = message.substring(9,45);
var payload = mc.getPayloadXML().*;
mc.setPayloadXML(
<payload> <messageId>{messageId}</messageId>{payload}
</payload>);
</script>
<switch xmlns:ns="http://org.apache.synapse/xsd" xmlns:soapenv="http://www.w3.org/2003/05/soap-envelope" xmlns:ns3="http://org.apache.synapse/xsd" source="get-property('To')">
<case regex=".*/TEST/.*">
<property name="target.endpoint" value="test_e" scope="default"/>
<store messageStore="TEST"/>
</case>
<default/>
</switch>
<property name="OUT_ONLY" value="false"/>
<script language="js">var serviceMessageId = mc.getMessageID();
mc.setPayloadXML(
<tag xmlns="http://tagcmd.com">
<messageId>{serviceMessageId}</messageId> </tag>);
</script>
<send>
<endpoint key="MessageService"/>
</send>
</in>
<out>
<script language="js">
var messagePayload= mc.getPayloadXML().*.*;
mc.setPayloadXML(
<eventResponse> <messageId>{messagePayload}</messageId> </eventResponse>);
</script>
<send/>
</out>
<description>The main sequence for the message mediation</description>
</sequence>
<messageStore class="org.wso2.carbon.message.store.persistence.jms.JMSMessageStore" name="TEST">
<parameter name="java.naming.factory.initial">org.apache.qpid.jndi.PropertiesFileInitialContextFactory</parameter>
<parameter name="java.naming.provider.url">repository/conf/jndi.properties</parameter>
<parameter name="store.jms.destination">APP8</parameter>
</messageStore>
<messageProcessor class="org.apache.synapse.message.processors.forward.ScheduledMessageForwardingProcessor" name="test_p2" messageStore="TEST">
<parameter name="max.deliver.attempts">1</parameter>
</messageProcessor>
</definitions>

Python Debugger out of sync with code

I am in the process of debugging something in OpenERP using Python 2.7.3. The debugger seems to get out of sync with the code when stepping through with the Next (n) command. See Code and output below. I have the same problem with Python 2.6.5. I have never experienced this before using Python pdb.
I believe the problem may be related to the way OpenERP calls my method through a Python exec() statement by reading the code field in the OpenERP XML below. Is possible that calling Python code constructed dynamically and called via exec() is confusing the pdb debugger?
If this is the case is there a work around?
CODE called via this OpenERP action below:
<record id="action_wash_st_method1" model="ir.actions.server">
<field name="type">ir.actions.server</field>
<field name="condition">True</field>
<field name="state">code</field>
<field name="model_id" ref="model_view_tree_display_address_list"/>
<field eval="5" name="sequence"/>
<field name="code">
action = self.view_calc_sales_tax(cr, uid, context)
</field>
<field name="name">wash state action request</field>
</record>
<record model="ir.values" id="action_wash_st_tax_trigger_method1" >
<field name="key2" eval="'tree_but_open'" />
<field name="model" eval="'view.tree.display.address.list'" />
<field name="name">Method1 Wash State</field>
<field name="value" eval="'ir.actions.server,%d'%action_wash_st_method1"/>
<field name="object" eval="True" />
</record>