I have build a couple of computed fields and its been working fine, but after I refactored my code and structure, the different computed fields exhibit odd behaviour.
One works and lets my halt the code at a breakpoint, the other does not.
The code looks identical, the index config is the same, no change other than I changed the values for the assembly and namespace after the refactoring.
<fieldMap ref="contentSearch/indexConfigurations/defaultLuceneIndexConfiguration/fieldMap">
<fieldNames hint="raw:AddFieldByFieldName">
<field fieldName="DateStart" storageType="YES" indexType="UNTOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="DateEnd" storageType="YES" indexType="UNTOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="Appetizer" storageType="YES" indexType="TOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="Title" storageType="YES" indexType="TOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="ApplicationDeadline" storageType="YES" indexType="TOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="Location" storageType="YES" indexType="TOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="Engagement" storageType="YES" indexType="TOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider" />
<field fieldName="computedCountries" storageType="YES" indexType="UNTOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<Analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
<field fieldName="computedMarkets" storageType="YES" indexType="UNTOKENIZED" vectorType="YES" boost="1f" type="System.String" settingType="Sitecore.ContentSearch.LuceneProvider.LuceneSearchFieldConfiguration, Sitecore.ContentSearch.LuceneProvider">
<Analyzer type="Sitecore.ContentSearch.LuceneProvider.Analyzers.LowerCaseKeywordAnalyzer, Sitecore.ContentSearch.LuceneProvider" />
</field>
</fieldNames>
</fieldMap>
<fields hint="raw:AddComputedIndexField">
<field fieldName="computedMarkets">Ram.SC.ContentSearch.ComputedFields.MarketComputedField, Ram.SC.ContentSearch</field>
<field fieldName="computedCountries">Ram.SC.ContentSearch.ComputedFields.CountryComputedField, Ram.SC.ContentSearch</field>
</fields>
So I cant figure out where the bone is buried.
I hope you can help me out :)
/Robin
Thanks for the comments.
In the end it was an inconsistency between framework versions, which I only found out by looking in the crawling.log file.
/Robin
Related
I am having troubles indexing a folder in solr
example-data-config.xml:
<dataConfig>
<dataSource type="BinFileDataSource" />
<document>
<entity name="files"
dataSource="null"
rootEntity="false"
processor="FileListEntityProcessor"
baseDir="C:\Temp\" fileName=".*"
recursive="true"
onError="skip">
<field column="fileAbsolutePath" name="id" />
<field column="fileSize" name="size" />
<field column="fileLastModified" name="lastModified" />
<entity
name="documentImport"
processor="TikaEntityProcessor"
url="${files.fileAbsolutePath}"
format="text">
<field column="file" name="fileName"/>
<field column="Author" name="author" meta="true"/>
<field column="text" name="text"/>
</entity>
</entity>
</document>
then I create the schema.xml:
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<field name="fileName" type="string" indexed="true" stored="true" />
<field name="author" type="string" indexed="true" stored="true" />
<field name="title" type="string" indexed="true" stored="true" />
<field name="size" type="plong" indexed="true" stored="true" />
<field name="lastModified" type="pdate" indexed="true" stored="true" />
<field name="text" type="text_general" indexed="true" stored="true" multiValued="true"/>
finally I modify the file solrConfig.xml adding the requesthandler and the dataImportHandler and dataImportHandler-extra jars:
<requestHandler name="/dataimport" class="solr.DataImportHandler">
<lst name="defaults">
<str name="config">example-data-config.xml</str>
</lst>
</requestHandler>
I run it and the result is:
Inside that folder there are like 20.000 files in diferent formats (.py,.java,.wsdl, etc)
Any suggestion will be appreciated. Thanks :)
Check your Solr logs . Answer for what is the Root Cause will definitely be there . I also faced same situation once and found through solr logs that my DataImportHandler was throwing exceptions because of encrypted documents present in the folder . Your reasons may be different, but first analyze your solr logs, execute your entity again in DataImport section, and then check the immediate logs for errors by going on the logging section on admin page . If you are getting errors other than I what I mentioned , post them here , so they can be understood and deciphered .
I'm trying to configure Solr to allow query data from my DB. After I've configured it, I've added a new field that is a foreign key to another table.
Old records have this field NULL.
Schema DB
Table: offers
Fields: id, type_material (foreign key), (others fields not need to show)
Table: materials
Fields: id, name
Solr config
File db-data-config.xml:
<dataConfig>
<dataSource type="JdbcDataSource" driver="com.mysql.jdbc.Driver" url="jdbc:mysql://path" user="user" password="pwd" />
<document name="offers">
<entity name="offers"
query="SELECT o.* FROM offers o inner join offer_group g on o.offer_group_id = g.id where g.status = 0"
deltaQuery="select id from offers where updated_at > '${dataimporter.last_index_time}'">
<field column="id" name="id" />
<field column="product_code" name="product_code" />
<field column="gender" name="gender" />
<field column="colors" name="colors" />
<field column="year" name="year" />
<field column="tags" name="tags" />
<field column="size" name="size" />
<field column="size_typology" name="size_typology" />
<field column="season" name="season" />
<field column="quantity" name="quantity" />
<field column="price" name="price" />
<field column="typology" name="typology" />
<field column="model" name="model" />
<entity name="brands"
query="select name from brands where id='${offers.brand_id}'"
deltaQuery="select id from brands where updated_at > '${dataimporter.last_index_time}'" >
<field name="brand_name" column="name" />
</entity>
<entity name="materials"
query="select name from materials where id='${offers.type_material}' OR '${offers.type_material}' = NULL">
<field name="material_name" column="name" />
</entity>
<entity name="offer_group"
query="select shop_id from offer_group where id='${offers.offer_group_id}'"
deltaQuery="select id from offer_group where updated_at > '${dataimporter.last_index_time}'" >
<field name="shop_id" column="shop_id" />
</entity>
</entity>
</document>
</dataConfig>
File schema.xml:
<?xml version="1.0" encoding="UTF-8" ?>
<schema name="offers" version="1.5">
<fieldType name="string" class="solr.StrField"></fieldType>
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true"/>
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="date" class="solr.TrieDateField" precisionStep="0" positionIncrementGap="0"/>
<!-- Just like text_general except it reverses the characters of
each token, to enable more efficient leading wildcard queries. -->
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.StandardFilterFactory"/>
<filter class="solr.ASCIIFoldingFilterFactory" />
<filter class="solr.LowerCaseFilterFactory"/>
<filter class="solr.ReversedWildcardFilterFactory" withOriginal="true"
maxPosAsterisk="3" maxPosQuestion="2" maxFractionAsterisk="0.33" minTrailing="3" />
<filter class="solr.EdgeNGramFilterFactory" minGramSize="1" maxGramSize="15" />
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.ASCIIFoldingFilterFactory" />
<filter class="solr.StandardFilterFactory"/>
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
<fieldType name="random" class="solr.RandomSortField" indexed="true" />
<dynamicField name="random_*" type="random" indexed="true" stored="false"/>
<!-- End randomize offers-->
<field name="_version_" type="long" indexed="true" stored="true" required="false"/>
<field name="id" type="long" indexed="true" stored="true" required="true" />
<field name="brand_id" type="long" indexed="true" stored="true" required="true" />
<field name="shop_id" type="long" indexed="true" stored="true" required="true" />
<field name="brand_name" type="text_general" indexed="true" stored="true" required="true" />
<field name="type_material" type="long" indexed="true" stored="true" default="NULL" />
<field name="material_name" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="offer_group_id" type="long" indexed="true" stored="true" required="true" />
<field name="product_code" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="gender" type="string" indexed="true" stored="true" default="NULL" />
<field name="colors" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="year" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="tags" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="size" type="string" indexed="true" stored="true" default="NULL" />
<field name="size_typology" type="string" indexed="true" stored="true" default="NULL" />
<field name="season" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="quantity" type="string" indexed="true" stored="true" default="NULL" />
<field name="price" type="float" indexed="true" stored="true" default="NULL" />
<field name="typology" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="photo_url" type="string" indexed="true" stored="true" required="true" />
<field name="model" type="text_general" indexed="true" stored="true" default="NULL" />
<field name="created_at" type="date" indexed="true" stored="true"/>
<field name="updated_at" type="date" indexed="true" stored="true"/>
<field name="text" type="text_general" indexed="true" stored="false" multiValued="true"/>
<uniqueKey>id</uniqueKey>
<copyField source="colors" dest="text"/>
<copyField source="year" dest="text"/>
<copyField source="season" dest="text"/>
<copyField source="typology" dest="text"/>
<copyField source="model" dest="text"/>
<copyField source="tags" dest="text"/>
<copyField source="product_code" dest="text"/>
<copyField source="brand_name" dest="text"/>
<copyField source="material_name" dest="text" />
<copyField source="gender" dest="text"/>
</schema>
When search's query start, return all offers that it hasn't type_material 's field equal to NULL.
I want to retry also those.
Just use a filter query &fq=type_material:NULL
I am trying to import a CSV file With the BULK INSERT Function. I am doing it through SQL because it can be an automated process. The problem I picked up is that the CSV file I get from the client has commas in the Item Description i.e Rosetta 10,5x25x4 - When I try to import it that comma in the description causes a problem: Here is the Script I am running:
BULK INSERT ORDERS_DATA
FROM 'C:\back_orders_2013.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
TABLOCK
)
Is there a way to ignore that like in excel where you treat consecutive delimiters as one?
Here is the solution:
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="2" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="3" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="4" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="5" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="6" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="7" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="8" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="9" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="10" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="11" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="12" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="13" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="14" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="15" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="16" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="17" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="18" xsi:type="CharTerm" TERMINATOR=',' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="19" xsi:type="CharTerm" TERMINATOR='\n' MAX_LENGTH="100" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
</RECORD>
<ROW>
<COLUMN SOURCE="1" NAME="RKID" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="2" NAME="ORNO" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="3" NAME="PONO" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="4" NAME="ITEM_CODE" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="5" NAME="ITEM_DESCRIPTION" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="6" NAME="ORDER_QTY" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="7" NAME="DELIVER_QTY" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="8" NAME="B_QTY" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="9" NAME="RTS_DATE" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="10" NAME="PRICE" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="11" NAME="PLEV" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="12" NAME="DISCOUNT1" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="13" NAME="DISCOUNT2" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="14" NAME="DISCOUNT3" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="15" NAME="TOTAL" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="16" NAME="CURRENCY" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="17" NAME="ORDER_DATE" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="18" NAME="BKYN" xsi:type="SQLVARYCHAR" />
<COLUMN SOURCE="19" NAME="PENTA_REF" xsi:type="SQLVARYCHAR" />
</ROW>
</BCPFORMAT>
This works 100% and the error I received:
String or binary data would be truncated. The statement has been
terminated
Was because of the length of one of my data fields.
I'm having issues with deltaquery where it's doesn't work automatically. Below is the data-config I have
<dataConfig>
<dataSource type="JdbcDataSource"
driver="com.microsoft.sqlserver.jdbc.SQLServerDriver"
url="jdbc:sqlserver://WTL-sql-1.com;databaseName=eng_metrics"
user="metrics"
password="metrics"/>
<document name="content">
<entity name="id"
query="select defect_id,headline,description,modify_date,issue_type,category,product,state FROM defects WHERE state not like 'Duplicate'"
deltaImportQuery="select defect_id,headline,description,modify_date,issue_type,category,product,state FROM defects WHERE defect_id = '${dataimporter.delta.defect_id}' and state not like 'Duplicate'"
deltaQuery="select defect_id FROM defects WHERE modify_date > '${dataimporter.last_index_time}'">
<field column="defect_id" name="defect_id" />
<field column="headline" name="headline" />
<field column="description" name="description" />
<field column="modify_date" name="modify_date" />
<field column="issue_type" name="issue_type" />
<field column="category" name="category" />
<field column="product" name="product" />
<field column="state" name="state" />
</entity>
</document>
</dataConfig>
But what I see that no matter the modify_date changes in the DB, I don't see any update happening unless I try doing a delta import explicitly.
Can someone provide me some thoughts on whether I need to change some config or some query to make that happen automatically?
Actually, DataImportHandler will not do it automatically. You have to trigger it by call delta import 's url.
You may want something like this:
http://wiki.apache.org/solr/DataImportHandler#Scheduling
or you can implement similar one by youself.
But I've this data-config which works fine in some cases
<dataConfig>
<dataSource type="JdbcDataSource"
driver="com.microsoft.sqlserver.jdbc.SQLServerDriver"
url="jdbc:sqlserver://127.0.0.1\SQLEXPRESS;databaseName=sustaining_trends"
user="sa"
password="metrics"/>
<document name="content">
<entity name="id"
query="select id,createtime,lastmodified,modifiedby,title,keywords,general,symptom,diagnosis,resolution FROM trends"
deltaImportQuery="select id,createtime,lastmodified,modifiedby,title,keywords,general,symptom,diagnosis,resolution FROM trends WHERE id = ${dataimporter.delta.id}"
deltaQuery="select id FROM trends WHERE lastmodified > '${dataimporter.last_index_time}' or createtime > '${dataimporter.last_index_time}'">
<field column="id" name="trendid" />
<field column="lastmodified" name="lastmodified" />
<field column="modifiedby" name="modifiedby" />
<field column="title" name="title" />
<field column="keywords" name="keywords" />
<field column="general" name="general" />
<field column="symptom" name="symptom" />
<field column="diagnosis" name="diagnosis" />
<field column="resolution" name="resolution" />
</entity>
</document>
</dataConfig>
Here if the item is modified immediately that gets updated without any interference but if a new data is created that doesn't get updated until either I do a manual delta import or else some entry gets modified.
How does this work automatically incase of modification and not work automatically for creation?
i have a problem with bulk load into my database. Server throws error "String or binary data would be truncated", but my data seems to be ok.
Im using UTF-16LE data file with XML format file.
Thanks for your answers
Here's my format file:
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="NCharTerm" TERMINATOR=";\0" />
<FIELD ID="2" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="3" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="4" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="5" xsi:type="NCharTerm" TERMINATOR=";\0" />
<FIELD ID="6" xsi:type="NCharTerm" TERMINATOR=";\0" />
<FIELD ID="7" xsi:type="NCharTerm" TERMINATOR=";\0" />
<FIELD ID="8" xsi:type="NCharTerm" TERMINATOR=";\0" />
<FIELD ID="9" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="10" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="11" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="12" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="13" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="14" xsi:type="NCharTerm" TERMINATOR=";\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="15" xsi:type="NCharTerm" TERMINATOR="\n\0" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
</RECORD>
<ROW>
<COLUMN SOURCE="1" NAME="RefundId" xsi:type="SQLINT"/>
<COLUMN SOURCE="2" NAME="Date" xsi:type="SQLNCHAR"/>
<COLUMN SOURCE="3" NAME="VariableSymbol" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="4" NAME="RefundType" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="5" NAME="TollAmount" xsi:type="SQLNUMERIC" PRECISION="28" SCALE="12"/>
<COLUMN SOURCE="6" NAME="DepositAmount" xsi:type="SQLNUMERIC" PRECISION="28" SCALE="12"/>
<COLUMN SOURCE="7" NAME="OperativeAmount" xsi:type="SQLNUMERIC" PRECISION="28" SCALE="12"/>
<COLUMN SOURCE="8" NAME="OIAmount" xsi:type="SQLNUMERIC" PRECISION="28" SCALE="12"/>
<COLUMN SOURCE="9" NAME="Currency" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="10" NAME="Prefix" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="11" NAME="BankAccountNumber" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="12" NAME="BankCode" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="13" NAME="BankIBANCode" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="14" NAME="BankSwiftCode" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="15" NAME="BeneficiaryName" xsi:type="SQLNVARCHAR"/>
</ROW>
</BCPFORMAT>
Here's my data file:
301;2008/05/05;444441;CN;50;1550;70;900;CZK;;8888888888;0800;;;
302;2008/05/06;444442;UP;50;1550;70;900;CZK;;8888888888;1234;;;
303;2008/05/07;444443;OP;50;1550;70;900;CZK;;8888888888;;;;
304;2008/05/08;444444;CN;60;1550;70;900;CZK;;;0400;;;
305;2008/05/09;444445;UP;70;1550;70;900;CZK;;1234567891;0800;;;
306;2008/05/10;444446;OP;80;1550;70;900;CZK;;8888888888;0800;;;
307;2008/05/11;444447;CN;90;1550;70;900;CZK;;8888888888;0800;CZ7308000000008888888888;;
308;2008/05/12;444448;UP;100;1550;70;900;CZK;;8888888888;0800;;12345678901;
309;2008/05/13;444449;OP;110;1550;70;900;CZK;;;;12234567890qqqq;COBADEFF;**123456789012345678901234567890qqqq**
310;2008/05/14;444450;CN;120;1550;70;900;CZK;;;;;COBADEFF;Herr Norbert Ramscheid
312;2008/05/16;444452;OP;140;1550;70;900;CZK;;;;RO49AAAA1B31007593840000;RNCBROBU;Mr. Papalescu
313;2008/05/17;444453;CN;150;1550;70;900;CZK;;678907890;;;RNCBROBU;Mojmir Fagaras
314;2008/05/18;444454;UP;160;1550;70;900;CZK;;;;;RNCBROBU;Mr. Without Bank Account
315;2008/05/19;444455;OP;170;1550;70;900;CZK;;;;RO49AAAA1B31007593840000;;NoSWIFT Bill
316;2008/05/20;444456;CN;180;2000;70;900;CZK;;;;RO49AAAA1B31007593840000;RNCBROBU;
317;2008/05/21;444457;UP;190;1550;70;900;CZK;;;;DE89370400440532013000;COBADEFF;
I assume there could be a problem with a destination table where the data should be loaded into.
Example
E.g. when you are loading data in string of 10 characters into the column which is e.g. varchar(6) data type, the data could not be loaded into this column as it supports only 6 characters and you would get the error as you are getting now - the data would be truncated.