I tend to use something like this:
return NHibernateSession.Current.CreateSQLQuery
(
#"
some sql
"
)
.SetResultTransformer(NHibernate.Transform.Transformers.AliasToBean(typeof(someviewmodel)))
.List<someviewmodel>();
to map my sql output to a viewmodel. Is it quite straightforward to achieve the same mapping to a Dictionary whilst using CreateSQLQuery which spews out two int columns?
Thanks.
You would basically need to create your own transformer and specify it in your SetResultTransformer call.
It might look something like this:
public class CustomDictionaryTransformer : IResultTransformer
{
public object TransformTuple(object[] tuple, string[] aliases)
{
KeyValuePair<int, int> result = new KeyValuePair<int, int>();
for (int i = 0; i < tuple.Length; i++)
{
string alias = aliases[i];
var val = new KeyValuePair<int, int>();
if (alias == "key") result.Key = (int)tuple[i];
else result.Value = (int)tuple[i];
}
return result;
}
public IList TransformList(IList collection)
{
return collection;
}
}
Related
I want to do a query like that : "banana apple cherry" on a "fruit" field.
All the fruits in the desserts needs to be in the query, but not all the fruits in the query needs to be in the desserts..
Here's an example..
NAME FRUIT
Dessert1 banana apple OK (we got banana and apple in the query)
Dessert2 cherry apple banana OK(the order doesn't matter)
Dessert3 cherry apple banana melon NO (melon is missing in the query)
public class ArrayStringFieldBridge implements TwoWayFieldBridge{
#Override
public Object get(String name, Document document) {
IndexableField[] fields = document.getFields(name);
String[] values = new String[fields.length];
for (int i=0; i<fields.length; i++) {
values[i] = fields[i].stringValue();
}
return values;
}
#Override
public String objectToString(Object value) {
return StringUtils.join((String[])value, " ");
}
#Override
public void set(String name, Object value, Document document, LuceneOptions luceneOptions) {
String newString = StringUtils.join((String[])value, " ");
Field field = new Field(name, newString, luceneOptions.getStore(), luceneOptions.getIndex(), luceneOptions.getTermVector());
field.setBoost(luceneOptions.getBoost());
document.add(field);
}
}
#Indexed
#AnalyzerDef(name = "customanalyzer",
tokenizer = #TokenizerDef(factory = StandardTokenizerFactory.class))
public class Dessert {
#Analyzer(definition="customanalyzer")
#Field(name = "equipment", index=Index.YES, analyze = Analyze.YES, store=Store.YES)
#FieldBridge(impl=ArrayStringFieldBridge.class)
public String[] fruits = new String[]{};
}
Even if you are not using hibernate-search, every suggestions about the theory to handle that would be great... Thank you
Step 1 : Fire lucene query "fruit:banana OR fruit:apple OR fruit:cherry"
Step 2 : Gather all matched dessert documents
Step 3 : Post process your match dessert document with query
convert match document to array of terms matchDocArr : {banana, apple}
convert query terms to array - queryArr : {banana, apple, cherry}
iterate over matchDocArr and make sure each term of matchDocArr is found in queryArr by array, if NOT (melon use case) knockout this matched document
Here is an example function which needs to be called for every matched doc
public static boolean isDocInterested(String query, String matchDoc)
{
List<String> matchDocArr = new ArrayList<String>();
matchDocArr = Arrays.asList(matchDoc.split(" "));
List<String> queryArr = new ArrayList<String>();
queryArr = Arrays.asList(query.split(" "));
int matchCounter = 0;
for(int i=0; i<matchDocArr.size(); i++)
{
if (queryArr.contains(matchDocArr.get(i)))
matchCounter++;
}
if (matchCounter == matchDocArr.size())
return true;
return false;
}
if function returns TRUE we are interested in doc/dessert, if it returns FALSE ignore this doc/dessert.
of course this function can be written in many different ways but I think you get the point.
I am trying to serialize Guava Range objects to JSON using Gson, however the default serialization fails, and I'm unsure how to correctly implement a TypeAdapter for this generic type.
Gson gson = new Gson();
Range<Integer> range = Range.closed(10, 20);
String json = gson.toJson(range);
System.out.println(json);
Range<Integer> range2 = gson.fromJson(json,
new TypeToken<Range<Integer>>(){}.getType());
System.out.println(range2);
assertEquals(range2, range);
This fails like so:
{"lowerBound":{"endpoint":10},"upperBound":{"endpoint":20}}
PASSED: typeTokenInterface
FAILED: range
java.lang.RuntimeException: Unable to invoke no-args constructor for
com.google.common.collect.Cut<java.lang.Integer>. Register an
InstanceCreator with Gson for this type may fix this problem.
at com.google.gson.internal.ConstructorConstructor$12.construct(
ConstructorConstructor.java:210)
...
Note that the default serialization actually loses information - it fails to report whether the endpoints are open or closed. I would prefer to see it serialized similar to its toString(), e.g. [10‥20] however simply calling toString() won't work with generic Range instances, as the elements of the range may not be primitives (Joda-Time LocalDate instances, for example). For the same reason, implementing a custom TypeAdapter seems difficult, as we don't know how to deserialize the endpoints.
I've implemented most of a TypeAdaptorFactory based on the template provided for Multimap which ought to work, but now I'm stuck on the generics. Here's what I have so far:
public class RangeTypeAdapterFactory implements TypeAdapterFactory {
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> typeToken) {
Type type = typeToken.getType();
if (typeToken.getRawType() != Range.class
|| !(type instanceof ParameterizedType)) {
return null;
}
Type elementType = ((ParameterizedType) type).getActualTypeArguments()[0];
TypeAdapter<?> elementAdapter = (TypeAdapter<?>)gson.getAdapter(TypeToken.get(elementType));
// Bound mismatch: The generic method newRangeAdapter(TypeAdapter<E>) of type
// GsonUtils.RangeTypeAdapterFactory is not applicable for the arguments
// (TypeAdapter<capture#4-of ?>). The inferred type capture#4-of ? is not a valid
// substitute for the bounded parameter <E extends Comparable<?>>
return (TypeAdapter<T>) newRangeAdapter(elementAdapter);
}
private <E extends Comparable<?>> TypeAdapter<Range<E>> newRangeAdapter(final TypeAdapter<E> elementAdapter) {
return new TypeAdapter<Range<E>>() {
#Override
public void write(JsonWriter out, Range<E> value) throws IOException {
if (value == null) {
out.nullValue();
return;
}
String repr = (value.lowerBoundType() == BoundType.CLOSED ? "[" : "(") +
(value.hasLowerBound() ? elementAdapter.toJson(value.lowerEndpoint()) : "-\u221e") +
'\u2025' +
(value.hasLowerBound() ? elementAdapter.toJson(value.upperEndpoint()) : "+\u221e") +
(value.upperBoundType() == BoundType.CLOSED ? "]" : ")");
out.value(repr);
}
public Range<E> read(JsonReader in) throws IOException {
if (in.peek() == JsonToken.NULL) {
in.nextNull();
return null;
}
String[] endpoints = in.nextString().split("\u2025");
E lower = elementAdapter.fromJson(endpoints[0].substring(1));
E upper = elementAdapter.fromJson(endpoints[1].substring(0,endpoints[1].length()-1));
return Range.range(lower, endpoints[0].charAt(0) == '[' ? BoundType.CLOSED : BoundType.OPEN,
upper, endpoints[1].charAt(endpoints[1].length()-1) == '[' ? BoundType.CLOSED : BoundType.OPEN);
}
};
}
}
However the return (TypeAdapter<T>) newRangeAdapter(elementAdapter); line has a compilation error and I'm now at a loss.
What's the best way to resolve this error? Is there a better way to serialize Range objects that I'm missing? What about if I want to serialize RangeSets?
Rather frustrating that the Google utility library and Google serialization library seem to require so much glue to work together :(
This feels somewhat like reinventing the wheel, but it was a lot quicker to put together and test than the time spent trying to get Gson to behave, so at least presently I'll be using the following Converters to serialize Range and RangeSet*, rather than Gson.
/**
* Converter between Range instances and Strings, essentially a custom serializer.
* Ideally we'd let Gson or Guava do this for us, but presently this is cleaner.
*/
public static <T extends Comparable<? super T>> Converter<Range<T>, String> rangeConverter(final Converter<T, String> elementConverter) {
final String NEG_INFINITY = "-\u221e";
final String POS_INFINITY = "+\u221e";
final String DOTDOT = "\u2025";
return new Converter<Range<T>, String>() {
#Override
protected String doForward(Range<T> range) {
return (range.hasLowerBound() && range.lowerBoundType() == BoundType.CLOSED ? "[" : "(") +
(range.hasLowerBound() ? elementConverter.convert(range.lowerEndpoint()) : NEG_INFINITY) +
DOTDOT +
(range.hasUpperBound() ? elementConverter.convert(range.upperEndpoint()) : POS_INFINITY) +
(range.hasUpperBound() && range.upperBoundType() == BoundType.CLOSED ? "]" : ")");
}
#Override
protected Range<T> doBackward(String range) {
String[] endpoints = range.split(DOTDOT);
Range<T> ret = Range.all();
if(!endpoints[0].substring(1).equals(NEG_INFINITY)) {
T lower = elementConverter.reverse().convert(endpoints[0].substring(1));
ret = ret.intersection(Range.downTo(lower, endpoints[0].charAt(0) == '[' ? BoundType.CLOSED : BoundType.OPEN));
}
if(!endpoints[1].substring(0,endpoints[1].length()-1).equals(POS_INFINITY)) {
T upper = elementConverter.reverse().convert(endpoints[1].substring(0,endpoints[1].length()-1));
ret = ret.intersection(Range.upTo(upper, endpoints[1].charAt(endpoints[1].length()-1) == ']' ? BoundType.CLOSED : BoundType.OPEN));
}
return ret;
}
};
}
/**
* Converter between RangeSet instances and Strings, essentially a custom serializer.
* Ideally we'd let Gson or Guava do this for us, but presently this is cleaner.
*/
public static <T extends Comparable<? super T>> Converter<RangeSet<T>, String> rangeSetConverter(final Converter<T, String> elementConverter) {
return new Converter<RangeSet<T>, String>() {
private final Converter<Range<T>, String> rangeConverter = rangeConverter(elementConverter);
#Override
protected String doForward(RangeSet<T> rs) {
ArrayList<String> ls = new ArrayList<>();
for(Range<T> range : rs.asRanges()) {
ls.add(rangeConverter.convert(range));
}
return Joiner.on(", ").join(ls);
}
#Override
protected RangeSet<T> doBackward(String rs) {
Iterable<String> parts = Splitter.on(",").trimResults().split(rs);
ImmutableRangeSet.Builder<T> build = ImmutableRangeSet.builder();
for(String range : parts) {
build.add(rangeConverter.reverse().convert(range));
}
return build.build();
}
};
}
*For inter-process communication, Java serialization would likely work just fine, as both classes implement Serializable. However I'm serializing to disk for more permanent storage, meaning I need a format I can trust won't change over time. Guava's serialization doesn't provide that guarantee.
Here is a Gson JsonSerializer and JsonDeserializer that generically supports a Range: https://github.com/jamespedwards42/Fava/wiki/Range-Marshaller
#Override
public JsonElement serialize(final Range src, final Type typeOfSrc, final JsonSerializationContext context) {
final JsonObject jsonObject = new JsonObject();
if ( src.hasLowerBound() ) {
jsonObject.add( "lowerBoundType", context.serialize( src.lowerBoundType() ) );
jsonObject.add( "lowerBound", context.serialize( src.lowerEndpoint() ) );
} else
jsonObject.add( "lowerBoundType", context.serialize( BoundType.OPEN ) );
if ( src.hasUpperBound() ) {
jsonObject.add( "upperBoundType", context.serialize( src.upperBoundType() ) );
jsonObject.add( "upperBound", context.serialize( src.upperEndpoint() ) );
} else
jsonObject.add( "upperBoundType", context.serialize( BoundType.OPEN ) );
return jsonObject;
}
#Override
public Range<? extends Comparable<?>> deserialize(final JsonElement json, final Type typeOfT, final JsonDeserializationContext context) throws JsonParseException {
if ( !( typeOfT instanceof ParameterizedType ) )
throw new IllegalStateException( "typeOfT must be a parameterized Range." );
final JsonObject jsonObject = json.getAsJsonObject();
final JsonElement lowerBoundTypeJsonElement = jsonObject.get( "lowerBoundType" );
final JsonElement upperBoundTypeJsonElement = jsonObject.get( "upperBoundType" );
if ( lowerBoundTypeJsonElement == null || upperBoundTypeJsonElement == null )
throw new IllegalStateException( "Range " + json
+ "was not serialized with this serializer! The default serialization does not store the boundary types, therfore we can not deserialize." );
final Type type = ( ( ParameterizedType ) typeOfT ).getActualTypeArguments()[0];
final BoundType lowerBoundType = context.deserialize( lowerBoundTypeJsonElement, BoundType.class );
final JsonElement lowerBoundJsonElement = jsonObject.get( "lowerBound" );
final Comparable<?> lowerBound = lowerBoundJsonElement == null ? null : context.deserialize( lowerBoundJsonElement, type );
final BoundType upperBoundType = context.deserialize( upperBoundTypeJsonElement, BoundType.class );
final JsonElement upperBoundJsonElement = jsonObject.get( "upperBound" );
final Comparable<?> upperBound = upperBoundJsonElement == null ? null : context.deserialize( upperBoundJsonElement, type );
if ( lowerBound == null && upperBound != null )
return Range.upTo( upperBound, upperBoundType );
else if ( lowerBound != null && upperBound == null )
return Range.downTo( lowerBound, lowerBoundType );
else if ( lowerBound == null && upperBound == null )
return Range.all();
return Range.range( lowerBound, lowerBoundType, upperBound, upperBoundType );
}
Here is a straight forward solution. Works very well
import com.google.common.collect.BoundType;
import com.google.common.collect.Range;
import com.google.gson.*;
import java.lang.reflect.Type;
public class GoogleRangeAdapter implements JsonSerializer, JsonDeserializer {
public static String TK_hasLowerBound = "hasLowerBound";
public static String TK_hasUpperBound = "hasUpperBound";
public static String TK_lowerBoundType = "lowerBoundType";
public static String TK_upperBoundType = "upperBoundType";
public static String TK_lowerBound = "lowerBound";
public static String TK_upperBound = "upperBound";
#Override
public Object deserialize(JsonElement json, Type typeOfT, JsonDeserializationContext context) throws JsonParseException {
JsonObject jsonObject = (JsonObject)json;
boolean hasLowerBound = jsonObject.get(TK_hasLowerBound).getAsBoolean();
boolean hasUpperBound = jsonObject.get(TK_hasUpperBound).getAsBoolean();
if (!hasLowerBound && !hasUpperBound) {
return Range.all();
}
else if (!hasLowerBound && hasUpperBound){
double upperBound = jsonObject.get(TK_upperBound).getAsDouble();
BoundType upperBoundType = BoundType.valueOf(jsonObject.get(TK_upperBoundType).getAsString());
if (upperBoundType == BoundType.OPEN)
return Range.lessThan(upperBound);
else
return Range.atMost(upperBound);
}
else if (hasLowerBound && !hasUpperBound){
double lowerBound = jsonObject.get(TK_lowerBound).getAsDouble();
BoundType lowerBoundType = BoundType.valueOf(jsonObject.get(TK_lowerBoundType).getAsString());
if (lowerBoundType == BoundType.OPEN)
return Range.greaterThan(lowerBound);
else
return Range.atLeast(lowerBound);
}
else {
double lowerBound = jsonObject.get(TK_lowerBound).getAsDouble();
double upperBound = jsonObject.get(TK_upperBound).getAsDouble();
BoundType upperBoundType = BoundType.valueOf(jsonObject.get(TK_upperBoundType).getAsString());
BoundType lowerBoundType = BoundType.valueOf(jsonObject.get(TK_lowerBoundType).getAsString());
if (lowerBoundType == BoundType.OPEN && upperBoundType == BoundType.OPEN)
return Range.open(lowerBound, upperBound);
else if (lowerBoundType == BoundType.OPEN && upperBoundType == BoundType.CLOSED)
return Range.openClosed(lowerBound, upperBound);
else if (lowerBoundType == BoundType.CLOSED && upperBoundType == BoundType.OPEN)
return Range.closedOpen(lowerBound, upperBound);
else
return Range.closed(lowerBound, upperBound);
}
}
#Override
public JsonElement serialize(Object src, Type typeOfSrc, JsonSerializationContext context) {
JsonObject jsonObject = new JsonObject();
Range<Double> range = (Range<Double>)src;
boolean hasLowerBound = range.hasLowerBound();
boolean hasUpperBound = range.hasUpperBound();
jsonObject.addProperty(TK_hasLowerBound, hasLowerBound);
jsonObject.addProperty(TK_hasUpperBound, hasUpperBound);
if (hasLowerBound) {
jsonObject.addProperty(TK_lowerBound, range.lowerEndpoint());
jsonObject.addProperty(TK_lowerBoundType, range.lowerBoundType().name());
}
if (hasUpperBound) {
jsonObject.addProperty(TK_upperBound, range.upperEndpoint());
jsonObject.addProperty(TK_upperBoundType, range.upperBoundType().name());
}
return jsonObject;
}
}
I've read a possible solution to this, but would require a lot of rewriting, the possible solution is linked here, but there wouldn't be any sense to doing it that way if I am just a couple words off in my dropdownlistfor.
I'm having an issue with my dropdownlistfor as this is all new to me:
#Html.DropDownListFor(model => model.pageID, new SelectList (Enum.GetNames(typeof(PageIndex)), EnumHelper.GetSelectedItemList<PageIndex>().SelectedValue))
Trying to grab the "description" of my enum values as the drop down lists text values, then have an integer value returned to the database on POST.
Here's my enum:
public enum PageIndex : int
{
[Description("Developmental Disabilities Tip Sheet")]
ddTipSheets = 1,
[Description("Hiiiiiiiiiiiiiiiiiiii")]
Example1 = 2,
[Description("I don't know what I'm doing")]
Example2 = 3
};
and my EnumHelper:
public class EnumHelper
{
public static SelectList GetSelectedItemList<T>() where T : struct
{
T t = default(T);
if (!t.GetType().IsEnum) { throw new ArgumentNullException("Please make sure that T is of Enum Type"); }
var nameList = t.GetType().GetEnumNames();
int counter = 0;
Dictionary<int, String> myDictionary = new Dictionary<int, string>();
if (nameList != null && nameList.Length > 0)
{
foreach (var name in nameList)
{
T newEnum = (T) Enum.Parse(t.GetType(), name);
string description = getDescriptionFromEnumValue(newEnum as Enum);
if (!myDictionary.ContainsKey(counter))
{
myDictionary.Add(counter, description);
}
counter++;
}
counter = 0;
return new SelectList(myDictionary, "Key", "Value");
}
return null;
}
private static string getDescriptionFromEnumValue(Enum value)
{
DescriptionAttribute descriptionAttribute =
value.GetType()
.GetField(value.ToString())
.GetCustomAttributes(typeof(DescriptionAttribute), false)
.SingleOrDefault() as DescriptionAttribute;
return descriptionAttribute == null ?
value.ToString() : descriptionAttribute.Description;
}
}
I have a list of Ids and I want to get all the rows back in one query. As a list of objects(So a List of Products or whatever).
I tried
public List<TableA> MyMethod(List<string> keys)
{
var query = "SELECT * FROM TableA WHERE Keys IN (:keys)";
var a = session.CreateQuery(query).SetParameter("keys", keys).List();
return a; // a is a IList but not of TableA. So what do I do now?
}
but I can't figure out how to return it as a list of objects. Is this the right way?
List<TableA> result = session.CreateQuery(query)
.SetParameterList("keys", keys)
.List<TableA>();
Howeever there could be a limitation in this query if number of ":keys" exceed more than 1000 (incase of oracle not sure with other dbs) so i would recommend to use ICriteria instead of CreateQuery- native sqls.
Do something like this,
[TestFixture]
public class ThousandIdsNHibernateQuery
{
[Test]
public void TestThousandIdsNHibernateQuery()
{
//Keys contains 1000 ids not included here.
var keys = new List<decimal>();
using (ISession session = new Session())
{
var tableCirt = session.CreateCriteria(typeof(TableA));
if (keys.Count > 1000)
{
var listsList = new List<List<decimal>>();
//Get first 1000.
var first1000List = keys.GetRange(0, 1000);
//Split next keys into 1000 chuncks.
for (int i = 1000; i < keys.Count; i++)
{
if ((i + 1)%1000 == 0)
{
var newList = new List<decimal>();
newList.AddRange(keys.GetRange(i - 999, 1000));
listsList.Add(newList);
}
}
ICriterion firstExp = Expression.In("Key", first1000List);
ICriterion postExp = null;
foreach (var list in listsList)
{
postExp = Expression.In("Key", list);
tableCirt.Add(Expression.Or(firstExp, postExp));
firstExp = postExp;
}
tableCirt.Add(postExp);
}
else
{
tableCirt.Add(Expression.In("key", keys));
}
var results = tableCirt.List<TableA>();
}
}
}
I am interfacing with a PostgreSQL database with NHibernate.
Background
I made some simple tests...it seems it's taking 2 seconds to persist 300 records.
I have a Perl program with identical functionality, but issue direct SQL instead, takes only 70% of the time.
I am not sure if this is expected. I thought C#/NHibernate would be faster or at least on par.
Questions
One of my observation is that (with show_sql turned on), the NHibernate is issuing INSERTs a few hundreds times, instead of doing bulk INSERT that take cares of multiple rows. And note I am assigning the primary key myself, not using the "native" generator.
Is that expected? Is there anyway I could make it issue bulk INSERT statement instead? It seems to me that this could be one of the area I could speed up the performance.
As stachu found out correctly: NHibernate does not have *BatchingBatcher(Factory) for PostgreSQL(Npgsql)
As stachu askes: Did anybody managed to force Nhibarnate to do batch inserts to PostgreSQL
I wrote a Batcher that doesn't use any Npgsql batching stuff, but does manipulate the SQL String "oldschool style" (INSERT INTO [..] VALUES (...),(...), ...)
using System;
using System.Collections;
using System.Data;
using System.Diagnostics;
using System.Text;
using Npgsql;
namespace NHibernate.AdoNet
{
public class PostgresClientBatchingBatcherFactory : IBatcherFactory
{
public virtual IBatcher CreateBatcher(ConnectionManager connectionManager, IInterceptor interceptor)
{
return new PostgresClientBatchingBatcher(connectionManager, interceptor);
}
}
/// <summary>
/// Summary description for PostgresClientBatchingBatcher.
/// </summary>
public class PostgresClientBatchingBatcher : AbstractBatcher
{
private int batchSize;
private int countOfCommands = 0;
private int totalExpectedRowsAffected;
private StringBuilder sbBatchCommand;
private int m_ParameterCounter;
private IDbCommand currentBatch;
public PostgresClientBatchingBatcher(ConnectionManager connectionManager, IInterceptor interceptor)
: base(connectionManager, interceptor)
{
batchSize = Factory.Settings.AdoBatchSize;
}
private string NextParam()
{
return ":p" + m_ParameterCounter++;
}
public override void AddToBatch(IExpectation expectation)
{
if(expectation.CanBeBatched && !(CurrentCommand.CommandText.StartsWith("INSERT INTO") && CurrentCommand.CommandText.Contains("VALUES")))
{
//NonBatching behavior
IDbCommand cmd = CurrentCommand;
LogCommand(CurrentCommand);
int rowCount = ExecuteNonQuery(cmd);
expectation.VerifyOutcomeNonBatched(rowCount, cmd);
currentBatch = null;
return;
}
totalExpectedRowsAffected += expectation.ExpectedRowCount;
log.Info("Adding to batch");
int len = CurrentCommand.CommandText.Length;
int idx = CurrentCommand.CommandText.IndexOf("VALUES");
int endidx = idx + "VALUES".Length + 2;
if (currentBatch == null)
{
// begin new batch.
currentBatch = new NpgsqlCommand();
sbBatchCommand = new StringBuilder();
m_ParameterCounter = 0;
string preCommand = CurrentCommand.CommandText.Substring(0, endidx);
sbBatchCommand.Append(preCommand);
}
else
{
//only append Values
sbBatchCommand.Append(", (");
}
//append values from CurrentCommand to sbBatchCommand
string values = CurrentCommand.CommandText.Substring(endidx, len - endidx - 1);
//get all values
string[] split = values.Split(',');
ArrayList paramName = new ArrayList(split.Length);
for (int i = 0; i < split.Length; i++ )
{
if (i != 0)
sbBatchCommand.Append(", ");
string param = null;
if (split[i].StartsWith(":")) //first named parameter
{
param = NextParam();
paramName.Add(param);
}
else if(split[i].StartsWith(" :")) //other named parameter
{
param = NextParam();
paramName.Add(param);
}
else if (split[i].StartsWith(" ")) //other fix parameter
{
param = split[i].Substring(1, split[i].Length-1);
}
else
{
param = split[i]; //first fix parameter
}
sbBatchCommand.Append(param);
}
sbBatchCommand.Append(")");
//rename & copy parameters from CurrentCommand to currentBatch
int iParam = 0;
foreach (NpgsqlParameter param in CurrentCommand.Parameters)
{
param.ParameterName = (string)paramName[iParam++];
NpgsqlParameter newParam = /*Clone()*/new NpgsqlParameter(param.ParameterName, param.NpgsqlDbType, param.Size, param.SourceColumn, param.Direction, param.IsNullable, param.Precision, param.Scale, param.SourceVersion, param.Value);
currentBatch.Parameters.Add(newParam);
}
countOfCommands++;
//check for flush
if (countOfCommands >= batchSize)
{
DoExecuteBatch(currentBatch);
}
}
protected override void DoExecuteBatch(IDbCommand ps)
{
if (currentBatch != null)
{
//Batch command now needs its terminator
sbBatchCommand.Append(";");
countOfCommands = 0;
log.Info("Executing batch");
CheckReaders();
//set prepared batchCommandText
string commandText = sbBatchCommand.ToString();
currentBatch.CommandText = commandText;
LogCommand(currentBatch);
Prepare(currentBatch);
int rowsAffected = 0;
try
{
rowsAffected = currentBatch.ExecuteNonQuery();
}
catch (Exception e)
{
if(Debugger.IsAttached)
Debugger.Break();
throw;
}
Expectations.VerifyOutcomeBatched(totalExpectedRowsAffected, rowsAffected);
totalExpectedRowsAffected = 0;
currentBatch = null;
sbBatchCommand = null;
m_ParameterCounter = 0;
}
}
protected override int CountOfStatementsInCurrentBatch
{
get { return countOfCommands; }
}
public override int BatchSize
{
get { return batchSize; }
set { batchSize = value; }
}
}
}
I also found that NHibernate is not doing batch inserts into PostgreSQL.
I identified two possible reasons:
1) Npgsql driver does not support batch inserts/updates (see forum)
2) NHibernate does not have *BatchingBatcher(Factory) for PostgreSQL(Npgsql). I tried using Devart dotConnect driver with NHibernate (I wrote custom driver for NHibernate) but it still did not worked.
I suppose this driver should also implement IEmbeddedBatcherFactoryProvider interface, but it seems not trivial for me (using one for Oracle did not worked ;) )
Did anybody managed to force Nhibarnate to do batch inserts to PostgreSQL or can confirm my conclusion?