Array Binding Implementation over NpgSql - nhibernate

I am working on Npgsql with Nhibernate over .Net for PostgreSQL query generation and also migration from oracle,
And I confused about “array binding” via NpgSql
I found some discussion array binding with Npgsql and seems solution is standard bulk insert
Below the sample query;
INSERT INTO <table_name>(C1,C2) VALUES (1, :V01);, V01: "System.String[]"
At oracle seems odp.net have some array binding implementation for bulk insert(Bulk Data Insertion into Oracle Database in C#).
At the Npgsql is there any implementation about array binding or maybe on road-map?
(Note: Npgsql version is 3.2.7 , Nhibernate v4.0.4.4000 and finally Postgresql version 9.4)

Since PostgreSQL has first-class support for parameters, Npgsql allows you to pass an array as a single parameter value (i.e. just like you can have a string column, you can have an array column).
However, there's no automatic way to do bulk insert from an array - you're going to have to write the loop yourself. The most efficient way to do that is to use the bulk insert binary COPY API. Otherwise you could concatenate multiple INSERT statements inside the same DbCommand (delimited by semicolons), although that would be less efficient.
I don't think that bulk insert from array would make a lot of sense, since PostgreSQL doesn't support anything like it at the protocol level. So if we did this, Npgsql would simply be implementing the loop instead of the user (pure sugar), and in general we try to stay low-level and only expose PostgreSQL functionality.

Related

How do I clear all the rows from a table using Entity Framework Core?

New to Entity Framework Core. How do I clear all the rows from a table?
I searched around and found the following solution:
_context.Database.ExecuteSqlCommand("TRUNCATE TABLE[TableName]");
Unfortunately using ExecuteSqlCommand throws a compiler error.
Am I missing a namespace or is there another way to do this?
Thanks, JohnB
ExecuteSqlCommand is Obsolete,you can see the details here.
For the execution of SQL queries using plain strings, use ExecuteSqlRaw instead. For the execution of SQL queries using interpolated string syntax to create parameters, use ExecuteSqlInterpolated instead.
So you can use:
_context.Database.ExecuteSqlRaw("TRUNCATE TABLE[TableName]");

Bulk Insert in multiple relational tables using Dapper .Net-using scope identity

I need to import millions of records in multiple sql server relational tables.
TableA(Aid(pk),Name,Color)----return id using scope identity
TableB(Bid,Aid(fk),Name)---Here we need to insert Aid(pk) which we got using scocpe Identity
How I can do bulk insert of collection of millions of records using dapper in one single Insert statement
Dapper just wraps raw ADO.NET; raw ADO.NET doesn't offer a facility for this, therefore dapper does not. What you want is SqlBulkCopy. You could also use a table-valued-parameter, but this really feels like a SqlBulkCopy job.
In a pinch, you can use dapper here - Execute will unroll an IEnumerable<T> into a series of commands about T - but it will be lots of commands; and unless you explicitly enable async-pipelining, it will suffer from latency per-command (the pipelined mode avoids this, but it will still be n commands). But SqlBulkCopy will be much more efficient.
If the input data is an IEnumerable<T>, you might want to use ObjectReader from FastMember; for example:
IEnumerable<SomeType> data = ...
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}

NHibernate mapping: is it possible to insert values into the database via a mapping file without using a property?

I am writing an application which works with a legacy database (brownfield). I have a couple of tables in which I insert data. These tables have some fields which need values of which I do not want the properties in my domain entities. Is there a way to insert the default value into the field without having to create a property for it my mapping file? I cannot alter the database to create a trigger, so it has to be done via the mapping file/.net application.
Hope someone can help. I hoped I could use a formula, but that doesn't work and I couldn't find any other ways to do it either.
you could use a private / protected property.
That would mean introducing these fields into your domain model / mappings, but they would be limited to those, and not exposed to whoever uses your entities.
seems like a reasonable compromise to me.
You could use EventListeners
in the OnPostInsert / OnPostUpdate event you can get the db connection and ad-hoc execute a sql query.
NH makes it rather easy
using xml see here
using FluentNHibernate see here
the basic idea is to use PropertyAccessor on a non existing property which always has the constant value.

Hibernate: UPDATE with a user-provided SQL code

I've been reading the documentation, but I couldn't find any information about it.
Is it possible to have Hibernate send user-provided SQL queries in order to UPDATE or to INSERT an object in the database?
In other words, is it possible to have session.saveOrUpdate( myObject ); , which generates update mySchema.myObject set field1=?, field2=?, field3=? where unique_key=?
to be replaced with a manual query from a user-provided String?
This is well described in the reference documentation. There are caveats, though : the session and the second-level cache aren't aware of the changes made to the entities, the version field is not updated, etc.
And if HQL is still not sufficient, you may always fall back to SQL queries.

insert blob via odp.net through dblink size limitations

i am using ODP.NET (version 2.111.7.0) and C#, OracleCommand & OracleParameter objects and OracleCommand.ExecuteNonQuery method
i was wondering if there is a way to insert a big byte array into an oracle table that resides in another database, through DB link. i know that lob handling through DB links is problematic in general, but i am a bit hesitant to modify code and add another connection.
will creating a stored procedure that takes blob as parameter and talks internally via dblink make any difference? don't think so..
my current situation is that Oracle will give me "ORA-22992: cannot use LOB locators selected from remote tables" whenever the parameter i pass with the OracleCommand is a byte array with length 0, or with length > 32KB (i suspect, because 20KB worked, 35KB didn't)
I am using OracleDbType.Blob for this parameter.
thank you.
any ideas?
i ended up using a second connection, synchronizing the two transactions so that commits and rollbacks are always performed jointly. i also ended up actually believing that there's a general issue with handling BLOBs through a dblink, so a second connection was a better choice, although in my case the design of my application was slightly disrupted - needed to introduce both a second connection and a second transaction. other option was to try and insert the blob in chunks of 32k, using PL/SQL blocks and some form of DBMS_LOB.WRITEAPPEND, but this would require more deep changes in my code (in my case), therefore i opted for the easier and more straightforward first solution.