update records with special conditions nhibernate - sql

I have a table(InquiryTable) and first of all I select some records from it, after I extract these records from database, I update them into database. but I need to know how I can do these two commands simultaneously(merge them in one) by nhibernate.
inquiry = Session.Query<InquiryTable>().Where((c => c.ID == ID)).ToList();
inquiry.FirstOrDefault().Time= sendTime;
Session.Update(inquiry);
I want to merge Session.Query and Session.Update in one command. It means that I need an update with where in same query.

What would fit to this concept is NHiberante DML
13.3. DML-style operations
As already discussed, automatic and transparent object/relational
mapping is concerned with the management of object state. This implies
that the object state is available in memory, hence manipulating
(using the SQL Data Manipulation Language (DML) statements: INSERT,
UPDATE, DELETE) data directly in the database will not affect
in-memory state. However, NHibernate provides methods for bulk
SQL-style DML statement execution which are performed through the
Hibernate Query Language (HQL).
This feature set is not built on top of ICriteria or QueryOver, but it does use HQL.
An example from the doc, doing the UPDATE on filtered data in one shot:
string hqlUpdate = "update Customer c set c.name = :newName where c.name = :oldName";
// or string hqlUpdate = "update Customer set name = :newName where name = :oldName";
int updatedEntities = s.CreateQuery( hqlUpdate )
.SetString( "newName", newName )
.SetString( "oldName", oldName )
.ExecuteUpdate();
See also:
Update Top n using NHibernate
Do I have to refresh entities after bulk updates / deletes with HQL?

Related

SQL update multiple rows with different values where they match a value from a list

So perhaps the title is a little confusing. If you can suggest better wording for that please let me know and i'll update.
Here's the issue. I've got a table with many thousands of rows and i need to update a few thousand of those many to store latest email data.
For example:
OldEmail#1.com => NewEmail#1.com
OldEmail#2.com => NewEmail#2.com
I've got a list of old emails ('OldEmail#1.com','OldEmail#2.com') and a list of the new ('NewEmail#1.com','NewEmail#2.com'). The HOPE was was to sort of do it simply with something like
UPDATE Table
SET Email = ('NewEmail#1.com','NewEmail#2.com')
WHERE Email = ('OldEmail#1.com','OldEmail#2.com')
I hope that makes sense. Any questions just ask. Thanks!
You could use a case expression:
update mytable
set email = case email
when 'OldEmail#1.com' then 'NewEmail#1.com'
when 'OldEmail#2.com' then 'NewEmail#2.com'
end
where email in ('OldEmail#1.com','OldEmail#2.com')
Or better yet, if you have a large list of values, you might create a table to store them (like myref(old_email, new_email)) and join it in your update query, like so:
update t
set t.email = r.new_email
from mytable t
inner join myref r on r.old_email = t.email
The actual syntax for update/join does vary accross databases - the above SQL Server syntax.
With accuracy to the syntax in particular DBMS:
WITH cte AS (SELECT 'NewEmail#1.com' newvalue, 'OldEmail#1.com' oldvalue
UNION ALL
SELECT 'NewEmail#2.com', 'OldEmail#2.com')
UPDATE table
SET table.email = cte.newvalue
FROM cte
WHERE table.email = cte.oldvalue
or, if CTE is not available,
UPDATE table
SET table.email = cte.newvalue
FROM (SELECT 'NewEmail#1.com' newvalue, 'OldEmail#1.com' oldvalue
UNION ALL
SELECT 'NewEmail#2.com', 'OldEmail#2.com') cte
WHERE table.email = cte.oldvalue
Consider prepared statement for rows update in large batches.
Basically it works as following :
database complies a query pattern you provide the first time, keep the compiled result for current connection (depends on implementation).
then you updates all the rows, by sending shortened label of the prepared function with different parameters in SQL syntax, instead of sending entire UPDATE statement several times for several updates
the database parse the shortened label of the prepared function , which is linked to the pre-compiled result, then perform the updates.
next time when you perform row updates, the database may still use the pre-compiled result and quickly complete the operations (so the first step above can be skipped).
Here is PostgreSQL example of prepare statement, many of SQL databases (e.g. MariaDB,MySQL, Oracle) also support it.

SSIS (in SQL Server 2012): Upsert in Lookup component

I have > 10 packages that need to update/insert in the dataflow. I am able to do it by:
Lookup => Match output branch => OLE DB Command.
Lookup => No Match output branch => OLE DB Destination.
(http://www.rad.pasfu.com/index.php?/archives/46-SSIS-Upsert-With-Lookup-Transform.html)
(http://jahaines.blogspot.com/2009/09/sss-performing-upsert.html)
However, I was wondering if there is some way I can use the "Merge" statement in Lookup (or in any other) component such that I can do something like:
MERGE [DBPrac].[dbo].[TargetTable] AS tt
USING [SourceTable] AS st ON tt.Id = st.Id
WHEN MATCHED THEN --* Update the records, if record found based on Id.
UPDATE
SET tt.SSN = st.SSN
,tt.FirstName = st.FirstName
,tt.MiddleName = st.MiddleName
,tt.LastName = st.LastName
,tt.Gender = st.Gender
,tt.DateOfBirth = st.DateOfBirth
,tt.Email = st.Email
,tt.Phone = st.Phone
,tt.Comment = st.Comment
WHEN NOT MATCHED BY TARGET THEN --* Insert from source to target.
INSERT (Id, SSN, FirstName, MiddleName, LastName, Gender, DateOfBirth, Email, Phone, Comment)
VALUES (st.Id, st.SSN, st.FirstName, st.MiddleName, st.LastName, st.Gender, st.DateOfBirth, st.Email, st.Phone, st.Comment)
;
SELECT ##ROWCOUNT;
SET IDENTITY_INSERT [dbo].[TargetTable] OFF
GO
So far I tried:
In Lookup component's "Advanced" pane in "Custom query", I tried to use the above query, but stumbled upon the "SourceTable". Don't know how to get hold of input recordset in the "Custom query"
(Don't know if it is even possible).
Any help and/or pointer would be great.
Yes you can use MERGE but you need to load your data into a staging table. This is the 'ELT' method - extract, load (into database), transform, as opposed to the 'ETL' method - extract, transform (in package), load (into database)
I usually find the ELT method faster and more maintainable, if you don't mind working with SQL scripts. Certainly a single bulk update is faster than the row by row update that occurs in SSIS
If I understand your question correctly, just execute the Merge statement using an Execute SQL task. Then you dont need any Lookups. We use the same strategy for our warehouse's final load from staging.

LinQ to SQL : Update on table data not working

I have a LinQ query which is intended to Update the table concerned.
The code is as follows:
LINQHelperDataContext PersonalDetails = new LINQHelperDataContext();
var PerDetails1 = (from details in PersonalDetails.W_Details_Ts
where details.UserId == userId
select details).First();
PerDetails1.Side = "Bridge";
PerDetails1.TotalBudget = 4000000;
PersonalDetails.SubmitChanges();
However, this change/update does not get reflected in the DB. Also,this does not throw any exception.Please suggest.
Make sure W_Details_Ts has one (or more) member properties marked as primary key. L2S can't generate update or delete statements if it does not know the underlying table's PK member(s).

Structuring many update statements in SQL Server

I'm using SQL Server. I'm also relatively new to writing SQL... in a strong way. It's mostly self-taught, so I'm probably missing key ideas in terms of proper format.
I've a table called 'SiteResources' and a table called 'ResourceKeys'. SiteResources has an integer that corresponds to the placement of a string ('siteid') and a 'resourceid' which is an integer id that corresponds to 'resourceid' in ResourceKeys. ResourceKeys also contains a string for each key it contains ('resourcemessage'). Basically, these two tables are responsible for representing how strings are stored and displayed on a web page.
The best way to consistently update these two tables, is what? Let's say I have 5000 rows in SiteResources and 1000 rows in ResourceKeys. I could have an excel sheet, or a small program, which generates 5000 singular update statements, like:
update SiteResources set resoruceid = 0
WHERE siteid IS NULL AND resourceid IN (select resourceid
from ResourceKeys where resourcemessage LIKE 'FooBar')
I could have thousands of those singular update statements, with FooBar representing each string in the database I might want to change at once, but isn't there a cleaner way to write such a massive number of update statements? From what I understand, I should be wrapping all of my statements in begin/end/go too, just in-case of failure - which leads me to believe there is a more systematic way of writing these update statements? Is my hunch correct? Or is the way I'm going about this correct / ideal? I could change the structure of my tables, I suppose, or the structure of how I store data - that might simplify things - but let's just say I can't do that, in this instance.
As far as I understand, you just need to update everything in table SiteResources with empty parameter 'placement of a string'. If so, here is the code:
UPDATE a
SET resourceid = 0
FROM SiteResources a
WHERE EXISTS (select * from ResourceKeys b where a.resourceid = b.resourceid)
AND a.siteid IS NULL
For some specific things like 'FooBar'-rows you can add it like this:
UPDATE a
SET resourceid = 0
FROM SiteResources a
WHERE EXISTS (select * from ResourceKeys b where a.resourceid = b.resourceid and b.resourcemessage IN ('FooBar', 'FooBar2', 'FooBar3', ...))
AND a.siteid IS NULL
Let me see if I understood the question correctly. You'd like to update resourceid to 0 if the resourcemessage corresponds to a list of strings ? If so, you can build your query like this.
UPDATE r
SET resourceid = 0
FROM SiteResources r
JOIN ResourceKeys k ON r.resourceid = k.resourceid
WHERE k.resourcemessage IN ('FooBar', ...)
AND r.siteid IS NULL;
This is using an extended UPDATE syntax in transact-sql allowing you to use a JOIN in the UPDATE statement. But maybe it's not exactly what you want ? Why do you use the LIKE operator in your query, without wildcard (%) ?
With table-valued parameters, you can pass a table from your client app to the SQL batch that your app submits for execution. You can use this to pass a list of all the strings you need to update to a single UPDATE that updates all rows at once.
That way you don't have to worry about all of your concerns: the number of updates, transactional atomicitty, error handling. As a bonus, performance will be improved.
I recommend that you do a bit of research what TVPs are and how they are used.

using a temporary table with NHibernate

I'm trying to make what seems to be an advanced use of NHibernate with sql server functions.
I'm using NHibernate's ICriteria interface to supply paging, sorting and filtering for my listviews.
one of business objects is an aggregation of items from 3 different tables.
in order to do this aggregation in the DB I've used a transact-sql function, accepting parameters.I'm using the IQuery interface returned by session.GetNamedQuery to invoke the function. but in order to use the paging/filtering/sorting code i'd like to use the ICriteria interface.
in order to achieve that I considered:
Opening a new transaction
calling the function which will create a global temporary table (instead of returning the result as it does now)
somehow alter the NHibernate mapping so it applies for the temporary table (not sure I can do that, also this has to be specific to the scope where I create the transaction...)
run the query on the new table using the new mapping, using the ICriteria interface
Delete the temporary table
so a number of questions:
can you suggest an alternative?
is it possible to replace the table in an NHibernate mapping on run time, locally for a specific code scope?
how costly would it be to generate and dispose of the temporary table?
Can you replace the function with a view? This view could aggregate the 3 tables, be mapped by NHibernate, and easily paged/sorted/filtered.
I wrote a blog post which shows how you can do exactly this here: pagination for performance intensive queries using nhibernate and sql server temporary tables
I use a table variable inside of an Hibernate SQL query using SQL 2012 Dialect, like below.
For my scenario I couldn't get a particular parameterized, paginated query to work in under 17 seconds no matter how I constructed the filter. By using a table variable, the time to grab page N with a page size of 1000 resulted in sub-second response.
All the magic happens in the SQL below that
creates a table variable - declare #temporderstatus table
selects the rows into the table variable - insert into #temporderstatus filtered by the sql in sqlfilter string
selects the data from the table variable to return to NHibernate ordered so that pagination returns a predictable result set - select OrderNum, Customer... from #temporderstatus ORDER BY StatusCodeChangedDate, OrderNum
uses the NHibernate paging functions to insert the necessary ROW/OFFSET statements using SQL Server 2012 dialect - SetFirstResult((pagination.Page - 1) * pagination.PageSize).SetMaxResults( pagination.PageSize )
Unlike a temp table, a table variable cleans up after itself, so it is ideal for NHibernate scenarios where you have a web server that is serving paginated requests.
Here is my code...
var session = _sessionManager.GetStatelessSession();
StringBuilder sqlfilter = new StringBuilder(
#"FROM Orders o join OrderType ot ON o.OrderType = ot.OrderType where o.StatusDate between :fromDate and :toDate" );
var mainQuery = session.CreateSQLQuery(
$#"declare #temporderstatus table (OrderNum int not null, CustomerID int, OrderType varchar(16), Status varchar(3), StatusCodeChangedDate datetime, OrderDate datetime, DeliveryDate datetime)
insert into #temporderstatus
SELECT o.OrderNum, o.CustomerID, ot.Description AS OrderType, o.StatusCode AS Status, o.StatusCodeChangedDate, o.OrderDate, o.DeliveryDate
{sqlfilter}
select OrderNum, CustomerID, OrderType, Status, StatusCodeChangedDate, OrderDate, DeliveryDate
from #temporderstatus
ORDER BY StatusCodeChangedDate, OrderNum
");
//construct the count query
var totalCountQuery = session.CreateSQLQuery($"SELECT COUNT(1) OCount {sqlfilter} ");
totalCountQuery.AddScalar("OCount", NHibernateUtil.Int32);
totalCountQuery.SetParameter("fromDate", criteria.fromDate);
totalCountQuery.SetParameter("toDate", criteria.toDate);
var totalCountResults = totalCountQuery.UniqueResult<int>();
pagination.TotalResultCount = totalCountResults;
if (pagination.TotalResultCount == 0)
{
//no results so don't waste time doing another query
return new List<OrderDataDto>();
}
//finish constructing the main query
mainQuery.AddEntity(typeof(OrderDataDto));
mainQuery.SetParameter("fromDate", criteria.fromDate);
mainQuery.SetParameter("toDate", criteria.toDate);
var mainQueryResults = mainQuery
.SetFirstResult((pagination.Page - 1)*pagination.PageSize).SetMaxResults(pagination.PageSize);
var results = mainQueryResults.List<OrderDataDto>();
return results;