Does Ef Code First support batch CUD, and do you have examples? - wcf

I have tried to google this question without finding any answers.
I am trying to batch update/insert entities in a wcf, entity frame work project.
My question is does Entity Framework 4.1(code first) support batch insert, update and delete?
And if ef 4.1 does support cud, do you have any examples?

Entity framework (all versions) supports CUD (insert, update, delete) but it doesn't support batching commands. It means that each insert, update and delete are executed in separate roundtrip to the database.
This has no relation to WCF. If you want batching over WCF you just need to send collection of objects for processing and let them process on the server by EF (without batching). WCF data services supports batching for passing multiple objects to server within one roundtrip.

Related

EF Core temporary data save (query queuing)

I have a windows service which uses EF Core to save data into SQL database. The main requirement for this service is that it works even if database is unavailable. Soo, if service needs to insert data into database and database is not available it should temporary save it somewhere else and insert them later (when database is available again).
What would be the best solution for this? I'm thinking about a queue of queries that weren't executed due to database unavailability. Are there any existing solutions for this type of problems? Does EF Core already have some kind of functionality to store data in file and insert them into database, when it becomes available again? Is there maybe any other library to achieve this?
Thanks for your help!

Run SQL without transaction

Is there a way how to execute SQL or stored procedure without creating additional transaction in entity framework ? There is solution for entity framework Stored Procedure without transaction in Entity Framework but it is not available for .net core.
The default behavior of ExecuteSqlCommand in EF Core is different than the EF6:
Note that this method does not start a transaction. To use this method with a transaction, first call BeginTransaction(DatabaseFacade, IsolationLevel) or UseTransaction(DatabaseFacade, DbTransaction).
Note that the current ExecutionStrategy is not used by this method since the SQL may not be idempotent and does not run in a transaction. An ExecutionStrategy can be used explicitly, making sure to also use a transaction if the SQL is not idempotent.
In other words, what are you asking is the default behavior in EF Core, so no action is needed.

Lightswitch 2011 Concurrency Control

I 've got two question.
Does Lightswitch 2011 support also pessimistic concurency control? If so, how?
Does optimistic control support ROWVERSION column on table from external data source or use only state of row (using original values)?
Thank you for response.
Lightswitch 2011 supports only optimistic concurrency. Hovewer, I integrated pessimistic concurrency into Lightswitch succesfully. I used Coarse-Graind Lock from Patterns of Enterprise Application Architecture (Martin Fowler) with Entity Framework witch already contains UnitOfWork (ObjectContext) and Repository (ObjectSet). Acquire, Release and Checking of Lock using ExecuteStoreCommand and ExecuteStoreQuery from EF. Custom RIA Service with CRUD operations for Root and Children entities was implemented.
Optimistic control with ROWVERSION is possible with custom RIA Service which use model from EF.

entity framework, self-tracking entity and sqlserver file stream

I just start a project where I need to have a WCF services that read and write files.
The architecture is based on DDD using Entity Framework Self-Tracking Entity.
The simple GUI should show a grid with a list of file and then click the row you can download it.
Can I use the file stream sql server 2008 feature with this architecture? Which strategy is the best one to manage this kind of entity?
Thanks.
Filestream will not help you when using EF. EF doesn't use streaming feature, it loads it as varbinary(max). If you wan to take advantage of filestream you must load it from database with ADO.NET directly and you need a streaming service to pass it back to the client in efficient way.

What's the best way to insert/update/delete multiple records in a database from an application?

Given a small set of entities (say, 10 or fewer) to insert, delete, or update in an application, what is the best way to perform the necessary database operations? Should multiple queries be issued, one for each entity to be affected? Or should some sort of XML construct that can be parsed by the database engine be used, so that only one command needs to be issued?
I ask this because a common pattern at my current shop seems to be to format up an XML document containing all the changes, then send that string to the database to be processed by the database engine's XML functionality. However, using XML in this way seems rather cumbersome given the simple nature of the task to be performed.
It depends on how many you need to do, and how fast the operations need to run. If it's only a few, then doing them one at a time with whatever mechanism you have for doing single operations will work fine.
If you need to do thousands or more, and it needs to run quickly, you should re-use the connection and command, changing the arguments for the parameters to the query during each iteration. This will minimize resource usage. You don't want to re-create the connection and command for each operation.
You didn't mention what database you are using, but in SQL Server 2008, you can use table variables to pass complex data like this to a stored procedure. Parse it there and perform your operations. For more info, see Scott Allen's article on ode to code.
Most databases support BULK UPDATE or BULK DELETE operations.
From a "business entity" design standpoint, if you are doing different operations on each of a set of entities, you should have each entity handle its own persistence.
If there are common batch activities (like "delete all older than x date", for instance), I would write a static method on a collection class that executes the batch update or delete. I generally let entities handle their own inserts atomically.
The answer depends on the volume of data you're talking about. If you've got a fairly small set of records in memory that you need to synchronise back to disk then multiple queries is probably appropriate. If it's a larger set of data you need to look at other options.
I recently had to implement a mechanism where an external data feed gave me ~17,000 rows of dta that I needed to synchronise with a local table. The solution I chose there was to load the external data into a staging table and call a stored proc that did the synchronisation completely within the database.