Lightswitch 2011 Concurrency Control - optimistic-concurrency

I 've got two question.
Does Lightswitch 2011 support also pessimistic concurency control? If so, how?
Does optimistic control support ROWVERSION column on table from external data source or use only state of row (using original values)?
Thank you for response.

Lightswitch 2011 supports only optimistic concurrency. Hovewer, I integrated pessimistic concurrency into Lightswitch succesfully. I used Coarse-Graind Lock from Patterns of Enterprise Application Architecture (Martin Fowler) with Entity Framework witch already contains UnitOfWork (ObjectContext) and Repository (ObjectSet). Acquire, Release and Checking of Lock using ExecuteStoreCommand and ExecuteStoreQuery from EF. Custom RIA Service with CRUD operations for Root and Children entities was implemented.
Optimistic control with ROWVERSION is possible with custom RIA Service which use model from EF.

Related

How to pass data using Entity Framework and wcf

I'm trying to develop a .net4 application  using c#, wcf and entity framework. My first idea was to pass the EF generated objects through wcf (first with the default entity objects, then with the POCO entities), but I soon got several connection problems (connection is closed) due to non serializable objects in the generated entities. I ended up writing several data-only classes to host the data queried with EF, but now I miss the role of the EF with WCF. I guess I'm doing something wrong, so how do you send data through wcf using EF? What is the point of EF? Wouldn't it be easier to write stored procs and standard ado.net...?
Entity Framework is just a data access technology. You can create a data access layer which talks to your database and return the required data using Entity framework and then plug that to your WCF service so that your WCF service will get the data. You can use the same data access layer with any other consumers ( a Silver light application, A Windows form project or an MVC application). The advantage of using Entity framework is that it will load the data to your domain objects (your POCO classes) so that you do not need to do it manually yourself. In the case of Stored proc, you need to execute the stored proc, Iterate thru the DataReader/ DataTable the fill your objects. For this you have to write code. If you use Entity framework, EF does this for you so you can save some dev time.
You should clearly logically seperate your project so that there will be a data access and a consumer which consumes the Data Acccess layer( your WCF service).

EF4.x and WCF Service (Persistence ignorant) Updating nested entities with 1 to n and m to n relationship.

I have SQL Server database and would like to use LINQ to Entities and wrap it with WCF layer and expose it to client. (typical N-Tier architecture). ALso would like to have Persistence ignorant option and also would like to have an option ignore certain fields (sensitive information) in database from serializing it to client.
So what would be best approach for using Entity Framework with Persistence Ignorance, Self Tracking with WCF Support. I could find T4 template with either Self Tracking or Persistence Ignorant.. But everything bundled as single package.
Any help in this would be greatly appreciated.
STEs don't allow any projections - you must expose your entities in their exact form. If you want to hide some fields you must abandon STEs and create your own DTOs (data transfer objects) exposing only subset of your entities data. Once you use DTOs you must manually handle all change tracking.

entity framework, self-tracking entity and sqlserver file stream

I just start a project where I need to have a WCF services that read and write files.
The architecture is based on DDD using Entity Framework Self-Tracking Entity.
The simple GUI should show a grid with a list of file and then click the row you can download it.
Can I use the file stream sql server 2008 feature with this architecture? Which strategy is the best one to manage this kind of entity?
Thanks.
Filestream will not help you when using EF. EF doesn't use streaming feature, it loads it as varbinary(max). If you wan to take advantage of filestream you must load it from database with ADO.NET directly and you need a streaming service to pass it back to the client in efficient way.

Does Ef Code First support batch CUD, and do you have examples?

I have tried to google this question without finding any answers.
I am trying to batch update/insert entities in a wcf, entity frame work project.
My question is does Entity Framework 4.1(code first) support batch insert, update and delete?
And if ef 4.1 does support cud, do you have any examples?
Entity framework (all versions) supports CUD (insert, update, delete) but it doesn't support batching commands. It means that each insert, update and delete are executed in separate roundtrip to the database.
This has no relation to WCF. If you want batching over WCF you just need to send collection of objects for processing and let them process on the server by EF (without batching). WCF data services supports batching for passing multiple objects to server within one roundtrip.

Querying database from different applications with nHibernate

In this moment, I have two web applications(one application is an MVC2 application for the management of my project and the second is an application with web services). Both applications have to deal with the database and have Nhibernate for querying the database. Is this a good pattern?, if not what can i do?
Edit 1
Both applications can write to the database. I have a dll project that handle the database transactions and have de nhibernate instance named "Repositorio". Nevertheless, each application will have a different instance of Repositorio.dll so there is going to be multiple threats to the database, what do i have to do to make both application use the same instance of Repositorio.dll?
The answer depends on whether or not both applications can write to the database.
If one is read-only, I'd say you're safe.
I not, I'd argue that a service-oriented approach would recommend creating a service that provided an interface for both applications and was the sole owner of the database.
"service-oriented" does not mean that the service has to be a distributed component (e.g., SOAP or REST or RPC). If you encapsulate the database access in a component with a well-defined interface you can choose to share the component as a DLL in both applications. Only make it a distributed component if that makes sense for both applications.
That sounds perfectly fine to me even if both applications write to the database. I would simply recommend you create a third project as a class library with all your nHibernate related stuff to avoid writing any redundant code in both projects.