WCF User Management Technology Recommendation - wcf

I'm writing C# WinForm and asp.net apps that talk to a SQL server through a WCF service. I want to avoid creating my own user management system and would like to use an existing component/technology that can create/delete/manage users and roles. What would you recommend?
Note: I'm aware of Geneva framework but RTM for that is second half of 2009.

The best solution for you would be using built in asp.net role management for your web application,depends on your applications complexity that how much can inbuilt role management handle for you.
Please check these links
http://msdn.microsoft.com/en-us/library/t32yf0a9.aspx
https://web.archive.org/web/20210513220018/http://aspnet.4guysfromrolla.com/articles/120705-1.aspx
About Geneva, its a claims based access model which is being proposition from MS from long and finally we are seeing them. I have evaluated the Beta and seems to work fine , but dont know how well it can be used for production level code.
If you are looking for multiplatform federated applications , you can go with geneva or else the asp.net role management should be enough.
Again, it entirely depends on the complexity of application you are developing.Geneva can handle complex federated applications very well without writing much code.

Related

Is it ok to ever use straight SQL (or an ORM) instead of BCS in SharePoint 2010

I am working on a custom web part that I want to query and write to another database.
BCS seems to complicate this process more than helping it out so I am wondering if it is ever ok to use directly access SQL Server without the use of BCS?
Short answer ... go for it - use DB only and don't use BCS.
but it really depends a bit ...
If you use BCS, you can take advantage of some Sharepoint facilities such as search.
Please find the full advantages of BCS here:
http://msdn.microsoft.com/en-us/library/ee556440.aspx
But there is a big gotcha here .... If you are using Sharepoint Foundation there are a number of limitations on what you can do with BCS, so please keep this in mind.
One disadvantage of accessing through a database is that you no longer have one central location for your data, and this can have its problems. but if you design your architecture well, you should be fine.
So overall, if you do not need the advantages of BCS and you can design a solid architecture (perhaps service oriented), then my personal recommendation would be to use the database in your web-part.
Sorry - BCS is the wrong answer - you should be using the Secure Store Service; this is how we connect between 'outside' data sources and SharePoint. Otherwise, the custom web part must somehow embed the Login information (either through properties, the web.config or the Registry) - otherwise, SSS is where you want to go.

How to test SQL scripts? (Data Integrity/Migration Testing)

Our team (QA) is facing the following problem:
We have a database that is accessed only by our Core application which is a WCF services app. Our client applications are using the Core to access the database.
At some point we were provided with a new Version of our Core application and of our Database. The Dev department also gave us a sql script which is altering a big part of our database Core data. The core data are used by the Core Application to describe the Logic of our system, so every change on that data may have affects on any of our client application's functionality.
My questions are:
Should we test all of our applications again (even if they are
already fully tested) or is there is a more efficient way to test the
SQL script?
Is there a testing technique/tool for data integrity/migration testing?
I am looking for a quick validity/integrity testing of the database after running a migration script. That will prevent us losing time by testing it through the applications. If the validity/integrity testing is successful then we can test the apps.
There are unit testing frameworks available for T-SQL. The TSQLUnit project is one such framework. You can use it to set up automated testing, just like you would in the applications.
as #Tim Lentine already posted, I would recomend testing the full application. As you commented, the new sql script your team received has made important changes on the core of your database development, according to your description, both on the structure and the data itself. So in order to be sure that everything is still on one piece I would preferably do a full application test. As for a tool or technique I can recomend the new RedGate (no, I do not work for them) addon on the SSMS called "SQL Test". It uses the unit testing open source tSQLt for its purposes. It only has the drawback that someone will need to first learn how to work with tSQLt but is prettu straightforward.
From the description you gave:
We have a database that is accessed only by our Core application ...
we were provided with a new Version of our Core application and of
our Database ...
tells me it is not your team's responsibility to test the database in isolation, but you can test the Core service from your client's perspective and therefore assume the database is correct.
Treat the Core application and the database as a black box and test using Unit Tests. These test should not require you to go poking around in the database as for all intents and purposes any application using your Core application doesn't know, nor should care, that the information is actually stored in to a database. You development team could decide in 6 months they are going to store the data in the Cloud in which case all your database test will be broken.
If you do have to look in the database to check data has been stored correctly then there is a problem with your Core service's interface as any data you put in should be retrievable via the same interface (I just know someone is going to comment that their app does store data which cannot be read back but without a more detailed description of your app it's easier to generalise).
Back in the real world I am assuming you are part of the QA team and unless the database developers are doing some testing (they are, aren't they?) you are more than likely going to have to validate the database changes.
To the end you may be interested to read a question I posted on the DBA Stack Exchage site about performing a data comparison between two different schemas. Spoiler: there's no easy answer.
show below links :
http://www.simple-talk.com/sql/t-sql-programming/sql-server-unit-testing-with-tsqlt/
http://www.red-gate.com/products/sql-development/sql-test/

What ORMs are developers using to connect to Azure?

Im interested to find out what techniques developers are using to connect to a Windows Azure instance running in the cloud?
From what i understand it is very similar to SQL Server with two of the key differences being Multiple Active Recordsets are not supported and idle/long running connections are automatically terminated by azure. For this microsoft suggest incorporating retry logic in your application to detect a closed connection and then attempt to complete the interrupted action. Does any one have example code that they are currently using on this?
To build out the data layer i was looking at various ORMs. Since im going to be accessing azure from windows azure (ie seperate boxes) to me it would seem key that any ORM mapper would need to support asynchronous methods so as not to block any windows azure instances.
Any suggestions as to which ORM mapper to use, or comments on what you are currently using
I have successfully used NHibernate with Azure and we are in the process of building a commercial app on top of NHibernate. The only problem that I had was with the connection pools when running locally and connecting to SQL Azure in the cloud - which was fixed when turning connection pooling off.
You may find similar problems with other ORM's... SQL Azure is less patient (for obvious reasons) than most people are used to. Connections timeout quicker, recycle sooner and so on.
Test first!
Here's one specifically designed for Azure:
"Telerik recently announced the
availability of Open Access, the first
ORM that works seamlessly with SQL
Azure relational databases in the
Windows Azure cloud."
And a few commenters at the Azure User Group recommend LLBLGen and Entity Framework.
I've been using Entity Framework - runs without any problems, just a different connection string.
What you do have to think about is your connection strategy, and how efficient your queries are. I've got method that's easy to write in EF - I've got a new record that could be duplicated, so I check if it's there, and if not, add it.
EF makes it really easy to do this, as if you're just accessing a local collection. BUT ... if you're paying for your dB access because it's in Azure and not on your local network, hmm, maybe there's a better (aka cheaper) way of doing that
According to Ayende, NHibernate "just works" with SQL Azure.
We have been using NHibernate without any customization on Azure (indeed, it just works), you can check Lokad.Translate as an open source example of such use.

Design Advice for an HTA based Crud App

I am developing a framework for various in-house CRUD apps. I've considered several MS technologies (WPF, Access, WinForms, ASP.NET) and have settled on ASP.NET MVC with HTA+Jquery for the client. My reason for doing so is that I need a way to write and deploy quick one-off GUI apps as well as maintaining more robust apps that are expected to have a long life time.
Firstly, I would appreciate some thoughts on the relative merits of using ADODB on the client side versus ADO.NET on the server side. I'm leaning towards ADODB since I'll have client side access to the SQL Server (I've already written a js library that handles interacting with ADODB). However, I can see how developing a RESTful service may eventually be useful.
Secondly, I need to incorporate reporting capability into the system. I can use SQL Server reporting services or crystal reports but the users have grown accustomed to some older applications that use VBA to write reports in Word; so I'm considering using WordML to write the reports.
Thanks.
Database Access
If you need a thin client, then it's probably better to stay away from directly accessing the database from within the client.
The main issue is that you will introduce a high dependency on a specific network architecture and both your ASP.Net application and the HTA will be highly dependent on the database.
Instead I would prefer to sever the dependency on direct line of sight to the DB and have the data to be handled by the server.
This has a few advantages:
for many small changes to the db, you're probably only going to have to update the ASP app.
if you ever need your client app to be functional over the internet (say because some users are going to an outside meeting, need to work from work or your company open a new branch) then you won't have to rewrite your thin client.
you keep better control over access to the resources: only let the ASP app talk to the database and filter what comes in/out of it.
This will saves you having to implement all security on the client: the ASP app becomes the guardian of the database. It's a much better way to secure information and it gives you a lot more control.
Reporting
For reporting I'd use the server again rather than implement complex reporting capabilities in the client itself.
The problem is that you'll always going to get limited on the client if you're using an HTA and don't want to start having to install dependencies on each user's machine.
You'll end-up building a thick client in no time...
If you're using ASP.Net there are plenty of really good reporting tools that will make your life much easier and allow your users to get nice reports in Excel, Word, PDF, etc without you having to code these features yourself.
Crystal Reports is ok, but there are better and simpler alternatives, for example the Developer Express Report engine is pretty easy to use.

Does Anyone Have Experience Creating an Occasionally-Connected Browser App With NHibernate?

We need to make our enterprise ASP.NET/NHibernate browser-based application able to function when connected to or disconnected from the customer's server. Has anyone done this? If so, how did you do it? (Technology, architecture, etc.)
Background:
We develop and sell an enterprise browser-based application used by construction field personnel to enter timesheet information. Currently, it requires a connection to the server back in the customer's office and we'd like to build an occasionally-connected version of the application for those clients without wireless Internet availability.
Our application is an ASP.NET application using NHibernate for O/R mapping. Being a Microsoft shop, the Microsoft Sync Framework is attractive, but we don't know whether it "plays well" with NHibernate.
Any insight would be greatly appreciated.
Dave T
Maybe you could operate some kind of offline version using a small version database (I hear good things about vistadb - http://www.vistadb.net/ which I believe does play well with NHibernate). With a syncing tool to copy data in when they are back on line. A click-once launcher could handle installation and integration.
Want to be careful with anything involving syncing though - if it is just single user timesheets that might be OK - but if there are any chances of conflicts in the online-offline data you might be better considering the problem from a different angle for pain-avoidance...
Why not couple it with Google Gears? People put their data in while offline, and then they can sync it when they reconnect to the server.
In a modern world, using the HTML5 data store:
http://www.webreference.com/authoring/languages/html/HTML5-Client-Side/