Design Advice for an HTA based Crud App - crud

I am developing a framework for various in-house CRUD apps. I've considered several MS technologies (WPF, Access, WinForms, ASP.NET) and have settled on ASP.NET MVC with HTA+Jquery for the client. My reason for doing so is that I need a way to write and deploy quick one-off GUI apps as well as maintaining more robust apps that are expected to have a long life time.
Firstly, I would appreciate some thoughts on the relative merits of using ADODB on the client side versus ADO.NET on the server side. I'm leaning towards ADODB since I'll have client side access to the SQL Server (I've already written a js library that handles interacting with ADODB). However, I can see how developing a RESTful service may eventually be useful.
Secondly, I need to incorporate reporting capability into the system. I can use SQL Server reporting services or crystal reports but the users have grown accustomed to some older applications that use VBA to write reports in Word; so I'm considering using WordML to write the reports.
Thanks.

Database Access
If you need a thin client, then it's probably better to stay away from directly accessing the database from within the client.
The main issue is that you will introduce a high dependency on a specific network architecture and both your ASP.Net application and the HTA will be highly dependent on the database.
Instead I would prefer to sever the dependency on direct line of sight to the DB and have the data to be handled by the server.
This has a few advantages:
for many small changes to the db, you're probably only going to have to update the ASP app.
if you ever need your client app to be functional over the internet (say because some users are going to an outside meeting, need to work from work or your company open a new branch) then you won't have to rewrite your thin client.
you keep better control over access to the resources: only let the ASP app talk to the database and filter what comes in/out of it.
This will saves you having to implement all security on the client: the ASP app becomes the guardian of the database. It's a much better way to secure information and it gives you a lot more control.
Reporting
For reporting I'd use the server again rather than implement complex reporting capabilities in the client itself.
The problem is that you'll always going to get limited on the client if you're using an HTA and don't want to start having to install dependencies on each user's machine.
You'll end-up building a thick client in no time...
If you're using ASP.Net there are plenty of really good reporting tools that will make your life much easier and allow your users to get nice reports in Excel, Word, PDF, etc without you having to code these features yourself.
Crystal Reports is ok, but there are better and simpler alternatives, for example the Developer Express Report engine is pretty easy to use.

Related

Should I connect to SQL directly or use web services for a commercial tablet app

I have a requirement to create a tablet application for use in restaurants. It will all be on a private internal network so security is not an issue. The question is which will cause the least network traffic? I can either connect directly to SQL using entity framework or I can connect to web services I create on the SQL server in IIS and the tablets communicate with that.
I guess to simplify it, does a standard SQL connection transfer more data than is necessary?
It's difficult to give a general rule, as network architecture plays into the answer quite heavily.
As a general guideline i would suggest to make web services or php "interfaces" on the server, it would give you a easier and more controllable data flow, besides you could handle transactions easier, as all of them would go thorugh one interface, all db accesses coming from one machine. It makes also debugging and errorhandling easier (log the interface and you see everything that's happening, so you don't ahve to check logs on devices) than if every client connects directly to the DB, gives you more control.
Just a general suggestion, a kind of web services/interface or whatsoever is always worth the investment, sooner or later you will go anyway this direction.
My humble oppinion

Stop exporting a SQL Server database to secure it

I have a vb.net windows form application with a database on SQL Server 2008 on the ./SQLEXPRESS instance.
I have created a setup of my project using the link below..
http://msdn.microsoft.com/en-US/library/49b92ztk(v=vs.80).aspx
When a user installs my application, the database will be available for him, and user can just export the SQL Server database.
How can I secure my database so that user shouldn't have a easily available copy of my database?
I thought of creating a new password protected server (as I have created the database in above walkthrough)... while installation of my application on user's pc, other than ./sqlexpress. And a complete copy of database used by my application will not be simply available for user to just export and get a copy of my database.
So could anyone please guide me...
The question is; how far do you want to go to protect your data?
Better protection of your data usually comes at the cost of more development time and likely less user friendliness, for example due to lower performance (encryption is not free). More complex code usually results in more support requests too.
Where the best balance is depends on your business model (if any) and on your user requirements.
Keep in mind that anything you deploy to an end-users machine is in the end vulnerable. If something is valuable enough there will be people trying to steal it.
So, you could argue that the best protection is not to deploy the data at all. You could back your end-user application with a web service and keep the data on your own server, for example in the cloud.
I've found however that you sometimes just need to trust your users. If you build a good product that makes them happy, they have no reason to steal from you. In fact, they are probably glad to pay you.
If you decide that you need to deploy the data and that you need to encrypt it, you should think about why you chose SQL Server.
What database features do you need exactly? Do you need a fullblown database server for that?
Any local admin can gain control over any SQL Server database in seconds so the built-in SQL server authentication will not bring you a lot of benefits.
You could switch to SQLServer CE and keep the database within your application. That would make the database a lot harder to access for a regular user.
If all you're doing is looking up words, you may be better off with a different storage engine like Lucene.
Lucene is actually a search engine, so it's highly optimized for matching words or parts of words.
You can run Lucene inside your .NET application so you don't even need the end-user to install SQL Server. There is a .NET version of Lucene here.
Lucene however doesn't protect your data. There's tooling available that will allow anybody to view and extract the data from the stored index files.
Since Lucene is open source though, you could extend it to support encrypted data storage (see this related question).

How to test SQL scripts? (Data Integrity/Migration Testing)

Our team (QA) is facing the following problem:
We have a database that is accessed only by our Core application which is a WCF services app. Our client applications are using the Core to access the database.
At some point we were provided with a new Version of our Core application and of our Database. The Dev department also gave us a sql script which is altering a big part of our database Core data. The core data are used by the Core Application to describe the Logic of our system, so every change on that data may have affects on any of our client application's functionality.
My questions are:
Should we test all of our applications again (even if they are
already fully tested) or is there is a more efficient way to test the
SQL script?
Is there a testing technique/tool for data integrity/migration testing?
I am looking for a quick validity/integrity testing of the database after running a migration script. That will prevent us losing time by testing it through the applications. If the validity/integrity testing is successful then we can test the apps.
There are unit testing frameworks available for T-SQL. The TSQLUnit project is one such framework. You can use it to set up automated testing, just like you would in the applications.
as #Tim Lentine already posted, I would recomend testing the full application. As you commented, the new sql script your team received has made important changes on the core of your database development, according to your description, both on the structure and the data itself. So in order to be sure that everything is still on one piece I would preferably do a full application test. As for a tool or technique I can recomend the new RedGate (no, I do not work for them) addon on the SSMS called "SQL Test". It uses the unit testing open source tSQLt for its purposes. It only has the drawback that someone will need to first learn how to work with tSQLt but is prettu straightforward.
From the description you gave:
We have a database that is accessed only by our Core application ...
we were provided with a new Version of our Core application and of
our Database ...
tells me it is not your team's responsibility to test the database in isolation, but you can test the Core service from your client's perspective and therefore assume the database is correct.
Treat the Core application and the database as a black box and test using Unit Tests. These test should not require you to go poking around in the database as for all intents and purposes any application using your Core application doesn't know, nor should care, that the information is actually stored in to a database. You development team could decide in 6 months they are going to store the data in the Cloud in which case all your database test will be broken.
If you do have to look in the database to check data has been stored correctly then there is a problem with your Core service's interface as any data you put in should be retrievable via the same interface (I just know someone is going to comment that their app does store data which cannot be read back but without a more detailed description of your app it's easier to generalise).
Back in the real world I am assuming you are part of the QA team and unless the database developers are doing some testing (they are, aren't they?) you are more than likely going to have to validate the database changes.
To the end you may be interested to read a question I posted on the DBA Stack Exchage site about performing a data comparison between two different schemas. Spoiler: there's no easy answer.
show below links :
http://www.simple-talk.com/sql/t-sql-programming/sql-server-unit-testing-with-tsqlt/
http://www.red-gate.com/products/sql-development/sql-test/

WCF User Management Technology Recommendation

I'm writing C# WinForm and asp.net apps that talk to a SQL server through a WCF service. I want to avoid creating my own user management system and would like to use an existing component/technology that can create/delete/manage users and roles. What would you recommend?
Note: I'm aware of Geneva framework but RTM for that is second half of 2009.
The best solution for you would be using built in asp.net role management for your web application,depends on your applications complexity that how much can inbuilt role management handle for you.
Please check these links
http://msdn.microsoft.com/en-us/library/t32yf0a9.aspx
https://web.archive.org/web/20210513220018/http://aspnet.4guysfromrolla.com/articles/120705-1.aspx
About Geneva, its a claims based access model which is being proposition from MS from long and finally we are seeing them. I have evaluated the Beta and seems to work fine , but dont know how well it can be used for production level code.
If you are looking for multiplatform federated applications , you can go with geneva or else the asp.net role management should be enough.
Again, it entirely depends on the complexity of application you are developing.Geneva can handle complex federated applications very well without writing much code.

Does Anyone Have Experience Creating an Occasionally-Connected Browser App With NHibernate?

We need to make our enterprise ASP.NET/NHibernate browser-based application able to function when connected to or disconnected from the customer's server. Has anyone done this? If so, how did you do it? (Technology, architecture, etc.)
Background:
We develop and sell an enterprise browser-based application used by construction field personnel to enter timesheet information. Currently, it requires a connection to the server back in the customer's office and we'd like to build an occasionally-connected version of the application for those clients without wireless Internet availability.
Our application is an ASP.NET application using NHibernate for O/R mapping. Being a Microsoft shop, the Microsoft Sync Framework is attractive, but we don't know whether it "plays well" with NHibernate.
Any insight would be greatly appreciated.
Dave T
Maybe you could operate some kind of offline version using a small version database (I hear good things about vistadb - http://www.vistadb.net/ which I believe does play well with NHibernate). With a syncing tool to copy data in when they are back on line. A click-once launcher could handle installation and integration.
Want to be careful with anything involving syncing though - if it is just single user timesheets that might be OK - but if there are any chances of conflicts in the online-offline data you might be better considering the problem from a different angle for pain-avoidance...
Why not couple it with Google Gears? People put their data in while offline, and then they can sync it when they reconnect to the server.
In a modern world, using the HTML5 data store:
http://www.webreference.com/authoring/languages/html/HTML5-Client-Side/