We need to make our enterprise ASP.NET/NHibernate browser-based application able to function when connected to or disconnected from the customer's server. Has anyone done this? If so, how did you do it? (Technology, architecture, etc.)
Background:
We develop and sell an enterprise browser-based application used by construction field personnel to enter timesheet information. Currently, it requires a connection to the server back in the customer's office and we'd like to build an occasionally-connected version of the application for those clients without wireless Internet availability.
Our application is an ASP.NET application using NHibernate for O/R mapping. Being a Microsoft shop, the Microsoft Sync Framework is attractive, but we don't know whether it "plays well" with NHibernate.
Any insight would be greatly appreciated.
Dave T
Maybe you could operate some kind of offline version using a small version database (I hear good things about vistadb - http://www.vistadb.net/ which I believe does play well with NHibernate). With a syncing tool to copy data in when they are back on line. A click-once launcher could handle installation and integration.
Want to be careful with anything involving syncing though - if it is just single user timesheets that might be OK - but if there are any chances of conflicts in the online-offline data you might be better considering the problem from a different angle for pain-avoidance...
Why not couple it with Google Gears? People put their data in while offline, and then they can sync it when they reconnect to the server.
In a modern world, using the HTML5 data store:
http://www.webreference.com/authoring/languages/html/HTML5-Client-Side/
Related
I have a requirement to create a tablet application for use in restaurants. It will all be on a private internal network so security is not an issue. The question is which will cause the least network traffic? I can either connect directly to SQL using entity framework or I can connect to web services I create on the SQL server in IIS and the tablets communicate with that.
I guess to simplify it, does a standard SQL connection transfer more data than is necessary?
It's difficult to give a general rule, as network architecture plays into the answer quite heavily.
As a general guideline i would suggest to make web services or php "interfaces" on the server, it would give you a easier and more controllable data flow, besides you could handle transactions easier, as all of them would go thorugh one interface, all db accesses coming from one machine. It makes also debugging and errorhandling easier (log the interface and you see everything that's happening, so you don't ahve to check logs on devices) than if every client connects directly to the DB, gives you more control.
Just a general suggestion, a kind of web services/interface or whatsoever is always worth the investment, sooner or later you will go anyway this direction.
My humble oppinion
Right now, our application only has one Web Site instance along with SQL Database deployed at Azure US datacenter. We are looking for deploying more Web Site instance at other datacenter such as APAC and Europe. There still be a local SQL Database for each of those web site instance. We would like end user could fail over to another instance if his registered instance is not available, such as if US web site instance is down, we could fail over user to Europe instance. With this, we would need to synchronize local SQL Database at all data centers, US, Europe and APAC.
So we are looking for what's best approach to implement the database synchronization here for Azure SQL Database. Here are what we found at this point:
Azure Data Sync, it looks like that it is the perfect choice since it is available right away at Azure Management Portal and it would be up and running with some simple configuration. However there seems couple catches. The feature has been on preview about 2 years now (see this link with the following quote from comment):
SQL Data Sync has been in preview for over 2 years and the last update was December 2012. Has this been abandoned? Is this a technology we should encourage our clients to use? There absolutely needs to be an ability to synchronize data between a local SQL DB and Azure but Microsoft seems to have dropped this and I'm leery of putting a client on this only to find that the plug has been pulled. You owe it to your users to give us some information
I also saw the post Azure data sync not syncing all databases at SO, it seems that this feature is a second class feature at Azure and MS doesn't really pay sufficient attention to it. So I am worried how good it is.
Microsoft Sync Framework, it seems a more generic sync framework and more suitable for client and server sync instead of sync among server database. Plus it is not simple as above SQL Data Sync which is available just by configuration at Azure.
Any other suggestions on sql database sync at Azure? It would be really appreciated if you could share your experience here.
Thanks very much in advance for your insight.
Update:
Azure Data Sync is built upon using Microsoft Sync Framework: see link, the quote:
Microsoft SQL Data Sync is a cloud-based data synchronization service built on the Microsoft Sync Framework technologies.
Since no one is answering this question and I am going to do it myself. Based on some latest information, the Azure Data Sync is buggy and can not be used for production at this point. I guess that's the reason why it never moves out of preview even after around 2 years. There is no other good approach for handling Azure SQL Database sync at this point unless you want to build something yourself.
you can use RedGate Data Compare to sync your Azuresql DB with your Local DB
Im interested to find out what techniques developers are using to connect to a Windows Azure instance running in the cloud?
From what i understand it is very similar to SQL Server with two of the key differences being Multiple Active Recordsets are not supported and idle/long running connections are automatically terminated by azure. For this microsoft suggest incorporating retry logic in your application to detect a closed connection and then attempt to complete the interrupted action. Does any one have example code that they are currently using on this?
To build out the data layer i was looking at various ORMs. Since im going to be accessing azure from windows azure (ie seperate boxes) to me it would seem key that any ORM mapper would need to support asynchronous methods so as not to block any windows azure instances.
Any suggestions as to which ORM mapper to use, or comments on what you are currently using
I have successfully used NHibernate with Azure and we are in the process of building a commercial app on top of NHibernate. The only problem that I had was with the connection pools when running locally and connecting to SQL Azure in the cloud - which was fixed when turning connection pooling off.
You may find similar problems with other ORM's... SQL Azure is less patient (for obvious reasons) than most people are used to. Connections timeout quicker, recycle sooner and so on.
Test first!
Here's one specifically designed for Azure:
"Telerik recently announced the
availability of Open Access, the first
ORM that works seamlessly with SQL
Azure relational databases in the
Windows Azure cloud."
And a few commenters at the Azure User Group recommend LLBLGen and Entity Framework.
I've been using Entity Framework - runs without any problems, just a different connection string.
What you do have to think about is your connection strategy, and how efficient your queries are. I've got method that's easy to write in EF - I've got a new record that could be duplicated, so I check if it's there, and if not, add it.
EF makes it really easy to do this, as if you're just accessing a local collection. BUT ... if you're paying for your dB access because it's in Azure and not on your local network, hmm, maybe there's a better (aka cheaper) way of doing that
According to Ayende, NHibernate "just works" with SQL Azure.
We have been using NHibernate without any customization on Azure (indeed, it just works), you can check Lokad.Translate as an open source example of such use.
Has anyone got sync framework to work on a mobile device as a sync mechanism in place of RDA or Merge replication?
If yes, could you point me to any resources available.
If one was to start a green field compact framework based application, what would one use as the sync mechanism (sync framework/RDA/Merge replication/any other...)?
Thanks
Here is an example SyncFX for SQL Server CE.
Here is a link comparing the technologies. Towards the bottom in particular is a bit about deciding which technology to use.
From a CF green field standpoint I would use SyncFX. It seems like Microsoft is getting away from RDA and SyncFX is a programmer-centric instead of DBA centric (like replication).
I am developing a framework for various in-house CRUD apps. I've considered several MS technologies (WPF, Access, WinForms, ASP.NET) and have settled on ASP.NET MVC with HTA+Jquery for the client. My reason for doing so is that I need a way to write and deploy quick one-off GUI apps as well as maintaining more robust apps that are expected to have a long life time.
Firstly, I would appreciate some thoughts on the relative merits of using ADODB on the client side versus ADO.NET on the server side. I'm leaning towards ADODB since I'll have client side access to the SQL Server (I've already written a js library that handles interacting with ADODB). However, I can see how developing a RESTful service may eventually be useful.
Secondly, I need to incorporate reporting capability into the system. I can use SQL Server reporting services or crystal reports but the users have grown accustomed to some older applications that use VBA to write reports in Word; so I'm considering using WordML to write the reports.
Thanks.
Database Access
If you need a thin client, then it's probably better to stay away from directly accessing the database from within the client.
The main issue is that you will introduce a high dependency on a specific network architecture and both your ASP.Net application and the HTA will be highly dependent on the database.
Instead I would prefer to sever the dependency on direct line of sight to the DB and have the data to be handled by the server.
This has a few advantages:
for many small changes to the db, you're probably only going to have to update the ASP app.
if you ever need your client app to be functional over the internet (say because some users are going to an outside meeting, need to work from work or your company open a new branch) then you won't have to rewrite your thin client.
you keep better control over access to the resources: only let the ASP app talk to the database and filter what comes in/out of it.
This will saves you having to implement all security on the client: the ASP app becomes the guardian of the database. It's a much better way to secure information and it gives you a lot more control.
Reporting
For reporting I'd use the server again rather than implement complex reporting capabilities in the client itself.
The problem is that you'll always going to get limited on the client if you're using an HTA and don't want to start having to install dependencies on each user's machine.
You'll end-up building a thick client in no time...
If you're using ASP.Net there are plenty of really good reporting tools that will make your life much easier and allow your users to get nice reports in Excel, Word, PDF, etc without you having to code these features yourself.
Crystal Reports is ok, but there are better and simpler alternatives, for example the Developer Express Report engine is pretty easy to use.