In this moment, I have two web applications(one application is an MVC2 application for the management of my project and the second is an application with web services). Both applications have to deal with the database and have Nhibernate for querying the database. Is this a good pattern?, if not what can i do?
Edit 1
Both applications can write to the database. I have a dll project that handle the database transactions and have de nhibernate instance named "Repositorio". Nevertheless, each application will have a different instance of Repositorio.dll so there is going to be multiple threats to the database, what do i have to do to make both application use the same instance of Repositorio.dll?
The answer depends on whether or not both applications can write to the database.
If one is read-only, I'd say you're safe.
I not, I'd argue that a service-oriented approach would recommend creating a service that provided an interface for both applications and was the sole owner of the database.
"service-oriented" does not mean that the service has to be a distributed component (e.g., SOAP or REST or RPC). If you encapsulate the database access in a component with a well-defined interface you can choose to share the component as a DLL in both applications. Only make it a distributed component if that makes sense for both applications.
That sounds perfectly fine to me even if both applications write to the database. I would simply recommend you create a third project as a class library with all your nHibernate related stuff to avoid writing any redundant code in both projects.
Related
Currently I have a monolithic application with some modules like financial and accounting. This application uses a single database and the modules are divided into schemas, so when I need to display the data on user interface or in a report I just need to do a simple query with a couple joins.
My question is, in a microservices structure where each module has his own database, how do I retrieve this data getting the same result as if I were in a single database?
When talking about splitting the database in the process of migrating a monolith to Microservices, there are some known patterns like:
The shared database
Database view
Database wrapping service
Database as a service
Seems the database view or the database as a service could be a candidate in this case, but of course no one better than you can decide which one is viable in your project.
I highly recommend you to have a look at chapter 4 of "Monolith to Microservices" by Sam Newman.
I'm required to create a bit of software for a company, to illustrate my code use. I'm using .NET CORE Web App MVC and I believe it requires me to use a database but I would need to upload my code on GitHub for them to inspect and run but obviously wouldn't read the database from my machine. What are the alternatives? Can a fake DB be created within the project for instance? Or is there something else I could do that doesn't involve Azure?
I tried scaffolding a DBContext from a controller but it requires a connection of a database
Have you considered mocking your data connection? It is the same thing you would do if you were unit testing your application. You would not want to connect directly to your database; instead, you would create a mock connection and return the data yourself.
You have multiple choices here. You can use a Mock framework like Moq, FakeItEasy, JustMock, and NSubstitute. Otherwise, you can roll your own.
What i mean by the title is: we have a system with different submodules, each with their own (MVC web) application. I thought about creating a REST service that accesses the database and gives data to the applications so no application themselves can access a database directly. The API calls on all the methods that access the database and an application chooses, which to use etc. Basically the web application's models aren't themselves mapped to any database entities which is commonly done in MVC applications (like in ASP.net with entityframework).
Why i thought about this idea in the first place is because i couldn't figure out how to map models to database tables without having to map to all of the tables and their attributes (switching some off for some applications, we're using Phalcon) and have hundreds of unused models in each application. How bad of an idea is creating a REST API for this?
If each application will access the same database you will have to maintain a lot of boilerplate model code (sql/orm). In case of some changes in database you'll have to propagate changes to every application.
In terms of maintenance it is better to expose business operations through web service which will be the only point of contact with database.
In case of web service changes inside database are not visible in applications
On the other hand without web service in front each change to database requires change in each application.
we have several sites for several different clients, each with several different databases.
Some of the databases are at client location, some are on our site.
I have been tasked with creating a few sharepoint sites that will display information from the databases.
Is it okay to call stored procedures from my sharepoint sites? Since the database is not for the sharepoint site, I feel like that site should not have direct access to the DB and should get the data through web services. Certainly, this would be the case if the data were exposed to another company, but since we are responsible for all of it, is that okay?
In my opinion you save yourself from lot of trouble by just going directly to the database, since you control both ends. The direct access to DB will also have better performance than writing some web services in between the two systems.
If the other system wouldn't be yours, I'd definitely hope that it had a web services (or RESTful web services) interface. My reasoning here is that in most software, web services are really actually meant for integrations and thus changes to them are kept at minimum. Database schema changes are fairly typical during the lifetime of a software product and thus it's not generally easy to evolve the schema if other people build integrations directly against the DB.
Querying the database directly is not a supported scenario, thus you shouldn't ever need to do that.
Best practice is to use the existing Web Services, or implement your own custom Web Service.
I am developing a framework for various in-house CRUD apps. I've considered several MS technologies (WPF, Access, WinForms, ASP.NET) and have settled on ASP.NET MVC with HTA+Jquery for the client. My reason for doing so is that I need a way to write and deploy quick one-off GUI apps as well as maintaining more robust apps that are expected to have a long life time.
Firstly, I would appreciate some thoughts on the relative merits of using ADODB on the client side versus ADO.NET on the server side. I'm leaning towards ADODB since I'll have client side access to the SQL Server (I've already written a js library that handles interacting with ADODB). However, I can see how developing a RESTful service may eventually be useful.
Secondly, I need to incorporate reporting capability into the system. I can use SQL Server reporting services or crystal reports but the users have grown accustomed to some older applications that use VBA to write reports in Word; so I'm considering using WordML to write the reports.
Thanks.
Database Access
If you need a thin client, then it's probably better to stay away from directly accessing the database from within the client.
The main issue is that you will introduce a high dependency on a specific network architecture and both your ASP.Net application and the HTA will be highly dependent on the database.
Instead I would prefer to sever the dependency on direct line of sight to the DB and have the data to be handled by the server.
This has a few advantages:
for many small changes to the db, you're probably only going to have to update the ASP app.
if you ever need your client app to be functional over the internet (say because some users are going to an outside meeting, need to work from work or your company open a new branch) then you won't have to rewrite your thin client.
you keep better control over access to the resources: only let the ASP app talk to the database and filter what comes in/out of it.
This will saves you having to implement all security on the client: the ASP app becomes the guardian of the database. It's a much better way to secure information and it gives you a lot more control.
Reporting
For reporting I'd use the server again rather than implement complex reporting capabilities in the client itself.
The problem is that you'll always going to get limited on the client if you're using an HTA and don't want to start having to install dependencies on each user's machine.
You'll end-up building a thick client in no time...
If you're using ASP.Net there are plenty of really good reporting tools that will make your life much easier and allow your users to get nice reports in Excel, Word, PDF, etc without you having to code these features yourself.
Crystal Reports is ok, but there are better and simpler alternatives, for example the Developer Express Report engine is pretty easy to use.