How to organize common schemas referenced by multiple biztalk applications in a same group? - schema

I have a situation were I need to reference a schema in two different BizTalk applications.
I could either:
Put the schema in one application and make the other applications have a reference to it.
OR
Put the schema in a common application and all other applications have a reference to this common application.
However, in both the above cases if there are any changes to the schema I end up having to rebuild/redeploy all the applications having a reference to the schemal. Is there a better way of organizing the applications?

From the Microsoft Biztalk Operations Guide:
Deploy shared artifacts in a separate application - If artifacts are going to be shared by two or more applications, deploy the shared artifacts into a separate application. For example, if two applications share a schema, place the schema in a separate application. We recommend this because only one artifact in a BizTalk group can have a single locally unique identifier (LUID). A LUID consists of the artifact name and optionally other attributes. If you include an artifact in one application, and then create a reference to it from another application, the referring application may not function correctly when you stop the application containing the artifact.
This best practice applies to all artifact types except for files, such as Readme files and scripts, which are added to the application as a File type of artifact. This is because more than one file artifact with the same name can be deployed in a BizTalk group. Therefore, you can use a file having the same name in two or more applications. In this case, stopping one application will not impact the other application. For more information about adding file artifacts, see "How to Add a File to an Application" in BizTalk Server 2006 R2 Help at http://go.microsoft.com/fwlink/?LinkId=106818.
The schemas going in the shared app really shouldn't change as they're shared and this is a major event. If you are adding schemas or modifying existing maps, there is no need to rebuild old apps. You may have to recycle dependent host instances to get them to refresh their in memory copies of the dll. Otherwise should be 95% hassle free.

We put our common schemas and functionality into a separate application.

I am not sure im understanding your question, so correct me if i am off base.
You should organize your schemas as their own assembly that can be independently deployed to all the applications that need it. They can then be referenced by other projects during development. After deployment if changes are made to the schemas; the schema assembly just has to be updated on the server; the reference to the schema from the applications will be maintained.
hope this helps.

Related

TFS Migration - Database Project Circular Dependency

I am trying to create a solution for a web application (that also contains the database as a database project) and then deploy it from TFS using web deploy for the application and DACPAC for the SQL database.
Unfortunately the database is referencing another database using 3-part names:
Select * From Database1.dbo.Table1
This forces me to import the referenced database as a project in the solution for the application that references it and set it as a reference in the other project, as seen in the picture below:
The problem is that Database1 is referencing Database2, but Database2 is also referencing Database1.
However when I try to do this I get the following error:
I have searched online for a solution and found two:
1) Using composite projects to create another 3rd project that contains the references between the two databases and then make this project reference the other two.
See this link: Composite projects solution
2) Replacing all the 3-part names queries to dynamic SQL, such as this:
EXEC('Select * From Database1.dbo.Table1')
None of this solutions is good for me as I don't just have two databases referencing each other, but many databases referencing a central database that references them back, as seen in the schema below:
The first solution would require that I import all the databases into the solution of each application (as they are linked to each other via the Central Database). Also there would be the circular reference error for each pair of projects (Database, Central Database).
The second solution would work as the queries would be seen as strings and would not require me to reference the Central Database in the solution, however I do not like the idea of having so many dynamic queries. Also it would be way to much work to replace all queries with dynamic SQL in each application database.
I would like to know if there are any other solutions beside the two I have mentioned.
The right way to solve the circular references problem is using the composite projects.
In general the "trick" consists to isolate all the shared objects (and the ones referenced by these; eg. a shared view and all the tables/functions used in its definition) in a composite project for each database.
In this way each database will be defined by a couple of database projects: one containing the objects used only inside it (base) and one containing all the objects to be shared with the other databases (shared).
Then you have to link a base database project with the shared one whose object are needed in its definitions.
One picture is worth a thousand words:
The dashed lines represent the "Same database" references (composite project). The solid ones are "regular" references.
I've updated my blog post with a generic case:
SSDT: How to Solve the Circular References Issue
You can also create a dacpac out of the existing database and add that dacpac as the database reference. We did that using a "Schema" folder to store all of the dacpacs and updated/referenced those as needed.
http://schottsql.blogspot.com/2012/10/ssdt-external-database-references.html
You can create another project that can be referenced Database1 and Database2 and this project handle calling between the two projects.
and let this project communicate with web application.

Should i use REST as a 'communication medium' between modules of a larger system?

What i mean by the title is: we have a system with different submodules, each with their own (MVC web) application. I thought about creating a REST service that accesses the database and gives data to the applications so no application themselves can access a database directly. The API calls on all the methods that access the database and an application chooses, which to use etc. Basically the web application's models aren't themselves mapped to any database entities which is commonly done in MVC applications (like in ASP.net with entityframework).
Why i thought about this idea in the first place is because i couldn't figure out how to map models to database tables without having to map to all of the tables and their attributes (switching some off for some applications, we're using Phalcon) and have hundreds of unused models in each application. How bad of an idea is creating a REST API for this?
If each application will access the same database you will have to maintain a lot of boilerplate model code (sql/orm). In case of some changes in database you'll have to propagate changes to every application.
In terms of maintenance it is better to expose business operations through web service which will be the only point of contact with database.
In case of web service changes inside database are not visible in applications
On the other hand without web service in front each change to database requires change in each application.

How to create a database from individual scripts?

A project I am working on is storing each database object (tables, store procedures, etc) in its own file in source control, TFS. I am thinking about implementing a workflow that will build the database in a Windows Azure SQL Server VM instance tied to TFS commits that will run tests for continuous integration.
How does one reconstruct the database from these individual files? Since there are dependencies to consider among other things, is there a standard practice on how to construct a database with needed structure when the objects are stored in individual files?
I am thinking that file by file might not actually be a realistic way to do this? If this is the case, do some companies keep an empty database in the testing domain to be filled with data for CI purposes and not drop the database during test tear down?
Sounds like your team had a SQL Server Database Project at some point. If it's not there, you can create one, and include the individual script files in the appropriate folder in the SQL Server Database Project.
Then all you have to do is right click and deploy to whichever environment you want to deploy the database to.
Here's more: CREATING A SQL SERVER DATABASE PROJECT IN VISUAL STUDIO 2012.

best way of migrating customised metadata associated with source component into Tridion environment

If we are migrating content from source Content Management System to Tridion, what is the best way of migrating customized metadata associated with the components(content) of source Content Management System into Tridion? Should we directly migrate it to the sql server or is there an option to migrate it in the form of some xml file, etc.?
Migrating directly into SQL Server is unsupported, and the entire system would be unsupported at that point, due to possible data consistency issues.
The most straightforward way is to read the data from the source system, and use the Tridion API to recreate the item.
If migrating metadata, some of the data would likely fit best into a taxonomy, which would mean you'd want to migrate the keywords / structure first, then tag the content as it came into Tridion.
You have a few options when migrating content into Tridion.
I can't understand from the above if you are talking about migrating to SQL server as an intermediate format, or directly into the Tridion database. Importing directly into the Tridion database is definitely not a supported solution, and could lead to unpredictable results.
You need to use the API, either the Core Service or the TOM.NET API (If you have Tridion 2011) or the old TOM API if not.
A popular approach is to export all content into an XML format that you can then process with a .NET application.
There's some good articles on migrating content into Tridion by Ryan Durkin here, and Nuno Linhares here.
As mention before, migrating directly into the Database is not an option if you are planning to use SDL Tridion as the final CMS.
Apart of the supported mechanism chosen for Migrate, play attention about how you are going to structure the metadata in the new CMS, as depending on the volume, structure, hierarchy, relation across metadata items the process can become complex.
Also play special attention at the Blueprint concept, as probably you can merge duplicated values from the old system into only one that is inherited.
Don't think only in how to put the metadata in the system, also how that Metadata will be used and maintained in the new CMS, in this case SDL Tridion
You can check also a recent post about Migration and plan Migration in general, in case adds some more information
Can we automate migrating to SDL Tridion?

Querying database from different applications with nHibernate

In this moment, I have two web applications(one application is an MVC2 application for the management of my project and the second is an application with web services). Both applications have to deal with the database and have Nhibernate for querying the database. Is this a good pattern?, if not what can i do?
Edit 1
Both applications can write to the database. I have a dll project that handle the database transactions and have de nhibernate instance named "Repositorio". Nevertheless, each application will have a different instance of Repositorio.dll so there is going to be multiple threats to the database, what do i have to do to make both application use the same instance of Repositorio.dll?
The answer depends on whether or not both applications can write to the database.
If one is read-only, I'd say you're safe.
I not, I'd argue that a service-oriented approach would recommend creating a service that provided an interface for both applications and was the sole owner of the database.
"service-oriented" does not mean that the service has to be a distributed component (e.g., SOAP or REST or RPC). If you encapsulate the database access in a component with a well-defined interface you can choose to share the component as a DLL in both applications. Only make it a distributed component if that makes sense for both applications.
That sounds perfectly fine to me even if both applications write to the database. I would simply recommend you create a third project as a class library with all your nHibernate related stuff to avoid writing any redundant code in both projects.