We are a team of two developers, building an ASP.Net MVC app in Azure and are wondering what is the best way to go about setting up the database.
We had been using a local db in the App_Data folder that attached to SQL Express, this seemed to be working fine until it came time to check in and conflicts occur.
We are using git to check into BitBucket and a deployment is ran off the master branch into Azure deploying the database as well.
We were/are using database code first migrations and all the data is seeded.
Can anyone help please?
Moving the discussion to an answer.
Regarding migrations:
I work in a team of 2 developers too. What works for us is at the early stages of development (model changes a lot) each one runs its own database locally (data seeded in the initializer). Once the project is relatively stable, we deploy to a website with a SQL Azure database. Whenever the model changes, we add a migration and run it against that database. Our team, like yours is small so this works for us. If the team grows, I recommend setting up a CI server (Team City).
Regarding Database connections
We don't use the .mdf file in App_Data. Each one has a local instance of SQL server Express running in their machine. For the connection strings, we have 4 environments set up (Local, Development, Staging, Production). These are set in the web.config (you can set them up in code too it's your choice). When we ran the application, we choose the environment we want to develop against. We deploy using VS2013 to an Azure Website, each web.config is configured accordingly per environment.
We get the environment from the web.config and depending on which environment we're at, the connection string is injected into the application using an IOC container (Ninject).
<configuration>
<appSettings>
<add key="environment" value="Development"/>
</appSettings>
</configuration>
Hope this helps,
Related
Using Code First and Entity Framework, I have created my web application on my dev machine, used migrations and published to my beta application to my production server and database.
Then on my dev system I’ve done lots of changes created several migrations and applied them to my local dev database. When I use update-database this updates my local dev database, but how then do I apply the migrations to my production server database?
I've been using update-database -script to get the SQL to manually apply to my production server. Is there a better way?
You should ideally employ some kind of actual database deployment system like ReadyRoll. Short of that, you should generate SQL scripts that you can commit and deploy manually, preferably via a DBA role in your organization. Code-based migrations can do all sorts of potentially bad things to your database with no notice, but in a SQL script, you can easily see that a table is about to be dropped or a column with lots of irreplaceable data is about to be removed.
Your in Web.config is what establishes which database the application is pointing to. When you point it to production and run the same EF commands (dotnet ef migrations add migrationName, and dotnet ef database update) it should update your production environment.
For my setup I just don't deploy my web.config so in production it always points to production database. When I run the EF update scripts in production it updates production and I'm good to go.
We are developing a small application that needs to have a local database installed on each users computer that will then sync up to the main database, via web services etc...
Anyways when we deploy the application on the users computer we want to use clickonce deployment. Now I have used this before but not attaching a SQL Server database. I know you can go to prerequisites in clickonce properties and click SQL Server Express.
Now the question is, when you have created your .mdf database file including stored procedures and all - how do you get this attached and setup automatically in the local database that is just installed through clickonce?
Also once this is finished in the future we may want to run updates to the database on the clients machines. We would like to use clickonce for this to publish database updates. Obviously we don't want to overwrite the database and just publish the latest updates based on if they already have the database or not and what version they have.
How could this be achieved using clickonce? Thanks
I am newer to Visual Studio 2012 and MVC4, and have a development project of a website using C# and MVC4 and SQL server 2012.
The Publish using web deploy works for the website portion of the project, but it does not automatically update the database portion (schema). If I right click the database portion of the project in VS2012 and click Publish, then the database schema is updated properly. I am only interested in schema updates. What could be wrong?
I programmed my development system to use Web Deploy 3.0. Here is a summary of my configuration:
Computer running win8 x64
SQL Express 2012 as my database server, running as the default instance (i.e. at localhost, NOT as .\SQLEXPRESS)
IIS Express 8 as my webserver, using the "Default Web Site" site (localhost)
Visual Studio 2012 Pro using MVC4 and C#
Web Deploy 3.0
The latest dates are all applied to the software
I programmed an SQL server user WDeployAdmin for managing the database updates and gave it full permissions over the database being used for the website. I also tried using Integrated Security (my administrator login) but that does not help.
I can correctly update the Default Web Site (views and controllers etc) using the Publish feature in VS2012 which uses Web Deploy 3.0.
When I use the Test Connection feature of the publishing setup options, it correctly connects to the database, and that certainly works fine when I do a separate Publish DB operation (right-click DB project, click Publish, and then pick my profile) for that part of my project.
So why doesn't the standard website Publish feature include the database schema updates? The standard website Publish always shows an empty change to database when the schema changes (e.g. a table or a stored procedure).
I have read through much MS docs but nothing is apparent to me.
Any help is appreciated,
Bruce
Does anyone have any advice or techniques for deploying SSIS packages to the Integration services database.
Basically I maintain a number of SSIS packages that need to get deployed to several environments (dev, test and production), there is a need to change the individual database connections as well.
I would like to automate the process of deploying them to these environments, so it can be included in a full application deployment that can be done by the server admins.
I came up with a method for configuring packages for different environments using a single SQL Server configuration table (assuming all environments can connect to the configuration server).
http://www.sqlservercentral.com/articles/SSIS/66426/
If we have 3 developers working on the same Biztalk project what is the best way to set up our development environment?
We are using TFS to store the Biztalk project.
Should we use 1 sql server and 1 Biztalk server and then have 1 or more developer machines that access the sql and biztalk servers? The issue we get with this is when 1 developer compiles and deploys their changes it can effect other developers if they are also trying to compile and deploy their work.
Should we have each developer host their own complete sql and biztalk server for local development either on their machine or within their own virtual machine? The problem we find with this is that each developer could modify their server settings and those settings are not stored in source control. This can cause confusion when changes are deployed to a testing server. Another smaller issue is that each developer would need to have sql server, biztalk server and windows server installed.
Is there another way to set up a multiple developer biztalk development environment?
You will always want to have each developer have a complete BizTalk installation on their own machines. Believe me, it doesn't work otherwise, as you'll just keep getting on each other while trying to deploy/test/debug changes.
That said, you will also want a centralized dev/test environment where you deploy your code for more complete integrated testing and making sure all the changes from everyone are seen together.
Your point about configuration is true, but only up to a point. This is because you should make your solution configuration part of your source code and keep it in source control as well. This is particularly important once you're a bit ahead in your development as you'll need to start maintaining multiple versions of your binding files for each environment (dev, test, production and so on).
tomasr is right. Also, if you have decent hardware and lots of RAM, you may want to setup a VM image of your full developer environment, then share this will all your team. Not as fast as native hardware, but does allow you to roll back changes, replace your VM if you really mess up and everyone then has the same environment – ideally close to the target one.
Setting up a continuous build server is also a most, if your projects are small, you can get each checkin to cause a full build, BizTalk deploy, export of MSI and then run tests. Later as your solutions get more numerous you might have to move to a continuous build of C# changes only, then say nightly or several times a day, you do a full. We have done this with CruiseControl.net, Nant, nunit and various power shell scripts, it was pretty time consuming, but each morning we come to work to find a fully compiled, deployed, exported and tested set of BizTalk solutions ready for the test team.