Execute Package Task on a external IS server - sql

We have recently started moving our development environment to a second data center. We have an Integration Services Server and an SQL server at both locations. We stage data from customers at our main data center, load it to a production warehouse and a development warehouse at our secondary server through SSIS. The problem I'm having is launching child packages that are stored on that development IS server are still using the main IS servers resources when launched from a parent package on the main server. This isn't ideal because of network latency caused by the physical distance between the servers (it's our DR site). This forces our load times to development to be 2x-4x what it is to our production site.
I launch it as a child package instead of using dtexec because I pass variables to the child to create connections/logging purposes.
I'm pretty sure I'll have to start with the ExecuteOutOfProcess property but I haven't been able to find any hints beyond that. Has anyone found a solution for this type of execution besides a lot of custom coding?

Related

Is it possible to copy a Azure DevOps build and run it locally?

I originally felt this question was for Software Engineering, but they've closed it as off topic and sent me here, so here I am.
One of the biggest time sinks when doing the odd piece of DB development is setting up the environment locally, often my process goes like so:
Get database
Publish db server
Publish db
Load test data
Repeat for any dependencies (can go 3-4 levels deep)
This is a bit of a pain really, and can take a while, and I was thinking if there are any ways to automate this.
We make use of ADO, and through ADO we run builds that deploy our changes and load out test data to make sure we haven't broke anything. Now I imagine ADO follows a very similar process to myself like above, and reviewing the build it looks something like so:
Now, I'd love it if I could get access to the script that runs this, so that when I start development, it gets rid of all the above down-time of setting up the environment.
Does anyone know a way to do this? Or perhaps have any other recommendations?
No, it's unable to copy the build to run locally. They are all based on the existing tasks (see Build and release tasks and azure-pipelines-tasks ).
However, you can try to develop your own scripts by calling the corresponding tools for each step, then combine them together.
Alternately you could setup a private agent on your develop machine, then you can build with this private agent with that build definition.
Another way is setup a on-premise Azure DevOps server, thus you can export the definition from your Azure DevOps Service and import to the on-premise Azure DevOps server to use the definition directly.

Integration Test on Continuous Integration server

Would like to setup the CI system so that the integration tests could run in an centralized place.
How could we setup a database for each developer for their related branch of work.
We want to guarantee 100% compatibility with the deployed platform, at the cost of having multiple databases which is synchronized with a major db .
installation and data transfer should be automated and not painful during application build.
You have to setup database sandboxes for your CI server. This setup would depend a lot on what database solution you use and the size of your database.

vb.net application and microsoft access database deployment

I have developed an application using VB.NET and used microsoft access as the database back end.
Deployment reqiurements : The application is to be deployed on a LAN with 5_15 machines. Any user profile can be accessed from any machine. Any changes to the database entries should be reflected on all machines.
I am confused about how I should achieve this deployment. According to my research :
1.The database should be deployed on one machine . This machine will acts as the database server .
My problem(s) : I am familiar with accessing databases on local machine but how to access a remote database?. How will the connection string look like? Do I need to install ms access on all machines or only on the server machine ? Do I have to deal with concurrency issues (multiple users accessing/modifying same data simultaneously) or is it handled by the database engine?
2.The application can be deployed in 2 ways : i. Storing the executable on a shared network drive on the server.Providing shortcut on desktop of each machine. ii. Storing the executable itself on each machine.
My Problem(s) : How does approach 1 work ? (One instance of an executable running on multiple machines ? :s) In approach 2 , will the changes in database entries be reflected on all machines appropriately? In approach 2, if there are changes to the application , is there any method to update it on all machines ? ( Other than redeploying it on each machine ) Which approach is preferable? Do I need to install the .NET framework all machines? How will i set the connection string to be able to access the database in the network?
Will I have to make any other system changes ( firewall,security,permissions) ? If given a choice to install the operating system on each machine ,which version of windows is preferable for such an application environment ?
This is my first time deploying a multi-user database application on a network.I'll be very grateful for any suggestions/advice,references,etc.
I will try and answer your questions:
Yes you should deploy the database onto a central machine. (Although Access may not the best choice for this sort of thing see: Is MS Access (JET) suitable for multiuser access?)
For connection strings look at this site: http://www.connectionstrings.com/access/
For deployment of your executable you should look at clickonce. This simplifies the install and upgrade of your application significantly. A small learning curve now will reduce your administration headache later.

SSIS: Is there a way to deploy packages to multiple SQL Server 2005 instances

Does anyone have any advice or techniques for deploying SSIS packages to the Integration services database.
Basically I maintain a number of SSIS packages that need to get deployed to several environments (dev, test and production), there is a need to change the individual database connections as well.
I would like to automate the process of deploying them to these environments, so it can be included in a full application deployment that can be done by the server admins.
I came up with a method for configuring packages for different environments using a single SQL Server configuration table (assuming all environments can connect to the configuration server).
http://www.sqlservercentral.com/articles/SSIS/66426/

How to set up a multi-developer Biztalk environment?

If we have 3 developers working on the same Biztalk project what is the best way to set up our development environment?
We are using TFS to store the Biztalk project.
Should we use 1 sql server and 1 Biztalk server and then have 1 or more developer machines that access the sql and biztalk servers? The issue we get with this is when 1 developer compiles and deploys their changes it can effect other developers if they are also trying to compile and deploy their work.
Should we have each developer host their own complete sql and biztalk server for local development either on their machine or within their own virtual machine? The problem we find with this is that each developer could modify their server settings and those settings are not stored in source control. This can cause confusion when changes are deployed to a testing server. Another smaller issue is that each developer would need to have sql server, biztalk server and windows server installed.
Is there another way to set up a multiple developer biztalk development environment?
You will always want to have each developer have a complete BizTalk installation on their own machines. Believe me, it doesn't work otherwise, as you'll just keep getting on each other while trying to deploy/test/debug changes.
That said, you will also want a centralized dev/test environment where you deploy your code for more complete integrated testing and making sure all the changes from everyone are seen together.
Your point about configuration is true, but only up to a point. This is because you should make your solution configuration part of your source code and keep it in source control as well. This is particularly important once you're a bit ahead in your development as you'll need to start maintaining multiple versions of your binding files for each environment (dev, test, production and so on).
tomasr is right. Also, if you have decent hardware and lots of RAM, you may want to setup a VM image of your full developer environment, then share this will all your team. Not as fast as native hardware, but does allow you to roll back changes, replace your VM if you really mess up and everyone then has the same environment – ideally close to the target one.
Setting up a continuous build server is also a most, if your projects are small, you can get each checkin to cause a full build, BizTalk deploy, export of MSI and then run tests. Later as your solutions get more numerous you might have to move to a continuous build of C# changes only, then say nightly or several times a day, you do a full. We have done this with CruiseControl.net, Nant, nunit and various power shell scripts, it was pretty time consuming, but each morning we come to work to find a fully compiled, deployed, exported and tested set of BizTalk solutions ready for the test team.