What is the convenient way to distribute configurations between hundreds of machines - automation

we run a daemon in all of our machines, however we have a demand that we need to feed different machines with different configurations.
What we need is like this:
When some one reconfigure some thing in the front end, we need to generate new configuration and send these new configurations to the specified machine.
Besides, we should also need to execute commands, like restart after configuration distributed.
we should have a way to check whether the configuration in the specified machine is the newest one, i.e. whether the configuration distribution executed in the first phrase succeeds or not.

I'd keep separate the generation of the new configs from the actual deployment.
I'd have a centralized daemon keeping track of the frontend changes and updating the configs and the necessary machine/config mapping in a central, well/known location. In a simpler form it could also be an on-demand manually executed process.
Your existing daemon would be modified to periodically check if the particular config for its own machine is changed and, if so, apply the change, perform the necessary deployment cmds and maybe even report back into the central location its own progress and results for the centralized daemon's overall status reporting. This keep this daemon code simpler (free from the machine/config mapping logic) and thus more reliable.

Related

Managing multiple google cloud VM instances / compute engines at the same time

I have created a few servers on google cloud. I made them in VM instances. They run the same script everyday but each server runs with different arguments.
However, when changes need to be made to them or updates, I have to do them one by one, all the changes are the same, only different arguments. Meaning I would ssh into the server, run apt updates, download some files, upload some files, change some arguments and test. Then I repeat this process on all the servers.
I would like to be able to keep one copy of the server somewhere which would upload to the rest, or make changes that would apply automatically to each server.
Is there some way I can achieve this? Update all the servers (apt update, or download new files or make changes to scripts) all at once?
I would suggest creating a managed instance group that uses an instance template to create the VMs. Then, you can roll out updates to MIGs.
You can provide a startup script stored on Cloud Storage and apply it to the running instances.

What would cause SSIS to ignore Package Configuration Connections?

I have a very simple SSIS Package that has 2 connections defined in the Connection Manager section. An MS Access Data Source and an MS SQL Data Source Destination. All this package does is Truncate a table in the SQL Destination and Imports data from MS Access into the SQL table. This works as expected during Development within VS2013.
Now, I also have enabled Package Configurations for the package and have a couple of XML Configuration files (1 for each Connection) in a folder on the root of the C: drive. The Configuration file connections differ based on the server where they reside, but the folder structure exists on both servers so the package can execute against the server from which it is run.
I've checked the box to enable Package Configurations and deploy the package to 2 different Servers. 1 for Development and the other for QA. When I execute the package via the SSMS Integration package execution on my Development Server, the package utilizes the Development table. But when I execute the same package on my QA environment, it also utilizes the Development table.
Since the Development connection is the one that is embedded in the package via the Connection Manager, it appears (presumably anyway) that the package is using the embedded connection and ignoring the configuration files.
I have alternatively explicitly added the path to the Configuration file within the Execute Package Utility in the Configurations section to see if it made any difference but the results are the same. The configuration file is not acknowledged. So it again appears that the package is using the embedded connections that defined in the Configuration Managers.
I suppose I "may" be able to remove the Connections from the package in the Connection Managers section and turn off validations during Design time and then deploy again in effort of forcing the package to use the Config files but that doesn't seem like the way to go and a hack at best; provided that it would even work.
Not that I think it should make a difference but to provide more detail, here is a bit more concerning my Server Configuration:
Development - SQL 2014 [ServerName]
Quality Assurance - SQL 2014 [ServerName][InstanceName]
I don't recall ever having this issue before, hence my reason for posting.
Ok, since I am working against a dead line; I was hoping to acquire an answer sooner than later. But since that wasn't the case and because I've seen variations of this question before without a definitive answer (at least to satisfy this scenario) I performed some tests and am posting this for others who may also have need of this information.
The Following Conditions will ignore the use of Configuration Files even if Package Configurations are enabled in an SSIS Package. These findings are based on actual tests and affirmed to be true for SQL 2014 although prior versions may also be applicable.
Disclaimer: These tests focused on the Configuration Files as they pertained to actual Server Connections. (E.g. Connection Strings) and not any other variables although it’s conceivable that any other values within the Configuration file would also be affected.
Execution of the Package from within SSMS while connected to the Integrated Services Component and selecting to Run Package. The noted behavior is that whatever Connection value was acquired prior to deployment to the Server is the one that will be used; irrespective of the Configuration Files
Note: This holds true even if configurations are added in the Configurations section prior to execution. Although there is mention that the configurations are not imported and they cannot be edited; the fact is they were neither used during the testing.
If an SQL job is of type SQL Server Integration Services Package and no Configuration File references are actually added to the Configurations tab, the values the job will execute under whatever values were used during the last build within BIDS prior to deployment (Embedded Values)
If multiple configuration files are used by the package but some are omitted in the Configurations tab of the job; the job will use those Configuration Files designated but will default to the last values used in Development (Embedded Values) for those which are not present in the context of the job
Some of these behaviors are not very obvious and I'd imagine it could be a frustrating puzzle when someone expecting to follow the rules of most online tutorials for using Package Configuration files; would have the expected more straight forward results.
I know it was a time consuming task of testing to identify the root cause for me and although I'm not an expert; I'm certainly far from a novice with SSIS.
At any rate, I hope this helps someone else from hours of work and investigations.

how to minimize JBOSS AS 7 configuration to fit needs

i need to be able to configure JBOSS as 7.2 to only startup the services required in the project.
what is the best aproach to customize JBOSS as 7.2 configuration and set it to a user defined configuration ?
i'm intending to use :
JAAS , EJB , JSF ..
If I understand your request correctly,
it would be removing the unnecessary subsystems.
In standalone, you can configure multiple xmls per your use, and decide the standalone config to use at startup, which would allow you to adapt to various needs quickly.
In domain, it's even easier as you can define various profiles, each of them with specific subsystems present or not, and then assign the necessary profile to your server group.
Removing subsystems through xml modification or CLI is simple,
adding them through CLI sometimes require figuring some of the "default" expected entries for recreation of the subsystems, but once figured out, is easy also.
Key element in your case is to make sure you do not clean up too many of the subsystems, as they sometimes have dependencies to one another.

Need advice regarding deployment on multiple remote machines

Currently I am using ms-deploy to build and deploy on several machines using team-city. In my current scenario, I need to build, package and deploy on Dev. After this I need to deploy this package on test and Live servers (which are on different domain. I understand how we do it but problem is Web transformation only occurs for test and live configs if we build a package. It means if I want to use the same package that is created for Dev cannot be used, as web transformation only occurred for Dev web config. Also know that we can change web config when un-packaging but that parameters are very limited. We have a lot of changes not just the connection string or db changes.
Another solution is to add another step to build packages for test and live as part of Dev deployment but then it means a lot of copying on remote servers, once for test and once for live which is a lot of time consuming due to different domains.
Can you please guide what is the best solution in this scenario. So I can use team-city to publish to Dev and test and live using same package and different web configs in one go.
To configure items at deployment time which are not automatically created for you. You can add a file named parameters.xml to your project and extend what you want to make available at deployment time.
Here's some documentation on the approach Using Deployment Parameters for Web.Config File Settings.

How to set up a stageing-enviorment for wordpress/wordpress mu?

I have a wordpress mu-site. I need to set up a test-version of it so that the client can run test on the changes we make, test the plugins with new updates etc.
Anybody who has worked with wordpress know it's a bit off a hassle to move between servers and/or domain-names, due to the absolute paths used. Does anybody have a good solution how to create a stage-enviorment of wordpress?
Here's how I do it + some adjustments I want to make:
Two WP installs on identical environments - dev & production
They each have their own FQDN
Version control (SVN in this case) to handle merges from dev to production
When merging, I don't ever merge database changes. I only merge code, and modify any of the domain specific things during the merge (which really should only be in the DB.)
Recreate any DB changes needed during deployment
There are other ways to do it, but they often require changing the hosts file or access to internal systems. So if you want to be able to show an external client a site, then those methods aren't likely to work.
I also sometimes copy the DB back from production to dev, and just do a find & replace for the FQDN.
You can also dev locally and use the above listed method for staging only.