How can I duplicate my instance in sql server !?
I want to have a test instance on my same server just for testing purposes instead of applying all tests on the real data.
is it possible to create an new instance and copy all the same data and users with permissions on a new instance?
Or is there any other way other than VM, because my DB is in running and in use from other user but I want to do my test environment without disturbing other users. And all that on the same server.
You could either install a new instance by starting the installer again or simply use the same instance and restore a backup of your prod database to a test database.
without disturbing other users
This requirement alone means you will have to run your instance on another machine or VM. You cannot expect to maintain an instance on a server without certain things affecting the server as a whole, and any other instance running on it. (e.g. restarts for patching or troubleshooting)
There is no reason if you have the resources to not just put it on another VM, but that all depends on what you want to test (e.g. unit, integration or performance testing).
With regards to duplicating your server, you can utilize dbatools. The Start-SqlMigration would perform the work to bring over the major parts. To make it the easiest process it helps to make sure your new SQL Server instance has the same drive configuration.
Yes, you can do it. Just create new instance, and then restore your prod. database on that instance. You might need to create users there.
Following might help with creating users and mapping them to users in DB.
USE [master]
GO
CREATE LOGIN [myDBUser] WITH PASSWORD=N'myPassword' MUST_CHANGE, DEFAULT_DATABASE=[myDB], CHECK_EXPIRATION=ON, CHECK_POLICY=ON
GO
USE mdb
exec sp_change_users_login #Action = 'Update_One',#UserNamePattern = 'myDBUser', #LoginName = 'myDBUser'
You can automate this work using DBATools's Start-SqlMigration powershell commandlet.
However, I would warn against running both the production & the test instances on the same physical hardware, as you will be starving the production instance of resources.
Related
I want to create an Azure function which is a queue trigger, that when it fires it connects to a SQL DB and gets a record and updates it.
How do I handle making sure the SQL connection queries the right database? Eg Staging DB vs Production DB
Do I need to have two instances of the same azure function? One that has it's connection string set in application settings to point to staging db and the other set to point at production db? Surely not?!
Every article I can find talks about your local.settings.json and production..which is fine. But in the real world we might have local, testing, staging, production.
I can pass through the environment as part of the queued message that comes into my Azure function, but surely there is a more elegant way and I'm missing something?
I think this depends on your solution design, size and deployment strategy. Here are 3 options:
Option 1 (our solution):
We are using Azure Functions in a large scale and 4 environments (DEV, TEST, STAGE, PROD).
Therefore, we've created a function for each instance having the right connection string on each stage.
Option 2:
Another possibility would be to create different deployment slots and -settings, then you could just use one function with different settings.
Option 3:
You could create parametrized settings and decide which one you might choose.
I have two databases for my customers, a LIVE database and a TEST database. Periodically my customers would like the LIVE data copied over to the TEST database so it has current data that they can mess around with.
I am looking for a simple way that I could run maybe a command in my application that would copy all the data from one and move it into the other.
Currently I have to remote in with their IT department or consultant and restore from a backup of the LIVE to the TEST. I would prefer to have a button in my application that says RESTORE TEST and it will do that for me.
Any suggestions on how I could do this? Is it possible in SQL? Is there a tool out there in .NET that could help?
Thanks
If you have a backup plan, which I hope you do, you could simply restore the latest full .bak, if it is accessible to your application. However, this would require some tunneling for your application to access the latest backup file and this is generally a no-no for zones containing database servers.
Perhaps you could set up a scheduled delivery of a backup from machine to machine. Does the LIVE server have access to your TEST server. I wouldn't think that a DBA would be inclined to set up a delivery of backup unless it was to have a remote backup location for disaster recovery and that is usually to dedicated locations, not a testing server. Maybe you could workout a scenario where your TEST server doubles as an extra remote backup location for the production environment.
It would be better to be shipped a backup and then periodically or manually start a job to restore from a local backup. This would take the burden of your application. Then you would only need to simply kick of the sql job from within your app as needed.
Our company has a proprietary t-sql script that we need to run on several SQL servers that are not owned by us. We would like a way to run this script so that it couldn't be copied via the file system or the clipboard. We usually login via a remote session (logmein or similar) so we are concerned a tech who has seen us run one script could prepare the next server to copy the script, by knowing how we would run it.
All the records the script creates will be in the db for them to view, we just wish to protect the script itself.
Is there any method that would not require significant development?
It's not possible to fully protect your script (it can always be captured) but if you don't want to distribute it as is for easy copy/paste you could make an application that does nothing more than connect to the database and run the script, and embed the script as an encrypted ressource in your application.
However i would simply suggest you don't do any of that and just trust your Customer, a situation where you don't trust your direct custom, with whom you're in direct contact, strikes me as very bad. It's also often very poorly thought out, you're imagining it as having more value than it has most likely.
you can use WITH ENCRYPTION option while creating the stored procedure or function
so that underlying SQL can't be viewed but it can still be executed.
http://msdn.microsoft.com/en-us/library/ms187926.aspx
I'm working on a project as an outsourcing developer where i don't have access to testing and production servers only the development environment.
To deploy changes i have to create sql scripts containing the changes to make on each server for the feature i wish to deploy.
Examples:
When i make each change on the database, i save the script to a folder, but sometimes this is not enought because i sent a script to alter a view, but forgot to include new tables that i created in another feature.
Another situation would be changing a table via SSMS GUI and forgot to create a script with the changed or new columns and later have to send a script to update the table in testing.
Since some features can be sent for testing and others straight to production (example: queries to feed excel files) its hard to keep track of what i have to send to each environment.
Since the deployment team just executes the scripts i sent them to update the database, how can i manage/ keep track of changes to sql server database without a compare tool ?
[Edit]
The current tools that i use are SSMS, VS 2008 Professional and TFS 2008.
I can tell you how we at xSQL Software do this using our tools:
deployment team has an automated process that takes a schema snapshot of the staging and production databases and dumps the snapshots nightly on a share that the development team has access to.
every morning the developers have up to date schema snapshots of the production and staging databases available. They use our Schema Compare tool to compare the dev database with the staging/production snapshot and generate the change scripts.
Note: to take the schema snapshot you can either use the Schema Compare tool or our Schema Compare SDK.
I'd say you can have a structural copy of test and production servers as additional development databases and keep in mind to always apply change when you send something.
On these databases you can establish triggers that will capture all DDL events and put them into table with getdate() attached. With that you should be able to handle changes pretty easily and some simple compare will also be easier to apply.
Look into Liquibase specially at the SQL format and see if that gives you what you want. I use it for our database and it's great.
You can store all your objects in separate scripts, but when you do a Liquibase "build" it will generate one SQL script with all your changes in it. The really important part is getting your Liquibase configuration to put the objects in the correct dependency order. That is tables get created before foreign key constraints for one example.
http://www.liquibase.org/
My requirement is to retrieve data from a local SQL server and store that data in remote server. I would like to get the data from the local SQL Server and use that data in my application to proceed further.
Yes, see Create linked server with SQL command.
You can create the linked server either locally (I'd recommend that) or remotely
If you mean you want your remote server to execute queries on the local one then yes, but...
Setting this up is a fair bit of work and if I remember rightly needs a hefty amount of privileges on the remote server.
Might be easier to set up the linked server locally. NB assumes the account the local server is running as can reach the remote machine, and can access it.
I'd be a bit nervous about doing this, internally for fear of some admin type breaking it, if it's over the internet, then securing it will be a nightmare even if they allowed their server to be accessed directly..
You might find it easier to do it via the client, though that will dpend on how much data you want to synchronise, and then perhaps a briefcase approach might do the job better.
You can set up linked servers either via the GUI in SSMS, or via scripts. A couple of things to look out for though - first make sure you create a login account on each server that is mapped to the appropriate database on each server, with the least possible permissions. Then verify the mapping after you've set it up. Be aware that you are opening up the attack surface via this solution, so you may want to have your admin set up auditing too.