Setting Autoincrement in SQL Server 2012 - sql

I was in my designer trying to enable the auto increment and when trying to save the changes this error shows, I am using SQL Server 2012 Management Studio.
Does anyone know why? Thank You.

This is a feature in Management Studio that prevents you from accidentally dropping and re-creating a table without knowing it. You can turn this off in Tools -> Options -> Designers -> "Table and Database Designers" - Option is "Prevent saving changes that require table re-creation".
You should turn this off with caution as this feature is trying to help prevent you from causing issues. I recommend you only do this against development servers, or even better always just generate the script and execute that instead. This way you will always know what commands are being executed.

Related

SQL server update dbo refresh

How come SQL server 2014 intellisense doesn't recognize recently created database objects (E.g. view, table, etc.), yet a query can be ran using such objects?
Is this a bug?
The intellisense cache is built when you open a connection in the Query Editor of SQL Server Management Studio. But unfortunately the intellisense cache is not rebuilt after creating objects, and hence it is not refreshed until you press the shortcut Ctrl+Shift+R suggested by Simon.
If you like to use a mouse, enter here

Designed table cannot be saved in SQL Server Management Studio

I am trying to design a table in Microsoft SQL Server Management Studio.
I can modify it but cannot be saved, anyone knows the problem how to fix?
I found the way to fix.
Go to tools > options > designers uncheck "Prevent saving changes that require table re-creation

What's the best version neutral method for deploying a SQL Server database?

On my development box, I always run the latest version of SQL Server. I often deploy databases from my dev box to a live/staging area for review or testing. I've done this many times and it has always been a painful process, but I am realizing that I need to find an easier, more reliable and consistent way of performing this basic operation.
I normally use WebMatrix purely for deployment and it's worked fine, but I've been having problems getting it to work on my server for some reason. Consequently, I am seeking an alternative solution.
Creating a SQL dump file would probably work, but it's not an acceptable solution a database contains images and easily exceeds 2 gigs of data which would take forever.
The Import/Export utility fails due to issues with incomplete schema copies, identity inserts and checks. The solutions offered for these issues has failed to work in my particular case.
The Backup and Restore method also fails due to some strange incompatibilities between SQL Server 2008 and 2012. SQL Server 2008 Management Studio throws exceptions during the restore process of a 2012 database. It's odd that this happens, even though I set the compatibility of the database to version 2008.
I haven't tried the Detaching, copying and reattaching files method, but I haven't bothered trying since it would probably fail for the same reasons the backup and restore method did.
Are there other alternatives out there? Also, why is this so unbelievable hard for a task that is so common and important, especially in this day in age of 2013? Get real Microsoft!
We changed our method of deploying and moving databases between servers, instances and versions by adopting the tools from RedGate. They are expensive, but worth it IMHO.
My team creates scripts for ~everything.
Database Creation, Alter, Inserts, etc, etc.
And we write all scripts that check for the existence of things before trying to create them.
Aka, we can run the scripts over and over and get the same results.
And we deploy to different environments by using SqlCmd.exe.
EDIT
See:
http://odetocode.com/blogs/scott/archive/2008/02/02/versioning-databases-views-stored-procedures-and-the-like.aspx
and
http://odetocode.com/blogs/scott/archive/2008/01/30/three-rules-for-database-work.aspx
=============
If that is "too much" then I agree with the other poster, RedGate is your friend.
Points below aside, have you considered the Database Projects within VS2012? they allow you to script off the tables, sp's, triggers, users etc you want, generate sql cmd scripts, make changes and schema compare and version control your database code, I'd certainly recommend it
"Creating a SQL dump file would probably work, but it's not an
acceptable solution a database contains images and easily exceeds 2
gigs of data which would take forever."
Why is this a problem? where are you transferring the file from and to and over what connection?
"The Backup and Restore method also fails due to some strange
incompatibilities between SQL 2008 and 2012. SQL 2008 Management
Studio throws exceptions during the restore process of a 2012
database. It's odd that this happens, even though I set the
compatibility of the database to version 2008"
This shouldn't be an issue if file is created in 2008 prior to restoring. If you create a new DB in your 2008 instance, then take a backup from that and restore it to a 2012 instance with 2008 compatiblity, then you should be able to use it there, back it up from the 2012 instance and restore to 2008 again afterwards.

How can I get SSMS to provide Auto Completion for SQL Azure?

I'm trying to identify a SQL Server Management Studio option (when writing/running queries) that provides table/column auto-complete functionality in the query editor. Unfortunately, SSMS seems to stop giving you Intellisense when you're connected to a SQL Azure database. Is there any way to fix this?
Are there any options, hacks, plugins or anything else that can accomplish this?
I've finally stumbled upon an option. It's non-ideal but it is certainly a huge step in the right direction!
dbForge SQL Complete is a SSMS plugin that replaces SSMS's built-in Intellisense with its own auto-complete engine. This is a HUGE improvement when connected to SQL Azure, but so far the free version feels like a step backwards when connected to traditional SQL Server instances. Overall, I think we're going to prefer using this over not using it. I'll come back in a couple days to report how well (or not) it's going.
At least it's an option, though!
New Release of SQL Server Management Studio V17.2 now support Intelliscence so no need to have any other tools for this.
Please note that this will only work in case of SQL Server Authentication, that means logged in using SA Instance.
https://connect.microsoft.com/SQLServer/feedback/details/3100677/ssms-2016-would-be-nice-to-have-intellisense-on-azure-sql-databases
Visual Studio have enabled basic IntelliSense for Azure SQL, but it isn't avalaible easily.
You have to click on table, select from drop menu DROP AND CREATE TO -> New query window and then IntelliSense will work. If you simply use New query it will not.

Modifying SQL Database on Shared Hosting

I have a live database on a shared hosting server. I am making some major changes to my site's code and I would like to fix some stupid mistakes I made in initially designing the database. These changes involve altering the size of a large number of fields, and enforcing referential integrity between tables properly. I would like to make the changes on both my local test server and the remote server if possible.
I should note that while I'm fairly comfortable with writing complex queries to handle data, I have very little experience modifying database structure without a graphical interface.
I can access the remote database in the visual studio database explorer but I can not use that for anything other than data manipulation. I installed Sql Management Studio express last night and after 40+ crashes I gave up - I couldn't even patch the damn thing.
The remote server is SQL 2005 / The MyLittleAdmin web interface is available.
So my question is what is the best way to accomplish these changes. Is there a graphical interface I can use on the remote server? If not is there an easy way to copy the database to my local machine, fix it, and re upload? Finally if none of the above are viable does anyone have links to a decent info on fixing referential integrity via query?
Sorry for the somewhat general question - I feel like I am making this far harder than it should be but after searching / trying all night i haven't gotten anywhere. Thanks in advance for the help. I really appreciate it.
...Also does anyone have a time machine I can borrow- I need to go kick my past self's ass for this.
Usually hosting providers allow you to backup and restore your database, so the easiest way to accomplish the move is to backup your live DB, download the backup file, restore it locally, do all the changes, do a backup of the local db, upload it, then restore it in the live service. Your site should be placed on an administrative shutdown during this time so it does not continue to update the data while you're doing this operations. You have to make sure your local SQL instance is exactly at the same build version (##version) like the hosting provider, otherwise your local SQL may upgrade the database structure and you'll be unable to restore it back on the hosting provider (or you'll be unable to restore in on your local server if your version is earlier than the host's). The MSDN BOL has a detailed guid on how to Copy Databases using Backup/Restore.
An alternative to backup/restore is to detach/attach the database, but I do not recommend this because you need to move both the MDF and the LDF in sync, and they're also larger in size than a backup.
This assumes you can do all the schema changes on your local copy in a wizardly manner, ie. fast and correct. Of course, that is not easy. The recommended way is to prepare in time a script that applies all the transformations needed to reach the new schema. There are tools like SQL Diff, SQL Compare, SQL Delta and other that can generate such a script. Also Visual Studio Database Edition can do this.
How I would do this would be like this:
Ensure I have exactly the same schema on my dev machine as on the live host. If not sure, I can take a backup of the live server and restore it localy. This would be my reference, v1. schema.
Keep the backup of v1. for reference
Start developing a script that changes the schema to my target. Sometimes I need to refresh my memory on script syntax myself, and what I do is I go to the SQL Server Management Studio wizards for the operation I want to do, select all the options in the UI and then select the 'show script options', that will show me exactly the script SSMS is running to accomplish my desired change.
For each change I add to the script, I can test it by restoring the v1. reference backup I have from step 1 and running the script.
Keep iterating on the script, adding one change at a time, until all the needed schema changes are done. After each change, I can test it again like in step 4.
Yourscript should do not only DDL changes to the schema, but also any DML changes needed (modifying reference data, changing values, moving columns between tabels etc).
When the script is ready, I can download a newer backup, apply the script, and upload the updated backup and restore it on the live host. Alternatively you can simply run the script on the live host (after of course you backed it up in case something goes horribly wrong).
In my projects I always rely on scripts to deploy and upgrade the database. In fact I use the database extended properties to store a 'version' of my application deployed schema and in my code I simply roll forward all the scripts that bring the schema to my last version. I have an article on my blog describing this technique: Version Control and your Database.