While I recreate a MS SQL Server database it gives me the following error.
"Filestream feature is disabled."
I really need that feature and it was previously enabled on the existing copy if the database but since I am trying to recreate it how is it possible to enable a feature on a database that is still not created?
Thanks.
Related
I am attempting to create a new database in an existing sql server from code first. I used package manager to enable migrations, add a migration, then update database. The three methods executed without any errors and I got the traditional massages afterwards. However, when I look in either MSSSMS or Server Explorer I don’t see the database. I tried re-running the update database command and got the migration has already been applied. Any suggestions?
Update:
I've figured out that the project is adding the database to localdb. However, I have a connection string in the app.config file.
Probable causes are:
the database was never created
you use a login that can't see the database. Using Windows authentication/sa might allow you to see it and create user mappings for other logins.
you have multiple instances of sql running and you're connecting to the default instance while the database what created on a named instance
you accidentally created the database on another server
you connect to master by default with your software and instead of creating a database the database structure was created in the system database master
I've been working with MS sql for more than a decade and have struggled with issues like these, but in the end I could always explain what happened; if SQL returns it executed a query succesfully id did.
I am trying to create a full-text search mechanism, to do that I created full text catalog in my local database. Then I opened context menu and it had properties option.
I opened it and set up tables/views tracking. But when I tried to do the same with remote Azure database, I could not open properties window.
Any idea how can I open/edit settings I need?
I was playing around with SSMS and found an interesting solution to this. As Alberto Morillo mentioned, to do operations with Azure SQL DB you need to use Transact-SQL. I never worked with it and basic SQL knowledge turned out to be not enough. But I found interesting option in SSMS. I went to my local DB, created new catalog, set everything like I need and instead of saving, I pressed 'Save script to file'. This generated Transact-SQL code that I needed, all I did after is changed names and executed that queries on my Azure DB.
I was in my designer trying to enable the auto increment and when trying to save the changes this error shows, I am using SQL Server 2012 Management Studio.
Does anyone know why? Thank You.
This is a feature in Management Studio that prevents you from accidentally dropping and re-creating a table without knowing it. You can turn this off in Tools -> Options -> Designers -> "Table and Database Designers" - Option is "Prevent saving changes that require table re-creation".
You should turn this off with caution as this feature is trying to help prevent you from causing issues. I recommend you only do this against development servers, or even better always just generate the script and execute that instead. This way you will always know what commands are being executed.
I have created an application in VB.NET (using Microsoft Visual Basic 2010 Express) with a local database (SQL Server Compact 3.5 database) to store data.
I have installed this on the users computer, and added a "search online for updates" functionallity (which can be selected when publishing)
Now i have noticed, that sometimes when i upload a new version, the data from the database gets cleared. (possibly when i opened the dtb while developing)
This is offcourse not how i want the system to behave, and the data should always remain on the users computer.
In 'Application Files' the database file (*.sdf) is currently set to 'Data File (Auto)', but i'm unsure of the exact way this works.
Could anyone help me to understand how all of this works, and tell me how i can be sure that the data in the users database will remain, even after an update?
If there is no solution to ensure this, is there a way to safely backup the data and reload it?
Thanks in advance!!
Basically, the click one install overwrites everything in the program folder that is included in your publish. So if you include the .sdf then it will overwrite it when the installer is executed. What you need to do is select "exclude" on the sdf instead. This will keep the database in its previous state.
So my recommendation would be to have 2 different publishes. One that you create that contains the .sdf which is only used on first time installation, and then in all the update ones you exclude it.
To perform updates on your tables you would have to write the SQL for it in your software. Basically do a check on all tables to see that they have the proper setup on startup. If they don't then you add the missing columns.
Hope this helps.
I have a live database on a shared hosting server. I am making some major changes to my site's code and I would like to fix some stupid mistakes I made in initially designing the database. These changes involve altering the size of a large number of fields, and enforcing referential integrity between tables properly. I would like to make the changes on both my local test server and the remote server if possible.
I should note that while I'm fairly comfortable with writing complex queries to handle data, I have very little experience modifying database structure without a graphical interface.
I can access the remote database in the visual studio database explorer but I can not use that for anything other than data manipulation. I installed Sql Management Studio express last night and after 40+ crashes I gave up - I couldn't even patch the damn thing.
The remote server is SQL 2005 / The MyLittleAdmin web interface is available.
So my question is what is the best way to accomplish these changes. Is there a graphical interface I can use on the remote server? If not is there an easy way to copy the database to my local machine, fix it, and re upload? Finally if none of the above are viable does anyone have links to a decent info on fixing referential integrity via query?
Sorry for the somewhat general question - I feel like I am making this far harder than it should be but after searching / trying all night i haven't gotten anywhere. Thanks in advance for the help. I really appreciate it.
...Also does anyone have a time machine I can borrow- I need to go kick my past self's ass for this.
Usually hosting providers allow you to backup and restore your database, so the easiest way to accomplish the move is to backup your live DB, download the backup file, restore it locally, do all the changes, do a backup of the local db, upload it, then restore it in the live service. Your site should be placed on an administrative shutdown during this time so it does not continue to update the data while you're doing this operations. You have to make sure your local SQL instance is exactly at the same build version (##version) like the hosting provider, otherwise your local SQL may upgrade the database structure and you'll be unable to restore it back on the hosting provider (or you'll be unable to restore in on your local server if your version is earlier than the host's). The MSDN BOL has a detailed guid on how to Copy Databases using Backup/Restore.
An alternative to backup/restore is to detach/attach the database, but I do not recommend this because you need to move both the MDF and the LDF in sync, and they're also larger in size than a backup.
This assumes you can do all the schema changes on your local copy in a wizardly manner, ie. fast and correct. Of course, that is not easy. The recommended way is to prepare in time a script that applies all the transformations needed to reach the new schema. There are tools like SQL Diff, SQL Compare, SQL Delta and other that can generate such a script. Also Visual Studio Database Edition can do this.
How I would do this would be like this:
Ensure I have exactly the same schema on my dev machine as on the live host. If not sure, I can take a backup of the live server and restore it localy. This would be my reference, v1. schema.
Keep the backup of v1. for reference
Start developing a script that changes the schema to my target. Sometimes I need to refresh my memory on script syntax myself, and what I do is I go to the SQL Server Management Studio wizards for the operation I want to do, select all the options in the UI and then select the 'show script options', that will show me exactly the script SSMS is running to accomplish my desired change.
For each change I add to the script, I can test it by restoring the v1. reference backup I have from step 1 and running the script.
Keep iterating on the script, adding one change at a time, until all the needed schema changes are done. After each change, I can test it again like in step 4.
Yourscript should do not only DDL changes to the schema, but also any DML changes needed (modifying reference data, changing values, moving columns between tabels etc).
When the script is ready, I can download a newer backup, apply the script, and upload the updated backup and restore it on the live host. Alternatively you can simply run the script on the live host (after of course you backed it up in case something goes horribly wrong).
In my projects I always rely on scripts to deploy and upgrade the database. In fact I use the database extended properties to store a 'version' of my application deployed schema and in my code I simply roll forward all the scripts that bring the schema to my last version. I have an article on my blog describing this technique: Version Control and your Database.