I am creating and maintaining my SQL Server 2008 warehouse database using a Visual Studio 2008 project.
When I try to deploy the project, I get an error as below
Creating DataWarehouseReports...
Company.HigherEducation.DataWarehouse.dbschema(0,0)Error TSD01268: .Net SqlClient Data Provider: Msg 2714, Level 16, State 6, Line 2 There is already an object named 'DataWarehouseReports' in the database.
Company.HigherEducation.DataWarehouse.dbschema(0,0)Error TSD01268: .Net SqlClient Data Provider: Msg 2759, Level 16, State 0, Line 2 CREATE SCHEMA failed due to previous errors.
An error occurred while the batch was being executed.
Done executing task "SqlDeployTask" -- FAILED.
Done building target "DspDeploy" in project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
Done executing task "CallTarget" -- FAILED.
Done building target "DBDeploy" in project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
Done building project "Company.HigherEducation.DataWarehouse.dbproj" -- FAILED.
What I found, is that the DataWarehouseReports reports schema exists in the master database (granted, by mistake), but I get the "duplicate" error when I try to create it in the OLAP database.
In development, my OLAP and OLTP databases are on the same server, and I encounter the same problem for schema names. For now, I have named schemas differently in the OLTP vs. the OLAP, but for consistency sakes, I would rather have the same name.
How do I debug this?
edit: PEBCAC Alert
I found out that I have a pre-deployment sql script that changes the database context to MASTER. My script was failing and therefore the context never was changed back.
So when the rest of my script executed, it was actually trying to create everything in Master.
Lesson learned: Read the output when the project Deploys.
Basically, the VS2008 project creates a .SQL file and in which the change DB is only where the DB gets created. If you change context in the Pre-Deployment, it is up to you to change it back.
Related
I need make a backup of my SQL Server database. When I try, I get this error:
System.Data.SqlClient.SqlError: Read on "c:..." failed: 23(...)(Data error (cyclic redundancy error))
Now, I'm trying to run this command:
DBCC CheckDB ('MYDATABASE') WITH NO_INFOMSGS, ALL_ERRORMSGS
But I get this error
Msg 8921, Level 16, State 1, Line 18
Check terminated. A failure was detected while collecting facts. Possibly tempdb out of space or a system table is inconsistent. Check previous errors.
What can I do? I just need make a backup.
I'm using Microsoft SQL Server Management Studio.
First of all, check the Service Account used for the SQL Server Instance from Services.
Ensure the service account have enough permission for read/write at the exact location for Backup in Physical Disk.
Ensure the the user (the user you using to login in SQL Instance) have enough permission to perform backup.
Final option to recover the data from the database is create another database with same tables (blank) in different machine in different SQL instance, then Export all the database to new database using Management studio (Right click on the Database > task > Export Data)
My question is, how to you develop and deploy a database project to Azure SQL DB that uses external tables?
I’m using a Visual Studio 2017 database projects to manage Azure SQL databases. I’ve been following the path of building the database in VS, then hitting the publish button, and it works great.
Now I am trying to add external tables for doing Elastic Queries. In SSMS, I created the external data source and credential for it, and a remote table:
CREATE EXTERNAL DATA SOURCE [RemoteServer]
WITH (TYPE = RDBMS, LOCATION = N'myserver.database.windows.net',
CREDENTIAL = [RemoteUser], DATABASE_NAME = N'MyRemoteDb')
GO
CREATE EXTERNAL TABLE dbo.MyTable (
ID int not null,
MyColumn varchar(10) not null
) WITH (
DATA_SOURCE = [RemoteServer]
);
GO
(Credential for RemoteUser also exists.)
SELECT * FROM MyTable
produces the expected result and all is fine in the world.
Now I want to add this to my database project so it can be deployed via publishing (in VS) for any future changes (and saved in source control).
I am getting the error:
SQL71501: External Data Source: [RemoteServer] has an unresolved reference to Database Scoped Credential [RemoteUser].
I can remove the error by setting the properties of all the external pieces to not build, but that defeats the purpose of being able to publish from Visual Studio.
I’ve tried removing the credential from the data source, so it doesn’t exist. It removed the error, but publish fails due to it missing the credential:
CREATE EXTERNAL DATA SOURCE [RemoteServer]
WITH (TYPE = RDBMS, LOCATION = N'myserver.database.windows.net',
DATABASE_NAME = N'MyRemoteDb')
GO
SQL72014: .Net SqlClient Data Provider: Msg 46505, Level 16, State 15, Line 1 Missing required external DDL option 'CREDENTIAL'.)
Any suggestions?
Elastic query in Azure is still in preview at the time of writing and Microsoft list a number of limitations in the overview page. Unfortunately one of them is regarding the scripting of external tables and external data sources via SSDT. It is noted here (see second item in the list in regards to SSDT).
Without going into too much detail, I had a system failure which meant that SQL got shut down unexpectedly. I have managed to detach the database and remove it from the list of databases in SSMS.
I am now trying to re-attach the database from a different location, on the same NAS;
The command I am using is this;
CREATE DATABASE ArchiveManager
ON (Filename = '\\411352-web5\m$\FTP\Detroit\ArchiveManager.mdf'),
(Filename = '\\411352-web5\m$\FTP\Detroit\ArchiveManager_log.ldf')
FOR ATTACH;
Originally, the database was sat on the share j$. I have now had to move it to m$.
I get the below error why trying to attach the database;
File activation failure. The physical file name "\\192.168.200.222\j$\FTP\Detroit\ArchiveManager_log2.ldf" may be incorrect.
The log cannot be rebuilt because there were open transactions/users when the database was shutdown, no checkpoint occurred to the database, or the database was read-only. This error could occur if the transaction log file was manually deleted or lost due to a hardware or environment failure.
Msg 1813, Level 16, State 2, Line 7
Could not open new database 'ArchiveManager'. CREATE DATABASE is aborted.
Note that 192.168.200.222 resolves to 411352-web5.
(Here is a problem with a similar issue:
Publish to SQL Azure fails with 'Cannot drop the external data source' message)
There is this new pattern called Sql Db Elastic Query (Azure).
The gist of it is captured here:
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-elastic-query-overview
Now, I have a SqlProj that defines:
External Data Source (EDS) and
Database Scoped Credentials (DSC)
To prevent passwords being in the script (I am using SqlCmd variables)
I probably have a gazillion views on "external" tables, based on the elastic query pattern.
And during DACPAC deployments to SQL Azure, I always get an error on:
Error SQL72014: .Net SqlClient Data Provider: Msg 33165, Level 16, State 1, Line 1 Cannot drop the external data source 'XXXX' because it is used by an external table.
Error SQL72045: Script execution error. The executed script:
DROP EXTERNAL DATA SOURCE [XXXX];
Checking the logs, I realize that there are all these Views/Tables that exist and use these EDS/DSC combo.
The work around comes with a price that's ever deepening.
So the question is, has anyone else hit this problem and found the root cause of this?
~~~~~~~~~~~~~~~~
Hi, all!
I have a database model xxx.pdm and a sql script I want to apply to the db (so that generated xxx_db.sql, xxx_triggers.sql, etc will contain the changes - the files are used in whole application building process to generate yyy.db file).
I've tried to:
open the pdm file with PowerDesigner 16.5
go to Database->Update Model from Database...
select "using script files" and specified a sql file (with some create index, alter table statements). pressed ok
PowerDesigner showed progress dialog and a dialog merge models with yellow locks near some of the entities.
I try to generate database: Database->Generate database... in the dialog xxx_db.sql is selected.
the result - generation aborted due to errors detected during the verification of the model.
Category Check Object Location
Reference Incomplete join Reference 'FK_table1_col1' <Model>
Reference Incomplete join Reference 'FK_table2_col2' <Model>
at the same time the sql script is well executed via Sybase Interactive (cmd line).
Is my approach correct?
Maybe I'm doing something wrong?