I have a SQL Server CLR trigger project that was created in VS2008. Opening the project in VS2012 does not present any problem, but on build of the solution, I get SQL71501 errors, with Trigger: [...] has an unresolved reference to object [...].
Based on my reading, this is due to a missing database reference in the project. When I try to add a database reference, I get the Add Database Reference dialog that gives me three options:
Database projects in the current solution ** this option is grayed out/disabled
System database (only shows system DBs)
Data-tier Application (.dacpac) ** there are no options to select as this was not how I created the project.
Further reading suggested that the reason there are no database projects to select for the first option, is because no Data Connections have been added via the Server Explorer. In my case there are certainly Data Connections present, and while my project is open, I can quite happily browse the database, look at data etc.
I thought it might have something to do with the Target Framework, so I have tried targeting 3.5 and even 2, but the same problem occurs.
I feel like I'm missing something fundamental, but just can't quite work it out. Any help would be GREATLY appreciated.
I've seen this dialog not enable the 'OK' button because the Database Variable that it defaults is invalid (in my case it had a '.' in it).
The clue to this being the error is that the text in the 'Example Usage' field contains the error message - it's just hard to see as it's dark grey on light grey.
Editing the Database Variable name fixes this.
I found a workarroud for this, isn't optimal but at least it works:
try this
- open sql server object explorer
- create a new connection to your server
- right click on the database and select create new the project...
- the wizard will create the project with all the references and connection string attached to it.
cheers!
I changed my answer
The problem is you cannot assume an insert is a single row and you can really only reference the primary key as a single value inside the trigger
Related
Vs2022, I have a database project. Within that project are some views and functions which refer to a system database. So I have added those system database references, both Master and MSDB to the project. The references work, all is good.
I close the solution and reopen it, and now the project shows two references to each database, and a bunch of script errors because an unresolved reference exists:
So the fix is again to remove these 4 references, add the database reference back to master and msdb, and then all is good, until I reopen the solution again!
One side note, this solution was originally created in VS2019. Also, this happens on 2 separate machines. I'm running VS 17.3.3 64-bit.
For anyone facing the same problem, VS 2022 adds two references to the DBs, one from VS extensions folder and the other one from SQL Server folder, its definitely a bug and happens often when updating VS 2022.
However, the solution is to delete the second one from the project references (SQL Folder Reference) and then you need to click on the project and explicitly save it by using Ctrl+S, otherwise, the change will not be saved and whenever you close and open the solution, the project will show invalid references.
I'm currently updating all of our ETLs using Visual Studio 2015 (made in BIDS 2008) and redeploying them to a new reporting server running on SQL Server 2016 (originally 2008R2).
While updating one of the ETLs and trying to run on the new server I got this error:
The package execution failed. The step failed.
Sometimes it also produces this error:
Source: Load Fact Table SSIS.Pipeline Description: "Copy To Fact
Table" failed validation and returned validation status
"VS_NEEDSNEWMETADATA".
I've tried deleting and re-adding the OLEDB Destination, connection strings and opened up the column mappings to refresh the meta data. I also recreated the whole data flow task but I'm still getting the same error.
The package runs fine on my local machine.
UPDATE:
I started taking the package apart and running only pieces of it to try and narrow down which part was failing. It seemed to be failing on loading into the staging table but I couldn't find out why.
I eventually decided to just try and re-create the whole thing. After re-creating the entire package, still no luck. The picture below is from the event viewer on the server itself but it didn't give me any new information.
Package error from event viewer
I have tried all the solutions provided above and the other sites. Nothing worked.
I got a suggestion from my friend Which worked for me.
Here are the steps:
Right click on the Source/Target Data flow component.
Go to Advanced Editor -> Component Properties
Find ValidateExternalMetadata and set it to False.
Try your luck. This is a pathetic issue and left me clueless for 2 days.
I finally found the issue and here's how I did it.
Because the error messages I was getting from SSMS weren't very insightful I first opened up my remote desktop and logged into the server. Then I went to Administrative Tools>Event Viewer and then Windows Logs>Application to see if the failed event would provide greater detail.
It didn't give me much still.
The next step I took was to run the package from the command line because the messages should be more verbose. Opened up cmd, changed directory to the one my package was in and then...
DTEXEC /FILE YourPackageName.dtsx
Finally, the error message here showed a missing column in the tables the package was trying to write to. I added those columns and voila!
As stated in comments,
if it runs ok in your development environment, then the problem isn't with the package, it's with the scheduled job on the server. Try recreating that.
If that doesn't work,
It seems like the server has a cached instance of the package it's using instead of the updated one. Try renaming your package and creating a new job with the new package name and see if that works.
If that doesn't work,
all I can recommend at that point is to cut the package down until it succeeds, then add the next step that fails.
Sounds like from your solution the development environment is more forgiving of schema updates than the deployed solution. Glad you were able to resolve, eliminating clutter helps.
I had the same problem and my issue was a difference between two environments, the same field in the same table once was written with a capital and once not. So the name was the same, but with this small difference (e.g. isActive vs IsActive).
This came from a refactoring effort, where we used VS database publish that did not updated the field name.
Have you tried deleting and re-creating the source? When I get this I can generally modify OK any object that has the error but have to delete and rebuild the paths between them, however sometimes I have to delete everything in the data flow and re-create it.
A Proxy for SSIS Package Execution should be created under the SQL Server Agent. You should then change your job step (or steps) to Run As the Proxy you've created.
I had your same problem some time ago and the proxy fixed it.
Forgive me if you've already tried this.
It is very common to get that message when 2 columns in the source file are being inserted into the same field of the table.
i.e.
My text file has twice "neighborhood" (same label for different columns) and my table has "neighborhood" and "neighborhoodb" (notice the "b" at the end). The import will try to import both text columns into the field "neighborhood" and ignore the "neighborhoodb" field, it will fail with the "VS_NEEDSNEWMETADATA" error.
Re-creating the job worked for me. Some cached version of the job may have been causing the VS_NEEDSNEWMETADATA error. The package was executing correctly but it was failing, when it was executed by an agent job.
This ended up being a permissions issue for me. The OLE DB Source was using a stored procedure that selected from a SQL view. This view joined to a table in another database and unfortunately the proxy account the SQL Agent job step was running the package under did not have SELECT permission to the table in that database. This is why the package ran fine in Visual Studio but not from a job when deployed to the server. I found the root cause of the error by taking the SELECT statement out of the stored procedure and putting it directly in the Source Query box of the OLE DB Source control which caused it to finally return the 'SELECT permission denied' error message. This error was apparently hidden from SSIS since the proxy account DID have execute permission on the stored procedure.
It works for me after changing the ValidateExternalMetadata to false. I was transferring the data from MSSQL database to MySQL database. Changed "ADO NET Destination".
You may need to strongly type your Source Query.
Example:
If your DestinationDB has a FullName field Nvarchar(255)
and in your source query you have
select firstname + lastname as FullName from...
Try this:
Select CONVERT(NVARCHAR(255),firstname + lastname) as Fullname from...
So if you are going from db to db and both are nvarchar(255) I don't have this issue, but if you are concatenating fields in your query specify the data type and length.
This error can also occur when an entire SSIS project needs to be redeployed rather than just one of the packages (for VS versions that allow deployment of a single package in a multi-package project), particularly when a project connection has been changed or added. For example, if you've added or removed columns from a flat-file project connection. In that case, you need to deploy the entire project to push out the updated project connection properties. This can be true even if the project only has one package in it. In VS Solution Explorer, rather than click on the package name to deploy, select the bolded project name at the top, and then click deploy.
I am getting an error that makes no sense when saving a table created within VS2012, SQL Express 2008 R2.
Item in the Virtualizing TreeView cannot be null
This is a table save not a rebuild, unless that happens anyway. But I do not get this error at any other time, even after creating new web page, or saving changes to an existing page.
Really not sure where to look for the cause. I have not changed any of the pages that use the treeview in over 3 years. They are in a totally different part of the web site.
Any help appreciated.
I also encountered the same problem some time ago and landed here to see not much solution has been provided.
What I found is that, if that instance of VS is closed and a new instance is reopened then that error is getting resolved.
Hi you can go to server explorer in VS and right click on your connection. It will ask you for Database Diagramming select yes for it. Now you will not get this error.
I have created an SSDT project for SQL Server 2012 database. since i have database already present in the SQL Server Database engine so i use the import feature to import all the objects into SSDT. everything works fine but i am now facing 2 problems
1) since one of the table is using the HIERARCHYID column (col1) as a datatype and there is one computed column based on the HIERARCHYID column. The definition of computed column is something like case Col1= hierarchy.GETRoot() THE NULL ELSE someexpression END. after importing the table script in SSDT, Error of unresolve reference start coming up.
If i change the defination to something like case hierarchy.GETRoot() = Col1 THE NULL ELSE someexpression END (note now col1 is now at the end) it works fine.
2) if i keep the above solution (i.e keeping col1 after =) then at the time of publishing the project,SSDT has to drop the column at the production server and then recreate it. since there is a index depend on this column the deployment get failed everytime saying the error like ALTER TABLE DROP COLUMN fail because other object access it. i have no control how SSDT design / publish the script. and if i have to keep any eye to drop every dependent object before publishing the database project then i think there is no use of it
Please suggest how i can resolve this
Thanks
Atul
I was able to reproduce the reference resolution problem you described. I would suggest submitting that issue to Microsoft via Connect here: https://connect.microsoft.com/SQLServer/feedback/CreateFeedback.aspx
I was not able to reproduce the publish failure. Which version of SSDT does the Visual Studio Help > About dialog show is installed? The most recent version ends with 40403.0. If you're not using the most recent version, I would suggest installing it to see if that fixes the publish failure. You can use Tools > Extensions and Updates to download SSDT updates.
If you do have the most recent version, could you provide an example schema that demonstrates the problem?
Compare your project to a production dacpac and have it generate scripts to make the changes. Then if need be, you can edit the scripts before they get applied to production. This is how my dev teams do it.
I was running into the same issue for a number of days now. After finding your post to confirm the issue was in SSDT, I realized that it may be fixed in a later version than the one we are currently using: 12.0.50730.0 (VS 2013, the version this project uses).
I also have version 14.0.3917.1 installed from VS 2017. I just attempted with that, no issues. So the solution is to upgrade your SSDT version.
Please ignore that solution, it appears my success last night was anomalous. While attempting to repeat it today after restoring a database with the issue, the deployment failed to account for at least one index again.
EDIT:
I have posted about this on User Voice: https://feedback.azure.com/forums/908035-sql-server/suggestions/33850309-computed-column-indexes-are-ignored-with-dacpac-de
Also, to maintain that this is at least a workable answer of sorts, the workaround I am implementing involves dropping and recreating the missed indexes myself using pre and post deployment scripts.
Not an ideal solution if the dacpac was meant to update various versions of the database that could have different levels of drift from the model, however it works for us as we have a tight control over all instances and can expect about the same delta generated each release for each db instance.
I've read through a couple of previous similar questions and none seem to provide a fix.
so i ask again.
I'm using Visual Studio and am trying to connect to a DB in Server Explorer. Regardless of what database i try to connect to it gives a "Given Key not Present in the Dictionary" error.
i have tried with SQL CE and SQL Express 2008 databases and each give the same issue.
I can connect quite easily with SQL Management Studio Express so i believe the Databases are the same.
Thanks in Advance.
after a lot of time searching for the answer to this i found the solution here
this is not really a workaround but in fact a SOLUTION, how to get rid of the error, in case you have missed to remove all connections before removing the provider:
edit C:\Users...\AppData\Roaming\Microsoft\VisualStudio\10.0\ServerExplorer\DefaultView.SEView and remove the connection with the wrong Provider manually. If you do not know which of the provider is failing, simply delete the file :)
If this won't help try deleting also C:\Users...\AppData\Local\Microsoft\VisualStudio\10.0
PS: you have to quit all instances of visual studios first or the files will be re-created from memory ..
The second part only worked for me , deleting the whole folder.
I had recently installed the MySQL Connector, and had multiple instances of Visual Studio open. After shutting them all down, I was able to add a connection to the Server Explorer.
Adding to what BastanteCaro said, I had open the DefaultView.SEView file in case I needed to go down that path. When I shut everything down and started up a new instance of Visual Studio, Notepad++ reported that the file had changed. So either there was an uncommitted change to the file or some sort of cleanup/addition was made on startup.