unresolved reference to object [INFORMATION_SCHEMA].[TABLES] - sql

I've created a UDF that accesses the [INFORMATION_SCHEMA].[TABLES] view:
CREATE FUNCTION [dbo].[CountTables]
(
#name sysname
)
RETURNS INT
AS
BEGIN
RETURN
(
SELECT COUNT(*) FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = #name
);
END
Within Visual Studio, the schema and name for the view are both marked with a warning:
SQL71502: Function: [dbo].[CountTables] has an unresolved reference to object [INFORMATION_SCHEMA].[TABLES].
I can still publish the database project without any problems, and the UDF does seem to run correctly. IntelliSense populates the name of the view for me, so it doesn't seem to have a problem with it.
I also tried changing the implementation to use sys.objects instead of this view, but I was given the same warning for this view as well.
How can I resolve this warning?

Add a database reference to master:
Under the project, right-click References.
Select Add database reference....
Select System database.
Ensure master is selected.
Press OK.
Note that it might take a while for VS to update.

In our project, we already have a reference to master, but we had this issue. Here was the error we got:
SQL71502: Procedure: [Schema].[StoredProc1] has an unresolved reference to object [Schema].[Table1].[Property1].
To resolve the reference error, on the table sql file,
right click properties and verify the BuildSettings are set to Build.
Changing it build fixed it.

what Sam said is the best way for doing this.
However, if you have a scenario that you need to deploy the dacpac from a machine that doesn't have that reference in that specific location, you may get into trouble.
Another way is to open your .project file and make sure the following tag has the value of false for the build configuration you are trying to run.
<TreatTSqlWarningsAsErrors>false</TreatTSqlWarningsAsErrors>
This way you don't need to add a reference to your project.

I'm using VS 2019, And even after adding the master db reference still got this issue. Resolved this by Changing the target platform of the DB project as shown in the image below. I had to remove and add back the master db again after this change.

Related

How do you see what SQL IronSpeed sends to the database?

I'm using IronSpeed Designer 12.2 and trying to write custom SQL in a WhereClause override. The custom SQL I wrote and submitted in the WhereClause is throwing an SQL exception, but I can't see the SQL IronSpeed is sending to the database. Without the SQL, I cannot troubleshoot.
I can't find where the SQL is submitted to the database, such as by an ExecuteReader method call.
I'm using a statement like this:
if (MiscUtils.IsValueSelected(this.MyFilter)) {
String sql = "(EXISTS (SELECT TOP 1 CompanyId FROM Collateral as c WHERE CODE = '{0}' AND c.CompanyId = Company.CompanyId))";
wc.iAND(String.Format(sql, this.MyFilter.SelectedValue));
}
I know my WhereClause SQL is correct when used outside of IronSpeed because I copy-pasted it from a query working directly in MSSQL. However I can't see how IronSpeed combines it with its internally-generated SQL after it becomes a WhereClause.
I'm hoping someone has experience with this issue and can point me in the right direction. Thanks for the help!
If you look for answer long enough, you can find it yourself. Here's how I found you can examine the SQL sent to the database:
Go to C:\Program Files\Iron Speed\Designer v12.2.0.
Copy the BaseClasses folder to the root of my IronSpeed solution folder.
Add the existing BaseClasses project to the IronSpeed solution.
Delete the existing references to baseclasses.dll from the projects in the IronSpeed solution (I'm using a web app rather than web site project).
Add references to the BaseClasses project now included in the solution.
Open the file MicrosoftDynamicSQLAdapter.vb.
In method GetRecordValuesEx(...), go to line 1514 statement "reader = SqlTransaction.ExecuteReader(myCommand, cmdBehavior)" and set a breakpoint on this line.
Run the project. When the breakpoint is hit, examine the command of myCommand object.

ssis Package validation error ole db source failed

I am getting the following error when I try and run my package. I am new to ssis. Any suggestions. Tahnks
===================================
Package Validation Error (Package Validation Error)
===================================
Error at Data Flow Task [SSIS.Pipeline]: "OLE DB Source" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
Error at Data Flow Task [SSIS.Pipeline]: One or more component failed validation.
Error at Data Flow Task: There were errors during task validation.
(Microsoft.DataTransformationServices.VsIntegration)
Program Location:
at Microsoft.DataTransformationServices.Project.DataTransformationsPackageDebugger.ValidateAndRunDebugger(Int32 flags, IOutputWindow outputWindow, DataTransformationsProjectConfigurationOptions options)
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchDtsPackage(Int32 launchOptions, ProjectItem startupProjItem, DataTransformationsProjectConfigurationOptions options)
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchActivePackage(Int32 launchOptions)
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchDtsPackage(Int32 launchOptions, DataTransformationsProjectConfigurationOptions options)
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.Launch(Int32 launchOptions, DataTransformationsProjectConfigurationOptions options)
VS_NEEDSNEWMETADATA shows up when the underlying data behind one of the tasks changes. The fastest solution will probably be to just delete and re-create each element which is throwing an error.
How about disabling validation checks?
Like if you right click on source or destination component and select properties then you will have the property named validateExternalMetadata put that as false and try.
This Solution is working for me.
This normally occurs if there has been a change to your schema, not to stress, just double click on your input and output and it should resolve itself
Make sure your connection is valid. If you are using dynamic connections, then try to set the option "delay validation" = true on the package or dataflow.
In my case destination table structure was not matching with matadata in OLEDB component. I added the missing column which i forgot to add and after that it was fixed.
After researching a bit (check to extract your own conclusions: this and this one), I think I've found a nice workaround for when the problem with the metadata comes from a Ole DB object, but only for a very specific case.
The thing is that when you change your columns names / remove columns / add columns, you can't do anything but update the metadata.
However, if you use a SQL query to retrieve the data from the object, in the case that you don't need to update the query itself, you won't need to update the metadata if the query still can ask for what it wants. Basically, if the query is still valid.
I've tried it within my own ETL, and changed an Ole DB object which was reading the data from an Excel file, targeting one sheet and then I had all the columns selected in the tab.
Changing it for an SQL query to retrieve the full sheet like:
SELECT * FROM ['Sheet_Name$']
Solved completely the case for me, even introducing files with different metadata in headers.

How to use setup FULL-TEXT Lucene SEARCH in H2 Database without errors?

I followed the H2 tutorial about setting the FTL using Lucene, however I'm experiencing unknown exceptions.
This is how I did it:
using SQuirrrel SQL Client I added the lucene-core-3.0.3.jar library in Additional classpath (otherwise it complains cannot import classes)
then I called this:
CREATE ALIAS IF NOT EXISTS FTL_INIT FOR "org.h2.fulltext.FullTextLucene.init";
CALL FTL_INIT();
Afterwards it the *.trace.db log said that the {db.name} exists but it's not a directory.
I just fixed it, by renaming the database file to something else, then I made a directory named after the db.
Now running this:
CREATE ALIAS IF NOT EXISTS FTL_INIT FOR "org.h2.fulltext.FullTextLucene.init";
CALL FTL_INIT();
works.

MsTest, DataSourceAttribute - how to get it working with a runtime generated file?

for some test I need to run a data driven test with a configuration that is generated (via reflection) in the ClassInitialize method (by using reflection). I tried out everything, but I just can not get the data source properly set up.
The test takes a list of classes in a csv file (one line per class) and then will test that the mappings to the database work out well (i.e. try to get one item from the database for every entity, which will throw an exception when the table structure does not match).
The testmethod is:
[DataSource(
"Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\EntityMappingsTests.Types.csv",
"EntityMappingsTests.Types#csv",
DataAccessMethod.Sequential)
]
[TestMethod()]
public void TestMappings () {
Obviously the file is EntityMappingsTests.Types.csv. It should be in the DataDirectory.
Now, in the Initialize method (marked with ClassInitialize) I put that together and then try to write it.
WHERE should I write it to? WHERE IS THE DataDirectory?
I tried:
File.WriteAllText(context.TestDeploymentDir + "\\EntityMappingsTests.Types.csv", types.ToString());
File.WriteAllText("EntityMappingsTests.Types.csv", types.ToString());
Both result in "the unit test adapter failed to connect to the data source or read the data". More exact:
Error details: The Microsoft Jet database engine could not find the
object 'EntityMappingsTests.Types.csv'. Make sure the object exists
and that you spell its name and the path name correctly.
So where should I put that file?
I also tried just writing it to the current directory and taking out the DataDirectory part - same result. Sadly, there is limited debugging support here.
Please use the ProcessMonitor tool from technet.microsoft.com/en-us/sysinternals/bb896645. Put a filter on MSTest.exe or the associate qtagent32.exe and find out what locations it is trying to load from and at what point in time in the test loading process. Then please provide an update on those details here .
After you add the CSV file to your VS project, you need to open the properties for it. Set the Property "Copy To Output Directory" to "Copy Always". The DataDirectory defaults to the location of the compiled executable, which runs from the output directory so it will find it there.

SSIS - Skip Missing Files

I have a SSIS 2008 package that calls about 25 other SSIS packages.
Each of those child packages loads a specific file into a table. But sometimes one or more of these input files will be missing.
How can I let a child package fail (because a file is missing) but let the rest of the parent package keep on running?
I've tried increasing the maximum error count on the parent package, the tasks in the parent package that call each child, and in the child package itself. None of that seemed to make any difference. I still get this error when I run it with a file missing:
SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The
Execution method succeeded, but the
number of errors raised (2) reached
the maximum allowed (1); resulting in
failure. This occurs when the number
of errors reaches the number specified
in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.
Edit:
failpackageonfailure and faulparentonfailure are already all set to false everywhere.
I haven't tried this, but this is how I would approach it.
Create a variable for the file name and the child package name.
Use a For Each Loop container. Have it go through the location of the files and pull the file names one at a time. Use the file name to change the child package name variable. In the container have the task to run the child package and have the name dynamically set based on the values of the child package name variable.
Then it should only try to run the child packages which have appropriate files.
in the properties of the execute package task, you can set the failpackageonfailure and faulparentonfailure. i haven't worked with these, but you can probably play with them to get your desired results.
Side note: for simplicity, I'd set these settings on the parent SSIS package.
There is a MaximumErrorCount values at the Sequence Containers & package level. If you're using this be sure your values are in-sync because the package level settings take precedence.
Another option is the ForcedExecutionValue.
To set this up, load the properties tab for each of container and:
1) ForceExecutionValue to TRUE
This will cause the container to return whatever value you put in the variable (see step #2), despite the outcome of the task(s).
2) ForcedExecutionValue to 0
This acts a return value for that task, and sets it to 0 (true, think "return 0" as in C++).
I hope that helps.
This will cause the package to
Load the properties using "ForcedExecutionValue" to 0, then Then set the Force
I have done this kind of scenario development, first plan the package execution method as whenever you will get a file we need to process the package if not either fail or leave the package ultimately our target is to process all the package of files existing. take a variable for all the packages. set the variable to "Y" or "N" on the existing of the file using script component or connection string in the parent package. the existing condition to execute the package on the value of the variable.
This method gave us desired results of process multiple files with different occurences of source files.
thanks
prav