SSRS Text Query: Variable names must be unique within a query batch or stored procedure - sql

I am developing an SSRS 2008 report, but instead of using stored procedures, I want to use all Text queries. This report was working with stored procedures, but when I changed this report to use same logic but via text queries, I got the following error:
An error occurred during local report processing
    Query execution failed for dataset 'BRSR_Totals'
        The variable name '#END_yEAR' has already been declared. Variable names must be unique within a query batch or stored procedure.
Operation cancelled by user.
The problem is that some of my datasets (text queries) re-use the same parameters and END_YEAR is one of these parameters. How do I make this report run correctly?

One area that you might want to check is case sensitivity. SSRS is case-sensitive when considering parameter names but T-SQL does not have that case sensitivity. Take another look at your code and make sure that all parameters are using the same case.

I just resolved a similar issue using a text query to populate a dataset. It worked in SQL Server Management Studio and it worked in the Query Designer within BIDS, but failed at runtime.
The issue turned out to be BIDS helpfully adding parameters to the Dataset that this query was referencing. Switching to the Parameters tab of the Dataset Properties showed that BIDS had duplicated the parameters I had already added earlier. Deleting the duplicates resolved my problem.
To respond to the suggestion that the logic be off-loaded into a stored procedure: in this case, the report is a custom report for a single customer. The query will only ever be used in this report and makes a few assumptions about the customer's configuration that should not be available globally

I also just fixed this same issue in one of my queries. I was using a text query and had datetime variables/parameters. SSRS added a second set into the parameters for the dataset properties. I deleted them and my query ran fine after that and my graph populated.

I ran into a similar issue on a report where I had declare a substantial number of parameters at the beginning that I didn't want the end user to see. The issue I had was I was using a comma at the beginning of the line, so I had:
DECLARE #Parameter VARCHAR(4) = 'text'
, #Parameter VARCHAR(4) = 'text2'
It worked just fine in SSMS, but when I ran it in Report Builder 3.0 it threw the error shown in this thread. I changed it to remove the comma and to restate DECLARE at the beginning of each line and it worked perfectly.

Check that you didn't declare it twice, once in the CREATE PROC statement you're creating and another in the actual code...I've seen this problem while testing changes to SP code.

Related

SSRS conditional execution of dataset

I have two different datasets in SSRS report which gives different number of output fields.I have used this two data set into two different tables in report.
1) One table will display output at a time in execution based on condition.
Actually while running the report two datasets are executing the SP and it takes more time to display the output.
Requirement:
I need to execute the one dataset SP at time based on condition.Other Dataset SP should not Execute.
Example:
Dataset1 executes Sp1
Dataset2 executes Sp2
Table1 uses Sp1
Table2 uses Sp2
Normally while executing report Table1 will display output(Based on default parameter selection)
But SP1 and SP2 are executing on same time.so report takes more time to display output.
I need to execute 1 SP at a time based on condition.so that other dataset SP will not execute.
Step1:
First Create Dummy SP for Dataset .That SP should have same input parameters and same output fields as original SP1 but gives zero output rows.Do the same for SP2 for Dataset 2
Step2:
In Dataset properties select StoredProcedure Icon and in Fx column add below code
=IIF(Parameters!ManagerID.Value= -1,"SP1","DummySP")
Note: DummySP created should be same like SP1
Do the same for Dataset 2..and this works.
I have seen this answer, it's nice, but to create two SP's, it is sometimes very cumbersome. Sometimes the report uses a SP's many. This means that all SP's must be duplicated. And if you double the reports many... This means that they need to be maintained and improved throughout their lives.
Instead I have another solution:
Step 1: I create one SP, and use one of the parameters that already exists anyway.
Step 2: In the mapping of the DS\Report's parameters, instead of sending to DS the parameter sent to the report as it is, you can write an expression, which says that if I want to display the data of this SP I send the parameter sent to the report, and if I do not want to display this data, I will send different parameter , Which will cause the SP not to return data.
All this of course only if it is a procedure with parameters that allow the above. It is also possible to create a special parameter for this.
The following is a picture and example of an expression:
and the expression:
=IIf(Parameters!Display_xxx.Value=1,Parameters!yyy.Value,0)
You must control the visibility of tablix depends on the received parameter, in additional you can create parameters on both queries to control where execution to avoid the execution, like
Where #Condition = 1 AND ( Your WHERE )
As for that sack of performance, you don't want to run other SP at all, in this case, as per my experience, the best way is to modify your stored procs to add a new parameter based on what you are deciding what data set to call (means what data to pull).
So that, if the parameter says that the specific SP is not required, don't run the code in SP using IF-ELSE combo and return blank single row.
And then, on top of that, you can hide or show your tablix.
Hope that makes sense?
And if the column count and data types are same, you can always use an expression to decide what stored-proc to call.

Trim Funtion in Dataset Stored Procedure Causes SSRS report Fail

I have an SSRS report. It have multi valued parameters & a sub report. It runs fine until I changed the dataset for a parameter ( In both report). When I previewed the report it shows below error.
"one or more parameters were not specified for the subreport located
at /report".
I tried below steps to resolve the issue
1. Ensured parameter names are correct.
2. removed rdl.data file
3. Deleted the dataset and recreated it.
I tried to revoke the changes made in dataset & the report runs perfectly. The only change I have made in the dataset was the addtion of LTRIM(RTRIM)) Function in dataset query to remove the spaces between multi-value parameter.
The dataset query was like below
select Loan_no
From loan_header
where rowatatus = 'A'
I have changed it into
SELECT LTRIM(RTRIM(Loan_no)) As Loan_no
From loan_header
where rowatatus = 'A'
When I removed LTRIM(RTRIM) Function from the Dataset query, the report works fine. So, use of LTRIM(RTRIM) Function causes the Report failure.
What will be reason behind this? Anyone facing similar issue before?. Please help.

Pentaho Execute SQL Statements variable conversion to null

I am using PDI to delete and insert some data from a DB. I have the following issue. I create two variables called START_DATE and END_DATE that are used to select the data that will be deleted from my DB. I am able to get them and run my transformation with no erors in the log file, but when I checked if data was deleted, I find it didn't. I send checked my "DeleteProcedure" step, and it says "Conversion error: null". I have tried different approached to take the variables and pass them as Strings, but I haven't been able to solve this issue. It cannot be a SQL mistake as I tested it with a constant and it works.
Any ideas? I attach some pics. Thanks!
As a documentation of the Execute SQL script says:
Note: When you have an issue, that the SQL is started at the initialization phase of the transformation and not for each row, make sure to check the option "Execute for each row" (see description below).
In your case it executes during the initialization phase of the transformation that's why it gets null values instead of ones from previous step.

Data Driven Subscriptions SSRS Standard Edition 2008

I'm fairly new to MSSQL and SSRS.
I'm trying to create a data driven subscription in MSSQL 2008 Standard SSRS that does the following.
Email the results of the report to a email address found within the report.
Run Daily
For Example:
Select full_name, email_address from users where (full_name = 'Mark Price')
This would use the email_address column to figure out who to email, This must also work for multiple results with multiple email address's.
The way I'm thinking of doing this is making a subscription to run the query, if no result is found then nothing happens.
But if a result is found then the report changes the row in Subscriptions table to run the report again in the next minute or so with the correct email information found in the results.
Is this a silly idea or not?
I've found a couple blog posts claiming this works but i couldn't understand their code enough to know what it does.
So, Any suggestions on how to go about this or if you can suggest something already out there on the internet with a brief description?
This takes me back to my old job where I wrote a solution to a problem using data-driven subscriptions on our SQL Server 2005 Enterprise development box and then discovered to my dismay that our customer only had Standard.
I bookmarked this post at the time and it looked very promising, but I ended up moving jobs before I had a chance to implement it.
Of course, it is targeted at 2005, but one of the comments seems to suggest it works in 2008 as well.
I've implemented something like this on SQL Server Standard to avoid having to pay for Enterprise. First, I built a report called “Schedule a DDR” (Data Driven Report). That report has these parameters:
Report to schedule: the name of the SSRS report (including folder) that you want to trigger if the data test is met. E.g. "/Accounting/Report1".
Parameter set: a string that will be used to look up the parameters to use in the report. E.g. "ABC".
Query to check if report should be run: a SQL query that will return a single value, either zero or non-zero. Zero will be interpreted as "do not run this report"
Email recipients: a list of semicolon-separated email recipients that will receive the report, if it is run.
Note that the “Schedule a DDR” report is the report we’re actually running here, and it will send its output to me; what it does is run another report – in this case it’s “/Accounting/Report1” and it’s that report that needs these email addresses. So “Schedule a DDR” isn’t really a report, although it’s scheduled and runs like one – it’s a gadget to build and run a report.
I also have a table in SQL defined as follows:
CREATE TABLE [dbo].[ParameterSet](
[ID] [varchar](50) NULL,
[ParameterName] [varchar](50) NULL,
[Value] [varchar](2000) NULL
) ON [PRIMARY]
Each parameter set – "ABC" in this case – has a set of records in the table. In this case the records might be ABC/placecode/AA and ABC/year/2013, meaning that there are two parameters in ABC: placecode and year, and they have values "AA" and "2013".
The dataset for the "Schedule a DDR" report in SSRS is
DDR.dbo.DDR3 #reportName, #parameterSet, #nonZeroQuery, #toEmail;
DDR3 is a stored procedure:
CREATE PROCEDURE [dbo].[DDR3]
#reportName nvarchar(200),
#parameterSet nvarchar(200),
#nonZeroQuery nvarchar(2000),
#toEmail nvarchar(2000)
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
select ddr.dbo.RunADDR(#reportName,#parameterSet,#nonZeroQuery,#toEmail) as DDRresult;
END
RunADDR is a CLR. Here's an outline of how it works; I can post some code if anyone wants it.
Set up credentials
Select all the parameters in the ParameterSet table where the parameterSet field matches the parameter set name passed in from the Schedule A DDR report
For each of those parameters
Set up the parameters array to hold the parameters defined in the retrieved rows. (This is how you use the table to fill in parameters dynamically.)
End for
If there’s a “nonZeroQuery” value passed in from Schedule A DDR
Then run the nonZeroQuery and exit if you got zero rows back. (This is how you prevent query execution if some condition is not met; any query that returns something other zero will allow the report to run)
End if
Now ask SSRS to run the report, using the parameters we just extracted from the table, and the report name passed in from Schedule A DDR
Get the output and write it to a local file
Email the file to whatever email addresses were passed in from Schedule A DDR
Instead of creating a subscription to modify the subscriptions table, I would put that piece somewhere else, such as in a SQL agent. But the idea is the same. A regularly running piece of SQL can add or change lines in the subscription table.
A Google of "SSRS Subscription table" returned a few helpful results: Here's an article based on 2005, but the principles should be the same for 2008: This article is for 2008, and is really close to what you are describing as well.
I would just look at the fields one by one in the subscriptions table and determine what you need for each. Try creating a row by hand (a manual insert statement) to send yourself a subscription.
R-Tag supports SSRS data driven reports with SQL Server standard edition
You can use SQL-RD, a third-party solution, to create and run data-driven schedules without having to upgrade to SQL enterprise. It also comes with event-based scheduling (triggers the report on events including database changes, file changes, emails received and so on).

SQL Server reports 'Invalid column name', but the column is present and the query works through management studio

I've hit a bit of an impasse. I have a query that is generated by some C# code. The query works fine in Microsoft SQL Server Management Studio when run against the same database.
However when my code tries to run the same query I get the same error about an invalid column and an exception is thrown. All queries that reference this column are failing.
The column in question was recently added to the database. It is a date column called Incident_Begin_Time_ts .
An example that fails is:
select * from PerfDiag
where Incident_Begin_Time_ts > '2010-01-01 00:00:00';
Other queries like Select MAX(Incident_Being_Time_ts); also fail when run in code because it thinks the column is missing.
Any ideas?
Just press Ctrl + Shift + R and see...
In SQL Server Management Studio, Ctrl+Shift+R refreshes the local cache.
I suspect that you have two tables with the same name. One is owned by the schema 'dbo' (dbo.PerfDiag), and the other is owned by the default schema of the account used to connect to SQL Server (something like userid.PerfDiag).
When you have an unqualified reference to a schema object (such as a table) — one not qualified by schema name — the object reference must be resolved. Name resolution occurs by searching in the following sequence for an object of the appropriate type (table) with the specified name. The name resolves to the first match:
Under the default schema of the user.
Under the schema 'dbo'.
The unqualified reference is bound to the first match in the above sequence.
As a general recommended practice, one should always qualify references to schema objects, for performance reasons:
An unqualified reference may invalidate a cached execution plan for the stored procedure or query, since the schema to which the reference was bound may change depending on the credentials executing the stored procedure or query. This results in recompilation of the query/stored procedure, a performance hit. Recompilations cause compile locks to be taken out, blocking others from accessing the needed resource(s).
Name resolution slows down query execution as two probes must be made to resolve to the likely version of the object (that owned by 'dbo'). This is the usual case. The only time a single probe will resolve the name is if the current user owns an object of the specified name and type.
[Edited to further note]
The other possibilities are (in no particular order):
You aren't connected to the database you think you are.
You aren't connected to the SQL Server instance you think you are.
Double check your connect strings and ensure that they explicitly specify the SQL Server instance name and the database name.
In my case I restart Microsoft SQL Sever Management Studio and this works well for me.
If you are running this inside a transaction and a SQL statement before this drops/alters the table you can also get this message.
I eventually shut-down and restarted Microsoft SQL Server Management Studio; and that fixed it for me. But at other times, just starting a new query window was enough.
If you are using variables with the same name as your column, it could be that you forgot the '#' variable marker. In an INSERT statement it will be detected as a column.
Just had the exact same problem. I renamed some aliased columns in a temporary table which is further used by another part of the same code. For some reason, this was not captured by SQL Server Management Studio and it complained about invalid column names.
What I simply did is create a new query, copy paste the SQL code from the old query to this new query and run it again. This seemed to refresh the environment correctly.
In my case I was trying to get the value from wrong ResultSet when querying multiple SQL statements.
In my case it seems the problem was a weird caching problem. The solutions above didn't work.
If your code was working fine and you added a column to one of your tables and it gives the 'invalid column name' error, and the solutions above doesn't work, try this: First run only the section of code for creating that modified table and then run the whole code.
Including this answer because this was the top result for "invalid column name sql" on google and I didn't see this answer here. In my case, I was getting Invalid Column Name, Id1 because I had used the wrong id in my .HasForeignKey statement in my Entity Framework C# code. Once I changed it to match the .HasOne() object's id, the error was gone.
I've gotten this error when running a scalar function using a table value, but the Select statement in my scalar function RETURN clause was missing the "FROM table" portion. :facepalms:
Also happens when you forget to change the ConnectionString and ask a table that has no idea about the changes you're making locally.
I had this problem with a View, but the exact same SQL code worked perfectly as a query. In fact SSMS actually threw up a couple of other problems with the View, that it did not have with the query. I tried refreshing, closing the connection to the server and going back in, and renaming columns - nothing worked. Instead I created the query as a stored procedure, and connected Excel to that rather than the View, and this solved the problem.