SQL Server Reporting Studio report showing "ERROR#" or invalid data type error - sql

I struggled with this issue for too long before finally tracking down how to avoid/fix it. It seems like something that should be on StackOverflow for the benefit of others.
I had an SSRS report where the query worked fine and displayed the string results I expected. However, when I tried to add that field to the report, it kept showing "ERROR#". I was eventually able to find a little bit more info:
The Value expression used in [textbox] returned a data type that is
not valid.
But, I knew my data was valid.

Found the answer here.
Basically, it's a problem with caching and you need to delete the ".data" file that is created in the same directory as your report. Some also suggested copying the query/report to a new report, but that appears to be the hard way to achieve the same thing. I deleted the .data file for the report I was having trouble with and it immediately started working as-expected.

After you preview the report, click the refresh button on the report and it will pull the data again creating an updated rdl.data file.

Another solution to this issue is to click Refresh Fields in the Dataset Properties menu.
This will update the list of fields, and force SSRS to get new data, rather than relying on a cached version.

Related

ALV field in RIMARA20 program is missing after migration to S4HANA

I have the following issue.
In the past, we have added some fields to transaction IH09.
However last year we migrated to HANA and a lot of programs were updated in the process including the program RIMARA20, which is the program behind IH09.
IH09 has worked fine with the added fields were seen.
Last week I was asked to add another field and I did it however although the field catalog has the new field, it is not shown in the output of IH09.
I have debugged the code countless times trying to figure out what is going on but I don't know what happens.
This program internally uses the function REUSE_ALV_GRID_DISPLAY.
We still have the former SAP environment and I tried to make exactly the same enhancement with this new field and I can see it as expected.
In other words; the same field, the same data element, and the same enhancement in both environments but in the HANA instance I cannot see it.
I'm truly frustrated because I see the new field in the field catalog but I can't make it visible in the report.
Any advice on this issue?
Sounds like the REUSE_ALV_GRID_DISPLAY has "remembered" the old catalog
Did you try
to reset the layout / add the missing field?

Table Adapter not updating after adding parameter

I'm quite new to .NET and very new to TableAdapters. I am currently working on updating an existing program and am required to alter some SQL code to update a field in one of the databases where I work. I've changed my VB and code and everything in terms of my UI is good to go so I am now trying to update the TableAdapter that is used to account for my change.
I am adding a parameter (#jobNumber) and wish to update the field 'DISCRETE_JOB' in the database table I'm working in.
I've added in the 'DISCRETE_JOB' parameter to my INSERT INTO statement as the third parameter and added the '#jobNumber' as the third parameter in the VALUES() line of my SQL.
After Finishing the changes are there when I bring up the tool-tip (by hovering mouse over the TableAdapter Query) but the actual parameters listed beside the name of the Query int the DataSet.xsd file are not updated
Looks like-> InsertData(#Param1,#Param2,#Param3)
Should look like -> InsertData(#Param1, #Param2, #jobNumber, #Param3)
Even after saving changes to the .xsd file the parameters don't SHOW the changes of parameters. I also took a look at the DataSet.Designer.vb file for the DataSet.xsd file and it doesn't appear that my new parameter has been taken into account.
Again, I am new and there may be a couple of things I'm supposed to do to make these changes happen, but I just don't want to mess anything up, as there's a lot of queries in the file for this program.
Thanks! Let me know if I need to provide more detail.
UPDATE: I fixed my problem, though I'm still not sure exactly what was going on and why the parameters weren't updated automatically.
Here's the fix:
The SQL code I added remained in place, but I needed to manually add the parameter I wanted. To do this I went to the properties of the query in question and clicked 'Add', filling in the correct fields based on what the existing parameters looked like as a guide line.
Hopefully this can help someone else too!

VB .NET 2012 Adding Scalar query to TableAdapter creates duplicate dataset.designer

I have a data set with multiple tables. In one of these tables I have included some scalar queries that take various fields of the table and spit out a single result (for instance average of fields X, Y, and Z), etc. Up to now, I have had great success with this, but now I am getting a very odd issue cropping up.
When I try to add a new scalar query, I am putting my SQL in the screen and naming my query, just like I normally do. However, whenever I do this now, it creates a duplicate of the DataSet.Designer file (now DataSet*1*.Designer), and I start to get compiler errors since all functions within the partial classes are duplicated. I am only able to back out of this by deleting the new designer file, in which case my new SQL query is now unavailable (but I still see it in the original designer view).
I am not sure why this is happening. Can anyone shed any light on why the IDE is creating a new DataSet.Designer file instead of modifying the original?
Discovered the answer. It looks like this may happen if some process is using the original designer file, and the IDE tries to generate a new one. Unfortunately, it doesn't reconcile that the old one is still there. This will correct the issue.
Delete the newest (offending) designer file from your project
Close the project.
Open the vbproj file using a text editor.
Search for the following..
<LastGenOutput>myDataSet1.Designer.cs<LastGenOutput>
Take the 1 off of the dataset name
save the file and reopen your project.

SSIS Package Not Populating Any Results

I'm trying to load data from my database into an excel file of a standard template. The package is ready and it's running, throwing a couple of validation warnings stating that truncation may occur because my template has fields of a slightly smaller size than the DB columns i've matched them to.
However, no data is getting populated to my excel sheet.
No errors are reported, and when I click preview for my OLE DB source, it's showing me rows of results. None of these are getting populated into my excel sheet though.
You should first make sure that you have data coming through the pipeline. In the arrow connecting your Source task to Destination task (I'm assuming you don't have any steps between), double click and you'll open the Data Flow Path Editor. Click on Data Viewer, then Add and click OK. That will allow you to see what is moving through the pipeline.
Something to consider with Excel is that is prefers Unicode data types to Non-Unicode. Chances are you have a database collation that is Non-Unicode, so you might have to convert the values in a Data Conversion task.
ALSO, you may need to force the package to execute in 32bit runtime. The VS application develops in a 32bit environment, so the drivers you have visibility to are 32bit. If there is no 64bit equivalent, it will break when you try and run the package. Right click on your project and click Properties and under the Debug menu you'll need to change the setting Run64BitRuntime to FALSE.
you dont provide much informatiom. Add a Data View between your source and your excel destination to see if data is passing through. Do do it, just double click the data flow path, select data view and then add a grid.
Run your app. If you see data, provide more details so we can help you
Couple of questions that may lead to an answer:
Have you checked that data is actually passed through the SSIS package at run time?
Have you double checked your mapping?
Try converting within the package so you don't have the truncation issue
If you add some more details about what you're running, I may be able do give a better answer.
EDIT: Considering what you wrote in your comment, I'd defiantly try the third option. Let us know if this doesn't solve the problem.
Just as an assist for anyone else running into this - I had a similar issue and beat my head against the wall for a long time before I found out what was going on. My export WAS writing data to the file, but because I was using a template file as the destination, and that template file had previous data that had been deleted, the process was appending the data BELOW the previously used rows. So, I was writing out three lines of data, for example, but the data did not start until row 344!!!
The solution was to select the entire spreadsheet in my template file, and delete every bit of it so that I had a completely clean sheet to begin with. I then added my header lines to the clean sheet and saved it. Then I ran the data flow task and...ta-daa!!! Perfect export!
Hopefully this will help some poor soul who runs into this same issue in the future!

SSRS 2005: How do I make available varbinary data for download in a report?

SSRS newbie question here...
I have a table where one column is varbinary(max) data. I would like to make a report that makes this data available for download as a hyperlink so the user can just click on the item and get a file download dialog for the binary data. In this particular case, the binary data happens to be the content of old pdf files, but that shouldn't matter.
I tried searching around but I can't find any pointers on how to do this. It seems to me that it should be possible. There are ways to display images in a report using varbinary data, so it makes sense that one should be able to make arbitrary binary data downloadable on a report, right?
No, it is not possible as far as I can tell. Don't see anyway to do this.
The work-around that I used was to create a simple asp.net page to serve the binary content through some URL. I then hyperlinked to that page from the SSRS report giving the right variables in the URL. Works fine for me, YMMV if you have to worry about URL hacking or security issues.