I have two MS Reports with each their own dataset.
The first one works but the other does not fill anything in it's table. When I debug the dataset, just before showing the Report, it's fill, and I did the same setup as with the first report.
I get no errors og other input.. The table just not show any rows at all.
Is their any log files that can tell me something and if so, where can I find them? Thanks.
Check 3 things:
See that the dataset binded are the same (that u r filling)
is the report Showing the formatted headers(table and column header) this means that table format is ok.
if you want you can check report .rdlc file (it is xml base file generated for the report)
The report server log file location can be found in the registry:
HKLM
Software
Microsoft
Microsoft SQL Server
{Report Server Instance Name, mine is MSRS10_50.MSSQLSERVER}
CPE
The log file location is the data value associated with the ErrorDumpDir key.
When previewing in report designer any error messages will be displayed in the preview tab. Sounds like you may have a different problem that won't be reported in the logs. Double-check that the query returns data. You may want to use SQL Server Profiler (assuming your database is SQL Server) to debug queries executed against the database.
Related
SSAS Version: 14.0.226.1
Visual Studio Version: 4.7.02558
Issue: once model is delployed to the server, it is processed w/o any errors. But if the SSAS server is rebooted, one of the dimensions throws an error while processing. It just loses one of the column. Here is the error that I get (Failed to save modifications to the server. Error returned: 'The 'Global_Code_SKU' column does not exist in the rowset.):
The column data sample looks like this:
The model contains 2 dimensions and a fact table with 632 million rows in it. May it be that the fact table size is an issue? Maybe dictionary's too big?
How I fix it: by deploying model again without partitions and roles, just metadata, and this fixes the issue, however sometimes servers can be rebooted without notification, so the processing job fails next day (it runs once a day).
Is there any suggestion I can consider to fix this? I searched for a while, haven't found any solution though.
There was a hidden sign in right before the first symbol in one of the names, so after comparing binaries of the two strings we wound that we just should recreate the table and that solved the problem
Some suggestions to try:
After reboot, connect to the SSAS server using SSMS and right click the database in question and choose Script -> Script database as. Is the column Global_Code_SKU still there? Is it hidden? Is it available in the source?
What datatype is the Global_Code_SKU? I've had problems with columns with similar values being auto-identified by SSAS as binary and therefore excluded from the load.
Got a weird problem.
I have created an SSIS package that imports data via an ODBC connection between an External database and SQL Server 2008.
The Data Flow has a Source Query (simple: select columns from table).
Data Conversion to String [DT_STR]
Destination: SQL Server table with same structure.
If I execute the Task, 11,000+ rows are entered. There are No errors.
However, the data is all White Space and NULLS, except the Date fields are correct (which is how I can tell that it's trying to enter the right data).
If I change the Error Output on any single column in the Source Query to Redirect, all of the fields are populated correctly. If I set it to Ignore or Failure, we get the blanks.
I added a Flat File Destination and connected the Redirect row, and nothing is written to it.
We have several packages like this, and I've never had this problem.
I also recreated the entire Package, and the same thing happens.
Any ideas?
There are 8 Data Flow tasks doing the same thing in this package for different databases. If, I can't figure it out, is there a good way to set the Error Output to Nowhere?
UPDATE 2/26/18:
I added Data Viewers, and on the first Viewer from the Source, I get the Blanks/NULLS if the Error Output on the Source is NOT set to Redirect Rows. If I set to Redirect, the Viewer shows data. NOTE: I can set ANY column Error Output to Redirect on the Error column OR the Truncation column. As long as one is selected, the data from the Source comes through.
Since this is the Viewer between the Source and Data Conversion, doesn't that mean the problem would be at the Source Query and NOT the Data Conversion?
Here it is with Source Query - Error Output set to default Fail Component
Here I set to Redirect:
~~~~~~~~~~~~~~~~
Hi, all!
I have a database model xxx.pdm and a sql script I want to apply to the db (so that generated xxx_db.sql, xxx_triggers.sql, etc will contain the changes - the files are used in whole application building process to generate yyy.db file).
I've tried to:
open the pdm file with PowerDesigner 16.5
go to Database->Update Model from Database...
select "using script files" and specified a sql file (with some create index, alter table statements). pressed ok
PowerDesigner showed progress dialog and a dialog merge models with yellow locks near some of the entities.
I try to generate database: Database->Generate database... in the dialog xxx_db.sql is selected.
the result - generation aborted due to errors detected during the verification of the model.
Category Check Object Location
Reference Incomplete join Reference 'FK_table1_col1' <Model>
Reference Incomplete join Reference 'FK_table2_col2' <Model>
at the same time the sql script is well executed via Sybase Interactive (cmd line).
Is my approach correct?
Maybe I'm doing something wrong?
Backstory
I have recently been given the task of maintaining a SSIS process that a former colleague oversaw. I have only a minimal experience with BIDS/SSIS/"What ever MS marketing wants to call it now" and have an issue which I can't seem to find a solution to.
Issue
I have a Data Flow that includes reading images data from a table as well as doing a file read on the images them self's.
For the image read a 'Import Column transformation' (here by called ICt) is being used, and it hangs indefinitely.
The module gets handed 2500 rows of image data (name, path, date created etc) and using the 'path' column the ICt tries to read the file. I've set the correct input column under 'Input and Output Properties' as well as setting the output column. The input column has the output columns ID in its FileDataColumnId.
When running the process it just hangs as yellow and nothing happens. I can access the images in the explorer, and know they exist (at least some).
Tools used
Windows 7
Visual Studio 2008 sp2
SQL-Server 2012
All hints, tips or possible solutions would be appreciated.
I've never touched PervasiveSql before and now I have a bunch of .ddf and .Btr files. I read that all I had to do was create a new database in the control center and point to the folder that contains these files.
When I do this and look at the database there is nothing in it. Since I am new to Pervasive, I'm more than likely sure that I'm doing something wrong.
EDIT: Added a screen shot after running command prompt
To create a database name in the PCC, you need to connect to the engine then right click the engine name and select New then Database. Once you do that, the following dialog should be displayed:
Enter the database name, and path. The path being where the DDFs are located. In most cases the default options are sufficient.
A longer process is documented at http://docs.pervasive.com/products/database/psqlv11/wwhelp/wwhimpl/js/html/wwhelp.htm#href=uguide/using.02.5.html.
If you pointed to a directory that had DDF files (FILE.DDF, FIELD.DDF,and INDEX.DDF) when you created the database name, you should see tables listed.
If you pointed to a directory that does not have DDF files, the database will still be created but will have no tables defined. You'll either need to get DDFs from the vendor or create the table entries using CREATE TABLE (with IN DICTIONARY clauses) or use DDF BUilder to add table entries.
Based on your screen shot, you only have 10 records in FILE.DDF. This is not enough. There are minimum system tables required (X$FILE, X$FIELD, X$INDEX, and a few others). It appears your DDFs are not a valid set. Contact the client / vendor that provided the DDFs and ask for a set that include all of the table definitions.
Once you have tables listed in your Database Name, you can use ODBC to access the data.