We have been developing a new SSAS Tabular (1400) model, initially doing the PoC by loading the data directly from SQL Server (not using expressions). Once we had an idea of what we wanted, we refactored and did the data imports with Expressions to allow further Power Query (M) processing. At first, all seemed fine. Along the way we started to mash up other data from an ODBC datasource and didn't experience any issues during the process.
This morning we opened the project to make some modifications before running a new load of data and clicking "process tables" triggered an authentication dialog for the SQL Server data source again and then produced this error for all tables we were importing.
We have tried removing the data source connection string and adding it again. We removed all expressions and tried to start from scratch. No matter what we do, the Expression preview window WILL show the data preview, but attempting to import it into the model then fails with this error.
We've tried this on multiple workstations to try and rule out something with one developers settings that might have changed.
Any thoughts?
After reading this error message again, it struck me that it mentions an ODBC error. The connection to the SQL Server database shouldn't be ODBC so that got me wondering if something was wrong with that connection instead. Sure enough, once I re-authenticated the ODBC data source, the data load worked as expected.
Note that we hadn't actually created a table in the model from the ODBC source yet. Regardless, processing the tables appears to try and validate each connection and rather than more clearly saying that something was wrong with this one data source (even though no data was being pulled into tables), the error dialog made it appear that the import was failing for the SQL Server connection, which is simply wrong.
Related
I am trying to use Excel pull data from a large Oracle Data Warehouse via an ODBC connection. I have a query that works using the editor in Access. I've tried using Power Query and Microsoft SQL to use this query to get this data into Excel and I get errors.
Therefore:
Does SQL executed from Excel need to be in a different syntax? Shouldn't it still be Oracle?
How can I use this pre-written query to ping the data warehouse and get what I need?
Here is the SQL that I have so far. I had to change some table names...sorry if that makes it weird.
The SQL you posted uses Access-specific functions. That is NOT a valid SQL query if run directly against Oracle. If you have a bunch of linked tables in Access, that would allow this.
The ODBC datasource connection in Excel works differently. ODBC executes the query directly at the datasource. It does some validation first and supports a limited subset of SQL language supported at the destination. What's included in the subset is determined in part by the driver selected for the connection.
So what you want to do is use a tool that lets you build the query directly in an Oracle environment, like Quest Toad or Oracle SQL Developer. Once you have the query working there, it should be easier to port it to Excel.
One thing I like to do is put my query into a view on the database. Then I can just select everything from the view when creating the Excel connection.
It's also worth pointing you to the My Data Sources folder. When you first setup an ODBC connection in Excel, the connection is saved by default in Windows in a folder called "My Data Sources" located just under your user profile folder. For example: C:\Users\UserName\My Data Sources\Data Source Name.odc.
You can open these *.odc files in any text editor, and you should be able to manually edit the SQL here. Especially look for the <odc:CommandText> element. In this way you can build a simple query up front, and then improve on the SQL command in your favorite environment and easily move the updated SQL to the existing ODBC connection.
I'm currently updating all of our ETLs using Visual Studio 2015 (made in BIDS 2008) and redeploying them to a new reporting server running on SQL Server 2016 (originally 2008R2).
While updating one of the ETLs and trying to run on the new server I got this error:
The package execution failed. The step failed.
Sometimes it also produces this error:
Source: Load Fact Table SSIS.Pipeline Description: "Copy To Fact
Table" failed validation and returned validation status
"VS_NEEDSNEWMETADATA".
I've tried deleting and re-adding the OLEDB Destination, connection strings and opened up the column mappings to refresh the meta data. I also recreated the whole data flow task but I'm still getting the same error.
The package runs fine on my local machine.
UPDATE:
I started taking the package apart and running only pieces of it to try and narrow down which part was failing. It seemed to be failing on loading into the staging table but I couldn't find out why.
I eventually decided to just try and re-create the whole thing. After re-creating the entire package, still no luck. The picture below is from the event viewer on the server itself but it didn't give me any new information.
Package error from event viewer
I have tried all the solutions provided above and the other sites. Nothing worked.
I got a suggestion from my friend Which worked for me.
Here are the steps:
Right click on the Source/Target Data flow component.
Go to Advanced Editor -> Component Properties
Find ValidateExternalMetadata and set it to False.
Try your luck. This is a pathetic issue and left me clueless for 2 days.
I finally found the issue and here's how I did it.
Because the error messages I was getting from SSMS weren't very insightful I first opened up my remote desktop and logged into the server. Then I went to Administrative Tools>Event Viewer and then Windows Logs>Application to see if the failed event would provide greater detail.
It didn't give me much still.
The next step I took was to run the package from the command line because the messages should be more verbose. Opened up cmd, changed directory to the one my package was in and then...
DTEXEC /FILE YourPackageName.dtsx
Finally, the error message here showed a missing column in the tables the package was trying to write to. I added those columns and voila!
As stated in comments,
if it runs ok in your development environment, then the problem isn't with the package, it's with the scheduled job on the server. Try recreating that.
If that doesn't work,
It seems like the server has a cached instance of the package it's using instead of the updated one. Try renaming your package and creating a new job with the new package name and see if that works.
If that doesn't work,
all I can recommend at that point is to cut the package down until it succeeds, then add the next step that fails.
Sounds like from your solution the development environment is more forgiving of schema updates than the deployed solution. Glad you were able to resolve, eliminating clutter helps.
I had the same problem and my issue was a difference between two environments, the same field in the same table once was written with a capital and once not. So the name was the same, but with this small difference (e.g. isActive vs IsActive).
This came from a refactoring effort, where we used VS database publish that did not updated the field name.
Have you tried deleting and re-creating the source? When I get this I can generally modify OK any object that has the error but have to delete and rebuild the paths between them, however sometimes I have to delete everything in the data flow and re-create it.
A Proxy for SSIS Package Execution should be created under the SQL Server Agent. You should then change your job step (or steps) to Run As the Proxy you've created.
I had your same problem some time ago and the proxy fixed it.
Forgive me if you've already tried this.
It is very common to get that message when 2 columns in the source file are being inserted into the same field of the table.
i.e.
My text file has twice "neighborhood" (same label for different columns) and my table has "neighborhood" and "neighborhoodb" (notice the "b" at the end). The import will try to import both text columns into the field "neighborhood" and ignore the "neighborhoodb" field, it will fail with the "VS_NEEDSNEWMETADATA" error.
Re-creating the job worked for me. Some cached version of the job may have been causing the VS_NEEDSNEWMETADATA error. The package was executing correctly but it was failing, when it was executed by an agent job.
This ended up being a permissions issue for me. The OLE DB Source was using a stored procedure that selected from a SQL view. This view joined to a table in another database and unfortunately the proxy account the SQL Agent job step was running the package under did not have SELECT permission to the table in that database. This is why the package ran fine in Visual Studio but not from a job when deployed to the server. I found the root cause of the error by taking the SELECT statement out of the stored procedure and putting it directly in the Source Query box of the OLE DB Source control which caused it to finally return the 'SELECT permission denied' error message. This error was apparently hidden from SSIS since the proxy account DID have execute permission on the stored procedure.
It works for me after changing the ValidateExternalMetadata to false. I was transferring the data from MSSQL database to MySQL database. Changed "ADO NET Destination".
You may need to strongly type your Source Query.
Example:
If your DestinationDB has a FullName field Nvarchar(255)
and in your source query you have
select firstname + lastname as FullName from...
Try this:
Select CONVERT(NVARCHAR(255),firstname + lastname) as Fullname from...
So if you are going from db to db and both are nvarchar(255) I don't have this issue, but if you are concatenating fields in your query specify the data type and length.
This error can also occur when an entire SSIS project needs to be redeployed rather than just one of the packages (for VS versions that allow deployment of a single package in a multi-package project), particularly when a project connection has been changed or added. For example, if you've added or removed columns from a flat-file project connection. In that case, you need to deploy the entire project to push out the updated project connection properties. This can be true even if the project only has one package in it. In VS Solution Explorer, rather than click on the package name to deploy, select the bolded project name at the top, and then click deploy.
I am new in vb.net programming. Am facing a problem in database handling. Am using oledb to deal with database, which is MS Access in my project. I am dealing with queries at the time. Now the problem is that my queries are working well on vb form but are not affecting the actual database. For example, when am adding a record, it displays 'record added successfully', the message I have used for my conformation, but the actual database is not displaying the record I just entered and even got the above conformation message as well. I have checked query in sql editor too, its doing well. I have checked locals in vb debug mode, all are containing correct values.
Am not getting what's the reason behind that. Why it is displaying the success message but not modifying the actual database. Same is the case when am firing delete query, till now. I have not tried Update query yet.
Technology - Visual Basic.net with MS Access
Am using Access 2007 and Visual Studio 2013
Please Help by your suggestions
Do the controls on your form have the correct control source, i.e. the database table/query from and to which it should be reading and writing to?
Basically the problem was the gap between my understanding and .Net's working.
Here is the solution.
You can include database in two ways:
1.Either importing it directly to your project from the place, for example using drag and drop, or some other such method.
2.Or, by including it via using wizard.
But, the difference lies in connection string you use in your project, if you give absolute path of database, then you will directly see the alterations in database you have done using your application, even in testing and debugging mode via IDE.
connection.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\xyz.accdb"
But if you are using connection string provide by wizard, for example,
connection.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=|DataDirectory|\xyz.accdb"
what the IDE will be doing is whenever you will be running project for debugging or testing, every time, it will copy the actual database, with its contents as well, in /bin/Debug folder. So, the changes you are performing will only be visible to that copy, not the actual file. So, if you want to verify with the database, like in our case, check the copy of database, which will be present in /bin/Debug folder. You will see the changes there. But, every time you run project for debugging, it will replace that copy with original one.
So, actually I was checking the original database file, not that copy, since the changes were only made to that copy. So that is why I was facing the above problem, not due to any programming fault.
I'm using Oracle SQL Developer 3.0.04. Trying to connect to a remote DB. Don't have any issues with database connection, it connects.
But when trying to view a selected tables rows a small progress bar within a small dialog box titled "Display Results" appears saying "Waiting for Checking if Object is Editable" and the progress is stuck. Cannot see any data of the selected table.
I tried to export the schema and data, but it freezed at once. So I closed SQL Developer.
Tried to attempt that again but now cannot even view the tables details.
Used TOAD in another machine and connected using that, no issue then the table listed all the data. Checked on the remote machine using putty and "sqlplus" to connect and the data showed as expected.
So that tells me that something in my local machine is wrong. Cannot connect using SQL Developer. Has anyone out there faced similar issue? please any idea on this matter?
Thanks
Did you try connecting to any other database, remote and local ? Does this happen with every database or only one connection? Perhaps something is wrong with that installation folder of SQL Developer, perhaps something got corrupt while it was stuck. Perhaps try to download the latest version of SQL Developer from this link and try again.
I've got an .rpt file that I did not write and can find no documentation about. I want to be able to review the SQL that is generated from this report so that I can figure out, well, what data it was pulling and what WHERE clause parameters were used.
I can open it up and see the report layout. But when I select Database|Show SQL Query... the report tries to connect to the data source. The problem is, the data source being used is unknown to me, probably an ODBC connection used by whoever wrote the query. All I can do at that stage is 'Cancel' and I'm back to looking at the report designer.
Am I missing something? Can I get to the SQL query without connecting to the datasource? It seems like viewing the selection criteria shouldn't be dependent on a data connection.
Thanks.
version: Crystal Reports 2008
I know that this is an old thread, but I encountered this same problem. Effectively we used to have a database/application that has since been aquired by an external agency.
Although they now have the database/application they don't have access to crystal reports, so we can't just send them the old report that we used to run. Likewise we can't run it as we don't even have the database set up anywhere.... So instead our plan was just to extract the SQL code generated by the report and forward that on.
We experienced the same problem, but the solution is actually pretty simple.
If you don't have access to the original data source, just create a new 'blank' datasource (such as an ODBC connection). As long as the connection to the datasource works (i.e. it is some kind of valid datasource this it works fine). When running the 'Show SQL' option point the report to this datasource. As long as you don't try to actually run the report (and only show the SQL) the operation wont fail. This worked for our situation anyway. (Crystal Reports 2008)
(I can give more details if it helps in any way.)
It should be possible to find out some details about the existing datasource, by selecting Database > Set Datasource Location... .
As well as enabling you to change the datasource location, this should show you some information about the current datasource, such as which type of datasource is being used, and possibly (dependant on the type of driver) the name of the database. It is likely to be less helpful if (as you surmise) the datasource is ODBC, but if it uses a native driver there may be something useful.
Without the password, I'm not sure how much you can do. It seems "Show SQL Query" requires to report to run first, then generate the SQL plan.
It's not ideal, but you could go to Database > Visual Linking Expert to at least see the tables and how they are joined, and the go to the Record Selection Formula Editor and see what the custom WHERE statements are.
Viewing the SQL of a Command in a Crystal Report File
There are times you have just the report file, but not the associated database structure that the report uses.
This is common when dealing with example reports of functionality you wish to mimic.
This is a workaround ONLY to allow you to see the SQL of a Command that a Crystal Report is based on, when you don't have the underlying database connection that the report is based on.
In essence, the dialog box has to be satisfied before it will show the SQL, so we fool it with a legitimate Data Source, just not one that would work with the SQL that is actually in the SQL Command.
Why does a report use a command? Doesn't Crystal Reports have the ability to link tables?
When a Crystal Report is based on a record set that is too complex for the table linking functionality within Crystal Reports, the report can instead be based on a SQL Query, usually developed/tested in another editor tool and pasted into the command. This allows advanced SQL functions to be utilized.
If you don't already have a Data Source on your computer set up that you can connect to, you will need to build one first.
A simple Microsoft Access .mdb file saved in a simple location will suffice.
I placed mine with the path C:\A_test\test.mdb to make it easy to find.
If you don't have one, google for a sample mdb file and download it, saving it with a name and location you can remember. (You won't ever actually open this file, but just connect to it.)
Once you have the file saved, open the ODBC Administrator and create a New Data Source.
(you can get to the ODBC Administrator quickly from Start > type ODBC in the Search)
On the User DSN tab, click the Add button.
Scroll down the driver list to Microsoft Access Driver (*.mdb), select it and click the Finish button.
In the Data Source Name box, type a name (I used MyTest).
Click the Select Button and select the mdb file you saved from a previous step, click OK.
Click OK again. You will see your new Data Source listed by the name you gave it. Click OK.
You now have the data source you will need for the next steps.
Open the Crystal Report you want to see the SQL command for, and click on Database Expert button or Database>Database Expert Menu.
Under Selected Tables, right click on the Command and choose View Command
The Data Source Selection Box appears. Select the Data Source you created (or one you already use) and click the Finish button. The View Command box should open with the SQL in the left pane. Copy the SQL into your favorite text editor.
Whats happening is that the crystal reports needs a database to connect to regardless if its the original source DB or not.
Create a local database or use a database stored on a server, added it to your ODBC Datasources and use it when connecting. After a successful connection you should be able to view the SQL query without an error.