Oftentimes I need to troubleshoot a workbook that another person at my company has created and published to our server. To troubleshoot, I need to see their connection details, specifically their Custom SQL, to understand what data they are using in their extract.
Is there any way to view this connection info (specifically their SQL code) when viewing the published workbook on the server (web) version?
I am an admin and I am able to download their workbook to my desktop version of tableau, then open it, then reconnect to the data, then look through the data connections they created, to see their SQL. But it's a really cumbersome process.
All I'm looking to do is, when looking at a published workbook, see the data connection details so that I can see the Custom SQL, without going through the process of downloading I described above.
You can get some details on the SQL statement by creating a performance recording.
From the Tableau Server Admin Guide:
Enable Performance Recordings:
Choose the Admin button in Tableau Server.
Choose Site.
Select a site.
Choose Edit.
In the Edit Site dialog box, select Allow Performance Recording.
Choose OK.
You start performance recording for a specific view by adding ?:record_performance=yes to
the url. For example:
http://server.site.com/views/Variety/BaseballStatistics?:record_performance=yes
Now, notice a new link at the top of your view called "Show Performance Recording".
Click this to open the generated performance workbook dashboard. Click on the bar chart and observe the SQL appear at the bottom of the view. Note, the SQL text will truncate after about 250 characters.
The admin guide suggests viewing the "Tableau Log" to find the full SQL statement.. I have looked at all the server side logs in C:\ProgramData\Tableau\Tableau Server\data\tabsvc\logs but cannot locate the SQL. (please reply if you know where to find this?)
You can also run a database trace to see the SQL that the database sees. For example, for MS SQL Server, run the Profiler tool, setup a default trace, and filter on "Application Name" = "Tableau Protocol Server 8.0" or similar.
I have version 8.1 and this is how I got around this problem. Tableau shows a 'Custom SQL Warning' when you open a workbook that contains the custom SQL. You can copy all the text in this message by simply Ctrl + C as this is any other Windows warning message. And then paste it your editor of choice to analyze it.
I do not know if this works on earlier versions.
I thought you could do this easily, and originally answered that you could, but I didn't pay close attention to your question. You can change some things about data connections without editing the workbook, including the ip address or name of the database server, but there doesn't appear to be a simple way to access custom sql without downloading the workbook.
Go to the Administrator page and select Data Connections.
You can enter some search criteria to filter the list of data connections shown (or not).
Find the workbook in question by scanning the second column -- you can sort the column if that helps.
Then select the corresponding data connection in the 4th column to see the details of the connection.
If it makes sense for the connection, you can also modify the connection directly at the server. This is really useful if you, say, need to move your enterprise database to a new IP address or change a database password, without downloading, modifying and republishing alot of workbooks.
An even better practice is to start using shared data connections hosted on Tableau server instead of having each workbook have its own local copy of connection and related info.
Related
Our project is moving from MySQL to MS SQL and after a long time working with MySQL Workbench I really miss some features in SQL Server Management studio (2014).
Do you know whether they exist in SSMS or there is an alternative/replacement application for SSMS to work with database?
Functions are listed below:
Generate update data script to review and to be able to copy-paste it. Do not update data when I move to another row when the table is opened for editing.
Some changes are still made in database in our project, and sometimes it's easier to add some rows manually in 5 tables, get the script, test it and run the script at production environment. I don't want to write a script for each update and I don't want to make a mistake when copying data to production server using edit table option.
Review update table script BEFORE the changes were made, not after (I am talking about Tools - Options - Designer - Auto generate change scripts).
Upload a file using select file dialog into a binary field.
Again, I know about using OPENROWSET function, just interested how to do it as I used to.
Ability to view large text fields in a convenient way in SSMS. Now I have to copy data from a field and paste it into notepad. (For example, error message with a long trace log)
Save a few tabs with some useful scripts and open all of them when I open SSMS.
Is there any way to organize tabs to be able to work with 10+ tabs more effectively? Now only 6 of them can be shown on the screen (compate it to 15 tabs in MySQL WB).
Simple 'search field' (like Ctrl+F in Excel) to be able to search data in all fields displayed on the screen.
I would appreciate any ideas.
Thank you.
Seems like this would be a fairly popular question to get asked, I did a quick search and didn't see anything. I'm curious to see if there is a way around this.
I spend my entire day in SSMS, and I am constantly changing my connection between many different servers while working within the same query file. I have them all set up as Registered servers, but for some reason, the change connection dialog box doesn't link up with that list...This occasionally becomes annoying.
You would think the "Browse for more" option under server name would have a tab for registered servers. I just installed SSMS 2014 today and was hoping maybe there'd be a new feature to cover this or something.
Have you tried SSMSBoost?
SSMSBoost URL
Several of the features include:
Preferred connections
Connection aliases/coloring
Quick Connections Switch
Drop-down on toolbar, allowing to switch connections between servers
I need to connect and send/receive information from an MS SQL server in my Lotus Notes app using #formula in realtime (I can connect using an agent, but I need to use inline code for this).
The commands themselves seem pretty straight forward, but setting up the configurations seems to be a topic with scarce documentation. Apparently I need to install an ODBC driver. Where would I find that, and do I install that onto the server or onto the workstations that will run this app?
If any Lotus gurus could step me through setting this up, it would be greatly appreciated.
Thanks
You'll need to install the ODBC driver on the workstations that run this app, if the users will be triggering the ODBC connections. If at all possible, I highly suggest setting this up on the server side, and having it run via an agent. That'll save you from a few headaches, including having to maintain the ODBC connections on each workstation and worrying if each workstation has access to the data and server.
You first just want to make sure your ODBC setup is correct. You'll need the appropriate driver, of course, and the connection information. This site has a walkthrough to give you an idea of how to setup an ODBC database connection
If you have MS Access you can use it to test querying from the ODBC data source. Once you've tested the connection works, you'll just refer to the data source name (DSN) in your #DbColumn, #DbLookup, or #DbCommand formulas.
Back to my suggestion on setting this up on the server side, that would mean you'd keep a copy of the data you're querying within the Notes database itself, and then users would be interacting with read-only data in Notes. You could schedule updates regularly on the server side of that read-only data and effectively create a cache of the data in your Notes environment. Then that data would replicate around to other replicas of the database, but remove the trouble of the ODBC connection being needed everywhere.
If you need realtime data, though, that solution is out the window and you'll have to go with a local solution. In that case, you might want to look at the LCConnection class or using an ADODB.Connection from script, as both will allow you to create DSN-less connections to data sources. You'd then save the trouble of requiring ODBC data sources on each workstation, and only have to worry about whether they can access the server from their workstation.
I would add another option to Ken's list. It involves having the server do the queries of the external database (therefore you are only setting up ODBC in on the server - you don't have to deal with it on the workstations). You create an agent that is launched on the server using the 'run on server' technique. When the workstation needs to query the external data, the code creates a throw-away document in the database, puts the query criteria into the temporary document, saves the document, then calls the 'run on server' agent passing a reference to the temporary document. The server launches the agent, reads the criteria from the temporary document, does the query, and writes the results back to the temporary document. Then the workstation can access the query results from the temporary document. A scheduled agent can delete the temp docs on a regular basis.
It sounds complicated, and it all has to be done in script, but I've done this in many applications and it is fast, flexible, easy to administer, and gives your applications a lot of power. Note that end users must have the ACL rights to create a document in the db (the temp doc) in order for this to work.
Good luck!
I've got an .rpt file that I did not write and can find no documentation about. I want to be able to review the SQL that is generated from this report so that I can figure out, well, what data it was pulling and what WHERE clause parameters were used.
I can open it up and see the report layout. But when I select Database|Show SQL Query... the report tries to connect to the data source. The problem is, the data source being used is unknown to me, probably an ODBC connection used by whoever wrote the query. All I can do at that stage is 'Cancel' and I'm back to looking at the report designer.
Am I missing something? Can I get to the SQL query without connecting to the datasource? It seems like viewing the selection criteria shouldn't be dependent on a data connection.
Thanks.
version: Crystal Reports 2008
I know that this is an old thread, but I encountered this same problem. Effectively we used to have a database/application that has since been aquired by an external agency.
Although they now have the database/application they don't have access to crystal reports, so we can't just send them the old report that we used to run. Likewise we can't run it as we don't even have the database set up anywhere.... So instead our plan was just to extract the SQL code generated by the report and forward that on.
We experienced the same problem, but the solution is actually pretty simple.
If you don't have access to the original data source, just create a new 'blank' datasource (such as an ODBC connection). As long as the connection to the datasource works (i.e. it is some kind of valid datasource this it works fine). When running the 'Show SQL' option point the report to this datasource. As long as you don't try to actually run the report (and only show the SQL) the operation wont fail. This worked for our situation anyway. (Crystal Reports 2008)
(I can give more details if it helps in any way.)
It should be possible to find out some details about the existing datasource, by selecting Database > Set Datasource Location... .
As well as enabling you to change the datasource location, this should show you some information about the current datasource, such as which type of datasource is being used, and possibly (dependant on the type of driver) the name of the database. It is likely to be less helpful if (as you surmise) the datasource is ODBC, but if it uses a native driver there may be something useful.
Without the password, I'm not sure how much you can do. It seems "Show SQL Query" requires to report to run first, then generate the SQL plan.
It's not ideal, but you could go to Database > Visual Linking Expert to at least see the tables and how they are joined, and the go to the Record Selection Formula Editor and see what the custom WHERE statements are.
Viewing the SQL of a Command in a Crystal Report File
There are times you have just the report file, but not the associated database structure that the report uses.
This is common when dealing with example reports of functionality you wish to mimic.
This is a workaround ONLY to allow you to see the SQL of a Command that a Crystal Report is based on, when you don't have the underlying database connection that the report is based on.
In essence, the dialog box has to be satisfied before it will show the SQL, so we fool it with a legitimate Data Source, just not one that would work with the SQL that is actually in the SQL Command.
Why does a report use a command? Doesn't Crystal Reports have the ability to link tables?
When a Crystal Report is based on a record set that is too complex for the table linking functionality within Crystal Reports, the report can instead be based on a SQL Query, usually developed/tested in another editor tool and pasted into the command. This allows advanced SQL functions to be utilized.
If you don't already have a Data Source on your computer set up that you can connect to, you will need to build one first.
A simple Microsoft Access .mdb file saved in a simple location will suffice.
I placed mine with the path C:\A_test\test.mdb to make it easy to find.
If you don't have one, google for a sample mdb file and download it, saving it with a name and location you can remember. (You won't ever actually open this file, but just connect to it.)
Once you have the file saved, open the ODBC Administrator and create a New Data Source.
(you can get to the ODBC Administrator quickly from Start > type ODBC in the Search)
On the User DSN tab, click the Add button.
Scroll down the driver list to Microsoft Access Driver (*.mdb), select it and click the Finish button.
In the Data Source Name box, type a name (I used MyTest).
Click the Select Button and select the mdb file you saved from a previous step, click OK.
Click OK again. You will see your new Data Source listed by the name you gave it. Click OK.
You now have the data source you will need for the next steps.
Open the Crystal Report you want to see the SQL command for, and click on Database Expert button or Database>Database Expert Menu.
Under Selected Tables, right click on the Command and choose View Command
The Data Source Selection Box appears. Select the Data Source you created (or one you already use) and click the Finish button. The View Command box should open with the SQL in the left pane. Copy the SQL into your favorite text editor.
Whats happening is that the crystal reports needs a database to connect to regardless if its the original source DB or not.
Create a local database or use a database stored on a server, added it to your ODBC Datasources and use it when connecting. After a successful connection you should be able to view the SQL query without an error.
I know it is possible to get data from a SQL database into an excel sheet, but i'm looking for a way to make it possible to edit the data in excel, and after editing, writing it back to the SQL database.
It appears this is not a function in excel, and google didn't come up with much usefull.
If you want to have the Excel file do all of the work (retrieve from DB; manipulate; update DB) then you could look at ActiveX Data Objects (ADO). You can get an overview at:
http://msdn.microsoft.com/en-us/library/ms680928(VS.85).aspx
You want the Import/Export wizard in SQL Management Studio. Depending on which version of SQL Server you are using, open SSMS (connect to the SQL instance you desire), right click on the database you want to import into and select Tasks.. "Import Data".
In the wizard, click Next (past the intro screen) and from the Data Source drop list select "Microsoft Excel". You specify the path and file name of the Excel spreadsheet, whether you have column headings or not.. then press Next. Just follow the wizard through, it'll set up the destination (can be SQL Server or another destination) etc.
There is help available for this process in SQL Server Books Online and more (a walkthrough) from MSDN.
If you need something deployable/more robust (or less wizard driven) then you'd need to take a look at SQL Server Integration Services (for a more "Enterprise" and security conscious approach). It's probably overkill for what you want to accomplish though.
There is a new Excel plug-in named "MySQL for Excel" : http://www.mysql.com/why-mysql/windows/
I just had a need to do this, and this thread has been quiet for a long time, so I thought it might be useful to supply a recent data point.
In my application roving salespeople use a copy of an Excel workbook that tracks the progress of a prospect through a loan application. The current stage of the application needs to be automatically saved back to a remote SQL database so that we can run reporting on it.
Rejected methods for updating the database from Excel:
SSIS and OpenRowSet are both methods for allowing SQL Server to pull the data from Excel, and don't work very well when the Excel workbook is sitting in an undefined location on a user's computer, and certainly not when the workbook is currently open in Excel.
ADO is now, if not actually deprecated, nevertheless looking very long in the tooth. Also, I wanted the solution to be robust in the face of the user possibly not being connected to the internet.
I also considered running a web API on the destination server. Macros in the Excel workbook connect to the web API to transfer data. However, it can sometimes be painful to allow a web API to talk to the outside world. Also, the code to make it robust in the face of temporary loss of internet connection is painful.
The adopted solution:
The solution I plan to adopt is low-tech: email. Excel emails the data to an address hosted on an Exchange server. Everyone in the company has Outlook installed, so the emails are sent by programmatically adding them to the Outlook Outbox. Outlook nicely handles the case when the user is offline. At the server end, a custom C# executable, fired up at regular intervals by the Task Scheduler, polls the inbox and processes the emails.
You could use try these add-ins :
www.QueryCell.com (I created this one)
www.SQLDrill.com
www.Excel-DB.net
You can use the OPENROWSET function to manipulate Excel data from a T-SQL script. Example usage would be:
UPDATE OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;DATABASE=c:\MySpreadsheet.xls',
'Select * from MyTable')
SET Field1='Value1' WHERE Field2 = 'Value2'