I have an SQL Query I'm using to Pull Beds by Unit and the related Number of Beds in Use and Available.
What is the SQL Syntax to have the results of the query save to an excel file (existing or new).
Please Note: I have been searching and I can't believe this isn't easy to figure out. Lots of people seems to use other tools but I want to do it in SQL Stored Procedure - Is this possible?
Thanks so much for your help!
The easier way to do this is from Excel, utilize a connection to your SQLDB and then pull the data into excel, that is the most efficient way to do it.
Related
As question states:
Within powerBi there from the 'Get Data from SQL Server' -> connecting to the SQL Server
there are two options import and advanced. With Advanced, you can write a sql query to get the data or the default is import. This shows all the tables on the server and you can just ETL from a click.
What is the real difference?
If you are comfortable writing your own T-SQL select statement, you can use it to bypass the Power Query editor and send your desired statement straight to the SQL database. That is also handy if you have code already written out from a previous query or project, which you can just paste into the Advanced query window.
If you use the Power Query Editor to build you query step by step, you have a better visualisation about what data is returned by the previous step(s), and you can apply data manipulations after sighting the data.
Power Query uses query folding, which means that your individual steps are analysed and then translated into the most efficient SQL code before it is sent to the server.
That means that even if you don't speak T-SQL very well, you can still build efficient queries with the Query Editor, and if you feel you are an accomplished T-SQL developer, you can shortcut the Query Editor steps altogether. Of course that means that it is also possible to use "Advanced" and write clunky, inefficient T-SQL that performs slower than going through the Query Editor steps would.
In the end, it comes down to preference and familiarity. A seasoned DBA might just quickly write out a Select statement, a SQL rookie might prefer to click a few ribbon commands instead. The result can be identical in returned data and performance.
I have T-SQL queries stored on a hard drive: I:\queries\query1.sql and I:\queries\query2.sql.
I usually work in a way that I execute a query from a drive, and then I copy results into Excel, and then I work on it.
My problem here is that query1.sql is already long, and now I would like to extend it by getting a result of query2.sql, and join it with a result of query1.sql.
What I could do is appending a code from query2.sql to query1.sql. But then the query is getting really long and hard to maintain.
I would like to do something like this:
SELECT * FROM ("Result of I:\queries\query1.sql") q1
LEFT JOIN ("Result of I:\queries\query2.sql") q2 ON q1.ID=q2.ID
Is there any way to write a query or stored procedure, which will be again stored on a drive to do this?
Basically, you need to ask your DBA for a database when you are able to store things in the database. This can be on the same system where the data is stored. Or, it could be on a linked system. Gosh, you could run SQL Server locally and store the information and data there.
Then, the queries that you are storing in files should be views in the database. You can then run the queries and store and combine the results locally.
You are essentially recreating database functionality using text files and data files -- going through a lot of effort when SQL Server already supports this functionality.
To expand on Gordon's comment (+1), why are you running scripts off of a drive? Most DBA's I've known would treaten bodily harm over this as executing code that they can't control / troubleshoot / see source code control on brings a whole host of security and supportability issues.
Far better to store this code in a Stored Procedure, which will have a saved query execution plan, can be tracked using various DMV's, and have permissions assigned to it, then your outside Excel doc can just set a connection and execute the SP.
I'm having trouble performing a query on a remote Oracle SQL Server via Excel VBA while trying to perform an INNER JOIN in the same query with a local table in a Excel file sheet.
Example:
Excel Sheet with local table ["LTE_Cells$LTE_Cells_Tmp"]:
Sheet "LTE_Cells"
Oracle SQL Query
SELECT a.STARTDATE, a.ENODEB, a.EUTRANCELLFDD, (a.COUNTER_1/8/1024)+
a.COUNTER2/8/1024) AS Total_Total_Traffic_TB FROM »»»LOCAL_EXCEL_TABLE««««
INNER JOIN REMOTE_DATABASE.LTE_KPI_1 a ON
((»»»LOCAL_EXCEL_TABLE««««.EUTRANCELLFDD =
REMOTE_DATABASE.LTE_KPI_1.EUTRANCELLFDD) AND
(»»»LOCAL_EXCEL_TABLE««««.ENODEB = REMOTE_DATABASE.LTE_KPI_1.ENODEB)) WHERE
(((REMOTE_DATABASE.LTE_KPI_1.STARTDATE)>=sysdate-3));`
Thanks in advance for the help!
This doesn't answer your question directly, and I may be all wet on this but I don't know that either Excel or Oracle explicitly handles what you are trying to accomplish.
However, MS Access will out of the box. Short answer: I think you are using the wrong tool for this task. You are using the proverbial hammer to saw a board in half. Link the spreadsheet and the Oracle table as linked objects in Excel, and your query should be easy-peazy.
Longer answer: while Access does this simply and easily, it can and probably will leave a path of destruction behind it on the DBMS. Specifically, you can expect to thrash the shared pool in Oracle, as Access will be issuing one query (using literals, no less) for every line in Excel. For 1,000 lines, it probably doesn't matter that much, but if you're going to do this on really large datasets, you will make a fast enemy out of your DBA.
Extended answer: really, the best way to do this is to load the contents of those Excel spreadsheets in Oracle tables and let the DBMS do the heavy lifting. This is bread and butter for the RDBMS.
We have a migration task, which require to migrate ~30GB data from SQL server to SAS grid. We already have close to 50 reports using SSIS/SSRS and those need to move entirely to SAS grid. I am more familiar with SAS and haven't worked on SSIS/SSRS stuff.
How do I go about moving the process to SAS?
I know I can use Passthrough in SAS to call the SSIS/SSRS stuff from Proc SQL, but since they want everything needs to be coded in SAS, I am wondering if there is an easy way.
Is there a tool that can convert the SQL code and writes Proc sql code in SAS?
Thanks!
Park
You should be able to view the queries behind the SSIS jobs (run the report, then view the details behind it). Once you have the queries, copy/paste the SQL code into a SAS SQL passthrough statement. The table returned should give the same results as the report. It's then up to you to format the output as desired.
I've been tasked with creating a data dictionary for a DB that has 90 tables. Is there any way to identify by which procedure/task/job a table is populated? I need to source the data in each of the tables and I'm not quite sure how to do this.
Any tips would be greatly appreciated.
-T
You can search for which stored procedures use a given table with something like:
SELECT OBJECT_NAME(id) FROM SYSCOMMENTS WHERE text LIKE '%table_name%'
You'll then have to manually examine and analyse the code within those SPs to see what it actually does with that table. I expect you'll need to manually eyeball any SQL Agent tasks and SSIS packages you may have as well. This kind of work tends to be hard graft - there aren't many shortcuts to simply grinding over all the code by hand.