Running SQL query through RStudio via RODBC: How do I deal with Hash Tables? - sql

I've got a very basic SQL query that I'd like to be able to view in R.
The trouble is, I need to be able to reference a #table:
select
RAND(1) as random
into #test
select * from #test
Is this possible, or will I need to create permanent tables, or find some other work around?
I currently do this via a RODBC script which allows me to choose which SQL file to run:
require(RODBC)
sql.filename <- choose.files('T:\\*.*')
sqlconn <- odbcDriverConnect("driver={SQL Server};Server=SERVER_NAME;Trusted_Connection=True;")
file.content <- readLines(sql.filename)
output <- sqlQuery(sqlconn, paste(file.content[file.content!='--'],collapse=' '))
closeAllConnections()
Do you have any advice on how I can utilise #tables in my SQL scrips in R?
Thanks in advance!

When you use temp tables SQL outputs a message with the number of rows in the table. R doesn't know what to do with this message. If you begin your SQL query with SET NOCOUNT ON SQL will not output the count message.

I use #tables by separating my query into two parts, it returns character(0) if I do like:
sqlQuery(test_conn, paste("
drop table #test;
select
RAND(1) as random
into #test
select * from #test
"))
So instead I would use:
sqlQuery(test_conn, paste("
drop table #test;
select
RAND(1) as random
into #test
"))
sqlQuery(test_conn,"select * from #test")
It seems to work fine if you send one Query to make the #table, and a second to retrieve the contents. I also added in drop table #test; to my query, this makes sure there is not already a #test. If you try to write to a #table name that is already there you will get an error

Related

How do we insert data into a table?

I'm attempting to insert data into a table:
#one_files =
EXTRACT //all columns
FROM "/1_Main{suffixOne}.csv"
USING Extractors.Text(delimiter : '|');
CREATE TABLE A1_Main (//all cols);
INSERT INTO A1_Main SELECT * FROM #one_files;
Within the same script I'm attempting to SELECT data:
#finalData =
SELECT //mycols
FROM A1_Main AS one;
OUTPUT #finalData
TO "/output/output.csv"
USING Outputters.Csv();
Here's the exception I get:
What am I doing wrong? How do I select from my table? Can we not insert and query in the same script?
Some statements have restrictions on how they can be combined inside a script. For example, you cannot create a table and read from the same table in the same script, since the compiler requires that any input already physically exists at compile time of the query.
Check this:
https://learn.microsoft.com/en-us/u-sql/concepts/scripts

storing query outputs dynamically TSQL

I have a loop over different tables which returns results
with different number of columns.
Is it possible to store the output of a query without creating a concrete table?
I've read some posts regarding temporary tables so I tried this simple example:
create table #temp_table1 (id int)
insert into #temp_table1 ('select * from table1')
table1 above could be any table
I get the following error message:
Column name or number of supplied values does not match table definition.
Is there anyway to avoid having hard code table definitions exactly matching the output of your query?
Thanks!
You could do a select into - that will create the temporary table automatically:
SELECT * INTO #Temp
FROM TableName
The problem is that since you are using dynamic SQL , your temporary table will only be available inside the dynamic SQL scope - so doing something like this will result with an error:
EXEC('SELECT * INTO #Temp FROM TableName')
SELECT *
FROM #Temp -- The #Temp table does not exists in this scope!
To do this kind of thing using dynamic SQL you must use a global temporary table (that you must drop once done using!):
EXEC('SELECT * INTO ##GlobalTempFROM TableName')
SELECT * INTO #Temp
FROM ##GlobalTemp -- Since this is a global temporary table you can use it in this scope
DROP TABLE ##GlobalTemp

How to execute sql query when debugging a stored procedure

I'm debugging a stored procedure on SQL Server 2008 and I have this:
INSERT INTO #tempTable (ID, Name)
SELECT ID, Name FROM dbo.MYTABLE WHERE dbo.MYTABLE.Old >= 15
How can I view the data into #tempTable on Debug time?
In SQL Server Management Studio, you can't execute query directly while debugging stored procedure, and that's still not implemented(I think). You can only view the local variables value in Local Debug Window.
There are some work around to see temp table values while in Debugging mode:-
1) In the stored procedure, after insert data into #temptable, add this line of code to get temptable values in xml table varriable where you want to see temptable values. Then you can check the values in Local Debug window in xml format
--inserting data into temp table
INSERT INTO #tempTable (ID, Name)
SELECT ID, Name FROM dbo.MYTABLE WHERE dbo.MYTABLE.Old >= 15
--to see records of temp table
DECLARE #temptable XML
SET #temptable = (SELECT * FROM ##temptable FOR XML AUTO)
2) You can convert local temp table(#temptable) to global temptable(##temptable), so when you insert date in temp table, you can open new query window, and able to see global temp table records using select query.
This blog post describes how to access a temporary table from another session:
http://web.archive.org/web/20180409190701/http://sqlblog.com:80/blogs/paul_white/archive/2010/08/14/viewing-another-session-s-temporary-table.aspx
Alternative you can use two ## in the table name to make the table globally accessible from other sessions: ##tempTable (The table might be locked for reading while your insert is running)
Even though SQL Server Management Studio has some debugging functions , but I find them pretty useless.
I don't think there are any debugging tools out there for SQL Server like Visual Studio, which will give you a step by step information at runtime.
The way normally developers debug sql server code is to use print statement, for stored procedures take the sp definition out declare a variable for each parameter that procedure expects , hardcode the values for variables and execute smaller logical blocks of code to see what's going on where.

Executing dynamically created SQL Query and storing the Query results as a temporary table

I am creating a SQL Query dynamically. After it's been created I want to execute it and store it as a temporary table.
WITH [VALIDACCOUNTS] AS( EXEC (#sqlQuery))
You have two solutions for this:
As a first solution you can simply use an INSERT EXEC. This will work if you have a specified result set. This could be used if your procedure just returns one result set with a fixed result design.
Simply create your temporary table with matching columns and datatypes. After that you can call this:
INSERT INTO #yourTemporaryTable
EXEC(#sql)
The second solution would be the usage of OPENROWSET for this, which may have some sideeffects.
You can read more about it here.
INSERT INTO #yourTemptable
SELECT *
FROM OPENROWSET('SQLNCLI', 'DRIVER={SQL Server};',
'EXEC (''+#sql+''))'

Debug SQL in pgAdmin when SQL contains variables

In SQL Server I could copy sql code out of an application and paste it into SSMS, declare & assign vars that exist in the sql and run. yay great debugging scenario.
E.g. (please note I am rusty and syntax may be incorrect):
declare #x as varchar(10)
set #x = 'abc'
select * from sometable where somefield = #x
I want to do something similar with Postgres in pgAdmin (or another postgres tool, any recommendations?) where I can just drop my SQL (params & all) into something that will run against Postgres DB.
I realise you can create pgscript, but it doesn't appear to be very good, for example, if I do the equivalent of above, it doesn't put the single quotes around the value in #x, nor does it let me by doubling them up and you don't get a table out after - only text...
Currently I have a piece of SQL someone has written that has 3 unique variables in it which are used around 6 times each...
So the question is how do other people debug SQL efficiently, preferably in a similar fashion to my SQL Server days.
You can achieve this using the PREPARE, EXECUTE, DEALLOCATE commands for handling statements, which is really what we are talking about here.
For example:
PREPARE test AS SELECT * FROM users WHERE first_name = $1;
EXECUTE test ('paul');
DEALLOCATE test;
Perhaps not as graphical as some may like, but certainly workable.
I would give a shot at writing a SQL function that wraps your query. It can be something as simple as
CREATE OR REPLACE FUNCTION my_function(integer, integer)
RETURNS integer
AS
$$
SELECT $1 + $2;
$$
LANGUAGE SQL;
SELECT my_function(1, 2);
I would do this instead of a PREPARE since it will be simpler to update it. Depending on how complex the function is, you might want to also look at some of the other PL's in Postgres.
SQL procs are notoriously hard to debug. My lame but practical solution has been to write log messages to a log table, like this (please excuse syntax issues):
create table log_message (
log_timestamp timestamp not null default current_timestamp,
message varchar(1000)
);
then add lines to your stored proc like:
insert into log_message (message) values ("The value of x is " || #x);
Then after a run:
select * from log_message order by 1;
It's not pretty, but works in every DB.