R- create temporary table in sql server from R data frame - sql

I know I can create a temporary table in SQL from R with, for example:
require(RODBC)
X<- odbcDriverConnect('driver={SQL Server};
server=s001000;database=X1;trusted_connection=true')
sqlQuery(X, "create table #temptable (test int)" )
sqlQuery(X, "insert into #temptable(test) values(201508)")
doesItWork <- sqlQuery(X, "select * from #temptable")
But I would like to create a temporary table in sql server from an R object (I have a table that has the result of previous R calculations and I need to query it against another table in SQL. I don't want to export it as txt and upload it to SQL server. It has to be a way to do it from R. I tried:
tabla<-data.frame(per=c(201508,201510))
sqlQuery(X, "Select * into ##temporal from tabla")
But I got an error message:
"42S02 208 [Microsoft][ODBC SQL Server Driver][SQL Server]Invalid
object name 'tabla'."
"[RODBC] ERROR: Could not SQLExecDirect
'Select * into ##temporal from tabla '"
I also know I can create a table with sqlSave:
sqlSave(X, tabla, rownames=FALSE,safer=FALSE)
But I want to create a temporary table.
How can I create a temporary table in SQL from an R object?

Unfortunately, I don't recall sqlSave(conection, new_data, table_name, append = TRUE) ever working correctly for inserting data into existing tables (e.g. not creating new tables), so you may have to use the less efficient approach of generating the INSERT statements yourself. For example,
con <- odbcConnect(...)
query <- "
SET NOCOUNT ON;
IF ( OBJECT_ID('tempdb..##tmp_table') IS NOT NULL )
DROP TABLE ##tmp_table;
CREATE TABLE ##tmp_table
(
[ID] INT IDENTITY(1, 1)
,[Value] DECIMAL(9, 2)
);
SET NOCOUNT OFF;
SELECT 1;
"
sqlQuery(con, gsub("\\s|\\t", " ", query))
df <- data.frame(Value = round(rnorm(5), 2))
update_query <- paste0(
"SET NOCOUNT ON; INSERT INTO ##tmp_table ([Value]) VALUES ",
paste0(sprintf("(%.2f)", df$Value), collapse = ", "),
" SET NOCOUNT OFF; SELECT * FROM ##tmp_table;"
)
sqlQuery(con, update_query)
# ID Value
# 1 1 0.79
# 2 2 -2.23
# 3 3 0.13
# 4 4 0.07
# 5 5 0.50
#sqlQuery(con, "DROP TABLE ##tmp_table;")
#odbcClose(con)

Related

Create temporary data or table in sql and use as a database

I tried to create a temporary sql data(table) with same fields as in the original database.
What I tried..
dirname = os.path.dirname(__file__)
database = os.path.join(dirname, "database/database.db")
conn = sqlite3.connect(database)
cursor = conn.cursor()
stm = cursor.execute("CREATE TABLE new_table SELECT * FROM database;")
#and also tried with some selected fields like following
stm = cursor.execute("CREATE TABLE new_table(pressure int,parameter varchar(20),day int,month int,latitude int,longitude int,surface int);")
tables = cursor.execute("INSERT into new_table SELECT pressure,parameter, day,month,latitude,longitude,surface FROM database;")
And the error is,
sqlite3.OperationalError: near "SELECT": syntax error
How to create a new data table with selected tables or all tables(only) from the original database.
Hope someone can help me.
In your first create table you need to define it with an "as".
CREATE TABLE new_table AS (SELECT * FROM database);
This should clear up your error.

Save big dataset from R into temp table SQL database

I have a dataframe (predict_prc) with 60k rows and 2 variables (chrt_id and prc). And I need to save this dataframe into MS SQL database.
I choose the next way - create temp table, insert new values and exec the stored proc.
I tried the code below:
sql = paste("
CREATE TABLE #t (chrt_id INT PRIMARY KEY,prc FLOAT)
INSERT INTO #t
VALUES",
paste0(sprintf("(%.2i, ", predict_prc$chrt_id), sprintf("%.2f)", predict_prc$predict_prc), collapse = ", ")
,"EXEC DM.LoadChrtPrc
")
But its too many values to insert this way.
Then I tried next code:
sql_create = paste("
IF (SELECT object_id('#t')) IS NOT NULL
BEGIN
DROP TABLE #t
END
CREATE TABLE #t (chrt_id FLOAT PRIMARY KEY, prc FLOAT)
")
sql_exec = paste("
EXEC DM.LoadChrtPrc
")
channel <- odbcConnect('db.w')
create <- sqlQuery(channel, sql_create)
save <- sqlSave(channel, predict_prc, tablename = '#t', fast=TRUE, append=F, rownames=FALSE)
output <- sqlQuery(channel, sql_exec)
odbcClose(channel)
But i`ve got an error:
> save <- sqlSave(channel, predict_prc, tablename = '#t', fast=TRUE, append=F, rownames=FALSE)
Error in sqlSave(channel, predict_prc, tablename = "#t", fast = TRUE, :
42S01 2714 [Microsoft][ODBC SQL Server Driver][SQL Server]There is already an object named '#t' in the database.
[RODBC] ERROR: Could not SQLExecDirect 'CREATE TABLE "#t" ("chrt_id" float, "prc" float)'
If I execute save without create then I`ve got this error:
> save <- sqlSave(channel, predict_prc, tablename = '#t1', fast=TRUE, append=F, rownames=FALSE)
Error in sqlColumns(channel, tablename) :
‘#t’: table not found on channel
Can anybody help me with this issue?
SQL Server doesn't allow more than 1000 rows in one query.
You can insert all the values by creating chunks of 1000.
For every 1000 rows, you should create a new sql query and run it.
To solve the problem i have to create several temporary tables and then I inserted 1000 records per table to solve my problem.
So before you create temporary table, count the number of records you gonna put in temp table , divide by 1000 and then create a temp table as per your requirement.
This solution is for one time query solution.
If you want automate the process, use something else.

R in SQL Server: Output data frame into a table

This probably has a simple answer but I cannot figure it out as I'm still getting a hang of working with R in SQL Server. I have a piece of code that reads in data from a SQL Server table, executes in R and returns a data frame.
execute sp_execute_external_script
#language=N'R',
#script=N'inp_dat=InputDataSet
inp_dat$NewCol=max(inp_dat$col1,inp_dat$col2)
new_dat=inp_dat
OutputDataSet=new_dat'
#input_data_1=N'select * from IM_COMP_TEST_SQL2016.dbo.temp_table';
I want to insert new_dat into a SQL Server table (select * into new_table from new_dat). How do I go about this?
As shown in this tutorial, you can use INSERT INTO ... EXEC in a previously created table with columns aligning to script's dataframe return:
INSERT INTO Table1
execute sp_execute_external_script
#language=N'R',
#script=N'inp_dat <- InputDataSet
inp_dat$NewCol <- max(inp_dat$col1,inp_dat$col2)
new_dat <- inp_dat',
#input_data_1=N'SELECT * FROM IM_COMP_TEST_SQL2016.dbo.temp_table',
#output_data_1=N'newdat';
However, to use the make-table query may require OPENQUERY() or OPENROWSET() using an ad-hoc distributed query as described in this SO Post to return the output of stored procedure:
Stored Procedure
CREATE PROCEDURE dbo.R_DataFrame
AS
BEGIN
execute sp_execute_external_script
#language=N'R',
#script=N'inp_dat <- InputDataSet
inp_dat$NewCol <- max(inp_dat$col1,inp_dat$col2)
new_dat <- inp_dat',
#input_data_1=N'SELECT * FROM IM_COMP_TEST_SQL2016.dbo.temp_table',
#output_data_1=N'newdat';
-- ADD ALL COLUMN TYPES;
WITH RESULT SETS (("newdat" [col1] varchar(20), [col2] double, [col3] int ...));
END
GO
Action Query
SELECT * INTO Table1
FROM OPENROWSET('SQLNCLI', 'Server=(local);Trusted_Connection=yes;',
'EXEC dbo.R_DataFrame')

SQl: Update Table from a text File

Here's what I have to do :
I have a text file which has 3 columns: PID, X, Y.
Now I have two tables in my database:
Table 1 contains 4 columns: UID, PID, X, Y
Table 2 contains multiple columns, required ones being UID, X, Y
I need to update Table 2 with corresponding X and Y values.
I think we can use BULK INSERT for updating table 1, then some WHILE loop or something.
But I can't figure out exact thing.
CREATE PROCEDURE [dbo].[BulkInsert]
(
#PID int ,
#x int,
#y int,
)
AS
BEGIN
SET NOCOUNT ON;
declare #query varchar(max)
CREATE TABLE #TEMP
(
[PID] [int] NOT NULL ,
[x] int NOT NULL,
[y] int NOT NULL,
)
SET #query = 'BULK INSERT #TEMP FROM ''' + PathOfYourTextFile + ''' WITH ( FIELDTERMINATOR = '','',ROWTERMINATOR = ''\n'')'
--print #query
--return
execute(#query)
BEGIN TRAN;
MERGE TableName AS Target
USING (SELECT * FROM #TEMP) AS Source
ON (Target.YourTableId = Source.YourTextFileFieldId)
-- In the above line we are checking if the particular row exists in the table(Table1) then update the Table1 if not then insert the new row in Table-1.
WHEN MATCHED THEN
UPDATE SET
Target.PID= Source.PID, Target.x= Source.x, Target.y= Source.y
WHEN NOT MATCHED BY TARGET THEN
-- Insert statement
You can use this above approach to solve your problem. Hope this helps. :)
How are you going to run it ? From a stored procedure ?
To save some performance, I would have done BULK INSERT to temp table, then insert from temp table to Table 1 & 2.
It should look like this
INSERT INTO Table1 ( PID, X, Y)
SELECT PID, X, Y
FROM #tempTable
Some will tell that temp table are not good, but it really depend - if you file is big, reading it from disk will take time and you don't want to do it twice.
You don't need any loop to update table 2; all you need is insert from table 1.
Or, if you are trying to update existing rows in table 2, use an update query that joins on table 1. See this question for an example.
However, you should consider changing your database design, as it appears to be incorrect: you are storing X and Y in two places; they should only be stored in one table, and you should join to this table if you need to use them in conjunction with other data. If you did this, you wouldn't have to worry about messy issues of keeping the two tables in sync.

SAS reading bit data type in sql server 2005

I have a sql server 2005 database that has a table with a column of data type bit. When I look at the data in sql server management studio I see the column value as 0 or 1, when i pull with SAS I see 0 or -1, is like SAS is negating the 1 value. Anyone have an explanation for this? Thanks.
I reckon you must be using libname oledb to connect to SQL Server from SAS. I'm able to replicate your problem here:-
SQL Server code to generate dummy data
create table dbo.tbl (
tblId int identity(1,1) not null
constraint pk_tbl_tblId primary key,
bool bit not null,
)
go
insert into dbo.tbl(bool) values(0)
insert into dbo.tbl(bool) values(1)
SAS code using OLEDB
libname imm oledb provider=sqloledb
properties=(
"Integrated Security"=SSPI
"Persist Security Info"=False
"Initial Catalog"=test
"Data Source"=localhost
);
proc print data=imm.tbl; run;
The print out is:-
Obs tblId bool
1 1 0
2 2 -1
SAS code using PROC SQL
It seems like using PROC SQL should fix your problem.
proc sql noprint;
connect to sqlservr (
server='localhost'
database='test'
'Integrated Security'='SSPI'
'Persist Security Info'='False'
);
create table test as
select *
from connection to sqlservr (
select * from dbo.tbl
);
disconnect from sqlservr;
quit;
proc print data=test; run;
The print out is:-
Obs tblId bool
1 1 0
2 2 1