“first argument“ error when using shinyapps.io, rodbc to show sql query result in web - sql

First of all, I need to use R to get SQL query result from HANA database, which I finish by using RODBC in Rstudio.
Second of all, I need to share my code with others, which I use shinyapps.io to finish.
However, I need to use shinyapps to show my SQL query result on other computers, which I have the following error message:
error first argument is not an open rodbc channel
I used the answer from R shiny RODBC connection Failing, but it still does not work.
Here is my codes for ui.R and sever.R attached:
ui.R:
library(dplyr)
library(RODBC)
library(stringr)
library(ggplot2)
fluidPage(
titlePanel("Basic DataTable"),
fluidRow(
DT::dataTableOutput("table")
)
)
sever.R:
library(dplyr)
library(RODBC)
library(stringr)
library(ggplot2)
ch<-odbcConnect('HANARB1P',uid='****',pwd='****')
options(scipen = 200)
myOffice <- 0
StartDate <- 20170601
EndDate <- 20170610
office_clause = ""
if (myOffice != 0) {
office_clause = paste(
'AND "_outer"."/BIC/ZSALE_OFF" IN (',paste(myOffice, collapse=", "),')'
)
}
function(input, output) {
output$table <- DT::renderDataTable(DT::datatable({
data <- sqlQuery(channel=ch,query=paste(' SELECT TOP 100
"/BIC/ZSALE_OFF" AS "SalesOffice",
"/BIC/ZHASHPAN" AS "CreditCard"
FROM "SAPB1P"."/BIC/AZ_RT_A212"
WHERE "CALDAY" BETWEEN',StartDate,'AND',EndDate,'
',office_clause,'
'))
data
}))
}
How to use shinyapps.io and RODBC to show the SQL query result on the webpages for sharing?
According to the answer, I revised my code a little bit. But something weird happens again. When I use the code:
function(input, output) {
output$table <- DT::renderDataTable(DT::datatable({
data <- sqlQuery(channel=ch,query=paste(' SELECT TOP 50
"/BIC/ZSALE_OFF" AS "SalesOffice",
"/BIC/ZHASHPAN" AS "CreditCard"
FROM "SAPB1P"."/BIC/AZ_RT_A212"
WHERE "CALDAY" BETWEEN',StartDate,'AND',EndDate,'
',office_clause,'
'))
data
}))
}
I have the error information:
When I use the code:
shinyServer(
function(input, output) {
data <- sqlQuery(channel=ch,query=paste(' SELECT TOP 50
"/BIC/ZSALE_OFF" AS "SalesOffice",
"/BIC/ZHASHPAN" AS "CreditCard"
FROM "SAPB1P"."/BIC/AZ_RT_A212"
WHERE "CALDAY" BETWEEN',StartDate,'AND',EndDate,'
',office_clause,'
'))
output$table <- DT::renderDataTable(data)
}
)
I have the error information:
I am sure the channel works. If I just use run app to do this:
shiny::runApp('//paper/fchen4/feng.officeworks/mycode/myShiny')
It works fine. But I work in a company, I do not know if my firewall could have something to do with this error. But if I do not use SQL here, it is OK.

Well, have you checked that the channel is actually open?
The error message can be the result of wrong credentials, unreachable server or anything else that would prevent a successful SQL connection.
I had no problems showing table content with the following code:
ui.R
library(shiny)
# Define UI for application that draws a histogram
shinyUI(fluidPage(
# Application title
titlePanel("Basic Data Table"),
fluidRow(
dataTableOutput("table")
)
))
server.R
library(shiny)
library(RODBC)
ch <- odbcConnect("S12")
# Define server logic to provide table output
shinyServer(
function(input, output) {
query_result <- sqlQuery(channel = ch, query = 'SELECT * FROM M_DATABASE')
output$table <- renderDataTable(query_result)
}
)
There is no reason to call DT::datatable() around the result of the SQL query as it already returns a data frame that can be fed into renderDataTable().
A few general hints:
never put your logon data into the application code. In SCN Blog
"HANA quick note – checking my connections and using them securely
…" I explained how to securely store and use connection and logon data for SAP HANA systems. This also gives you a very easy way to check the connectivity to your HANA instance.
Besides, just pointing to the ODBC DSN connection instead of providing all the parameters looks much cleaner.
You don't need all the R libraries in the ui.R file as the code that uses libraries like RODBC is in the server.R file. Make sure to have the minimum required libraries in every file to make your life a lot easier.
It doesn't hurt to break up long nested function parameter calls as I did it with your "calling-SQL-statement-convert-resultset-data-type-feed-it-into-render-function". It's a lot easier to follow what happens where and what fails where, when there are not too many commands in a single line.
This should work for you.

Related

Some clarification around sql verbage within R, specifically with dbListFields

I'm using R to connect to an Oracle Database. I'm able to do so with this code:
library(rJava)
library(RJDBC)
jdbcDriver <- JDBC(driverClass="oracle.jdbc.driver.OracleDriver", classPath="jar files/ojdbc8.jar")
jdbcConnection <- dbConnect(jdbcDriver, "jdbc:oracle:thin:#(DESCRIPTION=(ADDRESS=(PROTOCOL=tcps)(HOST=blablablablabla.com)(PORT=1234))(CONNECT_DATA=(SERVICE_NAME=pdblhs)))", "myusername", "mypassword")
DataFrame <- dbGetQuery(jdbcConnection, "select protocol_id, disease_site_code, disease_site
From abc_place_prod.Rv_sip_PCL_disease_site
order by protocol_id")
# Close connection
dbDisconnect(jdbcConnection)
This snippet of code works. It downloads the data I want into a neat data frame and works as planned.
But I'm trying to learn how to navigate with R/SQL more and I wanted to use dbListFields to explore all the possible column names in that table I'm pulling from. Per the documentation here, you could type something like this:
dbListFields(con, "mtcars")
and it would work. So I tried:
dbListFields(jdbcConnection, "abc_place_prod.Rv_sip_PCL_disease_site")
and it returned this error:
Error in dbSendQuery(conn, paste("SELECT * FROM ", dbQuoteIdentifier(conn, :
Unable to retrieve JDBC result set
JDBC ERROR: ORA-00933: SQL command not properly ended
It seems like someone else experienced something similar here and I'm not sure it ever got answered. I've tried it with a semicolon at the end (that "ended" something else properly for me, and its mentioned in the comments of this question) and no luck.

How to solve error "no applicable method for 'show_query' applied to an object of class "data.frame""

I am working with the R programming language. I am trying to convert "dplyr/dbplyr" code into SQL code using the "show_query()" option.
For example, I tried to run the following code:
#first code
library(dplyr)
library(dbplyr)
data(iris)
iris %>%
filter(Species == "setosa") %>%
summarise(mean.Sepal.Length = mean(Sepal.Length),
mean.Petal.Length = mean(Petal.Length)) %>% show_query()
However, this returned the following error (note: when you remove "show_query()", the above code actually runs):
Error in UseMethod("show_query") :
no applicable method for 'show_query' applied to an object of class "data.frame"
I think I found a solution to this problem:
#second code
> con <- DBI::dbConnect(RSQLite::SQLite(), ":memory:")
> flights <- copy_to(con, iris)
> flights %>%
filter(Species == "setosa") %>%
summarise(mean.Sepal.Length = mean(Sepal.Length),
mean.Petal.Length = mean(Petal.Length)) %>% show_query()
<SQL>
SELECT AVG(`Sepal.Length`) AS `mean.Sepal.Length`, AVG(`Petal.Length`) AS `mean.Petal.Length`
FROM `iris`
WHERE (`Species` = 'setosa')
Warning message:
Missing values are always removed in SQL.
Use `mean(x, na.rm = TRUE)` to silence this warning
This warning is displayed only once per session.
Can someone please tell me why the original code I tried did not work, but the second code is working? Why is it necessary to establish a connection and add the "copy_to" statement - even if I want to run something locally? I am just curious to convert DPLYR code into SQL - at this point, I just want to run everything locally, and not connect to a remote database. Thus, why do I need to establish a connection if I want to run this locally? Why does the show_query() statement not work in the original code?
show_query() translates the dplyr syntax into query code for the backend you are using.
A database backend using dbplyr will result in an SQL query (as a data.table backend using dtplyr will result in a DT[i,j,by] query).
show_query doesn't need to have a method to translate dplyr syntax applied to a data.frame backend to itself, hence the error message you're getting.
An easy way to get an SQL query result is to transform the data.frame into an in-memory database with memdb_frame:
memdb_frame(iris) %>%
filter(Species == "setosa") %>%
summarise(mean.Sepal.Length = mean(Sepal.Length),
mean.Petal.Length = mean(Petal.Length)) %>% show_query()
<SQL>
SELECT AVG(`Sepal.Length`) AS `mean.Sepal.Length`, AVG(`Petal.Length`) AS `mean.Petal.Length`
FROM `dbplyr_002`
WHERE (`Species` = 'setosa')
dbplyr translation of R commands to SQL only works for remote tables. show_query() reveals the translated query that would be used to fetch data from the database. If the table is in local R memory then there is no need for SQL translation.
Part of the reason for this is that dbplyr has different translations defined for different databases. So without knowing what flavor of SQL/database you are using it can not determine the correct translation.
If you want to produce translations without connecting to a database, you can use simulated connections:
library(dbplyr)
library(dplyr)
data(iris)
# SQL server translation
remote_df = tbl_lazy(iris, con = simulate_mssql())
remote_df %>%
filter(Species == 'setosa') %>%
head() %>%
show_query()
# MySQL translation
remote_df = tbl_lazy(iris, con = simulate_mysql())
remote_df %>%
filter(Species == 'setosa') %>%
head() %>%
show_query()
These will produce slightly different SQL translations TOP vs LIMIT keywords.

R wont knit to html or pdf with SQL chunks. Error in eval(x, envir = envir) : object 'connection' not found

connection <- dbConnect(RSQLite::SQLite(), "AmericasCup.sqlite")
dbWriteTable(connection, "results", results, overwrite=TRUE)
```{sql connection=connection, output.var = "outQ6"}
SELECT Code,
SUM(Result = 'Win' ) AS Wins,
SUM(Result = 'Loss') AS Losses
FROM results
GROUP BY Code
Everything runs fine but when I try knit to html or pdf I get the following error:
Line 79 is: SELECT Code,
Can anyone tell me what the problem is here, I have tried to research this but cant find anything similar, though I suspect it has something to do with my '''{sql connection=connection, output.var - "outQ6"} setup.
Thank you in advance.
I have fixed it, I put this in my r setup instead and now just do {sql} to start sql chunk.
library(RSQLite)
library(DBI)
connection <- dbConnect(RSQLite::SQLite(), "AmericasCup.sqlite")
knitr::opts_chunk$set(connection = "connection")

R-SQL Invalid value from generic function ‘fetch’, class “try-error”, expected “data.frame”

I am having a problem to fetch some data from database using ROracle. Everything works perfect (I am getting the data from different tables without any problem), but one of the tables throws an error:
library(ROracle)
con <- dbConnect(dbDriver("Oracle"),"xxx/x",username="user",password="pwd")
spalten<- dbListFields(con, name="xyz", schema = "x") # i still get the name of the columns for this table
rs <- dbSendQuery(con, "Select * From x.xyz") # no error
data <- fetch(rs) # this line throws an error
dbDisconnect(con)
Fehler in .valueClassTest(ans, "data.frame", "fetch") : invalid
value from generic function ‘fetch’, class “try-error”, expected
“data.frame”
I followed this question: on stackoverflow, and i selected the columns
rs <- dbSendQuery(con, "Select a From x.xyz")
but none of it worked and gave me the same error.
Any ideas what am I doing wrong?
P.S. I have checked the sql query in Oracle SQL Developer, and I do get the data table there
Update:
If anyone can help me to locate/query my Oracle error log, then perhaps I can find out what is actually happening on the database server with my troublesome query.
This is for debugging purposes only. Try running your code in the following tryCatch construct. It will display all warnings and errors which are happening.
result <- tryCatch({
con <- dbConnect(dbDriver("Oracle"),"xxx/x",username="user",password="pwd")
spalten <- dbListFields(con, name="xyz", schema = "x")
rs <- dbSendQuery(con, "Select * From x.xyz") # no error
data <- fetch(rs) # this line throws an error
dbDisconnect(con)
}, warning = function(war) {
print(paste("warning: ",war))
}, error = function(err) {
print(paste("error: ",err))
})
print(paste("result =",result))
I know I'm late to the game on this question, but I had a similar issue and discovered the problem: My query also ran fine in SQL Developer, but that only fetches 50 rows at a time. ROracle fetches the whole data set. So, an issue that appears later in the data set won't show immediately in SQL Developer. Once I pages through results in SQL Developer, it threw and error at a certain point because there was a problem with the actual value stored in the table. Not sure how an invalid value got there, but fixing it fixed the problem in ROracle.

Does R provide anything cleaner than tryCatch() to safely make SQL queries?

I am making SQL queries, using tryCatch() to prevent R from silently using up all the slots for database connections. It looks like this:
sql <- "SELECT * FROM addresses WHERE zipcode=10202"
con <- dbConnect(PostgreSQL(), user='user', password='pswd',
dbname='contacts',host='dbserv')
tryCatch( {
rs <- dbSendQuery(con, statement=sql)
fp <- fetch(rs,n=-1) # Fetch all
dbClearResult(rs)
fp},
finally=dbDisconnect(con))
fp
Does R provide anything cleaner for the purpose? I'm thinking of how readLines() works with a string argument to make sure no file connection is left open.
You might try on.exit, something like the following:
foo <- function() {
con <- dbConnect(
PostgreSQL(),
user=config$db.user,
password=config$db.password,
dbname=config$db.name,
host=config$db.host
)
on.exit({
dbDisconnect(con)
})
## ... do something w/ connection
}
When the function foo is about to return (or exit due to an exception), the expression passed to on.exit will be evaluated.