In Access I have 2 tables, table_A and table_B. In col2 from table_A, I have an R function as cell value.
mdPatternChart<-function (x, Str_PathFile)
{
if (!(is.matrix(x) || is.data.frame(x)))
stop("Data should be a matrix or dataframe")
if (ncol(x) < 2)
stop("Data should have at least two columns")
R <- is.na(x)
nmis <- colSums(R)
R <- matrix(R[, order(nmis)], dim(x))
pat <- apply(R, 1, function(x) paste(as.numeric(x), collapse = ""))
sortR <- matrix(R[order(pat), ], dim(x))
if (nrow(x) == 1) {
mpat <- is.na(x)
} else {
mpat <- sortR[!duplicated(sortR), ]
}
if (all(!is.na(x))) { cat(" /\\ /\\\n{ `---' }\n{ O O }\n==> V <==")
cat(" No need for mice. This data set is completely observed.\n")
cat(" \\ \\|/ /\n `-----'\n\n")
mpat <- t(as.matrix(mpat, byrow = TRUE))
rownames(mpat) <- table(pat)
} else {
if (is.null(dim(mpat))) {
mpat <- t(as.matrix(mpat))
}
rownames(mpat) <- table(pat)
}
r <- cbind(abs(mpat - 1), rowSums(mpat))
r <- rbind(r, c(nmis[order(nmis)], sum(nmis)))
png(file=paste(Str_PathFile,".png",sep=""),bg="transparent")
plot.new()
if (is.null(dim(sortR[!duplicated(sortR), ]))) {
R <- t(as.matrix(r[1:nrow(r) - 1, 1:ncol(r) - 1]))
} else {
if (is.null(dim(R))) {
R <- t(as.matrix(R))
}
R <- r[1:nrow(r) - 1, 1:ncol(r) - 1]
}
par(mar = rep(0, 4))
plot.window(xlim = c(-1, ncol(R) + 1), ylim = c(-1, nrow(R) +
1), asp = 1)
M <- cbind(c(row(R)), c(col(R))) - 1
shade <- ifelse(R[nrow(R):1, ], mdc(1), mdc(2))
rect(M[, 2], M[, 1], M[, 2] + 1, M[, 1] + 1, col = shade)
adj = c(0, 0.5)
srt = 90
for (i in 1:ncol(R)) {
text(i - 0.5, nrow(R) + 0.3, colnames(r)[i], adj = adj,
srt = srt)
text(i - 0.5, -0.3, nmis[order(nmis)][i])
}
for (i in 1:nrow(R)) {
text(ncol(R) + 0.3, i - 0.5, r[(nrow(r) - 1):1, ncol(r)][i],
adj = 0)
text(-0.3, i - 0.5, rownames(r)[(nrow(r) - 1):1][i],
adj = 1)
}
text(ncol(R) + 0.3, -0.3, r[nrow(r), ncol(r)])
dev.off()
}
Now I would like to insert this into col2 of table_B. Col2 from both tables are memo. It works as
CurrentDb.Execute "insert into Table_B (Col1,Col2) select Col1,Col2 from Table_A"
But it does not work if I use DAO.recordset as below.
CurrentDb.Execute "insert into Table_B (Co1,Col2) values (2,'" & Rs_TableA.Fields("Col2") & "')"
And it gave a run-time error 3075 saying something is wrong with the syntax. I replaced ! and " in the function but it did not work. I also tried by saving its value in a string variable before inserting and it did not work either. As I need to loop through table_A, Can anyone help? Thanks!
The function text contains apostrophes and quote characters. These characters have special meaning in SQL statements. The SELECT subquery won't have an issue but the constructed SQL pulling value from recordset is trying to process them as special characters, not as just simple text. This causes the compiled statement to be nonsense to the SQL engine. Review How do I escape a single quote in SQL Server?.
Options for handling:
Replace(Replace([fieldname], "'", "''"), Chr(34), Chr(34) & Chr(34))
Open a source recordset and a target recordset, loop through source and use AddNew and Update to write records to target
Maybe the SELECT subquery version will actually serve requirement, and if the ID should be supplied dynamically by textbox:
CurrentDb.Execute "INSERT INTO Table_B (Col1,Col2) SELECT " & Me.tbxID & " As C1, Col2 FROM Table_A"
Also, there are 2 slanted apostrophes where I think there should be normal apostrophes.
Related
I'm trying to execute a raw query in Django where I dynamically want to pick column names.
Eg
def func(all=True){
if all:
query_parameters = {
'col': '*'
}
else:
query_parameters = {
'col': 'a,b,c'
}
with connections["redshift"].cursor() as cursor:
cursor.execute(
"select %(col)s from table limit 2 ;",
query_parameters,
)
val = dictfetchall(cursor)
return val
}
Django is executing it like.
select "*" from table limit 2;
so the output is just like select "*"
*
and in the else case it is executed like
select "a,b,c" from table limit 2;
so the output is a,b,c
How can I run the command so that Django run it like
select a , b , c from table limit 2
so that the output is
a b c
1 2 3
4 5 6
I found a hack myself.
See Here
Explanation
Prepared query step by step
Input data (Columns I need)
self.export_col = “a,b,c”
def calc_col(self):
if self.exp == 'T':
select_col = ""
self.export_col = self.export_col.split(',')
for col in self.export_col:
select_col += col + ","
select_col = select_col[:-1]
self.export_col = select_col
else:
self.export_col += '*'
def prepare_query(self):
query += " SELECT "
query += self.export_col
query += """ from table limit 2;"""
I have a large sql db (7gbs), where the fields appear to have quotation marks in them.
For example:
res <- dbSendQuery(con, "
SELECT *
FROM master")
dbf2 <- fetch(res, n = 3)
dbClearResult(res)
Yields
NPI EntityTypeCode ReplacementNPI EmployerIdentificationNumber.EIN.
1 "1679576722" "1" "" ""
2 "1588667638" "1" "" ""
3 "1497758544" "2" "" "<UNAVAIL>"
ProviderOrganizationName.LegalBusinessName. ProviderLastName.LegalName. ProviderFirstName
1 "" "WIEBE" "DAVID"
2 "" "PILCHER" "WILLIAM"
3 "CUMBERLAND COUNTY HOSPITAL SYSTEM, INC" "" ""
I've been trying to get a smaller table by filtering on, say EntityTypeCode but I'm not getting any results. Here's an example of a query not getting anything, any advice? I think the issue is use of double quotes in the fields.
# Filter on State
res <- dbSendQuery(npi2, "
SELECT *
FROM master
WHERE (ProviderBusinessPracticeLocationAddressStateName = 'PA')
")
# Filter on State and type
res <- dbSendQuery(npi2, "
SELECT *
FROM master
WHERE (ProviderBusinessPracticeLocationAddressStateName = 'PA') AND
(EntityTypeCode = '1')
")
Escape the inner double quotes (ie, the ones in the cell) with a \.
res <- dbSendQuery(npi2, "
SELECT *
FROM master
WHERE (ProviderBusinessPracticeLocationAddressStateName = '\"PA\"') AND
(EntityTypeCode = '1')
")
This produces the following string:
SELECT *
FROM master
WHERE (ProviderBusinessPracticeLocationAddressStateName = '"PA"')
I am attempting to write a dataframe using R to Teradata. The dataframe is wide in format (over 100 columns) and writing to Teradata implies declaring both the name and class of each variable. Note that the below data is just serving as an example.
iris$integerRandom <- seq_along(iris$Sepal.Length)
iris$Dates <- seq.Date(as.Date("2018-01-01"), by = "d", length.out = nrow(iris))
iris$Dates2 <- seq.Date(as.Date("2019-01-01"), by = "d", length.out = nrow(iris))
iris$Species <- as.character(iris$Species)
iris$characterRandom <- sample(letters, nrow(iris), replace = TRUE)
## Getting Numeric and Integer Names first
names_num <- names(iris)[which(sapply(iris, class) %in% c("integer", "numeric"))]
names_date <- names(iris)[which(sapply(iris, class) %in% "Date")]
names_character <- names(iris)[which(sapply(iris, class) %in% "character")]
## Generating variable names with corresponding variable types
paste(gsub("varchar(300)", '"varchar(300)"', gsub(",", " = varchar(300), ", toString(names_character))), "varchar(300)", sep = " = ")
paste(gsub(",", " = date", toString(names_date)), " = date")
paste(gsub("varchar(300)", '"float"', gsub(",", " = float, ", toString(names_num))), "float", sep = " = ")
Ideally, I would like the desired output to say
Species = "varchar(300)", characterRandom = "varchar(300)", and so forth. Note that the order in which the variables is important as the order matters when declaring the names and the types to Teradata (or SQL in this case) the code will probably work for both tools. So the order of the variable names along with
Sepal.Length and end with characterRandom.
I have the following zoo object (res)
column1 column2 column3
2015-12-30 3.2735 2.3984 1.1250
2015-12-31 2.5778 1.8672 1.1371
2016-01-01 3.3573 2.4999 1.1260
2016-01-04 3.3573 2.4999 1.1463
and I would like to produce a vectorized update query.
UPDATE table SET column1=3.2735, column2=2.3984, column3=1.1250 WHERE dt = '2015-12-30';
UPDATE table SET column1=2.5778, column2=1.8672, column3=1.1371 WHERE dt = '2015-12-31';
etc.
I was able to do something similar previously for an INSERT query
sColumns <- paste0("dt, index, ", paste0(colnames(res), collapse=", "))
sValues = apply(data.frame(paste0("'", index(res), "'"), paste0("'", index, "'"), coredata(res)),
1 , paste, collapse = ",")
sql <- paste0("INSERT INTO table (", sColumns, ") VALUES (", sValues, ")")
which was considerably easier because all column names were grouped, and all values were grouped. For an UPDATE query, I have to combine alternately columns and fields.
So far, I have the following:
sColumns <- paste0(colnames(res), "=")
tmp <- paste(c(matrix(c(sColumns, res[1, ]), 2, byrow = T)), collapse = ", ")
tmp <- gsub("=, ", "=", tmp)
Which produces (for one row), output like:
[1] "column1=3.2735, column2=2.3984, column3=1.125"
Can anyone provide guidance as to how I can use something like apply() to do this for all rows of 'res'?
1) Try this:
library(zoo)
sapply(1:nrow(res), function(i) paste0(
"UPDATE table SET ",
toString(paste0(names(res), "=", coredata(res)[i, ])),
" WHERE dt='", time(res)[i], "'"))
giving the following character vector:
[1] "UPDATE table SET column1=3.2735, column2=2.3984, column3=1.125 WHERE dt='2015-12-30'"
[2] "UPDATE table SET column1=2.5778, column2=1.8672, column3=1.1371 WHERE dt='2015-12-31'"
[3] "UPDATE table SET column1=3.3573, column2=2.4999, column3=1.126 WHERE dt='2016-01-01'"
[4] "UPDATE table SET column1=3.3573, column2=2.4999, column3=1.1463 WHERE dt='2016-01-04'"
2) And a variation giving the same result:
sapply(unname(split(res, time(res))), function(z) paste0(
"UPDATE table SET ",
toString(paste0(names(z), "=", z)),
" WHERE dt='", time(z), "'"))
Note 1: If your table is not too large then you could alternately consider reading it into R, performing the update in R and then writing it back.
Note 2: Here is the input shown reproducibly:
Lines <- "Date column1 column2 column3
2015-12-30 3.2735 2.3984 1.1250
2015-12-31 2.5778 1.8672 1.1371
2016-01-01 3.3573 2.4999 1.1260
2016-01-04 3.3573 2.4999 1.1463"
library(zoo)
res <- read.zoo(text = Lines, header = TRUE)
Is this what you want?
foo <- apply(res,1, function(x){
sprintf("%s = %f", names(x),x)
})
lapply(colnames(foo), function(nn) {
sprintf("UPDATE table SET %s WHERE dt = \'%s\'",
paste(foo[,nn], collapse=","),
nn)
})
which gives:
[[1]]
[1] "UPDATE table SET column1 = 3.273500,column2 = 2.398400,column3 = 1.125000 WHERE dt = '2015-12-30'"
[[2]]
[1] "UPDATE table SET column1 = 2.577800,column2 = 1.867200,column3 = 1.137100 WHERE dt = '2015-12-31'"
[[3]]
[1] "UPDATE table SET column1 = 3.357300,column2 = 2.499900,column3 = 1.126000 WHERE dt = '2016-01-01'"
[[4]]
[1] "UPDATE table SET column1 = 3.357300,column2 = 2.499900,column3 = 1.146300 WHERE dt = '2016-01-04'"
I would like to create a function to connect to a SQLite database by passing it two parameters: database name and table.
I tried this:
sqLiteConnect <- function(database, table) {
con <- dbConnect("SQLite", dbname = database)
query <- dbSendQuery(con, "SELECT * FROM ", table)
fetch(query, n = -1)
}
But I pass result <- sqLiteConnect(primary_database, "table_name") I get Error in sqliteExecStatement(conn, statement, ...) : RS-DBI driver: (error in statement: near " ": syntax error)
If I change my function to
sqLiteConnect <- function(database, table) {
con <- dbConnect("SQLite", dbname = database)
query <- dbSendQuery(con, "SELECT * FROM ", table, "")
fetch(query, n = -1)
}
I get Error in sqliteExecStatement(conn, statement, ...) : unused argument ("")
I guess the problem is in concatenating a variable to a string.
dbSendQuery requires the SQL statement as a single character string (it does not take the table as an argument), so you would need to create it using either paste() or sprintf(), for example:
sqLiteConnect <- function(database, table) {
con <- dbConnect("SQLite", dbname = database)
query <- dbSendQuery(con, paste("SELECT * FROM ", table, ";", sep=""))
fetch(query, n = -1)
}