sql query in orange datamining software - sql

When I am trying to run a SQL query in orange datamining software using select statement and postgres database, it returns an error
INVALID CONNECTION OPTION 'PASSWD'
My query looks like this:
select * from CFAR_K7_DBTF_ALL;

This error is still valid, while there is 4 years old open ticket: http://old.biolab.si/trac/ticket/1218
I made this change to make it work:
--- Orange\data\sql.py
+++ (clipboard)
## -106,6 +106,7 ##
}
if schema == 'postgres':
argTrans["database"] = "dbname"
+ argTrans["password"] = "password"
elif schema == 'odbc':
argTrans["host"] = "server"
argTrans["user"] = "uid"

Related

R equivalent of SQL update statement

I use the below statement to update to the postgreSQL db using the following statement
update users
set col1='setup',
col2= 232
where username='rod';
Can anyone guide how to do similar to using R ?I am not good in R
Thanks in advance for the help
Since you didn't provide any data, I've created some here.
users <- data.frame(username = c('rod','stewart','happy'), col1 = c(NA_character_,'do','run'), col2 = c(111,23,145), stringsAsFactors = FALSE)
To update using base R:
users[users$username == 'rod', c('col1','col2')] <- c('setup', 232)
If you prefer the more explicit syntax provided by the data.table package, you would execute:
library(data.table)
setDT(users)
users[username == 'rod', `:=`(col1 = 'setup', col2 = 232)]
To update your database through RPostgreSQL, you will first need to create Database Connection, and then simply store your query in a string, e.g.
con <- dbConnect('PostgreSQL', dbname = <your database name>, user=<user>, password= <password>)
statement <- "update <schema>.users set col1='setup', col2= 232 where username='rod';"
dbGetQuery(con, statement)
dbDisconnect()
Note depending upon your PostgreSQL configs, you may need to also set your search path dbGetQuery(con, 'set search_path = <schema>;')
I'm more familiar with RPostgres, so you may want to double check the syntax and vignettes of the PostgreSQL package.
EDIT: Seems like RPostgreSQL prefers dbGetQuery to send updates and commands rather than dbSendQuery

How to connect to SQL Server using Perl

I am new to Perl and am trying to create a script that connects to my Oracle SQL Server database and returns the results of a table query via email. I am having trouble with the database connection part. Any ideas where I should start? Any example code would be greatly appreciated. Thanks,
Below is some sample code to get you connected to Oracle using the Perl DBI module. Adjust database name, username, and password as necessary.
#!/usr/bin/perl
use strict;
use DBI;
my $dbName = 'mydb';
my $username = 'username';
my $password = 'password';
my $options = { RaiseError => 1 };
my $dbh = DBI->connect("dbi:Oracle:${dbName}", $username, $password, $options);
$dbh is a database handle that you can use to execute all the queries that you like. See the DBI documentation page at CPAN for a concise description of the methods available.
First you should be aware of TNSNAMES.ORA file, which has predefined connections in form of
ORA11 =
(DESCRIPTION =
(ADDRESS_LIST =
(ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.0)(PORT = 1521))
)
(CONNECT_DATA =
(SERVICE_NAME = ORA12)
)
)
(check for above connection on server host if you happen to be on different machine)
Now you can use ORA11 as db name
my $DB = DBI->connect(
"dbi:Oracle:",
"USER/PASSWORD\#ORA11",
"",
{
# ChopBlanks => 1,
# AutoCommit => 0,
},
);
or use complete connection string instead of ORA11:
"USER/PASSWORD\#(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=192.168.1.0)(PORT=1521)))(CONNECT_DATA=(SERVICE_NAME=ORA12)))"
More connection options can be found in DBD::Oracle
I have a few dozen scripts that perform Oracle, Sybase, and MS SQL queries; even some scripts that perform queries on multiple databases and combine results, or build a query for one database based on result of a prior query ... but after many days of trying to get Perl libraries to play well.
For those of us that are real hacks, we start by using the Oracle SQL command-line client sqlplus.exe, which makes this approach easy, but far from pretty:
my $run_sql = 'sqlplus.exe -s DBuser/DBpwd#DBname < SQLfile.sql';
my $SQLfile = "temp.sql";
sub GET_EMP_LIST
{
my $status = $_[0];
my $sql_text = "
set linesize 150
set pagesize 0
set numf 99999999999
set feedback off
SELECT
EMP.FIRST || ',' ||
EMP.LAST || ',' ||
EMP.PHONE || ',' ||
EMP.SALARY
FROM
PERSONNEL.EMPLOYEES EMP
WHERE
(EMP.STATUS = '$status')
\;";
open (SQL, $FileOpenWrite, "$SQLfile");
print SQL $sql_text;
close (SQL_TEXT);
my #Results = "$run_SQL";
unlink $SQLfile;
return #Results;
}
#MAIN
#Employees = GET_EMP_LIST "Active";
for (#Employees)
{
my $temp = chomp $_;
$temp =~ s/\s+//g; #get rid of white spaces
my ($FIRST, $LAST, $PHONE, $SALARY) = split /,/, $temp;
.... do something with it ....
}
Like I say, far from pretty, but quick and easy, and using SQL query tools, like TOAD, you can generate the SQL in a drag & drop program before you integrate into your script.
I know many folks will say this is a terrible solution, but we pull in data that includes hundreds of thousands of lines of data.

SQL statement with conditionals

I'm currently building a food application which has users select a range of options (perishable, non-perishable ) and (snack, produce or meal) via a set of radio buttons. I'm currently using node.js and sqlite 3 to query a database to determine which entries to return to the user after they search the database.
I want to write a query such that when the booleans from the client-side are sent over to the server, the server will choose the entries such that perishable if perishable is set to true on the client that the query will find just the perishable items and vice-versa. I also want the same functionality with
Example:
perishable = request.perishable.
non-perishable = request.non-perishable
snack = request.non-perishable
meal = request.non-perishable
produce = request.non-perishable.
var q = 'SELECT * FROM posts WHERE available == true AND (if perishable is true all rows where the perishable column is set to true... etc );
Why not just change the query based on perishable?
var q = 'SELECT * FROM posts WHERE available == true';
if (perishable)
q += ' AND perishable == true';
You can do it with a CASE WHEN syntax in WHERE:
WHERE perishable IS CASE WHEN [condition] THEN TRUE ELSE FALSE END
Just a quick tip. Search for this syntax in your database documentation.

ROracle Errors When Trying to Use Bound Parameters

I'm using ROracle on a Win7 machine running the following R version:
platform x86_64-w64-mingw32
arch x86_64
os mingw32
system x86_64, mingw32
status
major 3
minor 1.1
year 2014
month 07
day 10
svn rev 66115
language R
version.string R version 3.1.1 (2014-07-10)
nickname Sock it to Me
Eventually, I'm going to move the script to a *nix machine, cron it, and run it with RScript.
I want to do something similar to:
select * from tablename where 'thingy' in ('string1','string2')
This would return two rows with all columns in SQLDeveloper (or Toad, etc).
(Ultimately, I want to pull results from one DB into a single column in a data.frame then use those results to loop through
and pull results from a second db, but I also need to be able to do just this function as well.)
I'm following the documentation for RORacle from here.
I've also looked at this (which didn't get an answer):
Bound parameters in ROracle SELECT statements
When I attempt the query from ROracle, I get two different errors, depending on whether I try a dbGetQuery() or dbSendQuery().
As background, here are the versions, queries and data I'm using:
Driver name: Oracle (OCI)
Driver version: 1.1-11
Client version: 11.2.0.3.0
The connection information is standard:
library(ROracle)
ora <- dbDriver("Oracle")
dbcon <- dbConnect(ora, username = "username", password = "password", dbname = "dbnamefromTNS")
These two queries return the expected results:
rs_send <- dbSendQuery(dbcon, "select * from tablename where columname_A = 'thingy' and rownum <= 1000")
rs_get <- dbGetQuery(dbcon, "select * from tablename where columname_A = 'thingy' and rownum <= 1000")
That is to say, 1000 rows from tablename where 'thingy' exists in columnname_A.
I have a data.frame of one column, with two rows.
my.data = data.frame(RANDOM_STRING = as.character(c('string1', 'string2')))
and str(my.data) returns this:
str(my.data)
'data.frame': 2 obs. of 1 variable:
$ RANDOM_STRING: chr "string1" "string2"
my attempted queries are:
nope <- dbSendQuery(dbcon, "select * from tablename where column_A = 'thingy' and widget_name =:1", data = data.frame(widget_name =my.data$RANDOM_STRING))
which gives me an error of:
Error in .oci.SendQuery(conn, statement, data = data, prefetch = prefetch, :
bind data does not match bind specification
and
not_this_either <- dbGetQuery(dbcon, "select * from tablename where column_A = 'thingy' and widget_name =:1", data = data.frame(widget_name =my.data$RANDOM_STRING))
which gives me an error of:
Error in .oci.GetQuery(conn, statement, data = data, prefetch = prefetch, :
bind data has too many rows
I'm guessing that my problem is in the data=(widget_name=my.data$RANDOM_STRING) part of the queries, but haven't been able to rubber duck my way through it.
Also, I'm very curious as to why I get two separate and different errors depending on whether the queries use the send (and fetch later) format or the get format.
If you like the tidyverse there's a slightly more compact way to achieve the above using purrr
library(ROracle)
library(purrr)
ora <- dbDriver("Oracle")
con <- dbConnect(ora, username = "username", password = "password", dbname = "yourdbnamefromTNSlist")
yourdatalist <- c(12345, 23456, 34567)
output <- map_df(yourdatalist, ~ dbGetQuery(con, "select * from YourTableNameHere where YOURCOLUMNNAME = :d", .x))
Figured it out.
It wasn't a problem with Oracle or ROracle (I'd suspected this) but with my R code.
I stumbled over the answer trying to solve another problem.
This answer about "dynamic strings" was the thing that got me moving towards a solution.
It doesn't fit exactly, but close enough to rubberduck my way to an answer from there.
The trick is to wrap the whole thing in a function and run an ldply on it:
library(ROracle)
ora <- dbDriver("Oracle")
con <- dbConnect(ora, username = "username", password = "password", dbname = "yourdbnamefromTNSlist")
yourdatalist <- c(12345, 23456, 34567)
thisfinallyworks <- function(x) {
dbGetQuery(con, "select * from YourTableNameHere where YOURCOLUMNNAME = :d", data = x)
}
ldply(yourdatalist, thisfinallyworks)
row1 of results where datapoint in YOURCOLUMNNAME = 12345
row2 of results where datapoint in YOURCOLUMNNAME = 23456
row3 of results where datapoint in YOURCOLUMNNAME = 34567
etc

Update From clause in SQL Server CE doesn't work , any Solutions?

I've this SQL statement:
UPDATE Movement_Item_Lots
SET Batch_Code = (SELECT WHSS.Batch_Code
FROM WH_Stock_Serials AS WHSS
WHERE WHSS.Item_Code = Movement_Item_Lots.Item_Code
AND WHSS.From_Distribution_Code = Movement_Item_Lots.Distribution_Code
)
it returns :
There was an error parsing the query.
[ Token line number = 2,Token line offset = 19,Token in error = SELECT ]
I know this is common issue in SQL Server CE that it can't do update from, any workaround for this issue ?
Change to sqlite, if possible, this sql would work... If not possible, you can always divide the statement in your program:
var <- SELECT WHSS.Batch_Code...
UPDATE .. SET Batch_Code = var