sql: expected 3 destination arguments in Scan, not 1 in Golang - sql

I am writing a generic code to query data from any RDS table. I have gone through many StackOverflow answers but none of it worked for me. I have gone through below links:-
panic: sql: expected 1 destination arguments in Scan, not <number> golang, pq, sql
How to query any table of RDS using Golang SDK
My first code is
package main
import (
"fmt"
)
type BA_Client struct {
ClientId int `json:ClientId;"`
CompanyName string `json:CompanyName;"`
CreateDate string `json:CreateDate;"`
}
func main() {
conn, _ := getConnection() // Det database connection
query := `select * from IMBookingApp.dbo.BA_Client
ORDER BY
ClientId ASC
OFFSET 56 ROWS
FETCH NEXT 10 ROWS ONLY ;`
var p []byte
err := conn.QueryRow(query).Scan(&p)
if err != nil {
fmt.Println("Error1", err)
}
fmt.Println("Data:", p)
var m BA_Client
err = json.Unmarshal(p, &m)
if err != nil {
fmt.Println("Error2", err)
}
}
My second code is
conn, _ := getConnection()
query := `select * from IMBookingApp.dbo.BA_Client__c
ORDER BY
ClientId__c ASC
OFFSET 56 ROWS
FETCH NEXT 10 ROWS ONLY ;`
rows, err := conn.Query(query)
if err != nil {
fmt.Println("Error:")
log.Fatal(err)
}
println("rows", rows)
defer rows.Close()
columns, err := rows.Columns()
fmt.Println("columns", columns)
if err != nil {
panic(err)
}
for rows.Next() {
receiver := make([]*string, len(columns))
err := rows.Scan(&receiver )
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
fmt.Println("Data:", p)
fmt.Println("receiver", receiver)
}
With both the codes, I am getting the same error as below
sql: expected 3 destination arguments in Scan, not 1
Can it be because I am using SQL Server and not MySQL? Appreciate if anyone can help me to find the issue.

When you want to use a slice of inputs for your row scan, use the variadic 3-dots notation ... to convert the slice into individual parameters, like so:
err := rows.Scan(receiver...)
Your (three in this case) columns, using the variadic arguments will effectively expand like so:
// len(receiver) == 3
err := rows.Scan(receiver[0], receiver[1], receiver[2])
EDIT:
SQL Scan method parameter values must be of type interface{}. So we need an intermediate slice. For example:
is := make([]interface{}, len(receiver))
for i := range is {
is[i] = receiver[i]
// each is[i] will be of type interface{} - compatible with Scan()
// using the underlying concrete `*string` values from `receiver`
}
// ...
err := rows.Scan(is...)
// `receiver` will contain the actual `*string` typed items

try this (no ampersand in front of receiver) for second example:
err := rows.Scan(receiver)
receiver and *string are already a reference types. No need to add another reference to it.

specific field table
query := `select * from IMBookingApp.dbo.BA_Client__c
ORDER BY
ClientId__c ASC
OFFSET 56 ROWS
FETCH NEXT 10 ROWS ONLY ;`
change to
query := `select ClientId, CompanyName, CreateDate from IMBookingApp.dbo.BA_Client__c
ORDER BY
ClientId__c ASC
OFFSET 56 ROWS
FETCH NEXT 10 ROWS ONLY ;`

Related

Go SQL query inconsistency

I am experiencing some really weird inconsistencies when executing queries, and was wondering if anyone knew why.
Imagine I have a struct defined as follows:
type Result struct {
Afield string `db:"A"`
Bfield interface{} `db:"B"`
Cfield string `db:"C"`
Dfield string `db:"D"`
}
And a MySQL Table with the following cols:
A : VARCHAR(50)
B : INT
C : VARCHAR(50)
D : VARCHAR(50)
The query I would like to execute:
SELECT A, B, C, D FROM table WHERE A="a"
first way it can be executed:
db.Get(&result, `SELECT A, B, C, D FROM table WHERE A="a"`)
second way it can be executed:
db.Get(&result, `SELECT A, B, C, D FROM table WHERE A=?`, "a")
The inconsistencies I am experiencing are as follows: When executing the query the first way, the type of Bfield is int. However, when executing the query the second time, it is []uint8.
This outcome is occurring for example when B is 1.
Why is the type of Bfield different depending on how the query is executed?
connection declaration:
// Connection is an interface for making queries.
type Connection interface {
Exec(query string, args ...interface{}) (sql.Result, error)
Get(dest interface{}, query string, args ...interface{}) error
Select(dest interface{}, query string, args ...interface{}) error
}
EDIT
This is also happening using the Go database/sql package + driver. The queries below are assigning Bfield to []uint8 and int64 respectively.
db is of type *sql.DB
query 1:
db.QueryRow(SELECT A, B, C, D FROM table WHERE A="a").Scan(&result.Afield, &result.Bfield, &result.Cfield, &result.Dfield)
-- > type of Bfield is []uint8
query 2:
db.QueryRow(SELECT A, B, C, D FROM table WHERE A=?, "a").Scan(&result.Afield, &result.Bfield, &result.Cfield, &result.Dfield)
--> type of Bfield is int64
EDIT
Something else to note, when chaining multiple WHERE clauses, as long as at least 1 is populated using ?, the query will return int. Otherwise if they are all populated in the string, it will return []uint8
Short answer: because the MySQL driver uses a different protocol for queries with and without parameters. Use a prepared statement to get consistent results.
The following explanation refers to the standard MySQL driver github.com/go-sql-driver/mysql, version 1.4
In the first case, the driver sends the query directly to MySQL, and interprets the result as a *textRows struct. This struct (almost) always decodes results into a byte slice, and leaves the conversion to a better type to the Go sql package. This works fine if the destination is an int, string, sql.Scanner etc, but not for interface{}.
In the second case, the driver detects that there are arguments and returns driver.ErrSkip. This causes the Go SQL package to use a PreparedStatement. And in that case, the MySQL driver uses a *binaryRows struct to interpret the results. This struct uses the declared column type (INT in this case) to decode the value, in this case to decode the value into an int64.
Fun fact: if you provide the interpolateParams=true parameter to the database DSN (e.g. "root:testing#/mysql?interpolateParams=true"), the MySQL driver will prepare the query on the client side, and not use a PreparedStatement. At this point both types of query behave the same.
A small proof of concept:
package main
import (
"database/sql"
"log"
_ "github.com/go-sql-driver/mysql"
)
type Result struct {
Afield string
Bfield interface{}
}
func main() {
db, err := sql.Open("mysql", "root:testing#/mysql")
if err != nil {
log.Fatal(err)
}
defer db.Close()
if _, err = db.Exec(`CREATE TABLE IF NOT EXISTS mytable(A VARCHAR(50), B INT);`); err != nil {
log.Fatal(err)
}
if _, err = db.Exec(`DELETE FROM mytable`); err != nil {
log.Fatal(err)
}
if _, err = db.Exec(`INSERT INTO mytable(A, B) VALUES ('a', 3)`); err != nil {
log.Fatal(err)
}
var (
usingLiteral Result
usingParam Result
usingLiteralPrepared Result
)
row := db.QueryRow(`SELECT B FROM mytable WHERE A='a'`)
if err := row.Scan(&usingLiteral.Bfield); err != nil {
log.Fatal(err)
}
row = db.QueryRow(`SELECT B FROM mytable WHERE A=?`, "a")
if err := row.Scan(&usingParam.Bfield); err != nil {
log.Fatal(err)
}
stmt, err := db.Prepare(`SELECT B FROM mytable WHERE A='a'`)
if err != nil {
log.Fatal(err)
}
defer stmt.Close()
row = stmt.QueryRow()
if err := row.Scan(&usingLiteralPrepared.Bfield); err != nil {
log.Fatal(err)
}
log.Printf("Type when using literal: %T", usingLiteral.Bfield) // []uint8
log.Printf("Type when using param: %T", usingParam.Bfield) // int64
log.Printf("Type when using prepared: %T", usingLiteralPrepared.Bfield) // int64
}
Your first SQL string, in MySql is ambigous and can have too meaning as explained on StackOverflow in following address
When to use single quotes, double quotes, and back ticks in MySQL
Depending on SQL-MODE, your SQL command can be interpreted as
SELECT A, B, C, D FROM table WHERE A='a'
that is what I think you are expecting.
or as
SELECT A, B, C, D FROM table WHERE A=`a`
To avoid this ambiguity, can you make a new FIRST test in replacing double quotes by single quote ?
If the same behavior continue to be there, my answer is not a good response.
If BOTH SQL select return same value, your question has been solved.
Using ` character, you pass a variable name and not a string value !

"WHERE ... IN (...)" using pgx [duplicate]

I am trying to execute the following query against the PostgreSQL database in Go using pq driver:
SELECT COUNT(id)
FROM tags
WHERE id IN (1, 2, 3)
where 1, 2, 3 is passed at a slice tags := []string{"1", "2", "3"}.
I have tried many different things like:
s := "(" + strings.Join(tags, ",") + ")"
if err := Db.QueryRow(`
SELECT COUNT(id)
FROM tags
WHERE id IN $1`, s,
).Scan(&num); err != nil {
log.Println(err)
}
which results in pq: syntax error at or near "$1". I also tried
if err := Db.QueryRow(`
SELECT COUNT(id)
FROM tags
WHERE id IN ($1)`, strings.Join(stringTagIds, ","),
).Scan(&num); err != nil {
log.Println(err)
}
which also fails with pq: invalid input syntax for integer: "1,2,3"
I also tried passing a slice of integers/strings directly and got sql: converting Exec argument #0's type: unsupported type []string, a slice.
So how can I execute this query in Go?
Pre-building the SQL query (preventing SQL injection)
If you're generating an SQL string with a param placeholder for each of the values, it's easier to just generate the final SQL right away.
Note that since values are strings, there's place for SQL injection attack, so we first test if all the string values are indeed numbers, and we only proceed if so:
tags := []string{"1", "2", "3"}
buf := bytes.NewBufferString("SELECT COUNT(id) FROM tags WHERE id IN(")
for i, v := range tags {
if i > 0 {
buf.WriteString(",")
}
if _, err := strconv.Atoi(v); err != nil {
panic("Not number!")
}
buf.WriteString(v)
}
buf.WriteString(")")
Executing it:
num := 0
if err := Db.QueryRow(buf.String()).Scan(&num); err != nil {
log.Println(err)
}
Using ANY
You can also use Postgresql's ANY, whose syntax is as follows:
expression operator ANY (array expression)
Using that, our query may look like this:
SELECT COUNT(id) FROM tags WHERE id = ANY('{1,2,3}'::int[])
In this case you can declare the text form of the array as a parameter:
SELECT COUNT(id) FROM tags WHERE id = ANY($1::int[])
Which can simply be built like this:
tags := []string{"1", "2", "3"}
param := "{" + strings.Join(tags, ",") + "}"
Note that no check is required in this case as the array expression will not allow SQL injection (but rather will result in a query execution error).
So the full code:
tags := []string{"1", "2", "3"}
q := "SELECT COUNT(id) FROM tags WHERE id = ANY($1::int[])"
param := "{" + strings.Join(tags, ",") + "}"
num := 0
if err := Db.QueryRow(q, param).Scan(&num); err != nil {
log.Println(err)
}
This is not really a Golang issue, you are using a string to compare to integer (id) in your SQL request. That means, SQL receive:
SELECT COUNT(id)
FROM tags
WHERE id IN ("1, 2, 3")
instead of what you want to give it. You just need to convert your tags into integer and passe it to the query.
EDIT:
Since you are trying to pass multiple value to the query, then you should tell it:
params := make([]string, 0, len(tags))
for i := range tags {
params = append(params, fmt.Sprintf("$%d", i+1))
}
query := fmt.Sprintf("SELECT COUNT(id) FROM tags WHERE id IN (%s)", strings.Join(params, ", "))
This will end the query with a "($1, $2, $3...", then convert your tags as int:
values := make([]int, 0, len(tags))
for _, s := range tags {
val, err := strconv.Atoi(s)
if err != nil {
// Do whatever is required with the error
fmt.Println("Err : ", err)
} else {
values = append(values, val)
}
}
And finally, you can use it in the query:
Db.QueryRow(query, values...)
This should do it.
Extending #icza solution, you can use pq.Array instead of building the params yourself.
So using his example, the code can look like this:
tags := []string{"1", "2", "3"}
q := "SELECT COUNT(id) FROM tags WHERE id = ANY($1::int[])"
num := 0
if err := Db.QueryRow(q, pq.Array(tags)).Scan(&num); err != nil {
log.Println(err)
}

golang sqlite can't define Query variable

It seems golang's sqlite package doesn't like my db.Query statement, though it's exactly like the one found in the example on github.
db, err := sql.Open("sqlite3", "./database.db")
if err != nil {
log.Fatal(err)
}
defer db.Close()
rows, err = db.Query("select id, name from job")
if err != nil {
log.Fatal(err)
}
defer rows.Close()
fmt.Println("Jobs:")
for rows.Next() {
var name string
var id int
fmt.Printf("%v %v\n", id, name)
}
This is the error I'm getting:
./test.go:7: undefined: rows
./test.go:7: cannot assign to rows
./test.go:11: undefined: rows
./test.go:14: undefined: rows
Edit: I've tried using grave accent and single quote strings for db.Query() as well, to no avail.
You cannot assign values to to undeclared variables.
rows, err = db.Query("select id, name from job")
Should be :
rows, err := db.Query("select id, name from job")
Theoretically this should solve the problem, but I haven't tried.
You should also add :
rows.Scan(&id, &name)
Before the printf function so as to actually assign the row's value to the id & name variables otherwise will print an empty string & 0.

How to execute an IN lookup in SQL using Golang?

What does Go want for the second param in this SQL query.
I am trying to use the IN lookup in postgres.
stmt, err := db.Prepare("SELECT * FROM awesome_table WHERE id= $1 AND other_field IN $2")
rows, err := stmt.Query(10, ???)
What I really want:
SELECT * FROM awesome_table WHERE id=10 AND other_field IN (this, that);
It looks like you may be using the pq driver. pq recently added Postgres-specific Array support via pq.Array (see pull request 466). You can get what you want via:
stmt, err := db.Prepare("SELECT * FROM awesome_table WHERE id= $1 AND other_field = ANY($2)")
rows, err := stmt.Query(10, pq.Array([]string{'this','that'})
I think this generates the SQL:
SELECT * FROM awesome_table WHERE id=10 AND other_field = ANY('{"this", "that"}');
Note this utilizes prepared statements, so the inputs should be sanitized.
Query just takes varargs to replace the params in your sql
so, in your example, you would just do
rows, err := stmt.Query(10)
say, this and that of your second example were dynamic, then you'd do
stmt, err := db.Prepare("SELECT * FROM awesome_table WHERE id=$1 AND other_field IN ($2, $3)")
rows, err := stmt.Query(10,"this","that")
If you have variable args for the "IN" part, you can do (play)
package main
import "fmt"
import "strings"
func main() {
stuff := []interface{}{"this", "that", "otherthing"}
sql := "select * from foo where id=? and name in (?" + strings.Repeat(",?", len(stuff)-1) + ")"
fmt.Println("SQL:", sql)
args := []interface{}{10}
args = append(args, stuff...)
fakeExec(args...)
// This also works, but I think it's harder for folks to read
//fakeExec(append([]interface{}{10},stuff...)...)
}
func fakeExec(args ...interface{}) {
fmt.Println("Got:", args)
}
Incase anyone like me was trying to use an array with a query, here is an easy solution.
get https://github.com/jmoiron/sqlx
ids := []int{1, 2, 3}
q,args,err := sqlx.In("SELECT id,username FROM users WHERE id IN(?);", ids) //creates the query string and arguments
//you should check for errors of course
q = sqlx.Rebind(sqlx.DOLLAR,q) //only if postgres
rows, err := db.Query(q,args...) //use normal POSTGRES/ANY SQL driver important to include the '...' after the Slice(array)
With PostgreSQL, at least, you have the option of passing the entire array as a string, using a single placeholder:
db.Query("select 1 = any($1::integer[])", "{1,2,3}")
That way, you can use a single query string, and all the string concatenation is confined to the parameter. And if the parameter is malformed, you don't get an SQL injection; you just get something like: ERROR: invalid input syntax for integer: "xyz"
https://groups.google.com/d/msg/golang-nuts/vHbg09g7s2I/RKU7XsO25SIJ
if you use sqlx, you can follow this way:
https://github.com/jmoiron/sqlx/issues/346
arr := []string{"this", "that"}
query, args, err := sqlx.In("SELECT * FROM awesome_table WHERE id=10 AND other_field IN (?)", arr)
query = db.Rebind(query) // sqlx.In returns queries with the `?` bindvar, rebind it here for matching the database in used (e.g. postgre, oracle etc, can skip it if you use mysql)
rows, err := db.Query(query, args...)
var awesome AwesomeStruct
var awesomes []*AwesomeStruct
ids := []int{1,2,3,4}
q, args, err := sqlx.In(`
SELECT * FROM awesome_table WHERE id=(?) AND other_field IN (?)`, 10, ids)
// use .Select for multiple return
err = db.Select(&awesomes, db.SQL.Rebind(q), args...)
// use .Get for single return
err = db.Get(&awesome, db.SQL.Rebind(q), args...)
//I tried a different way. A simpler and easier way, maybe not too efficient.
stringedIDs := fmt.Sprintf("%v", ids)
stringedIDs = stringedIDs[1 : len(stringedIDs)-1]
stringedIDs = strings.ReplaceAll(stringedIDs, " ", ",")
query := "SELECT * FROM users WHERE id IN (" + stringedIDs + ")"
//Then follow your standard database/sql Query
rows, err := db.Query(query)
//error checking
if err != nil {
// Handle errors
} else {
// Process rows
}
Rather pedestrian and only to be used if server generated. Where UserIDs is a slice (list) of strings:
sqlc := `select count(*) from test.Logins where UserID
in ("` + strings.Join(UserIDs,`","`) + `")`
errc := db.QueryRow(sqlc).Scan(&Logins)
You can also use this direct conversion.
awesome_id_list := []int{3,5,8}
var str string
for _, value := range awesome_id_list {
str += strconv.Itoa(value) + ","
}
query := "SELECT * FROM awesome_table WHERE id IN (" + str[:len(str)-1] + ")"
WARNING
This is method is vulnerable to SQL Injection. Use this method only if awesome_id_list is server generated.

Error with scan of nil float value using Go and database/sql

I'm writing a program that needs to determine opening values for a table prior to doing some Inserts and Updates for that table. The table in question (PostgreSql in this case) could have zero rows initially. When I select the opening values, if there are zero rows, the total for the value of balances is being returned as nil. This causes the scan to fail with message :
Error on scan of test01 opening Row Count. Error = sql: Scan error on column
index 1: converting string "<nil>" to a float64: strconv.ParseFloat:
parsing "<nil>": invalid syntax
While I can "solve" the problem by doing two selects, one to select the COUNT(*) and the other to SUM() the balances if the row-count exceeds zero, it does not seem an elegant solution, and may not always solve the problem, and the two values selected (number of rows and total of balances) are not at the same point in time.
Is there a way to solve this problem doing one select of the table?
A small test program illustrating the problem is below. When there are rows in the table being selected, the program works fine. However if there are zero rows, the above error results.
Example Test Program:
package main
import (
"database/sql"
"fmt"
_ "github.com/lib/pq"
)
var db *sql.DB
func main() {
var err error
db, err = sql.Open("postgres",
"user=test dbname=testdb password=test sslmode=disable")
if err != nil {
fmt.Sprintf("Failed to open Db Connection. Error = %s\n", err)
return
}
defer fCloseDb()
var row *sql.Row
var sSql string = "SELECT COUNT(*), SUM(dbalance) FROM test01"
if row = db.QueryRow(sSql); row == nil {
println("No row returned selecting opening count(*) from test01")
return
}
var iRowCount int64 = 0
var fBalTot float64 = 0.00
if err = row.Scan(&iRowCount, &fBalTot); err != nil {
fmt.Printf("Error on scan of test01 opening Row Count. Error = %s\n", err)
return
}
fmt.Printf("Row Count = %d, Balance total value = %.2f\n", iRowCount, fBalTot)
}
func fCloseDb() {
if db != nil {
db.Close()
println("Db Closed")
}
}
The structure of the table is as follows :
sSql = "CREATE TABLE IF NOT EXISTS test01 " +
"(ikey SERIAL Primary Key, " +
"sname varchar(22) not null, " +
"dbalance decimal(12,2) not null)"
Many thanks, that worked :
"SELECT COUNT(*), coalesce(SUM(dbalance), 0.00) FROM test01"
I believe that coalesce returns the first non-null value.