Scan a PostgreSQL field (of ARRAY type) into a slice of Go structs - sql

Let's say I have:
type User struct {
ID int64 `json:"id"
Posts []Post `json:"posts"
}
type Post struct {
ID int64 `json:"id"
Text string `json:"text"
}
The SQL query:
WITH temp AS (SELECT u.id AS user_id, p.id AS post_id, p.text AS post_text FROM users u JOIN posts p ON u.id=p.user_id)
SELECT user_id, ARRAY_AGG(ARRAY[post_id::text, post_text])
FROM temp
GROUP BY user_id
)
What I want is to scan rows from the query above into a slice of User objects:
import (
"context"
"fmt"
"github.com/jackc/pgx/v4/pgxpool"
"github.com/lib/pq"
)
var out []User
rows, _ := client.Query(context.Background(), query) // No error handling for brevity
for rows.Next() {
var u User
if err := rows.Scan(&u.ID, pq.Array(&u.Posts)); err != nil {
return
}
out = append(out, u)
}
Pretty much expectedly, the code above fails with:
pq: cannot convert ARRAY[4][2] to StringArray
This makes sense, but is there a way to read the SQL output into my slice of users?

Scanning of multi-dimensional arrays of arbitrary types, like structs, is not supported by lib/pq. If you want to scan such an array you'll have to parse and decode it yourself in a custom sql.Scanner implementation.
For example:
type PostList []Post
func (ls *PostList) Scan(src any) error {
var data []byte
switch v := src.(type) {
case string:
data = []byte(v)
case []byte:
data = v
}
// The data var holds the multi-dimensional array value,
// something like: {{"1","foo"}, {"2","bar"}, ...}
// The above example is easy to parse but too simplistic,
// the array is likely to be more complex and therefore
// harder to parse, but not at all impossible if that's
// what you want.
return nil
}
If you want to learn more about the PostgreSQL array representation syntax, see:
Array Input and Output Syntax
An approach that does not require you to implement a parser for PostgreSQL arrays would be to build and pass JSON objects, instead of PostgreSQL arrays, to array_agg. The result of that would be a one-dimensional array with jsonb as the element type.
SELECT user_id, array_agg(jsonb_build_object('id', post_id, 'text', post_text))
FROM temp
GROUP BY user_id
Then the implementation of the custom sql.Scanner just needs to delegate to lib/pq.GenericArray and another, element-specific sql.Scanner, would delegate to encoding/json.
type PostList []Post
func (ls *PostList) Scan(src any) error {
return pq.GenericArray{ls}.Scan(src)
}
func (p *Post) Scan(src any) error {
var data []byte
switch v := src.(type) {
case string:
data = []byte(v)
case []byte:
data = v
}
return json.Unmarshal(data, p)
}
type User struct {
ID int64 `json:"id"`
Posts PostList `json:"posts"`
}

Related

golang unmarshal using db tags

I have a sql query which fetches the list of cities nested inside provinces nested inside countries
SELECT
C.*,
P.provinces
FROM
countries AS C
LEFT JOIN (
SELECT
P.country_id,
json_agg(json_build_object(
'id', P.id,
'name', P.name,
'slug', P.slug,
'cities', Ci.cities
)) AS provinces
FROM
provinces AS P
LEFT JOIN (
SELECT
Ci.province_id,
json_agg(json_build_object(
'id', Ci.id,
'name', Ci.name,
'slug', Ci.slug
)) AS cities
FROM
cities AS Ci
GROUP BY Ci.province_id
) AS Ci ON Ci.province_id = P.id
GROUP BY P.country_id
) AS P ON P.country_id = C.id
I am fetching this data into slice of countries
type Country struct {
Id int64 `json:"id" db:"id"`
ISOCode2 string `json:"isoCode2" db:"iso_code_2"`
ISOCode3 string `json:"isoCode3" db:"iso_code_3"`
ISONumCode string `json:"isoNumCode" db:"iso_num_code"`
Name string `json:"name" db:"name"`
Slug string `json:"slug" db:"slug"`
Provinces SliceProvince `json:"provinces" db:"provinces"`
}
type SliceProvince []Province
func (provinces *SliceProvince) Scan(src any) (err error) {
if src == nil {
return
}
var source []byte
switch src := src.(type) {
case []byte:
source = src
case string:
source = []byte(src)
default:
return fmt.Errorf("unsupported type in scan:%v", src)
}
err = json.Unmarshal(source, provinces)
return
}
type Province struct {
Id int64 `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Slug string `json:"slug" db:"slug"`
Cities SliceCity `json:"cities" db:"cities"`
}
type SliceCity []City
func (cities *SliceCity) Scan(src any) (err error) {
if src == nil {
return
}
var source []byte
switch src := src.(type) {
case []byte:
source = src
case string:
source = []byte(src)
default:
return fmt.Errorf("unsupported type in scan")
}
err = json.Unmarshal(source, cities)
return
}
type City struct {
Id int64 `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Slug string `json:"slug" db:"slug"`
}
Now, my main query is that in the scan methods for these models, I want to do unmarshalling using db tags instead of json tags. Is there any workaround which I can do for this.
I came up with a thought of unmarshalling it to a map, then change their keys from db tag ones to json tag ones, then marshal and unmarshal it to the corresponding struct. But when there would be nesting of models, or slices involved, that would increase the complexity

How to parse a Json with Go

Have tis code:
type Foo struct {
field string
Value string
}
type Guyy struct {
info Foo
}
func main() {
GuyJson := `{"info":[{"field":"phone","value":"11111-11111"},{"field":"date","value":"05072001"},{"field":"nationality","value":"american"},{"field":"dni","value":"000012345"}]}`
var Guy Guyy
json.Unmarshal([]byte(GuyJson), &Guy)
fmt.Printf("%+v", Guy)
}
When compile I get
{info:{field: Value:}}
How can I get the field and value of nationality?
Change struct to slice (info represent list of structs)
You struct field must be exported (Marshal, Unmarshal, A Tour of GO)
Struct values encode as JSON objects. Each exported struct field becomes a member of the object, using the field name as the object key, unless the field is omitted...
type Guyy struct {
Info []Foo // 👈 exported slice
}
PLAYGROUND
var Guy Guyy
var f interface{}
json.Unmarshal([]byte(GuyJson), &f)
m := f.(map[string]interface{})
foomap := m["info"]
v := foomap.([]interface{})
for _, fi := range v {
vi := fi.(map[string]interface{})
var f Foo
f.Field = vi["field"].(string)
f.Value = vi["value"].(string)
if f.Field == "nationality" {
fmt.Println(f.Field, "was found to be", f.Value)
}
Guy.Info = append(Guy.Info, f)
}
fmt.Println(Guy)
Refer this link for complete code -> https://play.golang.org/p/vFgpE0GNJ7K

How do I get a slice from a Postgres array in Golang?

Let's say I have a postgres query like:
SELECT
id,
ARRAY_AGG(code) AS code
FROM
codes
WHERE id = '9252781'
GROUP BY id;
My return looks like:
id | codes
-----------+-------------
9252781 | {H01,H02}
Both id and codes are varchar.
In Golang when I scan along the rows for my result, it just freezes. No error, nothing.
If you're using the github.com/lib/pq postgres driver you can use their pq.Array helper function to scan and store postgres arrays.
var id string
var arr []string
row := db.QueryRow(`SELECT '9252781', ARRAY['H01','H02']`)
if err := row.Scan(&id, pq.Array(&arr)); err != nil {
log.Fatal(err)
}
log.Println(id, arr)
// 9252781 [H01 H02]
I don't want to use a additional driver (github.com/lib/pq), and I did not found any pgx way to do it other than creating my own type.
So I did one:
type Tags []string
func (t *Tags) Scan(v interface{}) error {
if v == nil {
*t = Tags{}
return nil
}
s, ok := v.(string)
if !ok {
return fmt.Errorf("Scan is expected to receive a string from database, but got [%+v]", v)
}
s = strings.TrimPrefix(s, "{")
s = strings.TrimSuffix(s, "}")
*t = strings.Split(s, ",")
return nil
}
func (t *Tags) Value() (driver.Value, error) {
s := fmt.Sprintf("{%v}", strings.Join(([]string)(*t), ","))
return s, nil
}
then you can scan the row returned from database like:
...
err := rows.Scan(
...
&Tags,
...
)
and you can use it directly you your Exec queries.
This code works well with constants in arrays, but you need more work if want to use it in commas and brackets.

Golang - Supersede Embedded Structure for URL Encoding Parameters

I have multiple structures with the same format as MapParameters that are passed to the encodeParams function. Unfortunately, using that function against these structures produces unwanted encoding including the embedded structure name. Is there anyway I can fix this using reflect without using a huge switch library of type assertions?
// Desired Encoding
&required_param=1
// Current Encoding
%5BRequired%5D&required_param=1
// Desired
type MapParameters struct {
Required struct { ... }
Optional struct { ... }
}
// Current
type MapParameters struct {
MapRequired
MapOptional
}
type MapRequired struct { ... }
type MapOptional struct { ... }
func encodeParams(s string, opt interface{}) (string, error) {
v := reflect.ValueOf(opt)
if v.Kind() == reflect.Ptr && v.IsNil() {
return s, nil
}
u, err := url.Parse(s)
if err != nil {
return s, err
}
// from github.com/google/go-querystring/query
qs, err := query.Values(opt)
if err != nil {
return s, err
}
u.RawQuery = u.RawQuery + qs.Encode()
return u.String(), nil
}
Anonymous not mean embed, they are completely different two thing. Embedding means the fields of nested struct will present to outer struct. Anonymous just missing the struct name. you are expecting anonymous structs to be embedded, it is not a good idea.
Anyhow, if you want encoding anonymous as embedded, change the code in url-encoding lib https://github.com/google/go-querystring/blob/master/query/encode.go
if /*sf.Anonymous &&*/ sv.Kind() == reflect.Struct {
// save embedded struct for later processing
embedded = append(embedded, sv)
continue
}
please note sf.Anonymous not mean anonymous in real, it means embedded as the comment saying
type StructField struct {
...
Index []int // index sequence for Type.FieldByIndex
Anonymous bool // is an embedded field
}

How do I convert a database row into a struct

Let's say I have a struct:
type User struct {
Name string
Id int
Score int
}
And a database table with the same schema. What's the easiest way to parse a database row into a struct? I've added an answer below but I'm not sure it's the best one.
Go package tests often provide clues as to ways of doing things. For example, from database/sql/sql_test.go,
func TestQuery(t *testing.T) {
/* . . . */
rows, err := db.Query("SELECT|people|age,name|")
if err != nil {
t.Fatalf("Query: %v", err)
}
type row struct {
age int
name string
}
got := []row{}
for rows.Next() {
var r row
err = rows.Scan(&r.age, &r.name)
if err != nil {
t.Fatalf("Scan: %v", err)
}
got = append(got, r)
}
/* . . . */
}
func TestQueryRow(t *testing.T) {
/* . . . */
var name string
var age int
var birthday time.Time
err := db.QueryRow("SELECT|people|age,name|age=?", 3).Scan(&age)
/* . . . */
}
Which, for your question, querying a row into a structure, would translate to something like:
var row struct {
age int
name string
}
err = db.QueryRow("SELECT|people|age,name|age=?", 3).Scan(&row.age, &row.name)
I know that looks similar to your solution, but it's important to show how to find a solution.
I recommend github.com/jmoiron/sqlx.
From the README:
sqlx is a library which provides a set of extensions on go's standard
database/sql library. The sqlx versions of sql.DB, sql.TX,
sql.Stmt, et al. all leave the underlying interfaces untouched, so
that their interfaces are a superset on the standard ones. This makes
it relatively painless to integrate existing codebases using
database/sql with sqlx.
Major additional concepts are:
Marshal rows into structs (with embedded struct support), maps, and slices
Named parameter support including prepared statements
Get and Select to go quickly from query to struct/slice
The README also includes a code snippet demonstrating scanning a row into a struct:
type Place struct {
Country string
City sql.NullString
TelephoneCode int `db:"telcode"`
}
// Loop through rows using only one struct
place := Place{}
rows, err := db.Queryx("SELECT * FROM place")
for rows.Next() {
err := rows.StructScan(&place)
if err != nil {
log.Fatalln(err)
}
fmt.Printf("%#v\n", place)
}
Note that we didn't have to manually map each column to a field of the struct. sqlx has some default mappings for struct fields to database columns, as well as being able to specify database columns using tags (note the TelephoneCode field of the Place struct above). You can read more about that in the documentation.
Here's one way to do it - just assign all of the struct values manually in the Scan function.
func getUser(name string) (*User, error) {
var u User
// this calls sql.Open, etc.
db := getConnection()
// note the below syntax only works for postgres
err := db.QueryRow("SELECT * FROM users WHERE name = $1", name).Scan(&u.Id, &u.Name, &u.Score)
if err != nil {
return &User{}, err
} else {
return &u, nil
}
}
rows, err := connection.Query("SELECT `id`, `username`, `email` FROM `users`")
if err != nil {
panic(err.Error())
}
for rows.Next() {
var user User
if err := rows.Scan(&user.Id, &user.Username, &user.Email); err != nil {
log.Println(err.Error())
}
users = append(users, user)
}
Full example
Here is a library just for that: scany.
You can use it like that:
type User struct {
Name string
Id int
Score int
}
// db is your *sql.DB instance
// ctx is your current context.Context instance
// Use sqlscan.Select to query multiple records.
var users []*User
sqlscan.Select(ctx, db, &users, `SELECT name, id, score FROM users`)
// Use sqlscan.Get to query exactly one record.
var user User
sqlscan.Get(ctx, db, &user, `SELECT name, id, score FROM users WHERE id=123`)
It's well documented and easy to work with.
Disclaimer: I am the author of this library.
there's package just for that: sqlstruct
unfortunately, last time I checked it did not support embedded structs (which are trivial to implement yourself - i had a working prototype in a few hours).
just committed the changes I made to sqlstruct
use :
go-models-mysql
sqlbuilder
val, err = m.ScanRowType(row, (*UserTb)(nil))
or the full code
import (
"database/sql"
"fmt"
lib "github.com/eehsiao/go-models-lib"
mysql "github.com/eehsiao/go-models-mysql"
)
// MyUserDao : extend from mysql.Dao
type MyUserDao struct {
*mysql.Dao
}
// UserTb : sql table struct that to store into mysql
type UserTb struct {
Name sql.NullString `TbField:"Name"`
Id int `TbField:"Id"`
Score int `TbField:"Score"`
}
// GetFirstUser : this is a data logical function, you can write more logical in there
// sample data logical function to get the first user
func (m *MyUserDao) GetFirstUser() (user *User, err error) {
m.Select("Name", "Id", "Score").From("user").Limit(1)
fmt.Println("GetFirstUser", m.BuildSelectSQL().BuildedSQL())
var (
val interface{}
row *sql.Row
)
if row, err = m.GetRow(); err == nil {
if val, err = m.ScanRowType(row, (*UserTb)(nil)); err == nil {
u, _ := val.(*UserTb)
user = &User{
Name: lib.Iif(u.Name.Valid, u.Nae.String, "").(string),
Id: u.Id,
Score: u.Score,
}
}
}
row, val = nil, nil
return
}