everyone, I have some troubles with postgresql in golang.
I have an sql file (database.sql), and before starting my server I want to exec some command, it look's like that
CREATE TABLE forums (
id BIGSERIAL primary key,
slug TEXT NOT NULL UNIQUE,
title TEXT,
author TEXT references users(login),
threads BIGINT DEFAULT 0,
posts BIGINT DEFAULT 0
);
I know, that I should use db.Exec(request), but i have many requests ("CREATE TABLE user" and others...)
I have no idea, how to do that
Help, and thanks a lot!
You can read the file's contents into a string and pass that to Exec.
query, err := ioutil.ReadFile("path/to/database.sql")
if err != nil {
panic(err)
}
if _, err := db.Exec(query); err != nil {
panic(err)
}
If your database.sql is formated in such a way, or it contains queries that break db.Exec, then you could use os/exec together with psql (if it's installed on the machine on which the code is being run).
cmd := exec.Command("psql", "-d", "database_name", "-f", "path/to/database.sql")
stderr, err := cmd.StderrPipe()
if err != nil {
panic(err)
}
if err := cmd.Start(); err != nil {
panic(err)
}
errout, _ := ioutil.ReadAll(stderr)
if err := cmd.Wait(); err != nil {
fmt.Println(errout)
panic(err)
}
Related
I want to make a backup sql data function in golang. I have written a sample code which will backup the sql database and a command for table too. But I don't know that how to dump the one row data from sql into given path?
For database:
package main
import (
"io/ioutil"
"log"
"os/exec"
)
func main() {
cmd := exec.Command("mysqldump", "-P3306", "-hhost", "-uuser", "-ppassword", "database_name")
stdout, err := cmd.StdoutPipe()
if err != nil {
log.Fatal(err)
}
if err := cmd.Start(); err != nil {
log.Fatal(err)
}
bytes, err := ioutil.ReadAll(stdout)
if err != nil {
log.Fatal(err)
}
err = ioutil.WriteFile("./out.sql", bytes, 0644)
if err != nil {
panic(err)
}
}
For table we can change the command like below:
cmd := exec.Command("mysqldump", "-P3306", "-hhost", "-uuser", "-ppassword", "database_name" "table_name")
For dump the row of a table what should I have to write please any suggestions with short code.
for example: dump that row where id is equals to 1
In my html page, user clicks on Download button after entering the start and end date to download data in csv format.
I have used jquery to get the user inputs from the screen and pass the values to main.go
Now in main.go, I am trying to run a select query to read all the data in database between the two dates that is coming from jquery.
Something like below:
func adminPage(w http.ResponseWriter, r *http.Request) {
dsn := "server=eccdb1677.md3q.ford.com;user id=prx_mdesk_appl;password=******"
db, err := sql.Open("mssql", dsn)
if err != nil {
log.Fatal(err)
}
err = db.Ping()
if err != nil {
log.Fatal(err)
}
defer db.Close()
r.ParseForm()
StartDate := r.FormValue("startdate")
EndDate := r.FormValue("enddate")
rows, _ := db.Query("SELECT * FROM mdesk.dbo.tbl_fdbk WHERE CreateDate BETWEEN StartDate and EndDate")
for rows.Next() {
var (
CDSID string
CreateDate int
Rating int
Comments string
)
err = rows.Scan(&CDSID, &CreateDate, &Rating, &Comments)
if err != nil {
panic(err.Error()) // proper error handling instead of panic in your app
}
file, err := os.Create("reports.csv")
if err != nil {
log.Fatal(err)
}
defer file.Close()
w := csv.NewWriter(file)
err = w.WriteAll(rows)
if err != nil {
log.Fatal("Error writing record to csv:", err)
}
}
}
This code has error in so many levels, but right now I'm focusing more on how to write the select query to read the data between start date and end date.
Any help would be appreciated.
I'm trying to put a (key,value) in aerospike
that it's value is: ("mykey",3).
is this the correct technique?
because i cannot fetch it...
key, err := as.NewKey("namespace", "set","mykey")
if err != nil {
log.Fatal(err)
}
exists, err := client.Exists(Policy, key)
if exists {
// Read a record
record, err := client.Get(Policy, key)
if err != nil {
log.Fatal(err)
}
var newval = 3
}
bin1 := as.NewBin("bin1", newval )
// Write a record
err = client.PutBins(WritePolicy, key, bin1)
Getting strange behaviour with a struct with embedded json.
package main
import (
"database/sql"
"encoding/json"
"fmt"
_ "github.com/lib/pq"
)
type Article struct {
Id int
Doc *json.RawMessage
}
func main() {
db, err := sql.Open("postgres", "postgres://localhost/json_test?sslmode=disable")
if err != nil {
panic(err)
}
_, err = db.Query(`create table if not exists articles (id serial primary key, doc json)`)
if err != nil {
panic(err)
}
_, err = db.Query(`truncate articles`)
if err != nil {
panic(err)
}
docs := []string{
`{"type":"event1"}`,
`{"type":"event2"}`,
}
for _, doc := range docs {
_, err = db.Query(`insert into articles ("doc") values ($1)`, doc)
if err != nil {
panic(err)
}
}
rows, err := db.Query(`select id, doc from articles`)
if err != nil {
panic(err)
}
articles := make([]Article, 0)
for rows.Next() {
var a Article
err := rows.Scan(
&a.Id,
&a.Doc,
)
if err != nil {
panic(err)
}
articles = append(articles, a)
fmt.Println("scan", string(*a.Doc), len(*a.Doc))
}
fmt.Println()
for _, a := range articles {
fmt.Println("loop", string(*a.Doc), len(*a.Doc))
}
}
Output:
scan {"type":"event1"} 17
scan {"type":"event2"} 17
loop {"type":"event2"} 17
loop {"type":"event2"} 17
So the articles end up pointing to the same json.
Am I doing something wrong?
UPDATE
Edited to a runnable example. I'm using Postgres and lib/pq.
I ran into this same issue and after looking at if for a long time I read the doc on Scan and it says
If an argument has type *[]byte, Scan saves in that argument a copy of the corresponding data. The copy is owned by the caller and can be modified and held indefinitely. The copy can be avoided by using an argument of type *RawBytes instead; see the documentation for RawBytes for restrictions on its use.
What I think is happening if you use *json.RawMessage then Scan does not see it as a *[]byte and does not copy into it. So you get in internal slice on the next loop Scan overwrites.
Change your Scan to cast the *json.RawMessage to a *[]byte so Scan will copy the values to it.
err := rows.Scan(
&a.Id,
(*[]byte)(a.Doc),
)
In case that helps anyone :
I used masebase anwser to INSERT a json.RawMessage property of my struct in a postgresql db column having jsonb column type.
All you need to do is cast : ([]byte)(a.Doc) in the insert binding method (without the * in my case).
I'd like to be able to evaluate my queries inside my app, which is in Go and using the github.com/lib/pq driver. Unfortunately, neither the [lib/pq docs][1] nor the [database/sql][2] docs seem to say anything about this, and nothing in the database/sql interfaces suggests this is possible.
Has anyone found a way to get this output?
Typical EXPLAIN ANALYZE returns several rows, so you can do it with simple sql.Query. Here is an example:
package main
import (
"database/sql"
"fmt"
_ "github.com/lib/pq"
"log"
)
func main() {
db, err := sql.Open("postgres", "user=test dbname=test sslmode=disable")
if err != nil {
log.Fatal(err)
}
defer db.Close()
rows, err := db.Query("EXPLAIN ANALYZE SELECT * FROM accounts ORDER BY slug")
if err != nil {
log.Fatal(err)
}
for rows.Next() {
var s string
if err := rows.Scan(&s); err != nil {
log.Fatal(err)
}
fmt.Println(s)
}
if err := rows.Err(); err != nil {
log.Fatal(err)
}
}