In my Go application I use crontab package to run Tracker function every minute. As you can notice from the code I call PostgreSQL function. To interact with the PostgreSQL database, I use the gorm package. Application worked several days without any problem but now I notice an error in logs: pq: sorry, too many clients already. I know that same questions was asked several times in StackOverflow before. For example in this post people advice to use Exec or Scan methods. In my case as you can see I use Exec method but anyway I have error. As far as I understand, each database request makes a separate connection and does not close it. I can't figure out what I'm doing wrong.
main.go:
package main
import (
"github.com/mileusna/crontab"
)
func main() {
database.ConnectPostgreSQL()
defer database.DisconnectPostgreSQL()
err = crontab.New().AddJob("* * * * *", controllers.Tracker); if err != nil {
utils.Logger().Fatal(err)
return
}
}
tracker.go:
package controllers
import (
"questionnaire/database"
"time"
)
var Tracker = func() {
err := database.DBGORM.Exec("CALL tracker($1)", time.Now().Format("2006-01-02 15:04:05")).Error; if err != nil {
utils.Logger().Println(err) // ERROR: pq: sorry, too many clients already
return
}
}
PostgreSQL.go:
package database
import (
"fmt"
"github.com/jinzhu/gorm"
_ "github.com/jinzhu/gorm/dialects/postgres"
"github.com/joho/godotenv"
"questionnaire/utils"
)
var DBGORM *gorm.DB
func ConnectPostgreSQL() {
err := godotenv.Load(".env")
if err != nil {
utils.Logger().Println(err)
panic(err)
}
databaseUser := utils.CheckEnvironmentVariable("PostgreSQL_USER")
databasePassword := utils.CheckEnvironmentVariable("PostgreSQL_PASSWORD")
databaseHost := utils.CheckEnvironmentVariable("PostgreSQL_HOST")
databaseName := utils.CheckEnvironmentVariable("PostgreSQL_DATABASE_NAME")
databaseURL:= fmt.Sprintf("host=%s user=%s dbname=%s password=%s sslmode=disable", databaseHost, databaseUser, databaseName, databasePassword)
DBGORM, err = gorm.Open("postgres", databaseURL)
if err != nil {
utils.Logger().Println(err)
panic(err)
}
err = DBGORM.DB().Ping()
if err != nil {
utils.Logger().Println(err)
panic(err)
}
DBGORM.LogMode(true)
}
func DisconnectPostgreSQL() error {
return DBGORM.Close()
}
Related
I have a test suite that pollute my database using a seed read from a YAML file.
I'm wondering is there a way to clean my database (delete all records used for the test suite) after running my tests.
// Open db and returns pointer and closer func
func prepareMySQLDB(t *testing.T) (db *sql.DB, closer func() error) {
db, err := sql.Open("mysql", "user:pass#/database")
if err != nil {
t.Fatalf("open mysql connection: %s", err)
}
return db, db.Close
}
// Pollute my database
func polluteDb(db *sql.DB, t *testing.T) {
seed, err := os.Open("seed.yml")
if err != nil {
t.Fatalf("failed to open seed file: %s", err)
}
defer seed.Close()
p := polluter.New(polluter.MySQLEngine(db))
if err := p.Pollute(seed); err != nil {
t.Fatalf("failed to pollute: %s", err)
}
}
func TestAllUsers(t *testing.T) {
t.Parallel()
db, closeDb := prepareMySQLDB(t)
defer closeDb()
polluteDb(db, t)
users, err := AllUsersD(db)
if err != nil {
t.Fatal("AllUsers() failed")
}
got := users[0].Email
if got != "myemail#gmail.com" {
t.Errorf("AllUsers().Email = %s; want myemail#gmail.com", got)
}
got1 := len(users)
if got1 != 1 {
t.Errorf("len(AllUsers()) = %d; want 1", got1)
}
}
// Test I'm interested in
func TestAddUser(t *testing.T) {
t.Parallel()
db, closeDb := prepareMySQLDB(t)
defer closeDb()
polluteDb(db, t)
user, err := AddUser(...)
if err != nil {
t.Fatal("AddUser() failed")
}
//how can I clean my database after this?
}
Should I retrieve the last ID inserted in TestAddUser() and just delete that line manually or there's any other way to save my database state and retrieve it after?
As I said I'm new to Go so any other comments on my code or what so ever are strongly appreciated.
The best way is usually to use a transaction, then ROLLBACK, so they are never committed in the first place.
The github.com/DATA-DOG/go-txdb package can help a lot with that.
Final code:
import (
"database/sql"
"os"
"testing"
txdb "github.com/DATA-DOG/go-txdb"
"github.com/romanyx/polluter"
)
//mostly sql tests
func init() {
txdb.Register("txdb", "mysql", "root:root#/betell_rest")
}
func TestAddUser(t *testing.T) {
db, err := sql.Open("txdb", "root:root#/betell_rest")
if err != nil {
t.Fatal(err)
}
defer db.Close()
users, _ := AllUsers(db)
userscount := len(users)
err = AddUser(db, "bla#gmail.com", "pass")
if err != nil {
t.Fatal("AddUser() failed")
}
users, _ = AllUsers(db)
if (userscount + 1) != len(users) {
t.Fatal("AddUser() failed to write in database")
}
}
Note: Also you can pass db into your polluter so you don't affect your database at all.
I want to make a backup sql data function in golang. I have written a sample code which will backup the sql database and a command for table too. But I don't know that how to dump the one row data from sql into given path?
For database:
package main
import (
"io/ioutil"
"log"
"os/exec"
)
func main() {
cmd := exec.Command("mysqldump", "-P3306", "-hhost", "-uuser", "-ppassword", "database_name")
stdout, err := cmd.StdoutPipe()
if err != nil {
log.Fatal(err)
}
if err := cmd.Start(); err != nil {
log.Fatal(err)
}
bytes, err := ioutil.ReadAll(stdout)
if err != nil {
log.Fatal(err)
}
err = ioutil.WriteFile("./out.sql", bytes, 0644)
if err != nil {
panic(err)
}
}
For table we can change the command like below:
cmd := exec.Command("mysqldump", "-P3306", "-hhost", "-uuser", "-ppassword", "database_name" "table_name")
For dump the row of a table what should I have to write please any suggestions with short code.
for example: dump that row where id is equals to 1
I am going through a list of files and Unmarshalling the xml data in them into an array of structs rArray. I intend to process about 18000 files. When I get to about 1300 files processed, the program panics and says that too many files are open. If I limit the amount of files processed to a safe amount of 1000, the program does not crash.
As seen below, I am using ioutil.ReadFile to read the file data.
for _, f := range files {
func() {
data, err := ioutil.ReadFile("./" + recordDir + "/" + f.Name())
if err != nil {
fmt.Println("error reading %v", err)
return
} else {
if (strings.Contains(filepath.Ext(f.Name()), "xml")) {
//unmarshal data and put into struct array
err = xml.Unmarshal([]byte(data), &rArray[a])
if err != nil {
fmt.Println("error decoding %v: %v",f.Name(), err)
return
}
}
}
}()
}
I am not sure if Go is using too many file descriptors or not closing the files fast enough.
After reading https://groups.google.com/forum/#!topic/golang-nuts/7yXXjgcOikM and viewing the ioutil source in http://golang.org/src/pkg/io/ioutil/ioutil.go, the code for ioutil.ReadFile shows that it uses defer to close the file. defer runs when calling function is returned and ReadFile() is the calling function. Am I correct in this understanding?
I also tried wrapping the ioutil.ReadFile part of my code in a function, but it makes no difference.
My ulimit is set to unlimited.
UPDATE:
I believe that the error of too many files is actually occurring during my Unzip function.
func Unzip(src, dest string) error {
r, err := zip.OpenReader(src)
if err != nil {
return err
}
for _, f := range r.File {
rc, err := f.Open()
if err != nil {
panic(err)
}
path := filepath.Join(dest, f.Name)
if f.FileInfo().IsDir() {
os.MkdirAll(path, f.Mode())
} else {
f, err := os.OpenFile(
path, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, f.Mode())
if err != nil {
panic(err)
}
_, err = io.Copy(f, rc)
if err != nil {
panic(err)
}
f.Close()
}
rc.Close()
}
r.Close()
return nil
}
I initially got the Unzip function from https://gist.github.com/hnaohiro/4572580, but upon further inspection, the use of defer in the gist author's function seemed wrong as the file would only be closed after the Unzip() function returned which is too late becuase then 18000 file descriptors will be open. ;)
I replaced the deferred Closes with explicit Close() as shown above, but am still getting the same "too many open files" error. Is there a problem with my modified Unzip function?
UPDATE # 2
Oops, I was running this on Heroku and was pushing to the wrong app with my changes this entire time. Lesson learned: verify target app in heroku toolbelt.
Unzip code from https://gist.github.com/hnaohiro/4572580 does not work as it does not close files until all files processed.
My unzip code with explicit close above works and so does the defer version in #peterSO's answer.
I would modify the Unzip function from https://gist.github.com/hnaohiro/4572580 to the following:
package main
import (
"archive/zip"
"io"
"log"
"os"
"path/filepath"
)
func unzipFile(f *zip.File, dest string) error {
rc, err := f.Open()
if err != nil {
return err
}
defer rc.Close()
path := filepath.Join(dest, f.Name)
if f.FileInfo().IsDir() {
err := os.MkdirAll(path, f.Mode())
if err != nil {
return err
}
} else {
f, err := os.OpenFile(
path, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, f.Mode())
if err != nil {
return err
}
defer f.Close()
_, err = io.Copy(f, rc)
if err != nil {
return err
}
}
return nil
}
func Unzip(src, dest string) error {
r, err := zip.OpenReader(src)
if err != nil {
return err
}
defer r.Close()
for _, f := range r.File {
err := unzipFile(f, dest)
if err != nil {
return err
}
}
return nil
}
func main() {
err := Unzip("./sample.zip", "./out")
if err != nil {
log.Fatal(err)
}
}
Getting strange behaviour with a struct with embedded json.
package main
import (
"database/sql"
"encoding/json"
"fmt"
_ "github.com/lib/pq"
)
type Article struct {
Id int
Doc *json.RawMessage
}
func main() {
db, err := sql.Open("postgres", "postgres://localhost/json_test?sslmode=disable")
if err != nil {
panic(err)
}
_, err = db.Query(`create table if not exists articles (id serial primary key, doc json)`)
if err != nil {
panic(err)
}
_, err = db.Query(`truncate articles`)
if err != nil {
panic(err)
}
docs := []string{
`{"type":"event1"}`,
`{"type":"event2"}`,
}
for _, doc := range docs {
_, err = db.Query(`insert into articles ("doc") values ($1)`, doc)
if err != nil {
panic(err)
}
}
rows, err := db.Query(`select id, doc from articles`)
if err != nil {
panic(err)
}
articles := make([]Article, 0)
for rows.Next() {
var a Article
err := rows.Scan(
&a.Id,
&a.Doc,
)
if err != nil {
panic(err)
}
articles = append(articles, a)
fmt.Println("scan", string(*a.Doc), len(*a.Doc))
}
fmt.Println()
for _, a := range articles {
fmt.Println("loop", string(*a.Doc), len(*a.Doc))
}
}
Output:
scan {"type":"event1"} 17
scan {"type":"event2"} 17
loop {"type":"event2"} 17
loop {"type":"event2"} 17
So the articles end up pointing to the same json.
Am I doing something wrong?
UPDATE
Edited to a runnable example. I'm using Postgres and lib/pq.
I ran into this same issue and after looking at if for a long time I read the doc on Scan and it says
If an argument has type *[]byte, Scan saves in that argument a copy of the corresponding data. The copy is owned by the caller and can be modified and held indefinitely. The copy can be avoided by using an argument of type *RawBytes instead; see the documentation for RawBytes for restrictions on its use.
What I think is happening if you use *json.RawMessage then Scan does not see it as a *[]byte and does not copy into it. So you get in internal slice on the next loop Scan overwrites.
Change your Scan to cast the *json.RawMessage to a *[]byte so Scan will copy the values to it.
err := rows.Scan(
&a.Id,
(*[]byte)(a.Doc),
)
In case that helps anyone :
I used masebase anwser to INSERT a json.RawMessage property of my struct in a postgresql db column having jsonb column type.
All you need to do is cast : ([]byte)(a.Doc) in the insert binding method (without the * in my case).
I'd like to be able to evaluate my queries inside my app, which is in Go and using the github.com/lib/pq driver. Unfortunately, neither the [lib/pq docs][1] nor the [database/sql][2] docs seem to say anything about this, and nothing in the database/sql interfaces suggests this is possible.
Has anyone found a way to get this output?
Typical EXPLAIN ANALYZE returns several rows, so you can do it with simple sql.Query. Here is an example:
package main
import (
"database/sql"
"fmt"
_ "github.com/lib/pq"
"log"
)
func main() {
db, err := sql.Open("postgres", "user=test dbname=test sslmode=disable")
if err != nil {
log.Fatal(err)
}
defer db.Close()
rows, err := db.Query("EXPLAIN ANALYZE SELECT * FROM accounts ORDER BY slug")
if err != nil {
log.Fatal(err)
}
for rows.Next() {
var s string
if err := rows.Scan(&s); err != nil {
log.Fatal(err)
}
fmt.Println(s)
}
if err := rows.Err(); err != nil {
log.Fatal(err)
}
}