Go ioutil using too many file descriptors/leak? - file-io

I am going through a list of files and Unmarshalling the xml data in them into an array of structs rArray. I intend to process about 18000 files. When I get to about 1300 files processed, the program panics and says that too many files are open. If I limit the amount of files processed to a safe amount of 1000, the program does not crash.
As seen below, I am using ioutil.ReadFile to read the file data.
for _, f := range files {
func() {
data, err := ioutil.ReadFile("./" + recordDir + "/" + f.Name())
if err != nil {
fmt.Println("error reading %v", err)
return
} else {
if (strings.Contains(filepath.Ext(f.Name()), "xml")) {
//unmarshal data and put into struct array
err = xml.Unmarshal([]byte(data), &rArray[a])
if err != nil {
fmt.Println("error decoding %v: %v",f.Name(), err)
return
}
}
}
}()
}
I am not sure if Go is using too many file descriptors or not closing the files fast enough.
After reading https://groups.google.com/forum/#!topic/golang-nuts/7yXXjgcOikM and viewing the ioutil source in http://golang.org/src/pkg/io/ioutil/ioutil.go, the code for ioutil.ReadFile shows that it uses defer to close the file. defer runs when calling function is returned and ReadFile() is the calling function. Am I correct in this understanding?
I also tried wrapping the ioutil.ReadFile part of my code in a function, but it makes no difference.
My ulimit is set to unlimited.
UPDATE:
I believe that the error of too many files is actually occurring during my Unzip function.
func Unzip(src, dest string) error {
r, err := zip.OpenReader(src)
if err != nil {
return err
}
for _, f := range r.File {
rc, err := f.Open()
if err != nil {
panic(err)
}
path := filepath.Join(dest, f.Name)
if f.FileInfo().IsDir() {
os.MkdirAll(path, f.Mode())
} else {
f, err := os.OpenFile(
path, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, f.Mode())
if err != nil {
panic(err)
}
_, err = io.Copy(f, rc)
if err != nil {
panic(err)
}
f.Close()
}
rc.Close()
}
r.Close()
return nil
}
I initially got the Unzip function from https://gist.github.com/hnaohiro/4572580, but upon further inspection, the use of defer in the gist author's function seemed wrong as the file would only be closed after the Unzip() function returned which is too late becuase then 18000 file descriptors will be open. ;)
I replaced the deferred Closes with explicit Close() as shown above, but am still getting the same "too many open files" error. Is there a problem with my modified Unzip function?
UPDATE # 2
Oops, I was running this on Heroku and was pushing to the wrong app with my changes this entire time. Lesson learned: verify target app in heroku toolbelt.
Unzip code from https://gist.github.com/hnaohiro/4572580 does not work as it does not close files until all files processed.
My unzip code with explicit close above works and so does the defer version in #peterSO's answer.

I would modify the Unzip function from https://gist.github.com/hnaohiro/4572580 to the following:
package main
import (
"archive/zip"
"io"
"log"
"os"
"path/filepath"
)
func unzipFile(f *zip.File, dest string) error {
rc, err := f.Open()
if err != nil {
return err
}
defer rc.Close()
path := filepath.Join(dest, f.Name)
if f.FileInfo().IsDir() {
err := os.MkdirAll(path, f.Mode())
if err != nil {
return err
}
} else {
f, err := os.OpenFile(
path, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, f.Mode())
if err != nil {
return err
}
defer f.Close()
_, err = io.Copy(f, rc)
if err != nil {
return err
}
}
return nil
}
func Unzip(src, dest string) error {
r, err := zip.OpenReader(src)
if err != nil {
return err
}
defer r.Close()
for _, f := range r.File {
err := unzipFile(f, dest)
if err != nil {
return err
}
}
return nil
}
func main() {
err := Unzip("./sample.zip", "./out")
if err != nil {
log.Fatal(err)
}
}

Related

What's the best way to delete all database records used for a Test suite?

I have a test suite that pollute my database using a seed read from a YAML file.
I'm wondering is there a way to clean my database (delete all records used for the test suite) after running my tests.
// Open db and returns pointer and closer func
func prepareMySQLDB(t *testing.T) (db *sql.DB, closer func() error) {
db, err := sql.Open("mysql", "user:pass#/database")
if err != nil {
t.Fatalf("open mysql connection: %s", err)
}
return db, db.Close
}
// Pollute my database
func polluteDb(db *sql.DB, t *testing.T) {
seed, err := os.Open("seed.yml")
if err != nil {
t.Fatalf("failed to open seed file: %s", err)
}
defer seed.Close()
p := polluter.New(polluter.MySQLEngine(db))
if err := p.Pollute(seed); err != nil {
t.Fatalf("failed to pollute: %s", err)
}
}
func TestAllUsers(t *testing.T) {
t.Parallel()
db, closeDb := prepareMySQLDB(t)
defer closeDb()
polluteDb(db, t)
users, err := AllUsersD(db)
if err != nil {
t.Fatal("AllUsers() failed")
}
got := users[0].Email
if got != "myemail#gmail.com" {
t.Errorf("AllUsers().Email = %s; want myemail#gmail.com", got)
}
got1 := len(users)
if got1 != 1 {
t.Errorf("len(AllUsers()) = %d; want 1", got1)
}
}
// Test I'm interested in
func TestAddUser(t *testing.T) {
t.Parallel()
db, closeDb := prepareMySQLDB(t)
defer closeDb()
polluteDb(db, t)
user, err := AddUser(...)
if err != nil {
t.Fatal("AddUser() failed")
}
//how can I clean my database after this?
}
Should I retrieve the last ID inserted in TestAddUser() and just delete that line manually or there's any other way to save my database state and retrieve it after?
As I said I'm new to Go so any other comments on my code or what so ever are strongly appreciated.
The best way is usually to use a transaction, then ROLLBACK, so they are never committed in the first place.
The github.com/DATA-DOG/go-txdb package can help a lot with that.
Final code:
import (
"database/sql"
"os"
"testing"
txdb "github.com/DATA-DOG/go-txdb"
"github.com/romanyx/polluter"
)
//mostly sql tests
func init() {
txdb.Register("txdb", "mysql", "root:root#/betell_rest")
}
func TestAddUser(t *testing.T) {
db, err := sql.Open("txdb", "root:root#/betell_rest")
if err != nil {
t.Fatal(err)
}
defer db.Close()
users, _ := AllUsers(db)
userscount := len(users)
err = AddUser(db, "bla#gmail.com", "pass")
if err != nil {
t.Fatal("AddUser() failed")
}
users, _ = AllUsers(db)
if (userscount + 1) != len(users) {
t.Fatal("AddUser() failed to write in database")
}
}
Note: Also you can pass db into your polluter so you don't affect your database at all.

How to use os.Open()'s return value as the third parameter of http.Post() and set Content-Length?

The third parameter of http.Post() allows io.Reader and that means the return value of os.Open() should work. But the below code gets unexpected result, in other words, it won't set Content-Length properly. Perhaps File type doesn't implement something. Is there any proper way to set Content-Length with *File?
package main
import (
"bytes"
"io/ioutil"
"log"
"net/http"
"net/http/httptest"
"os"
)
var sample = []byte(`hello`)
func main() {
ts := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
log.Println(r.Header)
if int(r.ContentLength) != len(sample) {
log.Fatal("Unexpected Content-Length:", r.ContentLength)
}
w.Header().Set("Content-Type", "application/json")
w.Write([]byte(`{}`))
}))
defer ts.Close()
file, err := ioutil.TempFile(os.TempDir(), "")
if err != nil {
log.Fatal(err)
}
defer os.Remove(file.Name())
file.Write(sample)
// This works
buf, err := ioutil.ReadFile(file.Name())
if err != nil {
log.Fatal(err)
}
_, err = http.Post(ts.URL, "application/octet-stream", bytes.NewBuffer(buf))
if err != nil {
log.Fatal(err)
}
// This looks fine in my opinion, though it doesn't set Content-Length
f, err := os.Open(file.Name())
if err != nil {
log.Fatal(err)
}
_, err = http.Post(ts.URL, "application/octet-stream", f)
if err != nil {
log.Fatal(err)
}
}
Output:
2009/11/10 23:00:00 map[Content-Type:[application/octet-stream] Accept-Encoding:[gzip] User-Agent:[Go-http-client/1.1] Content-Length:[5]]
2009/11/10 23:00:00 map[Content-Type:[application/octet-stream] Accept-Encoding:[gzip] User-Agent:[Go-http-client/1.1]]
2009/11/10 23:00:00 Unexpected Content-Length:-1
https://play.golang.org/p/hJLN2H9Y9p
If you look at source for NewRequest you can see that contentLength is handled specially for specific input types, and the file reader isn't one of them. You'll have to manually set the Content-Length header if that's important [chunked should also work fine, unless you're sending to an old server impl].
If you want to add a add the Content-Length, you need to stat the file to get the size. The ContentLength isn't calculated automatically because an os.File may not have a useful size.
f, err := os.Open(file.Name())
if err != nil {
log.Fatal(err)
}
req, err := http.NewRequest("POST", ts.URL, f)
if err != nil {
log.Fatal(err)
}
stat, err := f.Stat()
if err != nil {
log.Fatal(err)
}
req.ContentLength = stat.Size()
req.Header.Set("Content-Type", "application/octet-stream")
resp, err = http.Do(req)
...

GO lang : Communicate with shell process

I want to execute a shell script from Go.
The shell script takes standard input and echoes the result.
I want to supply this input from GO and use the result.
What I am doing is:
cmd := exec.Command("python","add.py")
in, _ := cmd.StdinPipe()
But how do I read from in?
Here is some code writing to a process, and reading from it:
package main
import (
"bufio"
"fmt"
"os/exec"
)
func main() {
// What we want to calculate
calcs := make([]string, 2)
calcs[0] = "3*3"
calcs[1] = "6+6"
// To store the results
results := make([]string, 2)
cmd := exec.Command("/usr/bin/bc")
in, err := cmd.StdinPipe()
if err != nil {
panic(err)
}
defer in.Close()
out, err := cmd.StdoutPipe()
if err != nil {
panic(err)
}
defer out.Close()
// We want to read line by line
bufOut := bufio.NewReader(out)
// Start the process
if err = cmd.Start(); err != nil {
panic(err)
}
// Write the operations to the process
for _, calc := range calcs {
_, err := in.Write([]byte(calc + "\n"))
if err != nil {
panic(err)
}
}
// Read the results from the process
for i := 0; i < len(results); i++ {
result, _, err := bufOut.ReadLine()
if err != nil {
panic(err)
}
results[i] = string(result)
}
// See what was calculated
for _, result := range results {
fmt.Println(result)
}
}
You might want to read/write from/to the process in different goroutines.

POST data using the Content-Type multipart/form-data

I'm trying to upload images from my computer to a website using go. Usually, I use a bash script that sends a file and a key to the server:
curl -F "image"=#"IMAGEFILE" -F "key"="KEY" URL
it works fine, but I'm trying to convert this request into my golang program.
http://matt.aimonetti.net/posts/2013/07/01/golang-multipart-file-upload-example/
I tried this link and many others, but, for each code that I try, the response from the server is "no image sent", and I've no idea why. If someone knows what's happening with the example above.
Here's some sample code.
In short, you'll need to use the mime/multipart package to build the form.
package main
import (
"bytes"
"fmt"
"io"
"mime/multipart"
"net/http"
"net/http/httptest"
"net/http/httputil"
"os"
"strings"
)
func main() {
var client *http.Client
var remoteURL string
{
//setup a mocked http client.
ts := httptest.NewTLSServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
b, err := httputil.DumpRequest(r, true)
if err != nil {
panic(err)
}
fmt.Printf("%s", b)
}))
defer ts.Close()
client = ts.Client()
remoteURL = ts.URL
}
//prepare the reader instances to encode
values := map[string]io.Reader{
"file": mustOpen("main.go"), // lets assume its this file
"other": strings.NewReader("hello world!"),
}
err := Upload(client, remoteURL, values)
if err != nil {
panic(err)
}
}
func Upload(client *http.Client, url string, values map[string]io.Reader) (err error) {
// Prepare a form that you will submit to that URL.
var b bytes.Buffer
w := multipart.NewWriter(&b)
for key, r := range values {
var fw io.Writer
if x, ok := r.(io.Closer); ok {
defer x.Close()
}
// Add an image file
if x, ok := r.(*os.File); ok {
if fw, err = w.CreateFormFile(key, x.Name()); err != nil {
return
}
} else {
// Add other fields
if fw, err = w.CreateFormField(key); err != nil {
return
}
}
if _, err = io.Copy(fw, r); err != nil {
return err
}
}
// Don't forget to close the multipart writer.
// If you don't close it, your request will be missing the terminating boundary.
w.Close()
// Now that you have a form, you can submit it to your handler.
req, err := http.NewRequest("POST", url, &b)
if err != nil {
return
}
// Don't forget to set the content type, this will contain the boundary.
req.Header.Set("Content-Type", w.FormDataContentType())
// Submit the request
res, err := client.Do(req)
if err != nil {
return
}
// Check the response
if res.StatusCode != http.StatusOK {
err = fmt.Errorf("bad status: %s", res.Status)
}
return
}
func mustOpen(f string) *os.File {
r, err := os.Open(f)
if err != nil {
panic(err)
}
return r
}
Here's a function I've used that uses io.Pipe() to avoid reading in the entire file to memory or needing to manage any buffers. It handles only a single file, but could easily be extended to handle more by adding more parts within the goroutine. The happy path works well. The error paths have not hand much testing.
import (
"fmt"
"io"
"mime/multipart"
"net/http"
"os"
)
func UploadMultipartFile(client *http.Client, uri, key, path string) (*http.Response, error) {
body, writer := io.Pipe()
req, err := http.NewRequest(http.MethodPost, uri, body)
if err != nil {
return nil, err
}
mwriter := multipart.NewWriter(writer)
req.Header.Add("Content-Type", mwriter.FormDataContentType())
errchan := make(chan error)
go func() {
defer close(errchan)
defer writer.Close()
defer mwriter.Close()
w, err := mwriter.CreateFormFile(key, path)
if err != nil {
errchan <- err
return
}
in, err := os.Open(path)
if err != nil {
errchan <- err
return
}
defer in.Close()
if written, err := io.Copy(w, in); err != nil {
errchan <- fmt.Errorf("error copying %s (%d bytes written): %v", path, written, err)
return
}
if err := mwriter.Close(); err != nil {
errchan <- err
return
}
}()
resp, err := client.Do(req)
merr := <-errchan
if err != nil || merr != nil {
return resp, fmt.Errorf("http error: %v, multipart error: %v", err, merr)
}
return resp, nil
}
After having to decode the accepted answer for this question for use in my unit testing I finally ended up with the follow refactored code:
func createMultipartFormData(t *testing.T, fieldName, fileName string) (bytes.Buffer, *multipart.Writer) {
var b bytes.Buffer
var err error
w := multipart.NewWriter(&b)
var fw io.Writer
file := mustOpen(fileName)
if fw, err = w.CreateFormFile(fieldName, file.Name()); err != nil {
t.Errorf("Error creating writer: %v", err)
}
if _, err = io.Copy(fw, file); err != nil {
t.Errorf("Error with io.Copy: %v", err)
}
w.Close()
return b, w
}
func mustOpen(f string) *os.File {
r, err := os.Open(f)
if err != nil {
pwd, _ := os.Getwd()
fmt.Println("PWD: ", pwd)
panic(err)
}
return r
}
Now it should be pretty easy to use:
b, w := createMultipartFormData(t, "image","../luke.png")
req, err := http.NewRequest("POST", url, &b)
if err != nil {
return
}
// Don't forget to set the content type, this will contain the boundary.
req.Header.Set("Content-Type", w.FormDataContentType())
Here is an option that works for files or strings:
package main
import (
"bytes"
"io"
"mime/multipart"
"os"
"strings"
)
func createForm(form map[string]string) (string, io.Reader, error) {
body := new(bytes.Buffer)
mp := multipart.NewWriter(body)
defer mp.Close()
for key, val := range form {
if strings.HasPrefix(val, "#") {
val = val[1:]
file, err := os.Open(val)
if err != nil { return "", nil, err }
defer file.Close()
part, err := mp.CreateFormFile(key, val)
if err != nil { return "", nil, err }
io.Copy(part, file)
} else {
mp.WriteField(key, val)
}
}
return mp.FormDataContentType(), body, nil
}
Example:
package main
import "net/http"
func main() {
form := map[string]string{"image": "#IMAGEFILE", "key": "KEY"}
ct, body, err := createForm(form)
if err != nil {
panic(err)
}
http.Post("https://stackoverflow.com", ct, body)
}
https://golang.org/pkg/mime/multipart#Writer.WriteField
Send file from one service to another:
func UploadFile(network, uri string, f multipart.File, h *multipart.FileHeader) error {
buf := new(bytes.Buffer)
writer := multipart.NewWriter(buf)
part, err := writer.CreateFormFile("file", h.Filename)
if err != nil {
log.Println(err)
return err
}
b, err := ioutil.ReadAll(f)
if err != nil {
log.Println(err)
return err
}
part.Write(b)
writer.Close()
req, _ := http.NewRequest("POST", uri, buf)
req.Header.Add("Content-Type", writer.FormDataContentType())
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
b, _ = ioutil.ReadAll(resp.Body)
if resp.StatusCode >= 400 {
return errors.New(string(b))
}
return nil
}
To extend on #attila-o answer, here is the code I went with to perform a POST HTTP req in Go with:
1 file
configurable file name (f.Name() didn't work)
extra form fields.
Curl representation:
curl -X POST \
http://localhost:9091/storage/add \
-H 'content-type: multipart/form-data; boundary=----WebKitFormBoundary7MA4YWxkTrZu0gW' \
-F owner=0xc916Cfe5c83dD4FC3c3B0Bf2ec2d4e401782875e \
-F password=$PWD \
-F file=#./internal/file_example_JPG_500kB.jpg
Go way:
client := &http.Client{
Timeout: time.Second * 10,
}
req, err := createStoragePostReq(cfg)
res, err := executeStoragePostReq(client, req)
func createStoragePostReq(cfg Config) (*http.Request, error) {
extraFields := map[string]string{
"owner": "0xc916cfe5c83dd4fc3c3b0bf2ec2d4e401782875e",
"password": "pwd",
}
url := fmt.Sprintf("http://localhost:%d%s", cfg.HttpServerConfig().Port(), lethstorage.AddRoute)
b, w, err := createMultipartFormData("file","./internal/file_example_JPG_500kB.jpg", "file_example_JPG_500kB.jpg", extraFields)
if err != nil {
return nil, err
}
req, err := http.NewRequest("POST", url, &b)
if err != nil {
return nil, err
}
req.Header.Set("Content-Type", w.FormDataContentType())
return req, nil
}
func executeStoragePostReq(client *http.Client, req *http.Request) (lethstorage.AddRes, error) {
var addRes lethstorage.AddRes
res, err := client.Do(req)
if err != nil {
return addRes, err
}
defer res.Body.Close()
data, err := ioutil.ReadAll(res.Body)
if err != nil {
return addRes, err
}
err = json.Unmarshal(data, &addRes)
if err != nil {
return addRes, err
}
return addRes, nil
}
func createMultipartFormData(fileFieldName, filePath string, fileName string, extraFormFields map[string]string) (b bytes.Buffer, w *multipart.Writer, err error) {
w = multipart.NewWriter(&b)
var fw io.Writer
file, err := os.Open(filePath)
if fw, err = w.CreateFormFile(fileFieldName, fileName); err != nil {
return
}
if _, err = io.Copy(fw, file); err != nil {
return
}
for k, v := range extraFormFields {
w.WriteField(k, v)
}
w.Close()
return
}
I have found this tutorial very helpful to clarify my confusions about file uploading in Go.
Basically you upload the file via ajax using form-data on a client and use the following small snippet of Go code on the server:
file, handler, err := r.FormFile("img") // img is the key of the form-data
if err != nil {
fmt.Println(err)
return
}
defer file.Close()
fmt.Println("File is good")
fmt.Println(handler.Filename)
fmt.Println()
fmt.Println(handler.Header)
f, err := os.OpenFile(handler.Filename, os.O_WRONLY|os.O_CREATE, 0666)
if err != nil {
fmt.Println(err)
return
}
defer f.Close()
io.Copy(f, file)
Here r is *http.Request. P.S. this just stores the file in the same folder and does not perform any security checks.

How can I use the "compress/gzip" package to gzip a file?

I'm new to Go, and can't figure out how to use the compress/gzip package to my advantage. Basically, I just want to write something to a file, gzip it and read it directly from the zipped format through another script. I would really appreciate if someone could give me an example on how to do this.
All the compress packages implement the same interface. You would use something like this to compress:
var b bytes.Buffer
w := gzip.NewWriter(&b)
w.Write([]byte("hello, world\n"))
w.Close()
And this to unpack:
r, err := gzip.NewReader(&b)
io.Copy(os.Stdout, r)
r.Close()
Pretty much the same answer as Laurent, but with the file io:
import (
"bytes"
"compress/gzip"
"io/ioutil"
)
// ...
var b bytes.Buffer
w := gzip.NewWriter(&b)
w.Write([]byte("hello, world\n"))
w.Close() // You must close this first to flush the bytes to the buffer.
err := ioutil.WriteFile("hello_world.txt.gz", b.Bytes(), 0666)
For the Read part, something like the useful ioutil.ReadFile for .gz files could be :
func ReadGzFile(filename string) ([]byte, error) {
fi, err := os.Open(filename)
if err != nil {
return nil, err
}
defer fi.Close()
fz, err := gzip.NewReader(fi)
if err != nil {
return nil, err
}
defer fz.Close()
s, err := ioutil.ReadAll(fz)
if err != nil {
return nil, err
}
return s, nil
}
Here the func for unpack gzip file to destination file:
func UnpackGzipFile(gzFilePath, dstFilePath string) (int64, error) {
gzFile, err := os.Open(gzFilePath)
if err != nil {
return 0, fmt.Errorf("open file %q to unpack: %w", gzFilePath, err)
}
dstFile, err := os.OpenFile(dstFilePath, os.O_CREATE|os.O_WRONLY|os.O_APPEND, 0660)
if err != nil {
return 0, fmt.Errorf("create destination file %q to unpack: %w", dstFilePath, err)
}
defer dstFile.Close()
ioReader, ioWriter := io.Pipe()
defer ioReader.Close()
go func() { // goroutine leak is possible here
gzReader, _ := gzip.NewReader(gzFile)
// it is important to close the writer or reading from the other end of the
// pipe or io.copy() will never finish
defer func(){
gzFile.Close()
gzReader.Close()
ioWriter.Close()
}()
io.Copy(ioWriter, gzReader)
}()
written, err := io.Copy(dstFile, ioReader)
if err != nil {
return 0, err // goroutine leak is possible here
}
return written, nil
}
I decided to combine ideas from others answers and just provide a full example program. Obviously there are many different ways to do the same thing. This is just one way:
package main
import (
"compress/gzip"
"fmt"
"io/ioutil"
"os"
)
var zipFile = "zipfile.gz"
func main() {
writeZip()
readZip()
}
func writeZip() {
handle, err := openFile(zipFile)
if err != nil {
fmt.Println("[ERROR] Opening file:", err)
}
zipWriter, err := gzip.NewWriterLevel(handle, 9)
if err != nil {
fmt.Println("[ERROR] New gzip writer:", err)
}
numberOfBytesWritten, err := zipWriter.Write([]byte("Hello, World!\n"))
if err != nil {
fmt.Println("[ERROR] Writing:", err)
}
err = zipWriter.Close()
if err != nil {
fmt.Println("[ERROR] Closing zip writer:", err)
}
fmt.Println("[INFO] Number of bytes written:", numberOfBytesWritten)
closeFile(handle)
}
func readZip() {
handle, err := openFile(zipFile)
if err != nil {
fmt.Println("[ERROR] Opening file:", err)
}
zipReader, err := gzip.NewReader(handle)
if err != nil {
fmt.Println("[ERROR] New gzip reader:", err)
}
defer zipReader.Close()
fileContents, err := ioutil.ReadAll(zipReader)
if err != nil {
fmt.Println("[ERROR] ReadAll:", err)
}
fmt.Printf("[INFO] Uncompressed contents: %s\n", fileContents)
// ** Another way of reading the file **
//
// fileInfo, _ := handle.Stat()
// fileContents := make([]byte, fileInfo.Size())
// bytesRead, err := zipReader.Read(fileContents)
// if err != nil {
// fmt.Println("[ERROR] Reading gzip file:", err)
// }
// fmt.Println("[INFO] Number of bytes read from the file:", bytesRead)
closeFile(handle)
}
func openFile(fileToOpen string) (*os.File, error) {
return os.OpenFile(fileToOpen, openFileOptions, openFilePermissions)
}
func closeFile(handle *os.File) {
if handle == nil {
return
}
err := handle.Close()
if err != nil {
fmt.Println("[ERROR] Closing file:", err)
}
}
const openFileOptions int = os.O_CREATE | os.O_RDWR
const openFilePermissions os.FileMode = 0660
Having a full example like this should be helpful for future reference.
To compress any Go object of interface type as input
func compress(obj interface{}) ([]byte, error) {
var b bytes.Buffer
objBytes, err := json.Marshal(obj)
if err != nil {
return nil, err
}
gz := gzip.NewWriter(&b)
defer gz.Close() //NOT SUFFICIENT, DON'T DEFER WRITER OBJECTS
if _, err := gz.Write(objBytes); err != nil {
return nil, err
}
// NEED TO CLOSE EXPLICITLY
if err := gz.Close(); err != nil {
return nil, err
}
return b.Bytes(), nil
}
To decompress the same,
func decompress(obj []byte) ([]byte, error) {
r, err := gzip.NewReader(bytes.NewReader(obj))
if err != nil {
return nil, err
}
defer r.Close()
res, err := ioutil.ReadAll(r)
if err != nil {
return nil, err
}
return res, nil
}
Note, ioutil.ReadAll(r) returns io.EOF or io.ErrUnexpectedEOF if you do not close the Writer object after writing. I assumed defer on Close() would close the object properly, but it won't. Don't defer writer objects.