Go Database Connector: go-sql-driver works, everything else "unknown driver, forgotten import?" - sql

When I try to use database/sql in this way it compiles and works:
import (
"database/sql"
_ "github.com/go-sql-driver/mysql"
)
But if I try to use postgres specific connectors it doesn't even compile:
import(
"database/sql"
_ "github.com/lib/pq"
)
import(
"database/sql"
_ "github.com/jbarham/gopgsqldriver"
)
both fail with the error
sql: unknown driver "mysql" (forgotten import?)
I have done go get for both of these packages, and am really not sure why it is not compiling

Are you doing
db, err := sql.Open("mysql",
later on? When you import "github.com/lib/pq" for example, it registers itself by calling sql.Register, and then in the source of sql.Open you have:
func Open(driverName, dataSourceName string) (*DB, error) {
driversMu.RLock()
driveri, ok := drivers[driverName]
driversMu.RUnlock()
if !ok {
return nil, fmt.Errorf("sql: unknown driver %q (forgotten import?)", driverName)
}
}
So, since you are no longer importing mysql, you need to change sql.Open to use the pq driver (or whichever one you end up picking).

Related

Write datas from Stripe API in a CSV file in golang

I am trying to retrieve Stripe datas and parse them into a CSV file.
Here is my code:
package main
import (
"github.com/stripe/stripe-go"
"github.com/stripe/stripe-go/invoice"
"fmt"
"os"
"encoding/csv"
)
func main() {
stripe.Key = "" // I can't share the API key
params := &stripe.InvoiceListParams{}
params.Filters.AddFilter("limit", "", "3")
params.Filters.AddFilter("status", "", "paid")
i := invoice.List(params)
// Create a CSV file
csvdatafile, err := os.Create("./mycsvfile.csv")
if err != nil {
fmt.Println(err)
}
defer csvdatafile.Close()
// Write Unmarshaled json data to CSV file
w := csv.NewWriter(csvdatafile)
//Column title
var header []string
header = append(header, "ID")
w.Write(header)
for i.Next() {
in := i.Invoice()
fmt.Printf(in.ID) // It is working
w.Write(in) // It is not working
}
w.Flush()
fmt.Println("Appending succed")
}
When I am running my program with go run *.go I obtain the following error:
./main.go:35:10: cannot use in (type *stripe.Invoice) as type []string in argument to w.Write
I think I am not far from the solution.
I just need to understand how to write correctly in the CSV file thank's to w.Write() command.
According to the doc, the Write function is:
func (w *Writer) Write(record []string) error
That is, it is expecting you to pass a slice of strings representing a line of CSV data with each string being a slice. So, if you have only one field, you have to pass a string slice of 1:
w.Write([]string{in.ID})

Tables not getting created in Postgresql using Gorm

I am trying to create a table from a struct using the following code. It had initially worked by hard coding the credentials to test. Once changing to env vars, I wanted to test that the tables and schemas would get created as expected.
So far I have tried:
Removing the tables from the db, running "go run main.go".
Result: Established connection successfully to the db, but tables do not get created.
Deleting the database, recreating the database using psql "CREATE DATABASE" command, and running "go run main.go"
Result: Established connection successfully to the db, but tables do not get created.
Use AutoMigrate, but was not able to successfully create the tables.
Debug: When I run it in debug mode, the debug console displays that its connected, i see no indication of any errors. I have not been coding for many years, still in the learning process.
Below are 2 files, main.go and api.go (opens db)
MAIN.GO
import (
"fmt"
"log"
"net/http"
"time"
"github.com/gorilla/handlers"
"gitlab......"
_ "gitlab....."
)
var err error
func main() {
api := controllers.API{}
// Using env vars from a config file
api.Initialize("user=%s password=%s dbname=%s port=%s sslmode=disable")
// BIND TO A PORT AND PASS OUR ROUTER IN
log.Fatal(http.ListenAndServe(":8000", handlers.CORS()(api.Router)))
if err != nil {
panic(err.Error())
}
// Models
type Application struct {
ID string `json:"id" gorm:"primary_key"`
CreatedAt time.Time `json:"-"`
UpdatedAt time.Time `json:"-"`
Name string `json:"name"`
Ci string `json:"ci"`
// CREATE TABLES AND SCHEMA IF TABLES DO NOT EXIST
if !api.Database.HasTable(&Application{}) {
api.Database.CreateTable(&Application{})
}
API.GO
func (api *API) Initialize(opts string) {
// Initialize DB
dbinfo := fmt.Sprintf("user=%s password=%s dbname=%s port=%s sslmode=disable",
config.DB_USER, config.DB_PASSWORD, config.DB_NAME, config.PORT)
api.Database, err = gorm.Open("postgres", dbinfo)
if err != nil {
log.Print("failed to connect to the database")
log.Fatal(err)
}
fmt.Println("Connection established")
log.Printf("Postgres started at %s PORT", config.PORT)
}
I have already created the database, and am able to establish a connection to the database. Just can not get the tables created.
Ideas?

how can i properly vendor github.com/docker/docker?

here my main.go
package cmd
import (
"context"
"fmt"
"github.com/docker/docker/api/types"
"github.com/docker/docker/client"
)
func main() {
cli, err := client.NewClientWithOpts(client.WithVersion("1.38"))
if err != nil {
panic(err)
}
networks, err := cli.NetworkList(context.Background(), types.NetworkListOptions{})
if err != nil {
panic(err)
}
fmt.Println(networks)
}
i tried to run dep init but vendor folder ended up with an older version of docker/docker because the newest tag is 17.05 tried to pin the actual commit but that did not work either
i give a shot to go mod vendor but that also rely on git tags
Strangely enough docker/docker is an alias to moby/moby and docker/engine.
Anyone could explain me and give example how can i successfully use vendoring with docker API?
[[constraint]]
name = "github.com/docker/docker"
branch = "master"
[[override]]
name = "github.com/docker/distribution"
branch = "master"
Actually this two entries helped solve the dependency issue in Gopkg.toml, then running dep ensure

How to test the passing of arguments in Golang?

package main
import (
"flag"
"fmt"
)
func main() {
passArguments()
}
func passArguments() string {
username := flag.String("user", "root", "Username for this server")
flag.Parse()
fmt.Printf("Your username is %q.", *username)
usernameToString := *username
return usernameToString
}
Passing an argument to the compiled code:
./args -user=bla
results in:
Your username is "bla"
the username that has been passed is displayed.
Aim: in order to prevent that the code needs to be build and run manually every time to test the code the aim is to write a test that is able to test the passing of arguments.
Attempt
Running the following test:
package main
import (
"os"
"testing"
)
func TestArgs(t *testing.T) {
expected := "bla"
os.Args = []string{"-user=bla"}
actual := passArguments()
if actual != expected {
t.Errorf("Test failed, expected: '%s', got: '%s'", expected, actual)
}
}
results in:
Your username is "root".Your username is "root".--- FAIL: TestArgs (0.00s)
args_test.go:15: Test failed, expected: 'bla', got: 'root'
FAIL
coverage: 87.5% of statements
FAIL tool 0.008s
Problem
It looks like that the os.Args = []string{"-user=bla is not able to pass this argument to the function as the outcome is root instead of bla
Per my comment, the very first value in os.Args is a (path to) executable itself, so os.Args = []string{"cmd", "-user=bla"} should fix your issue. You can take a look at flag test from the standard package where they're doing something similar.
Also, as os.Args is a "global variable", it might be a good idea to keep the state from before the test and restore it after. Similarly to the linked test:
oldArgs := os.Args
defer func() { os.Args = oldArgs }()
This might be useful where other tests are, for example, examining the real arguments passed when evoking go test.
This is old enough but still searched out, while it seems out dated.
Because Go 1.13 changed sth.
I find this change helpful, putting flag.*() in init() and flag.Parse() in Test*()
-args cannot take -<test-args>=<val> after it, but only <test-args>, otherwise the test-args will be taken as go test's command line parameter, instead of your Test*'s

How to create Fields

I am trying to create a script to add Fields on MarkLogic Database with Admin API. I have created following functions to perform this task:
declare function local:createField($config as element(configuration), $server-config as element(http-server))
{
let $dbid := xdmp:database(fn:data($server-config/database))
let $addField :=
let $fieldspec := admin:database-field("VideoTitle1", fn:false())
return admin:save-configuration(admin:database-add-field($config, $dbid, $fieldspec))
return "SUCCESS"
};
declare function local:createFieldPath($config as element(configuration), $server-config as element(http-server))
{
let $dbid := xdmp:database(fn:data($server-config/database))
let $addPath :=
let $fieldpath := admin:database-field-path("/Video/BasicInfo/Title", 1.0)
return admin:save-configuration(admin:database-add-field-paths($config, $dbid, "VideoTitle1", $fieldpath))
return "SUCCESS"
};
declare function local:createFieldRangeIndex($config as element(configuration), $server-config as element(http-server))
{
let $dbid := xdmp:database(fn:data($server-config/database))
let $addRange :=
let $rangespec := admin:database-range-field-index("string","VideoTitle1", "http://marklogic.com/collation/",fn:false() )
return admin:save-configuration(admin:database-add-range-field-index($config, $dbid, $rangespec))
return "SUCCESS"
};
But I am getting error:
[1.0-ml] ADMIN-BADPATHFIELDTYPE: (err:FOER0000) Incorrect field:
the field VideoTitle1 already has include-root.
In /MarkLogic/admin.xqy on line 5255
In database-check-path-field(<configuration/>, xs:unsignedLong("12095791717198876597"), "VideoTitle1")
$config := <configuration/>
$database-id := xs:unsignedLong("12095791717198876597")
$field-name := "VideoTitle1"
$field := <field xmlns="http://marklogic.com/xdmp/database" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><field-name>VideoTitle1</field-name><include-root>false</include...</field>
$field-path := ()
I am running complete script through QC and my MarkLogic version is "7.0-1". I have created Element Range Index, Attribute Range Index successfully by the script. But for this I am getting error.
Error description says that field has include-root. But, I am keeping it false()
admin:database-field("VideoTitle1", fn:false())
Is I am using wrong function or missed something?
Please help.
If you're trying to set up an entire database you're usually better off using the packaging API and services: https://developer.marklogic.com/learn/packaging-tutorial gives a tour of the configuration manager web UI, and then there is a guide, an XQuery API, and a REST API.
That said let's try to debug. It's difficult to debug your error message because the variable names and line numbers in the error message don't match up with your sample code. For example the stack trace has $database-id but your code has $dbid. A test case needs to be reproducible.
However I notice that you aren't calling the right function to construct your field configuration. If you want to use paths, say so up front using https://docs.marklogic.com/admin:database-path-field — not admin:database-path. The error message could use some work: it should be something more like "Incorrect field: the field VideoTitle1 is not a path field".
If you really want to stick with the admin API for this, I recommend changing your code so that you only call admin:save-configuration once. That's more efficient, and more robust in the face of any need to restart. One way to arrange this would be for each of your function calls to take a $config as element(configuration) param and return a new element(configuration) with the changes. Another method is to have a module variable $CONFIG as element(configuration) and mutate it with xdmp:set. Take a look at https://stackoverflow.com/a/12252515/908390 for examples of both techniques.
Here's a working version of your code:
import module namespace admin="http://marklogic.com/xdmp/admin"
at "/MarkLogic/admin.xqy";
declare function local:createField(
$config as element(configuration),
$server-config as element(http-server))
as element(configuration)
{
let $dbid := xdmp:database(fn:data($server-config/database))
let $fieldspec :=
admin:database-path-field(
"VideoTitle1",
admin:database-field-path("/Video/BasicInfo/Title", 1.0))
return admin:database-add-field($config, $dbid, $fieldspec)
};
declare function local:createFieldRangeIndex(
$config as element(configuration),
$server-config as element(http-server))
as element(configuration)
{
let $dbid := xdmp:database(fn:data($server-config/database))
let $rangespec :=
admin:database-range-field-index(
"string",
"VideoTitle1",
"http://marklogic.com/collation/",
fn:false())
return
admin:database-add-range-field-index(
$config, $dbid, $rangespec)
};
let $cfg := admin:get-configuration()
let $fubar := <http-server><database>test</database></http-server>
let $_ := xdmp:set($cfg, local:createField($cfg, $fubar))
let $_ := xdmp:set($cfg, local:createFieldRangeIndex($cfg, $fubar))
return admin:save-configuration($cfg)
Not a direct answer, but why reinvent a wheel that other have already invented. There are several solutions that can help deploy database settings and more. One of them is Roxy:
https://github.com/marklogic/roxy
Roxy provides a full framework for managing a MarkLogic project. You can find docs and tutorials on the wiki of the github project.
Another, less intrusive solution could be to configure your databases once, and then use the built-in Configuration Manager (http://host:8002/nav/) to export your database settings, and use the export to import the settings elsewhere.
HTH!