I have strange issue with golang sql and probably denisenkom/go-mssqldb.
My code part:
func Auth(username string, password string, remote_ip string, user_agent string) (string, User, error) {
var token string
var user = User{}
query := `exec dbo.sp_get_user ?,?`
rows, err := DB.Query(query, username, password)
if err != nil {
return token, user, err
}
defer rows.Close()
rows.Next()
if err = rows.Scan(&user.Id, &user.Username, &user.Description); err != nil {
log.Printf("SQL SCAN: Failed scan User in Auth. %v \n", err)
return token, user, err
}
hashFunc := md5.New()
hashFunc.Write([]byte(username + time.Now().String()))
token = hex.EncodeToString(hashFunc.Sum(nil))
query = `exec dbo.sp_set_session ?,?,?,?`
_, err = DB.Exec(query, user.Id, token, remote_ip, user_agent)
if err != nil {
return token, user, err
}
return token, user, nil
}
Problem: defer rows.Close() - not working properly
After this with DB.Connection.Stats().OpenConnections I always have 2 connection opened (also after repeat User login is still 2 connection for whole app lifecycle)
But if I rewrite func as:
...
query := `exec dbo.sp_get_user ?,?`
rows, err := DB.Query(query, username, password)
if err != nil {
return token, user, err
}
defer rows.Close()
rows.Next()
if err = rows.Scan(&user.Id, &user.Username, &user.Description); err != nil {
log.Printf("SQL SCAN: Failed scan User in Auth. %v \n", err)
return token, user, err
}
rows.Close()
...
Then rows underline stmt is closed and next DB.Connection.Stats().OpenConnections always will be 1 connection open.
DB in my app is simply return underlying connection from sql.Open
Problem is only in this part where two query executions with Query and Exec in one functions.
Maybe Query and Exec defines different connections, but i don't find this in driver source or database/sql source.
Thank you! (sorry for english if it's so bad)
PS:
exec dbo.sp_get_user ?,? - is simple select from user table, not more.
exec dbo.sp_set_session ?,?,?,? - is simple insert to user table, not more
In MSSQL - DBCC INPUTBUFFER shows me query = 'cast(##identity as bigint)' which executes in denisenkom/go-mssqldb mssql.go on line 593
Related
I am writing an AWS lambda to query 10 different tables from RDS(SQL Server) using Golang SDK. What I have learned so far is we have to create a similar struct for the table to query it. But as I want to query 10 tables, So I don't want to create the struct for every table, even the table schema may get changed someday.
Lately, I want to create a CSV file per table as the backup with the queried data and upload it to S3. So is it possible to directly import the CSV file into a lambda, so that I can directly upload it to S3?
You can see my current code below
func executeQuery(dbconnection *sql.DB) {
println("\n\n----------Executing Query ----------")
query := "select TOP 5 City,State,Country from IMBookingApp.dbo.Address"
rows, err := dbconnection.Query(query)
if err != nil {
fmt.Println("Error:")
log.Fatal(err)
}
println("rows", rows)
defer rows.Close()
count := 0
for rows.Next() {
var City, State, Country string
rows.Columns
err := rows.Scan(&City, &State, &Country)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
fmt.Printf("City: %s, State: %s, Country: %s\n", City, State, Country)
count++
}
}
This code can only work for the Address table, and not for other tables
I have also tried it with GORM
package main
import (
"fmt"
"github.com/jinzhu/gorm"
_ "github.com/jinzhu/gorm/dialects/mssql"
)
type Currency struct {
CurrencyId int `gorm:"column:CurrencyId;"`
Code string `gorm:"column:Code;"`
Description string `gorm:"column:Description;"`
}
func main() {
db, err := gorm.Open("mssql", "sqlserver://***")
db.SingularTable(true)
gorm.DefaultTableNameHandler = func(dbVeiculosGorm *gorm.DB, defaultTableName string) string {
return "IMBookingApp.dbo.Currency"
}
fmt.Println("HasTable-Currency:", db.HasTable("ClientUser"))
var currency Currency
db.Debug().Find(¤cy)
fmt.Println("Currency:", currency)
fmt.Println("Error", err)
defer db.Close()
}
With both the approaches I couldn't find any way to make the code generic for multiple tables. I would appreciate it if anyone can give me some suggestions or if you can point to some resources.
I did not test this code but is should give you an idea how to fetch Rows into strings array.
defer rows.Close()
columns, err := rows.Columns()
if err != nil {
panic(err)
}
for rows.Next() {
receiver := make([]*string, len(columns))
err := rows.Scan(receiver)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
}
GO internally converts many types into strings - https://github.com/golang/go/blob/master/src/database/sql/convert.go#L219
If data is cannot be converted you have 2 options:
Easy - update your SQL query to return strings or string compatible data
Complicated. Use slice of interface{} instead of slice of *string and fill it in with default values of correct type based on rows.ColumnTypes(). Later you will have to convert real values into strings to save into csv.
Below code worked for me -
conn, _ := getConnection() // Get database connection
rows, err := conn.Query(query)
if err != nil {
fmt.Println("Error:")
log.Fatal(err)
}
defer rows.Close()
columns, err := rows.Columns()
if err != nil {
panic(err)
}
for rows.Next() {
receiver := make([]string, len(columns))
is := make([]interface{}, len(receiver))
for i := range is {
is[i] = &receiver[i]
// each is[i] will be of type interface{} - compatible with Scan()
// using the underlying concrete `*string` values from `receiver`
}
err := rows.Scan(is...)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
fmt.Println("receiver", receiver)
Reference:- sql: expected 3 destination arguments in Scan, not 1 in Golang
I Have 1000000 records inside BigQuery. what is the best way to fetch data from DB and process using goLang? I'm getting timeout issue if fetch all the data without limit. already I increase the limit to 5min, but it takes more than 5 min.
I want to do some streaming call or pagination implementation, But i don't know in golang how I do.
var FetchCustomerRecords = func(req *http.Request) *bigquery.RowIterator {
ctx := appengine.NewContext(req)
ctxWithDeadline, _ := context.WithTimeout(ctx, 5*time.Minute)
log.Infof(ctx, "Fetch Customer records from BigQuery")
client, err := bigquery.NewClient(ctxWithDeadline, "ddddd-crm")
q := client.Query(
"SELECT * FROM Something")
q.Location = "US"
job, err := q.Run(ctx)
if err != nil {
log.Infof(ctx, "%v", err)
}
status, err := job.Wait(ctx)
if err != nil {
log.Infof(ctx, "%v", err)
}
if err := status.Err(); err != nil {
log.Infof(ctx, "%v", err)
}
it, err := job.Read(ctx)
if err != nil {
log.Infof(ctx, "%v", err)
}
return it
}
You can read the table contents directly without issuing a query. This doesn't incur query charges, and provides the same row iterator as you would get from a query.
For small results, this is fine. For large tables, I would suggest checking out the new storage api, and the code sample on the samples page.
For a small table or simply reading a small subset of rows, you can do something like this (reads up to 10k rows from one of the public dataset tables):
func TestTableRead(t *testing.T) {
ctx := context.Background()
client, err := bigquery.NewClient(ctx, "my-project-id")
if err != nil {
t.Fatal(err)
}
table := client.DatasetInProject("bigquery-public-data", "stackoverflow").Table("badges")
it := table.Read(ctx)
rowLimit := 10000
var rowsRead int
for {
var row []bigquery.Value
err := it.Next(&row)
if err == iterator.Done || rowsRead >= rowLimit {
break
}
if err != nil {
t.Fatalf("error reading row offset %d: %v", rowsRead, err)
}
rowsRead++
fmt.Println(row)
}
}
you can split your query to get 10x of 100000 records and run in multiple goroutine
use sql query like
select * from somewhere order by id DESC limit 100000 offset 0
and in next goroutine select * from somewhere order by id DESC limit 100000 offset 100000
I'm trying to make an HTTP request to get data from clickhouse database using Go. I don't have too much experience with it and not sure how to get the returned value by the query
This is what I have:
reader := strings.NewReader("SELECT COUNT(*) FROM system.tables WHERE database = 'local' AND name = 'persons'")
request, err := http.NewRequest("GET", "http://localhost:8123", reader)
if err != nil {
fmt.Println(err)
}
client := &http.Client{}
resp, err := client.Do(request)
if err != nil {
fmt.Println(err)
}
fmt.Println("The answer is: ", resp.Body)
The expected output should be a number (1 means table exist and 0 means that doesn't exist) but I'm getting in resp.Body this output:
The answer is: &{0xc4201741c0 {0 0} false <nil> 0x6a9bb0 0x6a9b20}
Any idea to get just the value of the query?
I had to add an extra line
message, _ := ioutil.ReadAll(resp.Body)
fmt.Println(string(message))
That helped me to get what I wanted.
I have a stored procedure let the name be "vijaystoredprocedure" , if it is some query in mssql then i will query in Go like
l_query_str = fmt.Sprintf(`select * from Users where Fname='%s'`, l_firstanme)
row, err := DBC.Query(l_query_str)
if err != nil {
log.Fatal("Prepare failed:", err.Error())
}
_, rows, r_err := DBScan_fn(row)
if r_err != nil {
fmt.Println("no data found err")
return
}
now since i have to get values from a stored procedure...can some one suggest how to acheive this in go
I'm using github.com/alexbrainman/odbc driver,
Example of executing stored procedure:
proc := "exec Dbo.vijaystoredprocedure ?, ?, ?, ?" //(Number of parameters)
parms := []interface{}{"parm1","parm2","parm3","parm4"}// Parameters if needed
if Stmt, err := DBC.Prepare(proc); err != nil {
log.Fatal(err.Error())
} else {
defer Stmt.Close()
if result, err := Stmt.Exec(parms...); err != nil {
log.Fatal(err.Error())
}
}
Example of stored function:
proc := "SELECT * From Dbo.[vijaystoredprocedure](?,?)" //(Number of parameters)
parms := []interface{}{"parm1","parm2"}// Parameters if needed
row, err := DBC.Query(proc, parms...)
if err != nil {
log.Fatal("Prepare failed:", err.Error())
}
_, rows, r_err := DBScan_fn(row)
if r_err != nil {
fmt.Println("no data found err")
return
}
I want to update data in remote db table and do further tasks but could not do it.
Using same code with insert query, I am able to insert values in same table where I am getting response very fast and going ahead for further tasks.
But in update query it does update values in table but could not going further.
Here is my sample code that I have tried:
package src
import (
"github.com/go-sql-driver/mysql"
"database/sql"
"fmt"
"log"
"net"
)
const (
DB_NAME = "test_db"
DB_HOST = "remote db ip address:port"
DB_USER = "username"
DB_PASS = "password"
)
const (
bufferSize int = 1024
port string = ":23456"
)
func main() {
udpAddr, err := net.ResolveUDPAddr("udp6", port)
var s Server
s.conn, err = net.ListenUDP("udp4", udpAddr)
fmt.Println("Trying to connect remote db")
dsn := DB_USER + ":" + DB_PASS + "#" + DB_HOST + "/" + DB_NAME + "?charset=utf8"
db, err := sql.Open("mysql", dsn)
if err != nil {
log.Fatal(err)
}
defer db.Close()
fmt.Println("Connected db")
rows, err := db.Query("UPDATE user_name_table SET user_id = ? WHERE user_name = ? ", "abcd", "admin")
defer rows.Close()
if err != nil {
log.Fatal(err)
fmt.Println("error in updating values")
} else {
fmt.Println("updated successfully in db")
}
// not coming here for doing further tasks
}
Also , I have tried to update db table from php webservice and its working fine but using go neither its giving response after updating values nor moving further.
Please help me .
Thanks in advance.
This happens because log.Fatal(err) will call os.Exit(1). In other words, it will terminate your program when err != nil.
Found out the solution.Just removed unwanted rows for returned response for updating query and everything working fine again.
_, err := db.Query("UPDATE user_name_table SET user_id = ? WHERE user_name = ? ", "abcd", "admin")
// defer rows.Close()