Does pgx offer any support for 'where in' clauses? I found in another stackoverflow thread that one should use string concatenation to build the query manually. IMO this is a bit error prone though, as you have to take care of escaping/sql injection and the like on your own.
I also tried to figure it out on my own:
const updatePurgedRecordingsStmt = "update recordings set status = 'DELETED', deleted = now() where status <> 'DELETED' and id in ($1);"
func (r *Repository) DeleteRecordings() error {
pool, err := r.connPool()
if err != nil {
return errors.Wrap(err, "cannot establish connection")
}
pgRecIds := &pgtype.Int4Array{}
if err := pgRecIds.Set([]int32{int32(1), int32(2)}); err != nil {
return errors.Wrap(err, "id conversion failed")
}
if _, err = pool.Exec(updatePurgedRecordingsStmt, pgRecIds); err != nil {
return errors.Wrap(err, "update stmt failed")
}
return nil
}
When I execute this code, I get the following error though:
ERROR: incorrect binary data format in bind parameter 1 (SQLSTATE 22P03)
The versions I am using:
Postgres:
db=> SELECT version();
version
-----------------------------------------------------------------------------------------------------------
PostgreSQL 9.6.11 on x86_64-pc-linux-gnu, compiled by gcc (GCC) 4.8.2 20140120 (Red Hat 4.8.2-16), 64-bit
(1 row)
PGX:
github.com/jackc/fake v0.0.0-20150926172116-812a484cc733 h1:vr3AYkKovP8uR8AvSGGUK1IDqRa5lAAvEkZG1LKaCRc=
github.com/jackc/fake v0.0.0-20150926172116-812a484cc733/go.mod h1:WrMFNQdiFJ80sQsxDoMokWK1W5TQtxBFNpzWTD84ibQ=
github.com/jackc/pgx v3.3.0+incompatible h1:Wa90/+qsITBAPkAZjiByeIGHFcj3Ztu+VzrrIpHjL90=
github.com/jackc/pgx v3.3.0+incompatible/go.mod h1:0ZGrqGqkRlliWnWB4zKnWtjbSWbGkVEFm4TeybAXq+I=
github.com/lib/pq v1.0.0 h1:X5PMW56eZitiTeO7tKzZxFCSpbFZJtkMMooicw2us9A=
github.com/lib/pq v1.0.0/go.mod h1:5WUZQaWbwv1U+lTReE5YruASi9Al49XbQIvNi/34Woo=
As you already know IN expects a list of scalar expressions, not an array, however pgtype.Int4Array represents an array, not a list of scalar expressions.
"IMO this is a bit error prone though, as you have to take care of escaping/sql injection and the like on your own. "
Not necessarily, you can loop over your array, construct a string of parameter references, concatenate that to the query and then execute it passing in the array with ....
var paramrefs string
ids := []interface{}{1,2,3,4}
for i, _ := range ids {
paramrefs += `$` + strconv.Itoa(i+1) + `,`
}
paramrefs = paramrefs[:len(paramrefs)-1] // remove last ","
query := `UPDATE ... WHERE id IN (` + paramrefs + `)`
pool.Exec(query, ids...)
Alternatively you can use ANY instead of IN.
ids := &pgtype.Int4Array{}
ids.Set([]int{1,2,3,4})
query := `UPDATE ... WHERE id = ANY ($1)`
pool.Exec(query, ids)
(here you may have to cast the param reference to the appropriate array type, I'm not sure, give it a try without cast, if not ok, try with cast)
func prepareWhereINString(count int) string {
var paramrefs string
for i := 0; i < count; i++ {
paramrefs += `$` + strconv.Itoa(i+1) + `,`
}
paramrefs = paramrefs[:len(paramrefs)-1] // remove last ","
return paramrefs
}
Related
I am trying to run a postgres if statement at my golang project, but i met this error, could you help me figure out?
the code is
newDate := "2022-06-22"
query := `
DO $$
DECLARE
new_date date:= $1;
BEGIN
IF EXISTS (SELECT * FROM systemtable WHERE date = new_date) THEN
UPDATE systemtable SET is_latest = TRUE WHERE date = new_date;
ELSE
INSERT INTO systemtable (date, is_latest) VALUES (new_date, TRUE);
END IF;
END$$;`
if _, err := txi.Exec(query, newDate); err != nil {
return err
}
Then the error returned is "pq: bind message supplies 1 parameters, but prepared statement "" requires 0"
Do your job with two separate statements within a transaction. That way you preserve consistency and don't have to perform any business logic on database side.
I find two ways to insert data into SQLServer with GORM.
GORM.DB.Exec("insert into [tableA] (value1,value2) VALUES (?,?)",v1,v2). It works.
GROM.DB.Create(&myDataStruct).Error. This reports Error and message is "LastInsertId is not supported. Please use the OUTPUT clause or add select ID = convert(bigint, SCOPE_IDENTITY()) to the end of your query."
I understand what's the means of this instruction, but I don't know how to code.
Thanks for any help.
following my code
db := mssql.GetMssqlDB()
defer db.Close()
newData := mssql.People{
Name: "Tom",
Age: 12,
}
err := db.Create(&newData).Error
if err != nil {
fmt.Println()
}
and data struct
type People struct {
ID int64 `gorm:"primary_key;column:id"`
Name string `gorm:"column:name"`
Age int `gorm:"column:age"`
}
func (p People) TableName() string {
return "dbo.people"
}
This is a bug and have been fixed since https://github.com/go-gorm/gorm/pull/2690.
If you get the same error, try update your gorm and it should work.
I am writing an AWS lambda to query 10 different tables from RDS(SQL Server) using Golang SDK. What I have learned so far is we have to create a similar struct for the table to query it. But as I want to query 10 tables, So I don't want to create the struct for every table, even the table schema may get changed someday.
Lately, I want to create a CSV file per table as the backup with the queried data and upload it to S3. So is it possible to directly import the CSV file into a lambda, so that I can directly upload it to S3?
You can see my current code below
func executeQuery(dbconnection *sql.DB) {
println("\n\n----------Executing Query ----------")
query := "select TOP 5 City,State,Country from IMBookingApp.dbo.Address"
rows, err := dbconnection.Query(query)
if err != nil {
fmt.Println("Error:")
log.Fatal(err)
}
println("rows", rows)
defer rows.Close()
count := 0
for rows.Next() {
var City, State, Country string
rows.Columns
err := rows.Scan(&City, &State, &Country)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
fmt.Printf("City: %s, State: %s, Country: %s\n", City, State, Country)
count++
}
}
This code can only work for the Address table, and not for other tables
I have also tried it with GORM
package main
import (
"fmt"
"github.com/jinzhu/gorm"
_ "github.com/jinzhu/gorm/dialects/mssql"
)
type Currency struct {
CurrencyId int `gorm:"column:CurrencyId;"`
Code string `gorm:"column:Code;"`
Description string `gorm:"column:Description;"`
}
func main() {
db, err := gorm.Open("mssql", "sqlserver://***")
db.SingularTable(true)
gorm.DefaultTableNameHandler = func(dbVeiculosGorm *gorm.DB, defaultTableName string) string {
return "IMBookingApp.dbo.Currency"
}
fmt.Println("HasTable-Currency:", db.HasTable("ClientUser"))
var currency Currency
db.Debug().Find(¤cy)
fmt.Println("Currency:", currency)
fmt.Println("Error", err)
defer db.Close()
}
With both the approaches I couldn't find any way to make the code generic for multiple tables. I would appreciate it if anyone can give me some suggestions or if you can point to some resources.
I did not test this code but is should give you an idea how to fetch Rows into strings array.
defer rows.Close()
columns, err := rows.Columns()
if err != nil {
panic(err)
}
for rows.Next() {
receiver := make([]*string, len(columns))
err := rows.Scan(receiver)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
}
GO internally converts many types into strings - https://github.com/golang/go/blob/master/src/database/sql/convert.go#L219
If data is cannot be converted you have 2 options:
Easy - update your SQL query to return strings or string compatible data
Complicated. Use slice of interface{} instead of slice of *string and fill it in with default values of correct type based on rows.ColumnTypes(). Later you will have to convert real values into strings to save into csv.
Below code worked for me -
conn, _ := getConnection() // Get database connection
rows, err := conn.Query(query)
if err != nil {
fmt.Println("Error:")
log.Fatal(err)
}
defer rows.Close()
columns, err := rows.Columns()
if err != nil {
panic(err)
}
for rows.Next() {
receiver := make([]string, len(columns))
is := make([]interface{}, len(receiver))
for i := range is {
is[i] = &receiver[i]
// each is[i] will be of type interface{} - compatible with Scan()
// using the underlying concrete `*string` values from `receiver`
}
err := rows.Scan(is...)
if err != nil {
fmt.Println("Error reading rows: " + err.Error())
}
fmt.Println("receiver", receiver)
Reference:- sql: expected 3 destination arguments in Scan, not 1 in Golang
I have a postgresql function that basically returns a number, but also as you can see the function receive an array of string.
Create Or Replace Function fnRegisterUserRoleArray(idUserFather int, rolesArray Text[]) returns int language plpgsql
as
$body$
declare ids INT;
declare roleID INT;
declare sanitazedRole TEXT;
declare counter int = 0;
begin
if(empty2null(idUserFather::text) is null) then
ids := 0;
elsif exists( select 1 from win_users where id_user = idUserFather limit 1) then
for counter in 1 .. array_upper(rolesArray, 1)
loop
select id_role from win_roles where rolename = rolesArray[counter] into roleID;
insert into win_user_role(id_user, id_role) values (idUserFather, roleID);
ids := ids + 1;
end loop;
else
ids := 0;
end if;
return ids;
end $body$;
and in my Go Function I have my variable database that receive the connection with the postgresql database,
database, err := getConnection()
however when I call the function fnRegisterUserViewsPermission and send the values I receive an error.
This is how I set the values:
var resultRole int
err = database.QueryRow("Select fnRegisterUserRoleArray($1, $2);", resultUser, pq.Array(roleArray)).Scan(&resultRole, &int)
fmt.Println(resultRole)
if err != nil {
fmt.Println(resultRole)
return nil, err
}
if resultRole != 0 && resultRole != 1 {
response = response + "their roles has been assigned correctly "
} else {
response = response + "however there was an error during the assignation of the role."
}
and the output that I receive is this:
"message": "sql: Scan error on column index 0: converting driver.Value
type (\"\") to a int: invalid syntax",
But the values get stored in my database, so the function receives in a good way the values, but the return is where it goes bam :(
This only occurs when i send an Array but if i send anything rather than an array the Scanner return the Id obtained from the postgresql function.
Is there any way to obtain the result id but also send the array in a more elegant way?
this is the values that contains role array:
[Carrier Brand]
And the value that contains resultUser is just a number, that represent the id of the user that has been registered in the database, in this case the registered user has an ID of...
result user: 63
Thanks! :)
As the error says:
"message": "sql: Scan error on column index 0: converting driver.Value
type (\"\") to a int: invalid syntax",
The error is when scanning the result coming from the function when the value is returned from it.
Remove the pointer to int primitive type, Since only single value is returned form the function which is of int type.
Scan the value into int type variable as
var resultRole int
err = database.QueryRow("Select fnRegisterUserRoleArray($1, $2);", resultUser, pq.Array(roleArray)).Scan(&resultRole)
For handling these types of situations it will be better if you handle scanning the result separately from the query as:
sqlStatement := `Select fnRegisterUserRoleArray($1, $2);`
var resultRole int
// Replace 3 with an ID from your database or another random
// value to test the no rows use case.
row := db.QueryRow(sqlStatement, resultUser, pq.Array(roleArray))
switch err := row.Scan(&resultRole); err {
case sql.ErrNoRows:
fmt.Println("No rows were returned!")
case nil:
fmt.Println(resultRole)
default:
panic(err)
}
Using above approach of separating queryRow with Scan help you to analyze more about the result and returned errors.
I have the following query:
SELECT ... ,
grade as [grade],
grade as [grade2]
FROM dbo.[qc_runs] r
JOIN ...
WHERE ...
I send it to MS SQL Server 2014 from my Go code and want to get back data (I am using github.com/denisenkom/go-mssqldb driver). However, I can read first grade value (type nvarchar(max)), but second one arrives empty! These are the same table fields, just duplicated. If I delete first grade value from query and leave just one, it will still arrive empty! The column is described as following:
[grade] [nvarchar](max) NULL,
SQL Management Studio executes this query just fine, both grade values are not empty, but Go code doesn't!
UPDATE #1
Go code:
evaluations, err := db.Query("EXEC qa.dbo.sp_get_evaluation_list ?", uid)
if err != nil {
if err == sql.ErrNoRows {
return list, nil
}
return list, err
}
// Get column names
colNames, err := evaluations.Columns()
if err != nil {
log.Logf("Failed to get columns: %v", err)
return list, err
}
// Result is your slice string.
readCols := make([]interface{}, len(colNames))
// Read data
for evaluations.Next() {
writeCols := make([]string, len(colNames))
for i := range writeCols {
readCols[i] = &writeCols[i]
}
evaluations.Scan(readCols...)
for i := range writeCols {
fmt.Println(colNames[i], ": ", writeCols[i])
}
...
}
Output:
...
grade : <some text from DB>
grade2 :
I'm not a go programmer. But you need to name those fields differently. How could your code possible discern between the first and second [grade]?
SELECT ... ,
grade as [grade],
grade as [grade2]
FROM dbo.[qc_runs] r
JOIN ...
WHERE ...
Given that you're getting an empty value on the second read, I suspect that the go driver (and the data reader you're using from it) is one-way only.
Problem solved, evaluations.Scan(readCols...) returned error err sql: Scan error on column index 3: unsupported Scan, storing driver.Value type <nil> into type *string, and left the rest of array blank