I have the following query:
SELECT ... ,
grade as [grade],
grade as [grade2]
FROM dbo.[qc_runs] r
JOIN ...
WHERE ...
I send it to MS SQL Server 2014 from my Go code and want to get back data (I am using github.com/denisenkom/go-mssqldb driver). However, I can read first grade value (type nvarchar(max)), but second one arrives empty! These are the same table fields, just duplicated. If I delete first grade value from query and leave just one, it will still arrive empty! The column is described as following:
[grade] [nvarchar](max) NULL,
SQL Management Studio executes this query just fine, both grade values are not empty, but Go code doesn't!
UPDATE #1
Go code:
evaluations, err := db.Query("EXEC qa.dbo.sp_get_evaluation_list ?", uid)
if err != nil {
if err == sql.ErrNoRows {
return list, nil
}
return list, err
}
// Get column names
colNames, err := evaluations.Columns()
if err != nil {
log.Logf("Failed to get columns: %v", err)
return list, err
}
// Result is your slice string.
readCols := make([]interface{}, len(colNames))
// Read data
for evaluations.Next() {
writeCols := make([]string, len(colNames))
for i := range writeCols {
readCols[i] = &writeCols[i]
}
evaluations.Scan(readCols...)
for i := range writeCols {
fmt.Println(colNames[i], ": ", writeCols[i])
}
...
}
Output:
...
grade : <some text from DB>
grade2 :
I'm not a go programmer. But you need to name those fields differently. How could your code possible discern between the first and second [grade]?
SELECT ... ,
grade as [grade],
grade as [grade2]
FROM dbo.[qc_runs] r
JOIN ...
WHERE ...
Given that you're getting an empty value on the second read, I suspect that the go driver (and the data reader you're using from it) is one-way only.
Problem solved, evaluations.Scan(readCols...) returned error err sql: Scan error on column index 3: unsupported Scan, storing driver.Value type <nil> into type *string, and left the rest of array blank
Related
I am trying to run a postgres if statement at my golang project, but i met this error, could you help me figure out?
the code is
newDate := "2022-06-22"
query := `
DO $$
DECLARE
new_date date:= $1;
BEGIN
IF EXISTS (SELECT * FROM systemtable WHERE date = new_date) THEN
UPDATE systemtable SET is_latest = TRUE WHERE date = new_date;
ELSE
INSERT INTO systemtable (date, is_latest) VALUES (new_date, TRUE);
END IF;
END$$;`
if _, err := txi.Exec(query, newDate); err != nil {
return err
}
Then the error returned is "pq: bind message supplies 1 parameters, but prepared statement "" requires 0"
Do your job with two separate statements within a transaction. That way you preserve consistency and don't have to perform any business logic on database side.
I find two ways to insert data into SQLServer with GORM.
GORM.DB.Exec("insert into [tableA] (value1,value2) VALUES (?,?)",v1,v2). It works.
GROM.DB.Create(&myDataStruct).Error. This reports Error and message is "LastInsertId is not supported. Please use the OUTPUT clause or add select ID = convert(bigint, SCOPE_IDENTITY()) to the end of your query."
I understand what's the means of this instruction, but I don't know how to code.
Thanks for any help.
following my code
db := mssql.GetMssqlDB()
defer db.Close()
newData := mssql.People{
Name: "Tom",
Age: 12,
}
err := db.Create(&newData).Error
if err != nil {
fmt.Println()
}
and data struct
type People struct {
ID int64 `gorm:"primary_key;column:id"`
Name string `gorm:"column:name"`
Age int `gorm:"column:age"`
}
func (p People) TableName() string {
return "dbo.people"
}
This is a bug and have been fixed since https://github.com/go-gorm/gorm/pull/2690.
If you get the same error, try update your gorm and it should work.
I have a postgresql function that basically returns a number, but also as you can see the function receive an array of string.
Create Or Replace Function fnRegisterUserRoleArray(idUserFather int, rolesArray Text[]) returns int language plpgsql
as
$body$
declare ids INT;
declare roleID INT;
declare sanitazedRole TEXT;
declare counter int = 0;
begin
if(empty2null(idUserFather::text) is null) then
ids := 0;
elsif exists( select 1 from win_users where id_user = idUserFather limit 1) then
for counter in 1 .. array_upper(rolesArray, 1)
loop
select id_role from win_roles where rolename = rolesArray[counter] into roleID;
insert into win_user_role(id_user, id_role) values (idUserFather, roleID);
ids := ids + 1;
end loop;
else
ids := 0;
end if;
return ids;
end $body$;
and in my Go Function I have my variable database that receive the connection with the postgresql database,
database, err := getConnection()
however when I call the function fnRegisterUserViewsPermission and send the values I receive an error.
This is how I set the values:
var resultRole int
err = database.QueryRow("Select fnRegisterUserRoleArray($1, $2);", resultUser, pq.Array(roleArray)).Scan(&resultRole, &int)
fmt.Println(resultRole)
if err != nil {
fmt.Println(resultRole)
return nil, err
}
if resultRole != 0 && resultRole != 1 {
response = response + "their roles has been assigned correctly "
} else {
response = response + "however there was an error during the assignation of the role."
}
and the output that I receive is this:
"message": "sql: Scan error on column index 0: converting driver.Value
type (\"\") to a int: invalid syntax",
But the values get stored in my database, so the function receives in a good way the values, but the return is where it goes bam :(
This only occurs when i send an Array but if i send anything rather than an array the Scanner return the Id obtained from the postgresql function.
Is there any way to obtain the result id but also send the array in a more elegant way?
this is the values that contains role array:
[Carrier Brand]
And the value that contains resultUser is just a number, that represent the id of the user that has been registered in the database, in this case the registered user has an ID of...
result user: 63
Thanks! :)
As the error says:
"message": "sql: Scan error on column index 0: converting driver.Value
type (\"\") to a int: invalid syntax",
The error is when scanning the result coming from the function when the value is returned from it.
Remove the pointer to int primitive type, Since only single value is returned form the function which is of int type.
Scan the value into int type variable as
var resultRole int
err = database.QueryRow("Select fnRegisterUserRoleArray($1, $2);", resultUser, pq.Array(roleArray)).Scan(&resultRole)
For handling these types of situations it will be better if you handle scanning the result separately from the query as:
sqlStatement := `Select fnRegisterUserRoleArray($1, $2);`
var resultRole int
// Replace 3 with an ID from your database or another random
// value to test the no rows use case.
row := db.QueryRow(sqlStatement, resultUser, pq.Array(roleArray))
switch err := row.Scan(&resultRole); err {
case sql.ErrNoRows:
fmt.Println("No rows were returned!")
case nil:
fmt.Println(resultRole)
default:
panic(err)
}
Using above approach of separating queryRow with Scan help you to analyze more about the result and returned errors.
I've seen two ways of people executing queries using Golang builtin database/sql queries. One of them is using fmt.Sprintf:
func (db *DB) CreateUserTable() (sql.Result, error) {
statement := "CREATE TABLE %s (%s, %s, %s, %s, %s)"
v := []interface{}{"User", "ID int PRIMARY KEY NOT NULL", "Name varchar(100) UNIQUE", "Email varchar(100) UNIQUE", "Address varchar(100) ", "Username varchar(100) UNIQUE"}
return db.Exec(fmt.Sprintf(statement, v...))
}
and the other one is using prepared statement:
func (db *DB) CreateUserTable() (sql.Result, error) {
statement, err := db.Prepare("INSERT INTO User(tbl1,tbl2,tbl3) VALUES(?,?,?)")
if err != nil {
log.Fatal(err)
}
return statement.Exec("value1", "value2", "value3")
}
The first gives benefit by enabling you to dynamically set the table name, column name, and the values. But the second one only for values. What's the difference? Which one should I use?
Never build SQL from strings that come from outside your system.
Always use the ? syntax.
If you must set SQL parts like table names, prepare multiple, complete SQL statements that contain ? for the values. Select the SQL to execute, maybe based on user input, but never build SQL from user input.
It is cleaner to use prepared statements so that whenever a requirement changes you can easily modify the statements. Also to prevent SQL injections.
Prepared statements is much better than concatenating strings, for all
the usual reasons (avoiding SQL injection attacks, for example).
In MySQL, the parameter placeholder is ?, and in PostgreSQL it is $N,
where N is a number. SQLite accepts either of these.
One more thing is Prepared statements can be used for repetitive approach, can be executed multiple times and can be destroyed.
stmt, err := db.Prepare("select id, name from users where id = ?")
if err != nil {
log.Fatal(err)
}
defer stmt.Close() // closing the statement
rows, err := stmt.Query(1)
And you are using interfaces
func (db *DB) CreateUserTable() (sql.Result, error) {
statement := "CREATE TABLE %s (%s, %s, %s, %s, %s)"
v := []interface{}{"User", "ID int PRIMARY KEY NOT NULL", "Name varchar(100) UNIQUE", "Email varchar(100) UNIQUE", "Address varchar(100) ", "Username varchar(100) UNIQUE"}
return db.Exec(fmt.Sprintf(statement, v...))
}
which can take any type of parameter under the hood which can be vulnerable
For more detailed information Go for this Link
In an application I have a globally scoped
var db *sql.DB
that is later called with
slcstrSource, slcint64Timestamp, slcstrContent, err := DB_functions.GetContent(db)
if err != nil {
fmt.Println("Error: " + err.Error())
}
GetContent is this:
func GetContent(db *sql.DB) ([]string, []int64, []string, error) {
var slcstrContent []string
var slcint64Timestamp []int64
var slcstrSource []string
// Run the query
rows, err := db.Query("SELECT source, timestamp, content FROM MyDatabase.MyTable")
if err != nil {
return slcstrSource, slcint64Timestamp, slcstrContent, err
}
defer rows.Close()
for rows.Next() {
// Holding variables for the content in the columns
var source, content string
var timestamp int64
// Get the results of the query
err := rows.Scan(&source, ×tamp, &content)
if err != nil {
return slcstrSource, slcint64Timestamp, slcstrContent, err
}
// Append them into the slices that will eventually be returned to the caller
slcstrSource = append(slcstrSource, source)
slcstrContent = append(slcstrContent, content)
slcint64Timestamp = append(slcint64Timestamp, timestamp)
}
return slcstrSource, slcint64Timestamp, slcstrContent, nil
}
When I run the application and these sections of code are hit, I get an:
Error: mssql: Invalid object name 'MyDatabase.MyTable'.
When I db.Ping() the database, it seems to work. From what I've narrowed down the error is happening right at the query, but I can't find what's wrong. I checked the database and there is a database called MyDatabase with a table called MyTable and the table has information in those three columns...
Is there something I'm missing before making the query, or in making the query?
I checked the database and there is a database called MyDatabase with
a table called MyTable and the table has information in those three
columns...
It seems like the driver is working just like it should. In order to query a table in SQL Server you should use [Database].[Schema].[TableName]. If you have not defined a particular schema name for your table then it will be created under the dbo schema by default.
In saying that you don't really need to specify the database name in your query. You rather define that on the connection string. I'm not sure how you have defined your connection details but have a look at the below and adapt accordingly to your needs.
var (
debug = flag.Bool("debug", false, "enable debugging")
password = flag.String("password", "mypwd", "the database password")
port *int = flag.Int("port", 1433, "the database port")
server = flag.String("server", "MyServer", "the database server")
user = flag.String("user", "MyUser", "the database user")
connStr = fmt.Sprintf("server=%s;Initial Catalog=MySchema;userid=%s;password=%s;port=%d", *server, *user, *password, *port)
)
func main() {
db, err := sql.Open("mssql", connStr)
}
Then you can query your table like this:
rows, err := db.Query("SELECT source, timestamp, content FROM MySchema.MyTable")