Using go-pg to retrieve virtual columns from Postgres - database

I'm using go-pg (https://github.com/go-pg/pg) and this code:
type Book struct {
id int
name string
}
var books []Book
err := db.Model(&books).Select()
and everything works good but I need to add a "virtual" column like this:
concat ('info:', 'id:', id, '...') AS info
and I tried to use:
query.ColumnExpr("concat ('info:', 'id:', id, '...') AS info")
but:
go-pg complains with: error="pg: can't find column=info in model=Book (try discard_unknown_columns)"
go-pg doesn't include anymore columns id and name in query: concat... ONLY!
I can understand that because now go-pg doesn't know how to bind data, but I really need that string which I can retrieve from DB only.
Is there a way?
Can I use a custom type like this below?
type CustomBook struct {
Info string
Book
}
Does this make sense?

this approach could work for you:
type Book struct {
ID int
Name string
Info string `pg:"-"`
}
...
db.Model(&books).ColumnExpr("book.*").ColumnExpr("CONCAT('id:', id, 'name:', name) AS info").Select()
pg:"-" ignores the struct field and it is not created nor it produces any errors
this ignored column is documented here: https://pg.uptrace.dev/models/
another approach, depending on your requirements could be like this:
var r []struct {
Name string
Info string
}
db.Model((*Book)(nil)).Column("name").ColumnExpr("CONCAT('id:', id, 'name:', name) AS info").Select(&r)
this second one is documented here: https://pg.uptrace.dev/queries/

Related

Filter db rows with GORM preload

Currently, I got this struct
type Token struct {
gorm.Model
Name string `gorm:"column:name"`
Enabled bool `gorm:"column:enabled"`
Symbol string `gorm:"column:symbol"`
TokenDetails []*EarnTokenDetail `gorm:"foreignkey:TokenID;references:ID"`
}
type EarnTokenDetail struct {
gorm.Model
Name string `gorm:"column:name"`
TokenID uint64 `gorm:"column:token_id"`
Enabled bool `gorm:"column:enabled"`
Chains *EarnChain `gorm:"foreignkey:ID;references:ChainID"`
}
type EarnChain struct {
ID uint64 `gorm:"primary_key column:id"`
Enabled bool `gorm:"column:enabled"`
Name string `gorm:"column:name"`
}
And this GORM query:
var tokens []*model.Token
result := e.db.
WithContext(ctx).
Preload("TokenDetails", "token_details.enabled = true").
Preload("TokenDetails.Chains", "chains.enabled = true").
Find(&tokens, "tokens.enabled = true")
It works fine when everything is enabled, but when I disable chains in the database, the result will still show the tokens with disabled chains, with the Chains field empty.
How can I filter out those rows while still using preload?
According the the GORM documentation, that's the expected behavior. It does one query after another, so there is no way to "drop" results of the former queries. If you don't want these results at all - not even with an empty Chain field, consider to use Join() to filter out those. I think I even found a comment pointing that out: How to multiple table joins in GORM

Fetching a struct and other data in one query

I have the following table schema:
user
-----
id uuid
name string
user_model
------
id uuid
user_id uuid
model_id uuid
role int
model
_____
id uuid
name string
model_no string
I have the following code which fetches the data from the "model" table.
underlyingModel = &model{}
var model IModel
model = underlyingModel
role := 0
db.Table("model").Joins('INNER JOIN user_model ON user.id = user_model.uuid')
.Joins('INNER JOIN model ON user.id = model_id').Find(&model);
In my actual code, the model can be many different struct types with different fields, they're all behind the IModel interface.
What I want to do is to fetch that extra role field from the user_model in one query. Something like .Find(&model, &role).
Is it possible using Gorm?
One possible solution is to create an anonymous struct to put the results in, with a combination of the Select() method.
var selectModel struct {
ID string //I'm assuming uuid matches the string
Name string
ModelNo string
Role int
}
db.Table("model").
Joins("INNER JOIN user_model ON user.id = user_model.uuid").
Joins("INNER JOIN model ON user.id = model_id").
Select("model.id, model.name, model.model_no, user_model.role").
Find(&selectModel);
Basically, you create an anonymous struct with selectModel variable, containing all the fields you want to return. Then, you need to do a select statement because you need some fields that are not part of the model table.
Here you can find more info on Smart Select Fields in form.
EDIT:
Based on additional info from the comments, there is a solution that might work.
Your IModel interface could have two methods in its signature, one to extract a string for the SELECT part of the SQL query, and the other one to get a pointer of the selectModel that you would use in the Find method.
type IModel interface {
SelectFields() string
GetSelectModel() interface{}
}
The implementation would go something like this:
func (m *model) SelectFields() string {
return "model.id, model.name, model.model_no, user_model.role"
}
func (m *model) GetSelectModel() interface{} {
return &m.selectModel
}
type model struct {
selectModel
ID uint64
Age int
}
type selectModel struct {
Name string
Email string
}
Then, your query could look something like this:
var m IModel
m = model{}
db.Table("model").
Joins("INNER JOIN user_model ON user.id = user_model.uuid").
Joins("INNER JOIN model ON user.id = model_id").
Select(m.GetSelectFields()).
Find(m.GetSelectModel());

Is there a simpler way to decode this json in Go?

I am trying to parse some JSON from Jira to variables. This is using the go-jira package (https://godoc.org/github.com/andygrunwald/go-jira)
Currently I have some code to get the developer:
dev := jiraIssue.Fields.Unknowns["customfield_11343"].(map[string]interface{})["name"]
and team := jiraIssue.Fields.Unknowns["customfield_12046"].([]interface{})[0].(map[string]interface{})["value"]
to get the team they are a part of from.
Getting the team they are on is a bit gross, is there a cleaner way to get the team besides having to type assert, set the index, then type assert again?
Here is the complete json (modified but structure is same, its way too long):
{
"expand":"renderedFields,names,schema,operations,editmeta,changelog,versionedRepresentations",
"id":"136944",
"self":"https://jira.redacted.com/rest/api/2/issue/136944",
"key":"RM-2506",
"fields":{
"customfield_11343":{
"self":"https://redacted.com/rest/api/2/user?username=flast",
"name":"flast",
"key":"flast",
"emailAddress":"flast#redacted.com",
"displayName":"first last",
"active":true,
"timeZone":"Europe/London"
},
"customfield_12046":[
{
"self":"https://jira.redacted.com/rest/api/2/customFieldOption/12045",
"value":"diy",
"id":"12045"
}
],
}
Thanks
The way I go about problems like this is:
Copy some JSON with things I am interested in and paste it into https://mholt.github.io/json-to-go/
Remove fields that arenĀ“t of interest.
Just read the data and unmarshal.
You might end up with something like this given the two custom fields of interest, but you can cut the structure down further if you just need the name.
type AutoGenerated struct {
Fields struct {
Customfield11343 struct {
Self string `json:"self"`
Name string `json:"name"`
Key string `json:"key"`
EmailAddress string `json:"emailAddress"`
DisplayName string `json:"displayName"`
Active bool `json:"active"`
TimeZone string `json:"timeZone"`
} `json:"customfield_11343"`
Customfield12046 []struct {
Self string `json:"self"`
Value string `json:"value"`
ID string `json:"id"`
} `json:"customfield_12046"`
} `json:"fields"`
}
The effect you get is that all extra information in the feed is discarded, but you get the data you want very cleanly.
This is a tough one since the second one is in an array form. It makes it hard to use a map.
For the first one, it's simple enough to use:
type JiraCustomField struct {
Self string `json:"self"`
Name string `json:"name"`
Key string `json:"key"`
EmailAddress string `json:"emailAddress"`
DisplayName string `json:"displayName"`
Active bool `json:"active"`
TimeZone string `json:"timeZone"`
}
type JiraPayload struct {
Expand string `json:"expand"`
ID string `json:"id"`
Key string `json:"key"`
Fields map[string]JiraCustomField `json:"fields"`
}
https://play.golang.org/p/y8-g6r0kInV
Specifically this part Fields map[string]JiraCustomField for the second case it looks like you need it in an array form like Fields map[string][]JiraCustomField.
In a case like this, I think you'll need to make your own Unmarshaler. This is a good tutorial: https://blog.gopheracademy.com/advent-2016/advanced-encoding-decoding/
What you could do with your custom Unmarshal/marshaler, is use the Reflection package and check if it's an array or a struct. If it's a struct then put it into an array, and store it in Fields map[string][]JiraCustomField.

Decimal type in Go and Postgresql with gorm

So i'm creating an API and I needed to store the price of something.
I'm using gorm and gormigrate for my database migration.
I'm just wondering what proper type should I use for storing decimals. I've red somewhere that I shouldn't use floats when storing currencies.
type MyStruct struct {
Name string `json:"name" gorm:"not null"`
Description string `json:"description" gorm:"null"`
Price <what type should be here> `json:"price"`
}
So, based on the suggestion of #ain, I used shopspring/decimal. But it's giving me an error when I do automigrate.
It turns out that I only needed to set the type to numeric using a gorm tag to make it work:
type MyStruct struct {
Name string `json:"name" gorm:"not null"`
Description string `json:"description" gorm:"null"`
Price decimal.Decimal `json:"price" gorm:"type:numeric"`
}

Dapper Results(Dapper Row) with Bracket Notation

According to the Dapper documentation, you can get a dynamic list back from dapper using below code :
var rows = connection.Query("select 1 A, 2 B union all select 3, 4");
((int)rows[0].A)
.IsEqualTo(1);
((int)rows[0].B)
.IsEqualTo(2);
((int)rows[1].A)
.IsEqualTo(3);
((int)rows[1].B)
.IsEqualTo(4);
What is however the use of dynamic if you have to know the field names and datatypes of the fields.
If I have :
var result = Db.Query("Select * from Data.Tables");
I want to be able to do the following :
Get a list of the field names and data types returned.
Iterate over it using the field names and get back data in the following ways :
result.Fields
["Id", "Description"]
result[0].values
[1, "This is the description"]
This would allow me to get
result[0].["Id"].Value
which will give results 1 and be of type e.g. Int 32
result[0].["Id"].Type --- what datattype is the value returned
result[0].["Description"]
which will give results "This is the description" and will be of type string.
I see there is a results[0].table which has a dapperrow object with an array of the fieldnames and there is also a result.values which is an object[2] with the values in it, but it can not be accessed. If I add a watch to the drilled down column name, I can get the id. The automatically created watch is :
(new System.Collections.Generic.Mscorlib_CollectionDebugView<Dapper.SqlMapper.DapperRow>(result as System.Collections.Generic.List<Dapper.SqlMapper.DapperRow>)).Items[0].table.FieldNames[0] "Id" string
So I should be able to get result[0].Items[0].table.FieldNames[0] and get "Id" back.
You can cast each row to an IDictionary<string, object>, which should provide access to the names and the values. We don't explicitly track the types currently - we simply don't have a need to. If that isn't enough, consider using the dapper method that returns an IDataReader - this will provide access to the raw data, while still allowing convenient call / parameterization syntax.
For example:
var rows = ...
foreach(IDictionary<string, object> row in rows) {
Console.WriteLine("row:");
foreach(var pair in row) {
Console.WriteLine(" {0} = {1}", pair.Key, pair.Value);
}
}

Resources