Filter db rows with GORM preload - database

Currently, I got this struct
type Token struct {
gorm.Model
Name string `gorm:"column:name"`
Enabled bool `gorm:"column:enabled"`
Symbol string `gorm:"column:symbol"`
TokenDetails []*EarnTokenDetail `gorm:"foreignkey:TokenID;references:ID"`
}
type EarnTokenDetail struct {
gorm.Model
Name string `gorm:"column:name"`
TokenID uint64 `gorm:"column:token_id"`
Enabled bool `gorm:"column:enabled"`
Chains *EarnChain `gorm:"foreignkey:ID;references:ChainID"`
}
type EarnChain struct {
ID uint64 `gorm:"primary_key column:id"`
Enabled bool `gorm:"column:enabled"`
Name string `gorm:"column:name"`
}
And this GORM query:
var tokens []*model.Token
result := e.db.
WithContext(ctx).
Preload("TokenDetails", "token_details.enabled = true").
Preload("TokenDetails.Chains", "chains.enabled = true").
Find(&tokens, "tokens.enabled = true")
It works fine when everything is enabled, but when I disable chains in the database, the result will still show the tokens with disabled chains, with the Chains field empty.
How can I filter out those rows while still using preload?

According the the GORM documentation, that's the expected behavior. It does one query after another, so there is no way to "drop" results of the former queries. If you don't want these results at all - not even with an empty Chain field, consider to use Join() to filter out those. I think I even found a comment pointing that out: How to multiple table joins in GORM

Related

Using go-pg to retrieve virtual columns from Postgres

I'm using go-pg (https://github.com/go-pg/pg) and this code:
type Book struct {
id int
name string
}
var books []Book
err := db.Model(&books).Select()
and everything works good but I need to add a "virtual" column like this:
concat ('info:', 'id:', id, '...') AS info
and I tried to use:
query.ColumnExpr("concat ('info:', 'id:', id, '...') AS info")
but:
go-pg complains with: error="pg: can't find column=info in model=Book (try discard_unknown_columns)"
go-pg doesn't include anymore columns id and name in query: concat... ONLY!
I can understand that because now go-pg doesn't know how to bind data, but I really need that string which I can retrieve from DB only.
Is there a way?
Can I use a custom type like this below?
type CustomBook struct {
Info string
Book
}
Does this make sense?
this approach could work for you:
type Book struct {
ID int
Name string
Info string `pg:"-"`
}
...
db.Model(&books).ColumnExpr("book.*").ColumnExpr("CONCAT('id:', id, 'name:', name) AS info").Select()
pg:"-" ignores the struct field and it is not created nor it produces any errors
this ignored column is documented here: https://pg.uptrace.dev/models/
another approach, depending on your requirements could be like this:
var r []struct {
Name string
Info string
}
db.Model((*Book)(nil)).Column("name").ColumnExpr("CONCAT('id:', id, 'name:', name) AS info").Select(&r)
this second one is documented here: https://pg.uptrace.dev/queries/

Is there a simpler way to decode this json in Go?

I am trying to parse some JSON from Jira to variables. This is using the go-jira package (https://godoc.org/github.com/andygrunwald/go-jira)
Currently I have some code to get the developer:
dev := jiraIssue.Fields.Unknowns["customfield_11343"].(map[string]interface{})["name"]
and team := jiraIssue.Fields.Unknowns["customfield_12046"].([]interface{})[0].(map[string]interface{})["value"]
to get the team they are a part of from.
Getting the team they are on is a bit gross, is there a cleaner way to get the team besides having to type assert, set the index, then type assert again?
Here is the complete json (modified but structure is same, its way too long):
{
"expand":"renderedFields,names,schema,operations,editmeta,changelog,versionedRepresentations",
"id":"136944",
"self":"https://jira.redacted.com/rest/api/2/issue/136944",
"key":"RM-2506",
"fields":{
"customfield_11343":{
"self":"https://redacted.com/rest/api/2/user?username=flast",
"name":"flast",
"key":"flast",
"emailAddress":"flast#redacted.com",
"displayName":"first last",
"active":true,
"timeZone":"Europe/London"
},
"customfield_12046":[
{
"self":"https://jira.redacted.com/rest/api/2/customFieldOption/12045",
"value":"diy",
"id":"12045"
}
],
}
Thanks
The way I go about problems like this is:
Copy some JSON with things I am interested in and paste it into https://mholt.github.io/json-to-go/
Remove fields that arenĀ“t of interest.
Just read the data and unmarshal.
You might end up with something like this given the two custom fields of interest, but you can cut the structure down further if you just need the name.
type AutoGenerated struct {
Fields struct {
Customfield11343 struct {
Self string `json:"self"`
Name string `json:"name"`
Key string `json:"key"`
EmailAddress string `json:"emailAddress"`
DisplayName string `json:"displayName"`
Active bool `json:"active"`
TimeZone string `json:"timeZone"`
} `json:"customfield_11343"`
Customfield12046 []struct {
Self string `json:"self"`
Value string `json:"value"`
ID string `json:"id"`
} `json:"customfield_12046"`
} `json:"fields"`
}
The effect you get is that all extra information in the feed is discarded, but you get the data you want very cleanly.
This is a tough one since the second one is in an array form. It makes it hard to use a map.
For the first one, it's simple enough to use:
type JiraCustomField struct {
Self string `json:"self"`
Name string `json:"name"`
Key string `json:"key"`
EmailAddress string `json:"emailAddress"`
DisplayName string `json:"displayName"`
Active bool `json:"active"`
TimeZone string `json:"timeZone"`
}
type JiraPayload struct {
Expand string `json:"expand"`
ID string `json:"id"`
Key string `json:"key"`
Fields map[string]JiraCustomField `json:"fields"`
}
https://play.golang.org/p/y8-g6r0kInV
Specifically this part Fields map[string]JiraCustomField for the second case it looks like you need it in an array form like Fields map[string][]JiraCustomField.
In a case like this, I think you'll need to make your own Unmarshaler. This is a good tutorial: https://blog.gopheracademy.com/advent-2016/advanced-encoding-decoding/
What you could do with your custom Unmarshal/marshaler, is use the Reflection package and check if it's an array or a struct. If it's a struct then put it into an array, and store it in Fields map[string][]JiraCustomField.

No Field Error Updating Values in Firestore

I am attempting to update a document in firestore using the golang library. For some reason I am getting an error: "no field \"BirthYear\" error and I am not sure why. Birth year is definitely one of the values that I am attempting to update.
I assume that I have configured my struct incorrectly but I cannot see how. Here is my struct and my update code:
sharedstructs.Profile
type Profile struct {
UID string `json:"UID" firestore:"UID"`
ContactEmail string `json:"ContactEmail,omitempty" firestore:"ContactEmail"`
BirthMonth int64 `json:"BirthMonth,omitempty" firestore:"BirthMonth"`
BirthYear int64 `json:"BirthYear,omitempty" firestore:"BirthYear"`
Gender string `json:"Gender,omitempty" firestore:"Gender"`
Unit string `json:"Unit,omitempty" firestore:"Unit"`
CurrentStatus string `json:"CurrentStatus,omitempty" firestore:"CurrentStatus"`
Country string `json:"Country,omitempty" firestore:"Country"`
ExperienceType string `json:"ExperienceType,omitempty" firestore:"ExperienceType"`
DateJoined time.Time `json:"DateJoined,omitempty" firestore:"DateJoined"`
Abilities []Ability `json:"Abilities,omitempty" firestore:"Abilities"`
Goals []Goal `json:"Goals,omitempty" firestore:"Goals"`
Roles []Role `json:"Roles,omitempty" firestore:"Roles"`
TermsAndConditions []TermsAndConditions `json:"TermsAndConditions,omitempty" firestore:"TermsAndConditions"`
TimeZone string `json:"TimeZone,omitempty" firestore:"TimeZone"`
BaselineTests []BaselineTestResults `json:"BaselineTests,omitempty" firestore:"BaselineTests"`
UpdatedDate time.Time `json:"UpdatedDate,omitempty" firestore:"UpdatedDate"`
FirstName *string `json:"FirstName,omitempty" firestore:"FirstName"`
LastName string `json:"LastName,omitempty" firestore:"LastName"`
DisplayName string `json:"DisplayName,omitempty" firestore:"DisplayName"`
}
Update Function
func updateProfileWithSpecficValues(documentName string, values sharedstructs.Profile, overwriteValues []string) error {
ctx := context.Background()
app := firestorehelper.GetFirestoreApp()
client, err := app.Firestore(ctx)
if err != nil {
return err
}
defer client.Close()
//Set the updated date
values.UpdatedDate = time.Now()
wr, error := client.Doc(collectionName+"/"+documentName).Set(ctx, values, firestore.Merge(overwriteValues))
if error != nil {
return error
}
fmt.Println(wr.UpdateTime)
//Assume success
return nil
}
https://godoc.org/cloud.google.com/go/firestore#Merge
Merge returns a SetOption that causes only the given field paths to be
overwritten. Other fields on the existing document will be untouched.
It is an error if a provided field path does not refer to a value in the data passed to Set.
You are sending no BirthYear (default value) in values, but BirthYear is specified in overwriteValues.
As of cloud.google.com/go/firestore v1.3.0, I don't think you can accomplish an update through Set(..., valueStruct, firestore.Merge(sliceOfPaths)) when valueStruct is the complete struct you might read or write to Firestore.
I receive an error string including the 'no field "[SomeFieldName]"' string the OP references, but there is more information in this error. If my sliceOfPaths refers to the names of two elements of my struct, say an int and a time.Time, I often receive an error such as 'no field "[NameOfMyIntField] for value 2020-09-14T00:00:00Z"' or vice-versa, with it trying to update my Firestore doc's time field with the integer.
I just used Update() rather than Set(). It's a little clunkier because you have to pass Update() a specially crafted subset of your original struct (a slice of firestore.Update), but it works.

Dapper Results(Dapper Row) with Bracket Notation

According to the Dapper documentation, you can get a dynamic list back from dapper using below code :
var rows = connection.Query("select 1 A, 2 B union all select 3, 4");
((int)rows[0].A)
.IsEqualTo(1);
((int)rows[0].B)
.IsEqualTo(2);
((int)rows[1].A)
.IsEqualTo(3);
((int)rows[1].B)
.IsEqualTo(4);
What is however the use of dynamic if you have to know the field names and datatypes of the fields.
If I have :
var result = Db.Query("Select * from Data.Tables");
I want to be able to do the following :
Get a list of the field names and data types returned.
Iterate over it using the field names and get back data in the following ways :
result.Fields
["Id", "Description"]
result[0].values
[1, "This is the description"]
This would allow me to get
result[0].["Id"].Value
which will give results 1 and be of type e.g. Int 32
result[0].["Id"].Type --- what datattype is the value returned
result[0].["Description"]
which will give results "This is the description" and will be of type string.
I see there is a results[0].table which has a dapperrow object with an array of the fieldnames and there is also a result.values which is an object[2] with the values in it, but it can not be accessed. If I add a watch to the drilled down column name, I can get the id. The automatically created watch is :
(new System.Collections.Generic.Mscorlib_CollectionDebugView<Dapper.SqlMapper.DapperRow>(result as System.Collections.Generic.List<Dapper.SqlMapper.DapperRow>)).Items[0].table.FieldNames[0] "Id" string
So I should be able to get result[0].Items[0].table.FieldNames[0] and get "Id" back.
You can cast each row to an IDictionary<string, object>, which should provide access to the names and the values. We don't explicitly track the types currently - we simply don't have a need to. If that isn't enough, consider using the dapper method that returns an IDataReader - this will provide access to the raw data, while still allowing convenient call / parameterization syntax.
For example:
var rows = ...
foreach(IDictionary<string, object> row in rows) {
Console.WriteLine("row:");
foreach(var pair in row) {
Console.WriteLine(" {0} = {1}", pair.Key, pair.Value);
}
}

How to filter on the value of a specific element in a list?

Using GAE-Java-JDO, is it possible to filter on the value of a specific element in a list?
WHAT WORKS
Normally, I would have the following:
#PersistenceCapable
class A {
String field1;
String field2;
// id, getters and setters
}
Then I would build a simple query:
Query q = pm.newQuery(A.class, "field1 == val");
q.declareParameters("String val");
List<A> list = new ArrayList<A>((List<A>) q.execute("foo"));
WHAT I WOULD LIKE
The above works fine. But what I would like to have is all of the fields stored in a list:
#PersistenceCapable
class AA {
ArrayList<String> fields;
// id, getters and setters
}
and then be able to query on a specific field in the list:
int index = 0;
Query q = pm.newQuery(A.class, "fields.get(index) == val");
q.declareParameters("int index, String val");
List<A> list = new ArrayList<A>((List<A>) q.execute(index, "foo"));
But this throws an exception:
org.datanucleus.store.appengine.query.DatastoreQuery$UnsupportedDatastoreFeatureException:
Problem with query
<SELECT FROM xxx.AA WHERE fields.get(index) == val PARAMETERS int index, String val,>:
Unsupported method <get> while parsing expression:
InvokeExpression{[PrimaryExpression{strings}].get(ParameterExpression{ui})}
My impression from reading the GAE-JDO doc is that this is not possible:
"The property value must be supplied by the application; it cannot refer to or be calculated in terms of other properties"
So... any ideas?
Thanks in advance!
If you only need to filter by index+value, then I think prefixing the actual list-values with their index should work. (If you need to also filter by actual values, then you'll need to store both lists.)
i.e. instead of the equivalent of
fields= ['foo', 'bar', 'baz] with query-filter fields[1] == 'bar'
you'd have
fields= ['0-foo', '1-bar', '2-baz'] with query-filter fields == '1-bar'
(but in java)

Resources