TLDR; How to create REST api on model having Foreign key (or db relationship in general) in buffalo framework?
I am absolute beginner in go and I am trying to write an RESTFul service using buffalo framework following following example given on it official website. I am able to create RESTful api on models which have no database relationship. But I am stuck when I faced a situation where model has a foreign key. I am not able to find any documentation or reference on web. I concept on Go is also weak, you can also educate me on those thins.
Models: (ref: https://gobuffalo.io/en/docs/db/relations#example
type Composer struct {
ID uuid.UUID `json:"id" db:"id"`
Name string `json:"name" db:"name"`
Description string `json:"description" db:"description"`
CreatedAt time.Time `json:"created_at" db:"created_at"`
UpdatedAt time.Time `json:"updated_at" db:"updated_at"`
}
type Track struct {
ID uuid.UUID `json:"id" db:"id"`
Title string `json:"title" db:"title"`
Description string `json:"description" db:"description"`
Composer Composer `has_one:"composer" fk_id:"id"`
CreatedAt time.Time `json:"created_at" db:"created_at"`
UpdatedAt time.Time `json:"updated_at" db:"updated_at"`
}
Resources: (ref: https://gobuffalo.io/en/docs/resources)
type TrackResource struct {
buffalo.Resource
}
func (v TrackResource) List(c buffalo.Context) error {
tx, ok := c.Value("tx").(*pop.Connection)
if !ok {
return errors.WithStack(errors.New("no transaction found"))
}
pieces := &models.Tracks{}
q := tx.PaginateFromParams(c.Params())
if err := q.All(pieces); err != nil {
return errors.WithStack(err)
}
c.Set("pagination", q.Paginator)
return c.Render(200, r.JSON(pieces))
}
func (v TrackResource) Show(c buffalo.Context) error {
tx, ok := c.Value("tx").(*pop.Connection)
if !ok {
return errors.WithStack(errors.New("no transaction found"))
}
piece := &models.Track{}
if err := tx.Find(piece, c.Param("track_id")); err != nil {
return c.Render(404, r.JSON(err))
}
return c.Render(200, r.JSON(piece))
}
func (v TrackResource) Create(c buffalo.Context) error {
piece := &models.Track{}
if err := c.Bind(piece); err != nil {
return errors.WithStack(err)
}
tx, ok := c.Value("tx").(*pop.Connection)
if !ok {
return errors.WithStack(errors.New("no transaction found"))
}
verrs, err := piece.Create(tx)
if err != nil {
return errors.WithStack(err)
}
if verrs.HasAny() {
return c.Render(422, r.JSON(verrs))
}
return c.Render(201, r.Auto(c, piece))
}
func (v TrackResource) Update(c buffalo.Context) error {
tx, ok := c.Value("tx").(*pop.Connection)
if !ok {
return errors.WithStack(errors.New("no transaction found"))
}
piece := &models.Track{}
if err := tx.Find(piece, c.Param("track_id")); err != nil {
return c.Error(404, err)
}
if err := c.Bind(piece); err != nil {
return errors.WithStack(err)
}
verrs, err := piece.Update(tx)
if err != nil {
return errors.WithStack(err)
}
if verrs.HasAny() {
return c.Render(422, r.JSON(verrs))
}
return c.Render(200, r.JSON(piece))
}
func (v TrackResource) Destroy(c buffalo.Context) error {
tx, ok := c.Value("tx").(*pop.Connection)
if !ok {
return errors.WithStack(errors.New("no transaction found"))
}
piece := &models.Track{}
if err := tx.Find(piece, c.Param("track_id")); err != nil {
return c.Error(404, err)
}
if err := tx.Destroy(piece); err != nil {
return errors.WithStack(err)
}
return c.Render(200, r.JSON(piece))
}
When I am trying to create a track. I am getting error:
json: cannot unmarshal string into Go struct field Track.Composer of type models.Composer
gitlab.com/****/****/actions.TrackResource.Create
Please help.
You're trying to bind a Track containing a Composer with its ID, but Composer is defined as a struct.
To make it work, you need to implement the Unmarshaler interface and define how to convert this ID into the Composer struct you want.
You should add a ComposerID field to your Track struct. Then you can use Eager or Load, and pop will load the associated Composer object for you. This is shown on the page you've linked to for your Models reference.
Related
I am receiving the following POST request json data from react frontend
{
"field_one": "first",
"field_two": "second",
"field_three": "3.00"
}
but i want golang to convert it to this before processing the request
{
"field_one": "first",
"field_two": "second",
"field_three": 3.00
}
I want to convert the field_three from string to float64, but i am unable to have golang accept the string and process the proper data type
here is my golang function processing the POST request data
func PostCreate(c *fiber.Ctx) error {
type PostCreateData struct {
fieldOne string `json:"field_one" form:"field_one" validate:"required"`
fieldTwo string `json:"field_two" form:"field_two" validate:"required"`
fieldThree float64 `json:"field_three" form:"field_three" validate:"required"`
}
data := PostCreateCreateData{}
if err := c.BodyParser(&data); err != nil {
return err
}
validate := validator.New()
if err := validate.Struct(data); err != nil {
return err
}
postCreate := models.PostCreate{
fieldOne: data.fieldOne,
fieldTwo: data.fieldTwo,
fieldThree: float64(data.fieldThree),
}
database.DB.Create(&postCreate)
return c.JSON(postCreate)
}
Currently the request is not getting processed because the wrong data type for field_three which is supposed to be float64 but frontend is sending everything as string
What steps am i missing here?
fixed by updating the function to this
func PostCreate(c *fiber.Ctx) error {
type PostCreateData struct {
fieldOne string `json:"field_one" form:"field_one" validate:"required"`
fieldTwo string `json:"field_two" form:"field_two" validate:"required"`
fieldThree float64 `json:"field_three,string" form:"field_three" validate:"required"`
}
data := PostCreateCreateData{}
if err := c.BodyParser(&data); err != nil {
return err
}
validate := validator.New()
if err := validate.Struct(data); err != nil {
return err
}
postCreate := models.PostCreate{
fieldOne: data.fieldOne,
fieldTwo: data.fieldTwo,
fieldThree: float64(data.fieldThree),
}
database.DB.Create(&postCreate)
return c.JSON(postCreate)
}
so updated the struct to expect string in the body
fieldThree float64 `json:"field_three,string" form:"field_three" validate:"required"`
and works like a charm!!!
I would like to connect my server that was written in Go with a MongoDB but I'm not sure how to do it in an efficient way. A couple of examples I found implemented it like shown below.
libs/mongodb/client.go
package mongodb
import (
"context"
"log"
"project/keys"
"go.mongodb.org/mongo-driver/mongo"
"go.mongodb.org/mongo-driver/mongo/options"
)
func GetClient() *mongo.Database {
client, err := mongo.Connect(
context.Background(),
options.Client().ApplyURI(keys.GetKeys().MONGO_URI),
)
if err != nil {
log.Fatal(err)
}
return client.Database(keys.GetKeys().MONGO_DB_NAME)
}
services/user/findOne.go
package userservices
import (
"context"
"log"
"project/libs/mongodb"
"project/models"
"go.mongodb.org/mongo-driver/bson"
)
func FindOne(filter bson.M) (models.User, error) {
var user models.User
collection := mongodb.GetClient().Collection("users")
result := collection.FindOne(context.TODO(), filter)
if result.Err() != nil {
return user, result.Err()
}
if err := result.Decode(&user); err != nil {
log.Println("Failed to decode user with error:", err)
return user, err
}
return user, nil
}
The GetClient function returns a database instance that is then used throughout the app. This seems to work, but I'm wondering if this really is best practice as it seems to create a new connection every time a new client is requested as shown in the second code snippet or is that assumption incorrect? I also thought about converting GetClient to a singleton, that always returns the same database instance but how would a lost connection be handled in that case? Thank you
I do it this way. Do it once at the service start and then pass the MongoDatastore object around to orchestrator, service layers and repository layers. I am using the "github.com/mongodb/mongo-go-driver/mongo" driver for mongo. I think it internally monitors and recycles idle connections. Hence, we don't have to bother about broken connections as long as reference to the mongo.Client object is not lost.
const CONNECTED = "Successfully connected to database: %v"
type MongoDatastore struct {
db *mongo.Database
Session *mongo.Client
logger *logrus.Logger
}
func NewDatastore(config config.GeneralConfig, logger *logrus.Logger) *MongoDatastore {
var mongoDataStore *MongoDatastore
db, session := connect(config, logger)
if db != nil && session != nil {
// log statements here as well
mongoDataStore = new(MongoDatastore)
mongoDataStore.db = db
mongoDataStore.logger = logger
mongoDataStore.Session = session
return mongoDataStore
}
logger.Fatalf("Failed to connect to database: %v", config.DatabaseName)
return nil
}
func connect(generalConfig config.GeneralConfig, logger *logrus.Logger) (a *mongo.Database, b *mongo.Client) {
var connectOnce sync.Once
var db *mongo.Database
var session *mongo.Client
connectOnce.Do(func() {
db, session = connectToMongo(generalConfig, logger)
})
return db, session
}
func connectToMongo(generalConfig config.GeneralConfig, logger *logrus.Logger) (a *mongo.Database, b *mongo.Client) {
var err error
session, err := mongo.NewClient(generalConfig.DatabaseHost)
if err != nil {
logger.Fatal(err)
}
session.Connect(context.TODO())
if err != nil {
logger.Fatal(err)
}
var DB = session.Database(generalConfig.DatabaseName)
logger.Info(CONNECTED, generalConfig.DatabaseName)
return DB, session
}
You may now create your repository as below:-
type TestRepository interface{
Find(ctx context.Context, filters interface{}) []Document, error
}
type testRepository struct {
store *datastore.MongoDatastore
}
func (r *testRepository) Find(ctx context.Context , filters interface{}) []Document, error{
cur, err := r.store.GetCollection("some_collection_name").Find(ctx, filters)
if err != nil {
return nil, err
}
defer cur.Close(ctx)
var result = make([]models.Document, 0)
for cur.Next(ctx) {
var currDoc models.Document
err := cur.Decode(&currDoc)
if err != nil {
//log here
continue
}
result = append(result, currDoc)
}
return result, err
}
I solved it doing this
var CNX = Connection()
func Connection() *mongo.Client {
// Set client options
clientOptions := options.Client().ApplyURI("mongodb://localhost:27017")
// Connect to MongoDB
client, err := mongo.Connect(context.TODO(), clientOptions)
if err != nil {
log.Fatal(err)
}
// Check the connection
err = client.Ping(context.TODO(), nil)
if err != nil {
log.Fatal(err)
}
fmt.Println("Connected to MongoDB!")
return client
}
//calll connection
func main() {
collection := db.CNX.Database("tasks").Collection("task")
}
output "Connected to MongoDB!"
How can I insert an array of documents into MongoDB with mgo library using only a single DB call as in db.collection.insert()?
I have the following Transaction structure:
type Transaction struct {
Brand string `json:"brand"`
Name string `json:"name"`
Plu string `json:"plu"`
Price string `json:"price"`
}
From a POST request I will recieve an array of these structures. I want to insert them into MongoDB as individual documents but using a single DB call as explained in db.collection.insert()
I tried using c.Insert of mgo
The following is the code snippet:
func insertTransaction(c *gin.Context) {
var transactions []Transaction
err := c.BindJSON(&transactions)
if err != nil {
c.AbortWithStatusJSON(http.StatusBadRequest, map[string]string{"error":"invalid JSON"})
return
}
err = InsertTransactons(transactions)
if err != nil {
c.AbortWithStatusJSON(http.StatusInternalServerError, &map[string](interface{}){
"status": "error",
"code": "500",
"message": "Internal server error",
"error": err,
})
return
}
c.JSON(http.StatusCreated, &map[string](interface{}){
"status": "success",
"code": "0",
"message": "created",
})
}
func InsertTransactons(u []Transaction) error {
s := GetSession()
defer s.Close()
c := s.DB(DB).C(TransactionColl)
err := c.Insert(u...)
if err != nil {
return err
}
return nil
}
But as I compile and run the code, I get the following error:
go/database.go:34:17: cannot use u (type *[]Transaction) as type
[]interface {} in argument to c.Insert
You cannot pass []*Transaction as []interface{}. You need to convert each Transaction to inferface{} to change its memory layout.
var ui []interface{}
for _, t := range u{
ui = append(ui, t)
}
Pass ui to c.Insert instead
Create slice of interface for document structs by appending and then inserting data using Bulk insert which takes variable arguments.
type Bulk struct {
// contains filtered or unexported fields
}
func (b *Bulk) Insert(docs ...interface{})
For inserting documents in Bulk
const INSERT_COUNT int = 10000
type User struct {
Id bson.ObjectId `bson:"_id,omitempty" json:"_id"`
Email string `bson:"email" json:"email"`
}
func (self *User) Init() {
self.Id = bson.NewObjectId()
}
Call Bulk() function on collection returned from db connection. Bulk() function returns pointer to *Bulk.
bulk := dbs.Clone().DB("").C("users").Bulk()
bulk.Insert(users...)
Assign it to variable which will be used to call Insert() method using Bulk pointer receiver.
func main(){
// Database
dbs, err := mgo.Dial("mongodb://localhost/")
if err != nil {
panic(err)
}
// Collections
uc := dbs.Clone().DB("").C("users")
defer dbs.Clone().DB("").Session.Close()
for n := 0; n < b.N; n++ {
count := INSERT_COUNT
users := make([]interface{}, count)
for i := 0; i < count; i++ {
loop_user := User{}
loop_user.Init()
loop_user.Email = fmt.Sprintf("report-%d#example.com", i)
users[i] = loop_user
}
bulk := uc.Bulk()
bulk.Unordered()
bulk.Insert(users...)
_, bulkErr := bulk.Run()
if bulkErr != nil {
panic(err)
}
}
}
On app engine I have a large number of entities of a particular kind.
I want to run a function on each entity (e.g. edit the entity or copy it)
I would do this in a taskqueue but a taskqueue is limited to 10 minutes runtime and each function call is prone to many kinds of errors. What is the best way to do this?
Here's my solution although I'm hoping someone out there has a better solution. I also wonder if this is prone to fork bombs e.g. if the task runs twice, it will set off two chains of iteration.. ! I'm only using it to iterate a few hundred thousand entities, although the operation on each entity is expensive.
First I create a taskqueue for running each individual function call on an entity one at a time:
queue:
- name: entity-iter
rate: 100/s
max_concurrent_requests: 1
retry_parameters:
task_retry_limit: 3
task_age_limit: 30m
min_backoff_seconds: 200
and then I have an iterate entity method which, given the kind, will call your delay func on each entity with the key.
package sysadmin
import (
"google.golang.org/appengine/datastore"
"golang.org/x/net/context"
"google.golang.org/appengine/log"
"google.golang.org/appengine/delay"
"google.golang.org/appengine/taskqueue"
)
func ForEachEntity(kind string, f *delay.Function) *delay.Function {
var callWithNextKey *delay.Function // func(c context.Context, depth int, cursorString string) error
callWithNextKey = delay.Func("something", func(c context.Context, depth int, cursorString string) error {
q := datastore.NewQuery(kind).KeysOnly()
if cursorString != "" {
if curs, err := datastore.DecodeCursor(cursorString); err != nil {
log.Errorf(c, "error decoding cursor %v", err)
return err
} else {
q = q.Start(curs)
}
}
it := q.Run(c)
if key, err := it.Next(nil); err != nil {
if err == datastore.Done {
log.Infof(c, "Done %v", err)
return nil
}
log.Errorf(c, "datastore error %v", err)
return err
} else {
curs, _ := it.Cursor()
if t, err := f.Task(key); err != nil {
return err
} else if _, err = taskqueue.Add(c, t, "entity-iter"); err != nil {
log.Errorf(c, "error %v", err)
return err
}
if depth - 1 > 0 {
if err := callWithNextKey.Call(c, depth - 1, curs.String()); err != nil {
log.Errorf(c, "error2 %v", err)
return err
}
}
}
return nil
})
return callWithNextKey
}
example usage:
var DoCopyCourse = delay.Func("something2", CopyCourse)
var DoCopyCourses = ForEachEntity("Course", DoCopyCourse)
func CopyCourses(c context.Context) {
//sharedmodels.MakeMockCourses(c)
DoCopyCourses.Call(c, 9999999, "")
}
I've found myself needing to do a GetMulti operation with an array of keys for which some entities exist, but some do not.
My current code, below, returns an error (datastore: no such entity).
err := datastore.GetMulti(c, keys, infos)
So how can I do this? I'd use a "get or insert" method, but there isn't one.
GetMulti can return a appengine.MultiError in this case. Loop through that and look for datastore.ErrNoSuchEntity. For example:
if err := datastore.GetMulti(c, keys, dst); err != nil {
if me, ok := err.(appengine.MultiError); ok {
for i, merr := range me {
if merr == datastore.ErrNoSuchEntity {
// keys[i] is missing
}
}
} else {
return err
}
}
I know this topic is up for more than a few days, but I like to post an alternative, using type switch.
if err := datastore.GetMulti(c, keys, dst); err != nil {
switch errt := err.(type) {
case appengine.MultiError:
for ix, e := range errt {
if e == datastore.ErrNoSuchEntity {
// keys[ix] not found
} else if e != nil {
// keys[ix] have error "e"
}
}
default:
// datastore returned an error that is not a multi-error
}
}
Thought I'd throw my answer in to display another usecase. The following will take in any number of keys and return all the valid keys only.
// Validate keys
var validKeys []*ds.Key
if err := c.DB.GetMulti(ctx, tempKeys, dst); err != nil {
if me, ok := err.(ds.MultiError); ok {
for i, merr := range me {
if merr == ds.ErrNoSuchEntity {
continue
}
validKeys = append(validKeys, tempKeys[i])
}
} else {
return "", err
}
} else {
// All tempKeys are valid
validKeys = append(validKeys, tempKeys...)
}