We have two different approaches for an appointment scheduling system, using mongodb.
1st approach:
appointments:
{
resourceId: "string",
resourceType: "doc"/"nut"...,
userId: "string",
userName: "string",
startDate: "2020-05-18T16:00:00Z",
endDate: 2020-05-18T17:00:00Z
title: "string",
description: "string",
type: "string"/"off"
},
{
resourceId: "string",
resourceType: "doc"/"nut"...,
userId: "string",
userName: "string",
startDate: "2020-05-21T12:00:00Z",
endDate: 2020-05-21T12:30:00Z,
title: "string",
description: "string",
type: "string"/"off"
},
...
resources:
{
resourceId: "string",
resourceName: "string"
resourceType: "doc"/"nut"/"room",
autoApprove: true/false,
constantDaysOff: [sunnday]
},
{
resourceId: "string",
resourceName: "string"
resourceType: "doc"/"nut"/"room",
autoApprove: true/false,
constantDaysOff: [sunnday]
},
{
resourceId: "string",
resourceName: "string"
resourceType: "doc"/"nut"/"room",
autoApprove: true/false,
constantDaysOff: [sunnday]
}
Here appointments and resources are different collections, with sample documents in each collection.
2nd approach:
resources:
{
resourceId: "string",
resourceName: "string",
resourceType: "doc"/"nut"...,
constantDaysOff: [sunday],
2020-05-21: [
{
startDate: "2020-05-21T12:00:00Z",
endDate: 2020-05-21T12:30:00Z,
userId: "string",
userName: "string",
title: "string",
description: "string",
type: "string"/"off"
},
{
startDate: "2020-05-21T14:00:00Z",
endDate: 2020-05-21T14:30:00Z,
userId: "string",
userName: "string",
title: "string",
description: "string",
type: "string"/"off"
}
],
2020-05-22: [
{
startDate: "2020-05-22T12:00:00Z",
endDate: 2020-05-22T12:30:00Z,
userId: "string",
userName: "string",
title: "string",
description: "string",
type: "string"/"off"
},
{
startDate: "2020-05-22T14:00:00Z",
endDate: 2020-05-22T14:30:00Z,
userId: "string",
userName: "string",
title: "string",
description: "string",
type: "string"/"off"
}
]
...
}
Here we only have one collection, and appointment dates are keys in the collection. Each date key would contain multiple json objects representing different appointments on the same day.
NOTE:
There are no appointments longer than 1 day, the reason we have startDate and endDate is to calculate the length of the appointment, its start and end time.
We need to be able to perform queries, most efficiently, along the lines of:
get all appointments for a specific resource id
get appointments for a resource between two different dates
get appointments for a user between different dates
cancel/remove appointments
A resource could be anything, for example a doctor, coach, room...
So my question is which one would be more efficient/feasible when it comes to mongodb queries?
most articles said not to use start-Date and end-Date, but Date and start/end-Time (not even in Date format) - you mostly select (query) by day/days, then you will handle time
Time is more flexible, especially when you need some time-step between appointments. Also, you can easily edit the time long without changing the date or change the date without changing the time.
IMHO: according to the previous point, it's much better to use your first variant and the second one I dislike.
In the second variant, it will be hard to change the date. You will need to handle the whole array, upsert by date, and then upsert the whole Set of those.
Related
I am trying to create a floating point column using go-memdb but seems like I am missing enough understanding + documentation to create one. Shouting out for help.
In the below example I need "price" to be a decimal number but in the current form is losing its precision. Not sure which indexer to implement.
type Product struct {
ID string `json:"id"`
Name string `json:"name"`
Description string `json:"description"`
Price float32 `json:"price"`
SKU string `json:"sku"`
CreatedOn string `json:"-"`
UpdatedOn string `json:"-"`
DeletedOn string `json:"-"`
}
schema := &memdb.DBSchema{
Tables: map[string]*memdb.TableSchema{
"product": {
Name: "product",
Indexes: map[string]*memdb.IndexSchema{
"id": {
Name: "id",
Unique: true,
AllowMissing: false,
Indexer: &memdb.StringFieldIndex{Field: "ID"},
},
"name": {
Name: "name",
Indexer: &memdb.StringFieldIndex{Field: "Name"},
},
"description": {
Name: "description",
Indexer: &memdb.StringFieldIndex{Field: "Description"},
},
"price": {
Name: "price",
Indexer: &memdb.StringFieldIndex{Field: "Price"},
},
"sku": {
Name: "sku",
Indexer: &memdb.StringFieldIndex{Field: "SKU"},
},
"createdon": {
Name: "createdon",
Indexer: &memdb.StringFieldIndex{Field: "CreatedOn"},
},
"updatedon": {
Name: "updatedon",
AllowMissing: true,
Indexer: &memdb.StringFieldIndex{Field: "UpdatedOn"},
},
"deletedon": {
Name: "deletedon",
AllowMissing: true,
Indexer: &memdb.StringFieldIndex{Field: "DeletedOn"},
},
},
},
},
}
The go-memdb package currently doesn't have a dedicated floating point number index. It currently only for integer numbers. But the package exposes the Indexer interface, so that anyone can implement an index for a type. By checking the IntFieldIndex, you can have a good idea of the work needed to be done.
Running into a problem with the API generated with aws amplify.
Basically, I keep getting the following warning whenever I try to create one entity and it is not being persisted in DynamoDB.
Variable 'input' has coerced Null value for NonNull type 'String!
The following are the pertinent parts of the Graphql schema I used to create the backend.
enum EntityStatus {
ACTIVE
INACTIVE
ARCHIVED
}
type Address {
streetAddress1: String!
streetAddress2: String
city: String!
state: String!
zipCode: String!
country: String!
location: Location!
}
type Location {
lat: Float
lng: Float
}
type Tenant
#model
#auth(
rules: [
{ allow: groups, groups: ["Admin", "Coordinator", "Employees"], operations }
{ allow: groups, groups: ["Auditor"], operations: [read] }
]
) {
id: ID!
name: String!
address: Address!
phone: AWSPhone!
email: AWSEmail!
status: EntityStatus!
locale: String!
}
The code to create one of the Tenant entities is a simple call
try {
return await DataStore.save(new Tenant({ ...values }));
} catch (error) {
console.error(error);
}
The payload going sent by Datastore is as follows:
{
"name": "Tenant 1",
"phone": "1234567890",
"email": "tenant#tenant.com",
"status": "ACTIVE",
"address": {
"city": "Anytown",
"state": "TAB",
"zipCode": "12345",
"country": "US",
"location": { "lat": 123.12, "lng": 123.12 }
},
"locale": "en-US",
"id": "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a"
}
Here is the full Warning message
[WARN] 40:26.787 DataStore
Object { localModel: {…}, message: "Variable 'input' has coerced Null value for NonNull type 'String!'", operation: "Create", errorType: undefined, errorInfo: undefined, remoteModel: null }
errorInfo: undefined
errorType: undefined
localModel: Object { id: "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a", name: "Tenant 1", phone: "1234567890", … }
_deleted: undefined
_lastChangedAt: undefined
_version: undefined
address: Object { city: "Anytown", state: "TAB", zipCode: "12345", … }
createdAt: undefined
email: "tenant#tenant.com"
id: "f8be53bd-b1cb-4cbd-9b64-01fdf930da8a"
locale: "en-US"
name: "Tenant 1"
phone: "1234567890"
status: "ACTIVE"
updatedAt: undefined
<prototype>: Object { … }
message: "Variable 'input' has coerced Null value for NonNull type 'String!'"
operation: "Create"
remoteModel: null
<prototype>: Object { … }
react_devtools_backend.js:3973:25
Figured it out. My payload was missing 2 fields.
Wish that the error messages would be more helpful.
Is something like this possible? How is this supposed to be designed for this use case? Do I have to add a lambda function that adds the user (owner) to the post when it is created?
Can anyone help me to accomplish that .. Thanks!
This is Post schema:
type Post
#model
#key(name: "byClub", fields: ["clubId"])
#auth(
rules: [
{ allow: owner, operations: [create, update, delete, read] }
{ allow: private, operations: [read] }
]
) {
id: ID!
content: String!
upVotes: Int!
downVotes: Int!
image: String
clubId: ID!
comments: [Comment] #connection(keyName: "byPost", fields: ["id"])
}
And when I fetch the post, this is what I got:
{
"id": "xxxxxxx",
"content": "xxxxx!",
"upVotes": 0,
"downVotes": 0,
"image": null,
"clubId": "xxxxxx",
"comments": {
"nextToken": null
},
"createdAt": "2021-12-05T10:46:59.797Z",
"updatedAt": "2021-12-05T10:46:59.797Z",
"owner": "moneebalalfi#gmail.com"
}
I want something like this:
{
"id": "xxxxx",
"content": "xxxxxxx",
"upVotes": 0,
"downVotes": 0,
"image": null,
"clubId": "xxxxxxxx",
"comments": {
"nextToken": null
},
"createdAt": "2021-12-05T10:46:59.797Z",
"updatedAt": "2021-12-05T10:46:59.797Z",
"owner": {
name: "xxxxx",
email: "xxxx#gmail.com",
image: "xxxxxx",
// and so on ...
}
}
owner field is only used for checking if the requested user is the owner of the data.
Owner authorization specifies whether a user can access or operate against an object. To do so, each object will get an ownerField field (by default owner will be added to the object if not specified) that stores ownership information and is verified in various ways during resolver execution.
from Amplify Docs
To be able to make a connection with the user you need to create another GraphQL Type.
type User
#model
#auth(
rules: [
{ allow: owner, operations: [create, update, delete, read] }
{ allow: private, operations: [read] }
]
) {
id: ID!
email: String!
image: String
}
type Post
#model
#key(name: "byClub", fields: ["clubId"])
#auth(
rules: [
{ allow: owner, operations: [create, update, delete, read] }
{ allow: private, operations: [read] }
]
) {
id: ID!
content: String!
upVotes: Int!
downVotes: Int!
image: String
clubId: ID!
owner: ID
user: User #connection(fields: ["owner"])
comments: [Comment] #connection(keyName: "byPost", fields: ["id"])
}
You might need to configure amplify codegen to increase the max-depth, if it's not showing.
$ amplify configure codegen
$ amplify codegen
I am making a mongodb model>>
const mongoose = require('mongoose');
const {Schema} = mongoose;
const locationSchema = new Schema({
name: String,
Address: String,
ContactInfo: {
phone: Number,
email: String,
},
Website: String,
Hours: {
DaysOpen: String,
OpeningTime:[Number],
ClosingTime:[Number],
},
Services: String,
Languages: String,
Documentation: Boolean,
OtherNotes: String,
})
mongoose.model('Locations', locationSchema);
When I try and run a get request to see what is in my database I am returned
{
"error": false,
"location": {
"Hours": {
"OpeningTime": [
1215,
898
],
"ClosingTime": [
1400
],
"DaysOpen": "Sunday"
},
"_id": "5ee8fd2e57aa5126d4c1c854",
"name": "Evergreen Christian Center Food Pantry",
"Address": "4400 NW Glencoe Rd, Hillsboro, OR 97124",
"ContactInfo": {
"phone": 5033196590,
"email": "gonzocyn2#msn.com"
},
"Website": "https://www.ecc4.org/home",
"Services": "All foods available including meat and frozen foods",
"Languages": "English, Spanish",
"Documentation": false,
"OtherNotes": "Bring own bag or box. Sign up starts at 9:00am",
"__v": 0
}
The problem is that "Hours" is being displayed before the name, address, and contact info. This only occurs when I have the fields "OpeningTime" and "ClosingTime" as arrays. Any idea on how to fix this?
In my sample document, I have a campaign document that contains the _id of the document and an importData array. importData is an array of objects containing a unique date and source value.
My goal is to have an object updated with a unique date/source pair. I would like to have the new object replace any matching object. In the example below, Fred may have originally donated a TV, but I want my application to update the object to reflect he donated both a TV and a radio.
// Events (sample document)
{
"_id" : "Junky Joe's Jubilee",
"importData" : [
{
"date": "2015-05-31",
"source": "Fred",
"items": [
{item: "TV", value: 20.00},
{item: "radio", value: 5.34}
]
},
{
"date": "2015-05-31",
"source": "Mary",
"items": [
{item: "Dresser", value: 225.00}
]
}
]
}
My original thought was to do something like the code below, but not only am I updating importData with Fred's donations, I'm also blowing away anything else in the importData array:
var collection = db.collection("events");
collection.update(
{_id: "Junky Joe's Jubilee",
importData: {
date: "2015-05-31",
source: 'Fred'
},
}, // See if we can find a campaign object with this name
{
$set:
{"importData":
{
date: "2015-05-31",
source: 'Fred',
items: [
{item: "TV", value: 20.00},
{item: "radio", value: 5.34}
]
}
}
},
{upsert: true}); // Create a document if one does not exist for this campaign
When I tried pushing (instead of $set), I was getting multiple entries for the date/source combos (e.g. Fred would appear to have donated two items multiple times on "2015-05-31").
How would I go about doing that with the MongoDB native driver and NodeJS?
Try this
var collection = db.collection("events");
collection.update(
{_id: "Junky Joe's Jubilee",
importData: {
date: "2015-05-31",
source: 'Fred'
},
}, // See if we can find a campaign object with this name
{
$set:
{"importData.$":
{
date: "2015-05-31",
source: 'Fred',
items: [
{item: "TV", value: 20.00},
{item: "radio", value: 5.34}
]
}
}
},
{upsert: true}); // Create a document if one does not exist for this campaign
According to the documentation under Array update operators this should only modify the first element in the array, which matches the query.