I have a question about to insert object in firestore in angularfire:
My object Person.ts
name: String
age: Number
//--constructor--
//--getters and setters--
if I do this, insert ok: (BUT is this good practice?)
[person.component.ts]
this.db.collection("person").add({
name: this.person.$nome,
age: this.person.$email
})
...
but if I try:
[person.component.ts]
this.db.collection("person").add({
Person: this.person
//or this this.person
})
I get this error in browser console:
Function DocumentReference.set() called with invalid data. Unsupported field value: a custom Person object (found in field Person)
at new FirestoreError (error.js:149)
at
Firestore only accepts a JavaScript object embedded within a document if it is a “pure” object, this means you can't use custom objects while coding with TypeScript.
Change your code to:
this.db.collection("person").add(Object.assign({}, this.person));
Related
can anyone explain to me please why and how this might happen:
I have a typescript app with Zustand state management.
Somewhere during the app I am updating certain elements by extracting them from the state and cloning via simple Object.Assign :
let elemToUpdate = Object.assign({},story?.content?.elementsData[nodeId]);
console.log(elemToUpdate);
if(elemToUpdate) {
if(elemToUpdate.title) elemToUpdate.title[editorLang] = newName;
else elemToUpdate.title = {[editorLang]:newName} as TextDictionary;
updateElement(nodeId,elemToUpdate);
}
Now the interesting part is on my first try the update goes through without fail, but the next object I am trying to update fails with the following message:
Tree.tsx:39 Uncaught TypeError: Cannot assign to read only property 'en' of object '#<Object>'
I can't understand WHY the first one comes through, but the second gets blocked.
(I know HOW to fix it, need to do deep clone, I just want to understand WHY)
Thanks
First, let's start from why some objects in your code are readonly. Based on what you described in the question, you use a Zustand state manager. Such managers traditionally wraps you stored data to readonly objects to prevent it's manual mutation (expecting, you will change the state only via built-in mechanisms), to guarantee data stability. So, if the story?.content?.elementsData[nodeId] is the Zustand state object, it self and all it's nested objects are converted to readonly.
Second, let's define, which objects will be blocked. I see at least two objects here: elemToUpdate: { ..., title: { [lang]: string }} (elemToUpdate and it's title). Both will be converted to readonly.
Third, you use Object.assign({}, ...) which creates a new object (new reference) and clones all properties of the source object. It happens only for first level of properties, no deep clone. So, as the title is a reference to another object, it will be cloned as is and in the new object it still leads to the existing { [lang]: string } object. There are several way to solve that: 1) deep clone as you mentioned; 2) manually clone title property, for instance {..., title: { ... elemToUpdate.title }} or via Object.assign
But I would suggest don't mutate you object this way. Probably, your entire algorithm has some architectural issues in general.
That is expected because in the first case you are not assigning value to the title you are only changing the value of the title property. In the second case, you are reassigning the value of the title property,
it's the read-only value you cant change it. Let's understand with a simple example
Javascript: Only for example not related to problem
const user = {
name: 'John',
}
user.name = "Pete"; // This works
const user = {
name: 'John',
}
user = { name: 'Pete'} // This doesn't work
Typescript:
const user: Readonly<{
a: {
name: string
}
}> = {
a:{ name: 'John',}
}
user.a.name = "Pete"; // This works
user.a = { name: 'John',} // not work
The same is happening there, Typescript does not check deep Readonly prop. check here
MongoDB has a field for every document called "_id". I see people using it everywhere as a primary key, and using it in queries to find documents by the _id.
This field defaults to using an ObjectId which is auto-generated, an example is:
db.tasks.findOne()
{
_id: ObjectID("ADF9"),
description: "Write lesson plan",
due_date: ISODate("2014-04-01"),
owner: ObjectID("AAF1") // Reference to another document
}
But in JavaScript, the underscore behind a field in an object is a convention for private, and as MongoDB uses JSON (specifically, BSON), should I be using these _ids for querying, finding and describing relationships between documents? it doesn't seem right.
I saw that MongoDB has a way to generate UUID https://docs.mongodb.com/manual/reference/method/UUID
Should I forget that _id property, and create my own indexed id property with an UUID?
Use UUIDs for user-generated content, e.g. to name image uploads. UUIDs can be exposed to the user in an URL or when the user inspects an image on the client-side. For everything that is on the server/not exposed to the user, there is no need to generate a UUID, and using the auto-generated _id is preferred.
An simple example of using UUID would be:
const uuid = require('uuid');
exports.nameFile= async (req, res, next) => {
req.body.photo = `${uuid.v4()}.${extension}`;
next();
};
How MongoDB names its things should not interfere in how you name your things. If data sent by third-party hurts the conventions you agreed to follow, you have to transform that data into the format you want as soon as it arrives in your application.
An example based in your case:
function findTaskById(id) {
var result = db.tasks.findOne({"_id": id});
var task = {
id: result._id,
description: result.description,
something: result.something
};
return task;
}
This way you isolate the use of Mongo's _id into the layer of your application that is responsible to interact with the database. In all other places you need task, you can use task.id.
Hi I created a SimpleSchema for a Mongo collection which has a variable number of sub-documents called measurables. Unfortunately it's been a while since I've done this and I can't remember how to insert into this type of schema! Can someone help me out?
The schema is as follows:
const ExerciseTemplates = new Mongo.Collection('ExerciseTemplates');
const ExerciseTemplateSchema = new SimpleSchema({
name: {
type: String,
label: 'name',
},
description: {
type: String,
label: 'description',
},
createdAt: {
type: Date,
label: 'date',
},
measurables: {
type: Array,
minCount: 1,
},
'measurables.$': Object,
'measurables.$.name': String,
'measurables.$.unit': String,
});
ExerciseTemplates.attachSchema(ExerciseTemplateSchema);
The method is:
Meteor.methods({
addNewExerciseTemplate(name, description, measurables) {
ExerciseTemplates.insert({
name,
description,
createdAt: new Date(),
measurables,
});
},
});
The data sent by my form for measurables is an array of objects.
The SimpleSchema docs seem to be out of date. If I use the example they show with measurables: type: [Object] for an array of objects. I get an error that the the type can't be an array and I should set it to Array.
Any suggestions would be awesome!!
Many thanks in advance!
edit:
The measurable variable contains the following data:
[{name: weight, unit: kg}]
With the schema above I get no error at all, it is silent as if it was successful, but when I check the db via CLI I have no collections. Am I doing something really stupid? When I create a new meteor app, it creates a Mongo db for me I assume - I'm not forgetting to actually create a db or something dumb?
Turns out I was stupid. The schema I posted was correct and works exactly as intended. The problem was that I defined my schema and method in a file in my imports directory, outside both client and server directories. This methods file was imported into the file with the form that calls the method, and therefore available on the client, but not imported into the server.
I guess that the method was being called on the client as a stub so I saw the console.log firing, but the method was not being called on the server therefore not hitting the db.
Good lesson for me regarding the new recommended file structure. Always import server side code in server/main.js!!! :D
Thanks for your help, thought I was going to go mad!
Got a server returning a JSON object like so:
{
'key1':'value'
'key2':{
'key2_0':'value'
}
}
And a collection:
var Collection = Backbone.Collection.extend({
url:api.url//which returns the object above
});
var collection = new Collection();
collection.fetch({
success:function(data){
//do something
}
});
Now i need to use certain properties of the collection throughout my application, but say i need key1, i always have to do collection.at(0).get('key1');//returns 'value', because the data returned is stored within the collection, in a new Array at key 0.
Question:
How to directly... collection.get('key1')//now returns undefined... because it is.
I know i could expose an object to the global scope in the collection success function some_other_var = data.toJSON()[0] and access the some_other_var properties directly, but that's not what i'm looking for;
In order to use the get() function from a Backbone.Collection you need to know the model id or cid wanted.
For instance, lets say your data coming from the server is like follow:
[{
id: '123',
name: 'Alex'
}, {
id: '456',
name: 'Jhon'
}]
In that case you can do this:
this.collection.get('123').get('name') // Return "Alex"
Keep in mind that collection is just a set of model, so behind the scenes by doing collection.get() you are getting a model
Tip: If you don't have any kind of id in your server data, there is always the option of using underscore methods:
find
filter
some
contains
etc
It seems like you're trying to ascribe attributes to a collection, but a collection is merely a set of models. Having additional data that is constant throughout the collection suggests that it should be wrapped inside another Model, which is demonstrated here: Persisting & loading metadata in a backbone.js collection
I'm using backbone + backbone.localStorage to persist my data, and I get a wrong behavior:
I've got a model settings with one attribute called user
Settings = Backbone.Model.extend({
localStorage : new Backbone.LocalStorage('settingsStore')
});
var settings = new Settings();
settings.set({user: 'USERNAME'});
settings.save();
After this code if I output the settings.attributes data in weinre I get the following:
settings.attributes
Object
id: "3ac78cfb-ad60-1ab8-8391-f058ae9bfcfb"
user: "USERNAME"
__proto__: Object
Then I save the model to the localStorage, clear, and fetch it again:
settings.save();
settings.clear();
settings.fetch();
And the problem is that if I output the settings.attributes, now this attributes are stored inside a nested object:
settings.attributes
Object
0: Object
id: "3ac78cfb-ad60-1ab8-8391-f058ae9bfcfb"
user: "USERNAME"
__proto__: Object
__proto__: Object
And the problem is when I set the user name again in order to modify, a new attribute is added like this:
settings.attributes
Object
0: Object
id: "3ac78cfb-ad60-1ab8-8391-f058ae9bfcfb"
user: "USERNAME"
__proto__: Object
user: "NEWUSER"
__proto__: Object
And if I save this model, and fetch it again I get 2 new objects on the attributes... and it keeps growing each time.
The answer to the question given by fguillen link gives the correct answer to this problem.
You just need to create the model object with a hardcoded "ID" if you want to save it correctly.
After doing this:
var settings = new Settings({ id: 1 });
The save() and fecth() methods are working correctly. Obviously you have to take care not repeating 2 ID's...