Flutter firestore overwrites data when I use set data - database

I have been trying to add data to sections in my firestorm database, I have a collection -> document -> data fields. Whenever I use setData({'key': 'value'}) is always overwrites the data already in the document. Is there anyway around this?

That is because what setData(object) is used for
To create or overwrite a single document
While update(object)
To update some fields of a document without overwriting the entire document
So what you need is to use update()

Using merge: true in your setData() statement prevents overwrite.
This seems similar to the update() method but you can use it even when document does not exist (ie is being created).

Related

How to get the key using "Put" into RunInTransaction without wait until after the transaction

The structure for a database I´m building its like a chain, it looks like this:
Click here to see the structure
Where those parts are:
Click here to see what represents each part
So, when I want to add new data to my chain:
Click here to see the new data coming
,in any place I want, I can just easily update their values by updating the datastore.key of the structs: click here to see the update
So, in this case I just need to update b.NextBlock, c.LastBlock, e.LastBlock and e.NextBlock and everything its fine, but lets supposed I want to add more new data Click her to see new data coming
and I don´t want to save the chain if any of that data fail ¿what should I do?
So, the normal thing to think in both cases it's to do it with "client.RunInTransaction" method for each new data so I guarantee that everything was fine,but this its not possible because I can´t get the "datastore.key" when appending data to datastore into "client.RunInTransaction" as documentation says https://godoc.org/cloud.google.com/go/datastore#Transaction.Put (it returns *PendingKey no key itself) and I need to be outside "client.RunInTransaction" in order to get the "datastore.key" of the element and "commit" as documentation says https://godoc.org/cloud.google.com/go/datastore#Commit.Key
So, I want the funtion "put" into "client.RunInTransaction" give me the key of that element when the code is inside "client.RunInTransaction", no after, so I can guarantee that everything was ok with the update, because if I have the key after, the next appending may fail and I don´t want my data to save it
First create the new data entity separately, to get its key. The LastBlock and NextBlock properties would still be empty at this point.
Only after you have the entity's key use the transaction to perform the entity's insertion in the list, in which you only update the key references for that entity as well as the previous and the next entities (if any) in between which the entity is to be inserted.

mongoose difference of findOneAndUpdate and update

What is the difference between findOneAndUpdate and update?
Both accept criteria to query and doc to update.
Well there is the respective documentation to view for both .update() and .findAndModify() which is the root method of .findOneAndUpdate() here.
But in the main differences there are:
update(): Is meant to perform an atomic update operation against "one or more" documents matched by it's query condition in a collection. It returns the number of modified documents in it's response.
findOneAndUpdate(): Has the purpose of both processing an update statment on a "singular" document, as well as retrieving the content of that "singular" document. The state returned depends on the value of the "new" option as passed to the operation. Where true the "modified" document is returned. Where false the "original" document is returned before any modification. The latter form is the default option.
In short. One is meant to modify in "bulk" and not worry with the document content in result. And the other is meant to modify a singular document and return the document content in result.
That's the difference.
The .findOneAndUpdate method issues a mongodb .findAndModify update command and returns the found document (if any) to the callback or return the modified document rather than the original if the new option is true and the .update execute the query as an update() operation.
Note there is an option returnNewDocument in the findOneAndXXX methods and it's default value is true. If you are using the node.js driver, the options is called returnOriginal.

Update Drupal Application with an external DB script

I’m trying to update my Drupal application with an external script writing directly in the MySQL DB, I only need to modify one field of one specific datatype. I see it in a table named field_data_field_FIELDNAME but when I update this the application doesn’t change. I need to modify something else? Thanks
Assuming you are trying to change the field's value, try modifying both field_data_field_FIELDNAME and field_revision_field_FIELDNAME, then clear all cache.
Note that you do not actually need to clear all cache if you know what you need to clear, e.g. to clear the field's value for a particular node, you can use cache_clear_all('field:node:[your nid here]', 'cache_field');
If your are trying to change more than the field's value, I suggest you do it through the Field API.

RavenDB - How to backpopulate "old" documents after adding new property to POCO?

I'm just starting to learn about NoSQL/Document storage this morning. I am used to EntityFramework/SQLServer.
My questions is the following: If I have a bunch of "documents" stored and somewhere down the line I add a property to my class that is needed by my app, how do I back-populate the already existing records?
If you change the model after the fact then you have a few options.
If you have a default value for the additional field and can wait until the next time that entity is saved for the database then you can simply add the new property and set the value to the defaultv value in the constructor.
You can use a IDocumentConversionListener (http://ayende.com/blog/66563/ravendb-migrations-rolling-updates)
You can also use https://github.com/khalidabuhakmeh/RavenMigrations which I have never used but it seems like it would do what you want.

How to store Propel objects into file?

I need to have option for Propel to store the data into database and file as well. Is there any way how to do that? I'll create the objects, fill the data and then need to store them into file (session) and be able to recreate the objects later. In some time it will go to database as well. Any idea?
I assume you could always create the object, fill some fields, and then serialize the object to a string, which you can save to a file. If you then deserialize this string, you get your original object. Watch out for references to objects or resources that can't be serialized, or should be re-created on serialization.
Once you get this working in one class, you can write a behavior (in Propel 1.5) that adds it to all your model classes.

Resources