Is Firestore a good database for storing many large objects in? - reactjs

I have been using React/Leaflet to create a choropleth map that can color any country on the map. What I am trying to do is to develop a save/load function that saves the colored countries and later be able to import it from the database. When this object is saved and loaded, it can bring back the exact same countries that were colored. I have been using firebase/firestore but I haven't been getting any luck.
This is how my object of map data looks like
Is Firestore the right database to do it? Or should I approach another database? I need a database that can store multiple objects in the picture above.

If you can convert that file into a JSON of a size smaller than 1MB, which is the Firestore Document size limit, it is possible. In the case you are proposing I would have the following structure, from the information you shared but fell free to adapt it as you see fit:
Map Collection
field1
...
fieldN
CountriesOptions Subcollection
optionObject: {}
Where each object is a separate document in the CountriesOptions subcollection converted to JSON using JSON.stringify(obj).
For more information on how to structure your Firestore with subcollections you can check this link to the documentation.

Related

Structuring the Firestore: Should I make another collection to store the changes that were made?

I am using Reactjs and Firestore.
I have this collection of products:
The colorMap is a map then below it are the different colors and their quanty.
Now, I want to create a list or a history whenever a product is added and whenever the quantity in those colors was added more of it.
Should I add another collection that will store when a product is added or whenever quantities are added in the color? I'm also thinking of adding a createdDate
Or there any other way I could do this? As much as possible, I won't be using any cloud functions.
A common way to keep the history of each document is by creating a subcollection under that document (say history) and writing a new document with either the complete, old document data there for every update you perform, or a new document with just the old values of the fields that were modified.
While it is convenient to do this from Cloud Functions, as they already get both the previous and the new data for each document write, you can accomplish the same from client-side code too.

React: How to create interface's attributes dynamically during runtime

at the moment I read documents from a collection according to a particular data model created by an interface in react.
This means the document's data is strongly saved in the attributes like:
dealContent.dealWelcome.productId where dealWeclome is a document-ID and an interface in react
Now we create documents dynamically in Firestore like:
dealcontent.MMDEde_dealWelcome.productId where MMDEde_dealWelcome is the doument-ID
Is there a way to create an attribute like MMDEde_dealWelcome during runtime, so the correct path in Firestore is read?
Best regards,
Johannes

Selenium Webdriver- Which is the better way to get data from an external data file

I am just trying to login into a web application and filling out the input criteria(10 text fields) and clicking on submit.I am getting data from xml.
My doubt here is we can get input data from excel,xml,json,etc..But which is better,efficient and lightweight.Please suggest
You can use different approaches. For example:
If you want representation of object (you have object structure), you may store information in XML/json, load data into Entity and pass data into page using this Entity.
If you want just load data and don't want to use objects or your data unstructured (it can't be represented as object), you may use txt/csv/excel
something else (depends on your situation)

Serialize Old App Engine 'db' Queryset into JSON

I'm just working on a rather dated project on App Engine that still uses the old 'db' model format instead of 'ndb'.
What would be the simplest way to serialize a 'db' query into JSON?
For example:
sections = Section.all() >>> JSON
All of the methods that I found from a google search use the to_list method of 'ndb' models.
Thanks!!!
A quick read of the docs (you have done that ?) turns up to_dict https://cloud.google.com/appengine/docs/python/datastore/functions?hl=en#to_dict which allows you to transform a model entity to a dictionary. Dictionaries can be transformed to JSON (unless they have Decimal values and a few other types, but I am sure you can work around that.). Then just iterate over the query result, producing a list of dicts, which you can json.dump(thelist)
However if you have a lot of entities you will have to take some additional steps, but you can read the docs to work out that.

How to store Propel objects into file?

I need to have option for Propel to store the data into database and file as well. Is there any way how to do that? I'll create the objects, fill the data and then need to store them into file (session) and be able to recreate the objects later. In some time it will go to database as well. Any idea?
I assume you could always create the object, fill some fields, and then serialize the object to a string, which you can save to a file. If you then deserialize this string, you get your original object. Watch out for references to objects or resources that can't be serialized, or should be re-created on serialization.
Once you get this working in one class, you can write a behavior (in Propel 1.5) that adds it to all your model classes.

Resources