What is the maximum size for redux persist? - database

I have a medium sized objects thats maximum size is 10mb .
What is the maximum size to store and persist data in redux using redux-persist ?
Also, What is the best approach for such sized data , to use redux-persist or use a database like realm ?

Update
When this post was originally written it was not possible to set the size of AsyncStorage. However in May 2019 the following commit changed that.
You can read more about it here
Current Async Storage's size is set to 6MB. Going over this limit causes database or disk is full error. This 6MB limit is a sane limit to protect the user from the app storing too much data in the database. This also protects the database from filling up the disk cache and becoming malformed (endTransaction() calls will throw an exception, not rollback, and leave the db malformed). You have to be aware of that risk when increasing the database size. We recommend to ensure that your app does not write more data to AsyncStorage than space is left on disk. Since AsyncStorage is based on SQLite on Android you also have to be aware of the SQLite limits.
If you still wish to increase the storage capability the you can add the following to your android/gradle.properties
AsyncStorage_db_size_in_MB=10
This will set the size of to 10MB instead of the default 6MB.
Original Answer
If you use the default storage settings for redux-persist in react-native it will use AsyncStorage
AsyncStorage has some limitations in the amount it can store depending on the operating system that you are using.
In Android if we look at native code behind AsyncStorage we can see that the upper limit is just 6MB
In iOS there are no such limits on the amount of storage that can be used. You can see this SO answer for a further discussion.
If you don't want to use AsnycStorage there are alternatives like redux-persist-fs-storage or redux-persist-filesystem-storage which get around the 6MB limitation on Android.

Related

what is the best way to store large data in react.js, cookies, localstorage, or any other?

I found some similar questions but this is unique because of the data size itself...
I have an axios request for a large amount of data that gets called only upon starting the app. It is an array of hundreds of thousands of objects. It can take about 1 minute to load the first time through so I would like to store this somewhere so when you open the app, you don't have to wait a minute again. What is the best procedure for this? I was thinking localStorage but I wasn't sure if I could for large amounts of data, or if this was even the best option. I read that 5MB is the max size to store in localstorage so I should ask for both scenarios. Assuming the data size is less than 5MB, is this the best way? And what if it is over 5MB?
The maximum size of session/local storage depends on setting of browser. I don't recommend to use local/session storage for storing large content of data. And Cookie used to be set by set-cookie in response header and also can be set in your client side code manually, but it is not a good way to use it for storing some data.
How about to use indexedDB. The following link can be help for you.
https://visualstudiomagazine.com/articles/2016/08/30/storing-data-client-javascript-typescript.aspx

What is considered "Too much data" in react state

I'm currently building an app with a dashboard that lists a ton of statistics. The data is loaded via an API and stored in my component's state (currently not using redux — just plain react). I'm loading more than 100.000 (small) rows of data, and storing them as an array in state. My question is: at what point does the size of my state become a problem? Surely there will be memory limitations at some point? is 100.000 entries in an array a problem, is 1.000.000? And if yes, what are alternative solutions to handling this amount of data? Is this where redux can help?
For the most part, it does not matter Where you store this data as much as how much of data you are storing. All of the data that you store, regardless of whether in a store or in a static variable, is stored in RAM. Because of this, your application might crash the browser because of taking too much resources.
A much better solution for storage (If you absolutely have to store data client-side) is to use something called IndexedDB. IndexedDB stores data in your hard disk instead of RAM
In most use-cases however, it is recommended to store the data in the backend, paginate it, and then send only the individual pages to the client as needed. This ensures that
The client doesn't have to load a massive chunk of data before the application works.
The client does not have to store large amounts of data in RAM.

Making Large IndexedDB Persistent in Browser

I am looking at making a LOB html5 web application. The main part we are trying to accomplish is to make the application offline capable. This will mean taking a large chunk of SQL data from the server and storing it in the browser. This will need to be living in the browser for quite a while, dont want to have to continuously refresh it everytime the browser is closed and reopened.
We are looking at storing the data inside the client in indexedDB but I read that indexedDB is stored in temporary storage so the lifetime of it cannot be relied on. Does anyone know of any strategies on prolonging its lifetime? Also we will be pulling down massive chunks of data so 1-5mb storage might not suffice what we require.
My current thought is to somewhat store it down to the browser storage using html5 storage API's and hydrate it into the indexedDb as it's required. Just need to make sure we can grow the storage limit to whatever we need.
Any advice on how we approach this?
We are looking at storing the data inside the client in indexedDB but I read that indexedDB is stored in temporary storage so the lifetime of it cannot be relied on.
That is technically true but in practice I've never seen the browser actually delete data. More common if you're storing a lot of data, you will hit quota limits which are annoying and sometimes inconsistent/buggy.
Regardless, you shouldn't rely on data in IndexedDB always being there forever, because users can always delete data, have their computers break without backups, etc.
If you create a Chrome extension, you can enable unlimited storage. I've successfully stored several thousand large text documents persistently in indexedDB using this approach.
This might be a silly question to add, but can you access the storage area outside of the browser? For instance, if I did not want to have a huge lag when my app starts up and loads a bunch of data, could I make an external app to "refresh" the local data so that when the browser starts, it is ready to rock and roll?
I assume the answer here will be no, but I had to ask.
Has anyone worked around this for large data sets? For instance loading in one tab and working in another? Chrome extension to load, but access via the app?

(HTML 5) How much is too much Local Storage?

Some questions considering HTML5 Client-Side Storage:
How much data in Local Storage is considered too much?
Is there a limit on the size?
Since its saved on files will it by any means have any effect on the browsers speed?
Why use Database storage? is it indexed?
Why not use LocalStorage where key is the index (if unique) of the record, and the value is the record JSON stringified?
EDIT
Just a follow up to the Answer, after the WebDatabase project was dropped, all browser are proceeding to implement the soon to be standard " IndexedDB "
Check this other Question.
HTML5 localStorage size limit for subdomains
It depends on your application.
5 mb is the max size
No impact.
Database storage is deprecated, so it will not receive more updates. Current browsers support it, yet their implementation may not be standard. So it is not a good idea to use it.

How to cap memory usage by Extensible Storage Engine (JetBlue)?

I have an app that every so often hits a ESE database quite hard and then stops for a long time. After hitting the database memory usage goes way up (over 150MB) and stays high. I'm assuming ESE has lots of cached data.
Is there a way to cap the memory usage by ESE? I'm happy to suffer any perf hit
the only way I've seen to drop the memory usage is to close the DB
You can control the database cache size by setting the database cache size system parameter (JET_paramCacheSize). That number can be changed on-the-fly.
You might not need to set it though: by default ESENT will manage its cache size automatically by looking at available system memory, system paging and database I/O load. If you have hundreds of MB of free memory then ESENT won't see any reason to reduce the cache size. On the other hand. if you start using the memory on your system you should find that ESENT will automatically reduce the size of the database cache in your application. You can set the limits for automatic cache sizing with the JET_paramCacheSizeMin and JET_paramCacheSizeMax parameters.
Documentation link for the sytem parameters: http://msdn.microsoft.com/en-us/library/ms683044.aspx

Resources