I am developing a website in React that extracts data from packages that users request around the world, I have a searcher that fetch the data from my backend, I want to save the lastest Packages and save it, for the next time that users search for another packages, something like "result of last search" and keep it saved, even if customer navigate to other route, the questions is I dont know where to save, if I have to use, Redux or Local Storage or something else?
If you want the results to be saved over multiple loads of the page, Redux won't be enough, since Redux only saves data in the current running script's memory, which is not persistent.
Local Storage is an option, but it has a relatively low size limit. If you have a lot of data, it may reach the limit quickly and become inoperable.
IndexedDB is like Local Storage, but its size limit is much higher, and it has a more complicated interface. If you have a lot of data, that's what I'd recommend. You may wish to use a library like localForage to make things easier.
Both IndexedDB and Local Storage save the data to the user's local machine. So, for example, if you want to save data for users regardless of what machine they log on from, or to cache data from different users' requests, the local machine options won't be enough - you'll need to save the data in a database on your server instead.
Related
I'm building an application which uses a publicly available set of data that is rather large. I have two options to query it:
Via an API. For each query, my application would send a request using this dataset's API.
Alternatively, I could download (downloading the CSV files take over 4.0GB) and store the entire dataset locally.
The type of operations and analysis that I'd like to perform on the data for my web application is easily done with either method. However I'm wondering which way is best and why?
The only thing I can think of is that querying a local database would be faster, however using the API would ensure the data is up-to-date ("valid" data in this dataset is said to expire after 10 years according to the organisation's website).
As you said both options are valid and it depends on your use case which option is better.
Consider the following questions:
How often is the data updated? Is it maybe completely historical data and will never be updated, or only new values will added but existing never change? How much effort would it be to update your locally stored data automatically.
How time critical is the response time and availability? Locally stored data makes you independent against network delay to the API, an outage of the API, a rate limit that the Service provider could implement to throttle the rate of requests, or taking the data offline. How much data is requested on average, what is the response time for the API?
I'm currently building an app with a dashboard that lists a ton of statistics. The data is loaded via an API and stored in my component's state (currently not using redux — just plain react). I'm loading more than 100.000 (small) rows of data, and storing them as an array in state. My question is: at what point does the size of my state become a problem? Surely there will be memory limitations at some point? is 100.000 entries in an array a problem, is 1.000.000? And if yes, what are alternative solutions to handling this amount of data? Is this where redux can help?
For the most part, it does not matter Where you store this data as much as how much of data you are storing. All of the data that you store, regardless of whether in a store or in a static variable, is stored in RAM. Because of this, your application might crash the browser because of taking too much resources.
A much better solution for storage (If you absolutely have to store data client-side) is to use something called IndexedDB. IndexedDB stores data in your hard disk instead of RAM
In most use-cases however, it is recommended to store the data in the backend, paginate it, and then send only the individual pages to the client as needed. This ensures that
The client doesn't have to load a massive chunk of data before the application works.
The client does not have to store large amounts of data in RAM.
I have noticed that apps like instagram keep some data persistent through app closures. Even if all internet connection is removed (perhaps via airplane mode) and the app is closed, reopening it still shows the last loaded data despite the fact that the app cannot call any loading functions from the database. I am curious as to how this is achieved? I would like to implement a similar process into my app (Xcode and swift 4), but I do not know which method is best. I know that NSUserDefaults can persist app data, but I have seen that this is for small and uncomplicated data, of which mine would not be. I know that I can store some of the data in an internal SQL db, via FMDB, but some of the data I would like to persist is image data, which I am not sure exactly how to save into SQL. I also know of Core Data but after reading through some of the documentation I have become a bit confused as to whether or not it fits my purpose. Which of these (or others?) would be best?
As an additional question, regardless of which persistence method I choose, I feel as though every time the data is actually loaded from the DB (when internet connection is available), which is in the viewDidLoad, I would need to be updating the data in the persistent storage in case the internet connection drops. I am concerned that this doubling of my writing procedures will slow the app down? Is there any validity to this concern? Or is it unavoidable anyway?
I am looking at making a LOB html5 web application. The main part we are trying to accomplish is to make the application offline capable. This will mean taking a large chunk of SQL data from the server and storing it in the browser. This will need to be living in the browser for quite a while, dont want to have to continuously refresh it everytime the browser is closed and reopened.
We are looking at storing the data inside the client in indexedDB but I read that indexedDB is stored in temporary storage so the lifetime of it cannot be relied on. Does anyone know of any strategies on prolonging its lifetime? Also we will be pulling down massive chunks of data so 1-5mb storage might not suffice what we require.
My current thought is to somewhat store it down to the browser storage using html5 storage API's and hydrate it into the indexedDb as it's required. Just need to make sure we can grow the storage limit to whatever we need.
Any advice on how we approach this?
We are looking at storing the data inside the client in indexedDB but I read that indexedDB is stored in temporary storage so the lifetime of it cannot be relied on.
That is technically true but in practice I've never seen the browser actually delete data. More common if you're storing a lot of data, you will hit quota limits which are annoying and sometimes inconsistent/buggy.
Regardless, you shouldn't rely on data in IndexedDB always being there forever, because users can always delete data, have their computers break without backups, etc.
If you create a Chrome extension, you can enable unlimited storage. I've successfully stored several thousand large text documents persistently in indexedDB using this approach.
This might be a silly question to add, but can you access the storage area outside of the browser? For instance, if I did not want to have a huge lag when my app starts up and loads a bunch of data, could I make an external app to "refresh" the local data so that when the browser starts, it is ready to rock and roll?
I assume the answer here will be no, but I had to ask.
Has anyone worked around this for large data sets? For instance loading in one tab and working in another? Chrome extension to load, but access via the app?
I have an existing database containing some pictures in blob fields. For a web application I have to display them.
What's the best way to do that, considering stress on the server and maintenance and coding efforts.
I can think of the following:
"Cache" the blobs to external files and send the files to the browser.
Read them from directly the database every time it's requested.
Some additionals facts:
I cannot change the database and get rid of the blobs alltogether and only save file references in the database (Like in the good ol' Access days), because the database is used by another application which actually requires the blobs.
The images change rarely, i.e. if an image is in the database it mostly stays that way forever.
There'll be many read accesses to the pictures, 10-100 pictures per view will be displayed. (Depending on the user's settings)
The pictures are relativly small, < 500 KB
I would suggest a combination of the two ideas of yours: The first time the item is requested read it from the database. But afterwards make sure they are cached by something like Squid so you don't have to retrieve the every time they are requested.
one improtant thing is to use proper HTTP cache control ... that is, setting expiration dates properly, responding to HEAD requests correctly (not all plattforms/webservers allow that) ...
caching thos blobs to the file system makes sense to me ... even more so, if the DB is running on another server ... but even if not, i think a simple file access is a lot cheaper, than piping the data through a local socket ... if you did cache the DB to the file system, you could most probably configure any webserver to do a good cache control for you ... if it's not the case already, you should maybe request, that there is a field to indicate the last update of the image, to make your cache more efficient ...
greetz
back2dos