Persisting and keeping mobile app data in snych with online backend - mobile

I am building a mobile app using AngularJS and PhoneGap. The app allows the user to access a large amount of data-items. These data-items come with the app in form of a number of .json files.
One use-case is that a user can favorite any of those data-items.
Currently, I store the (ids of) the items that have been favorited in localStorage. It works and it's great and very simple.
But now I would like create an online-backend for the app. By this I mean that the (ids of) the items that have been favorited should also be stored on a server somewhere in some form of database.
Now my question is:
How do I best do this?
How do I keep the localStorage data and online-backend data in synch?
In particular, the user might not have an internet connection at the time were he favorites a data-item. Additionally, if the user favorites x data-items in a row, I would need to make x update calls to the server db, which clearly isn't great.
So, how do people do it?
Does Angular have anything build-in for this?
Is there any plugin?
Any other framework?
This very much seems like a common problem that must have a well-known solution?

I think you've almost got the entire solution. All you need to do is periodically (on app start load the data from the service if available, otherwise use current local storage, then maybe with a timer and on app close update the data if connected) send the JSON out to a service (I generally prefer PHP, but Python, Java, Ruby, Perl, whatever floats your boat) that puts it in a database. If you're concerned with merging synchronization changes you'll need to use timestamps in the data in local storage and in the database to make the right call on what should be inserted vs what should be updated.
I don't think there's a one size fits all solution to the problem, though I imagine someone may have crafted a library that handles the different potential scenarios the configuration may be as complicated as just writing the logic yourself.

Related

Restricting data in PouchDB

I have an offline ready application that I am currently building in electron.
The core requirements are that all data is restricted (have to be a user to read or write) and that within that data some data is further restricted to a user, (account information, messages, etc...)
Now I do not want to replicate any data offline that a user should not have access to (this is because all the data can be seen using the devtools regardless of restriction) so essentially I only want to sync data to PouchDB's offline store if that user has access to it as well as all the data all users have access to.
Now I have read the following posts/guides but I am still a little confused.
https://pouchdb.com/2015/04/05/filtered-replication.html
https://www.joshmorony.com/creating-a-multiple-user-app-with-pouchdb-couchdb/
Restricting Access to local PouchDB
From my understanding filtering is a bad choice performance wise even though it could do what I want.
Setting up a proxy would work but it then essentially becomes a REST api and the data synchronization falls apart.
And the final option which I think is what I want is to have a database for every user that would contain their private information and then additional databases to hold the information that is available to every user.
The only real question I have with this approach is how is data handled that is private but shared between two users (messages, etc...)
I am more after an overarching view of how the data should be stored as opposed to code examples, just really struggling with the conceptual architecture of the application.
There are many solutions to your problem. One solution looks very promising: IBM Cloudant has started work on Cloudant Envoy, a proxy simulating the CouchDB interface instead of a simple REST API. You can read more about it on the site for Envoy over at ibm.com. A custom replicator for PouchDB is also available on Github.
There's also a blog post on Medium.com on this.
The idea is the same as the much older Couchbase Sync Gateway. Although Couchbase has common roots with CouchDB, I have not tracked if they still support replication with CouchDB.
The easiest way to start would be to create a single database per user on the server, and a common database that you just pull the shared data from. Let me know if you need more info on this solution.

Using a file to store json data

I am working on a webapp for a client that has a cPanel virtual server, and it appears that I can only use MySQL, but I want to store the data using a json-like structure, so that I can more easily use Angular.js on the frontend.
I've looked into installing a NoSQL database, and I can't find anything viable (if you know of a way to do that, that would be my best solution), so I'm thinking of storing the data as json strings in a series of text files on the server that I would write to with php.
I'd like to hear some opinions, and if there are any better solutions of which I'm not thinking of.
Go look at firebase and thank me afterwards.
In short, firebase is a cloud real-time JSON data storage. Everything for the backend is done for you and all you need to do is the front-end. Their servers are CDNs which means it will be great if you're looking to serve the entire world. All you need to do is configure your data-structure and use it!
It also provides sockets, which is great for real-time data (used for games, chat and etc).
There is a free option. The only downside is that it is a little expensive if you want to scale it, nevertheless if your app really gets to that stage - I'm sure you'll have money to hire some people to develop a similar backend for yourself.

How to make App Engine Datastore private

I'm developing an App Engine app that offers users to keep a diary.
Now, I noticed that I can check all data in datastore through Developers Console.
This is not good for a diary app for privacy.
So I want to know how to make datastore private to prevent me from checking users' data.
Please help me.
This is a little bit tricky since the code can read the data in the datastore and so, by definition, anyone who can update the running code can also read the data in the datastore; however, there are ways that you can at least make it more difficult to inadvertently examine the data (though accessing the data will still be technically possible for you or any of the owners to do). The simplest way is to encrypt the data before storing it within the datastore model objects (and decrypting it when you read the data from the model objects); however, this will make indexed fields no longer work if you do that (you will need to decide whether that content really needs to be indexable or whether it is worthwhile to add manual indexing).
If you want data to not be readable by you at all, then you will need to encrypt/decrypt the data with a key that is only available to your application while the user is interacting with it (e.g. encrypting the data in the client that communicates with your server); however, you need to be aware that this will make any sort of indexing or background processing of the data impossible.
The only way to prevent you from viewing data in the datastore is to remove you from the developers of the app. A developer can always extract data if he wants to, either by looking it at directly in the Datastore viewer or by writing code that can read/forward this data.

AngularJS: Do I need to know database structure to implement?

I'm a beginner AngularJS user. I've been trying to pull hard coded JSON (backend and server data not ready) currently. It seems that in order to pull data, for instance when using the very common ng-repeat, I need to know the database structure (as the rendered JSON will mirror that structure, right?).
So while I can code independently of the back end, am I correct in my assumption that I must know the database structure? For instance... I might want to pull user comment data. This could be in its own database and I might do this: ng-repeat='comment in comments' and filter for the specific user within each comment entry in database. Whereas if comments are only within a user table it would be ng-repeat='comment in user[0].comments'. I would imagine the former is the correct approach but I honestly have never learned about proper database structure. It seems that it is something you must know in order to properly implement AngularJS though.
Any help is appreciated. I really want to make sure I approach things properly. Thanks!
I don't think you need to (or should) know the database structure. AngularJS is an MVC framework. A basic principle in this architecture is the separation of concerns. Simple put: do not mix stuff, but more specifically, you're talking about the communication between two systems: a local one (the browser running angularJS) and the remote one (a server that might, or might not, be the same that served the angular files to the client)
For example, your view should not be accessing your database (if you were working with, say, PHP, you should not have things like mysql_query(...) in a view).
You should also design components to be loosely coupled: make them as independent as possible. Unit tests help you think that way and AngularJS is particularly unit-tests-friendly with karma. Following this principle, what if you use the twitter API to show tweets in your angularJS application? you don't need to know about the internals of twitter. There is an API that serves this JSON in a format that you can use.
Your backend should provide this (for example, with a façade controller), and you should agree with the backend team what data will be available.
Instead of making your design depend on the database structure, make the backend API depend on your requirements. this way you'll have two systems loosely coupled and the backend team can do whatever they want without affecting you. For example, changing the DBMS or the structure of the tables.
If you want to pull comments, you might have a remote call ($http or ng-resource) that gets all the comments for a specific user (or for a few users, because you might want to minimize the number of remote calls) in a service or in a controller. The server responds with a json file that represents this (and probably some more things that will be needed soon, like profile picture urls, user id's, etc). Then you put the data you want to expose to a view (a subset of what you fetched from the server) in $scope.

Is Redis right for storing and retrieving messages against user a la twitter?

I'm building a web app - primarily in php - but we need to pull down messages from twitter and various other services (email, sms). I'm writing a small service in node.js to handle the twitter connection etc. but am just trying to work out what is best to do with the content that is pulled down.
Right now I'm leaning towards a combination of MySQL for all our standard info with the main PHP app and Redis with the node.js service to store each message against a key that will probably be the username and some sort of unique identifier.
I've used redis before but this data needs to persist rather than being something that can expire like sessions. Redis' in memory nature makes me a bit nervous about this as, over time, with this being our main message store the dataset will quickly become unruly in RAM, will it not?
This blog post gives a good and concise overview for NoSQL-type databases. Perhaps you can find confirmation for or an alternative to Redis there. Since you have not given any numbers on how many and how often you need to pull data from the sources, it's hard to answer from my side.
And, Redis supports two methods of persistence: timed snapshots and an append-only journal files where changes to the db are written to. The second one is the safer alternative.

Resources