Make Angular Web App available offline - angularjs

I am struggling with some details about finding a solution to make an Angular / C# app available offline.
My idea would be to use upup.js the get the business logic for the SPA in Angular available offline. upup.js uses service workers to do so. I would store the data required for the offline app using angular-localForage which uses IndexedDB and falls back to WebSQL and localStorage in case.
The problem is, that I have to make files and images available offline too without requiring the user to visit the page they are being used on and I am worried about the maximum quotas. I could store them using either upup.js and adding those files as assets, or store them with angular-localForage as blobs. IndexedDB is supposed to be unlimited by now if I am informed correctly? I couldn't find any maximum quota for a service worker though, as the upup.js solution would use. The AppCache is deprecated, so I wouldn't use that... Or maybe I understood something completely wrong, or there is even another, better solution? Anyhow, the question is:
TL;DR: What is the better way to store files for an AngularJS offline application: angular-localForage (IndexedDB etc.) or a Service Worker (upup.js) and what are the maximum quotas for each solution? Or is there an even better solution?

In my opinion Service Worker (SW) is better than tradition local storage. Plus SW can also use indexedDB.
For the implementation, it is very depending. How your app structure, what technologies of front-end being with with your angular app, what is your main goals of using SW...?
1. Traditional JS loading, you are likely merge all the JS file in one... like app.js contains everything.
And you also don't care about Push Notification neither any other cool features that SW offering.
=> For this case it seem like upup.js suite you the best.
NOTE : beware that upup.js attempt to registering SW on it own, so it likely blocking or complexified your work on expanding feature of SW.
2. Advanced AMD user, where almost all of your JS chopped to small pieces... like fooCtrl.js, barCtrl.js, etc...
For sure you don't want to configure like 100+ files of JS, and further more you will got a lot of html template to load.
=> In this case I will suggest you to use sw-toolbox . A very powerful and light weight tool made by Google. At initial if you are not familiar with SW concept yet, you will have a bit of trouble setting it up for your site (but it won't be longer than a day of you are an advanced JS developer)
After all has configured, all become so very simple. For example this is how I cache all of static content in my site.
self.toolbox.router.get(/(.js|.css|.png|.jpg|.json|.html)/, self.toolbox.fastest);
3. You don't care about what kind of technology at front-end side. You just interest with SW.
=> Simply go for sw-toolbox it's a real time-saver for basic configuration. And if you want to expand the usage of SW, you can just expand it by your own will.

Related

Custom Angular setup for MIT library/project

Hi I need your help trying to figure it out something.
First a little background, I was used to code in Django, it was fast to code and good, but times change and Node is taking over most of new apps, then I move to Express a couple of years ago, however I still miss a lot of the Django functionality, then I start coding a little library to help me with common tasks, then start growing until the current point where you have a core library and plug-able “apps” to do recurrent tasks, like notifications, auth and more, or at least that’s the plan.
Right now an app works something like this:
./controllers (All renders)
./endpoints (Restfull API endpoints)
./models (Static and DB models)
./public (Public files)
./style (Scss styles with bootstrap injected)
./views (Default templates, distributed with the app as example, loaded by default)
…/…/views (Custom views to rewrite the default ones from the app)
On start, JSloth compile everything for you and run the server (hot reload included):
Now, that works fine using an static multipage environment, but I will love to use Angular for that, changes will be needed but I want you guys to lead me in the right direction.
So far I have 2 ideas:
1- Split Routes/Html apps and Restful/Endpoints, then basically use an standard set up on that kind of apps with webpack and AngularSSR.
The big downside is, you can’t give an out the box frontend implementation for apps.
2- Figure it out a way to provide an Angular app for each JSloth app, in this way if you install/copy the auth app you will be provided of user lists, interfaces to update your profile, etc.
I’m thinking that may be a problem talking about performance because in this way you will have a lot of Angular apps, am I wrong?
I need a easy way to allow the final user to share footers and headers at least, maybe even styles or scss variables for colors, in that way it will not look like a huge change.
Do you have any other option? Any better idea?
Thank you so much for the help, this is the repository: https://github.com/chrissmejia/JSloth
Edit 1: Forgot to add the models folder

Caching best practice for AngularJS dynamic sites?

I'm working on an AngularJS project that will have considerable traffic.
While in development I'm stumbled upon the issue with partials cached and not updated on different actions. Sure, I can get rid of this using .run with $templateCache.removeAll(); for example,
but want to make sure this is actually a good idea.
Some partials are updated dynamically (user input, or automatically in intervals), while some are static or updated very infrequently.
What would be the best approach to caching in this case?
With non-angular sites I prefer to keep responsibilities cleanly split, for ex.:
1. Cache-headers are set on app level
2. nginx - just to serve sites per se
3. Varnish doing FPC + CDN for static assets or CDN doing full page caching (depending on a client/project, etc.) etc. etc. etc.
Key point: every part has its own distinct responsibility.
With this project I can use Varnish and CDN for static assets,
multi-server setup. Possibility to re-use Varnish for loadbalancing as well, i.e. I may have Varnish above several web-nodes. I have some flexibility in terms of infrastructure.
Please share your thoughts on the optimal setup?
In particular: is it still worth caching partials?
If yes, what would be the best place to set CC headers?
What would be the best way to flush their cache then, esp. if I need to flush only some sub-selection?
Thank you!!
D.
Posting the answer myself, as cannot select rob's reply as an answer. My solution is based on his suggestion which helped me a lot to move forward:
During the production build:
I'm using "gulp-rev-easy" gulp module for revving the previously
concatenated and uglified css & js files.
However, no existing revving modules could provide me with functionality I needed: replacing specific strings in an index.html template file with my own paths to CDN both for CSS and JS scripts. So I had to add my own function/override reveasy a little.
As per rob's I've started using templatecache and its works great in conjunction with revving.
Images are just going through a S3 -> CDN pair.
Some quick notes:
Using revving for "breaking" cache turned out to be the fastest and
easiest way to deal with caching, as CloudFront is taking approx. 20
minutes to invalidate a path, other CDNs - less, but you still have
to issue the request etc.
For dynamically updated images having same names, like avatar images
that a user can update via his profile areas, I suggest adding a
timestamp or some random string to the file name. While adding a
random URL path sounds easier, you won't be able to cache this.
Hope this helps,
Dennis

Is there a way to save offline google map on hybrid mobile app on ionic?

I am planning to develop a hybrid mobile app using ionic. One of the features i need is offline google map. Is there a way how to do it?
It depends on the requirements of your application whether this will be possible. Are your users on "modern" devices A.K.A is HTML5 fully supported? Do your users need to view/edit the map globally, or just in a specific area? Does the map really need to be provided by google? I'll address some issues below to point you in possible takes on this problem.
Do you really need google maps? (Most optimal scenerio)
First of, do you really need google maps? Also relevant: how far do your users need to zoom their maps? If it can be any maps, and zooming is not really of high priority (if it is, including all map tiles will make the app eat all storage), you could probably use map-tiles as a packaged part of your app, and display them with a library like http://leafletjs.com/. The library is well documented, and provides a map-interface for a variety of map-providers. It will be do-able to configure this to use your own local map-tiles. You could include map-tiles for multiple zoom levels if necessary, and limit the min/max zoom-levels to the tiles you actually have available. This will make your maps work offline.
I can't or don't wan't to provide my own tiles make sure that you really looked into the option, there is systems out there that provide map-tiles you could use (check https://www.mapbox.com/ for example)
Okay, so you really don't want to do what I suggested. What are the options now? Javascript mapping-solutions typically render tiles based on the location of the map you want to see and the zooming level. These tiles are requested to the tile-provider. I do not know how to implement this for google exactly, you might need some research on this - I'll try to help you see a direction. There will be requests to get the tiles from the servers. I checked with http://maps.google.com what images are loaded when trying to navigate the map: (example (click)). Find out what url's are used in your situation, we will need these kind of url's later (just inspect the network tab in your browser console and see which requests are made when scrolling in your map). When we only need our users to work in a certain area when offline we could use service workers to cache the responses of these requests when we are online, and serve those caches when we are offline. Read more on service workers here (click).
Advantage: Real offline map-functionality for any tile you visited before (as long as your cache wasn't overflown, depending on your implementation of the service workers, and for service-worker supported browsers/devices).
Disadvantages: No support for tiles that were never put in the cache (AKA: never seen before). Another one: this will only work for devices that support service workers. Might be an option in situations where you either don't care about users using "older" devices, or where you can control the user's device choices. Note that using crosswalk could ease your developing efforts here, since you only have to consider one browser-runtime then: but crosswalk also doesn't support older devices.
However: This solution could be fine for people that will need to work in a specific area, which might be true for the case provided by #vipul-r If you or your users know in advance where they need their maps to work, you can instruct/help them in loading & caching their maps correctly.
If you can't work on either of these 2 solutions, then I highly doubt there will be a way to do it. I don't see any other way to the best of my knowledge.

Feasibility of using angular js for web app with over 200 medium to complex screens

My team was considering using angular js for web app UI development. But knowing at a high level how single page apps work, I had a question as to, how feasible it is to use angular js framework, for a large web application. this would have at least 200 screens, each screen containing an average of 30 UI controls like text boxes, calendar controls, drop downs etc.
The user will be accessing the web app on desktop or laptop and will be using the application in the browser throughout an 8 hour day, without ever closing the browser.
Given above usage, would angular js, memory usage / performance be issue?
are there web apps with huge and complex UI, built using angular js, that are in production and used everyday?
You can have not just 200 but 1000's of screens - this number does not matter as long as you address the core and fundamental questions below. They all boil down to memory and performance.
At a given point of time how many $watches will be active.
At a given point of time how many listeners are created.
At a given point of time what is the complexity of DOM i.e. the number of DOM elements and thee nesting/depth.
At a given point of time how many Javascript modules (services, controllers etc.) will be loaded in the memory. Even though these things are singletons they consume memory.
For each such screen how much memory will be consumed (if you use jqueryUI etc. your memory increases quite rapidly than pure angular + html components)
This brings to what you can do to control the above factors which if not watched will make your application sink/crash.
1) How do you break the run-time complexities of your "big" application ? You can now think of "Routers" or dialogs. Each of your screen will come-and-go i.e. you will never display 200 screens the same time.
2) Make sure when the screen is gone there is no "leftover". Don't use jQuery - if you do make sure you clean this on $scope.$destroy.
3) Multitudes of directives:- Directives are powerful but don't over use it - try not to deep nest too many of them. templateUrl is "tempting" but sometimes in-lining a template is the best choice. Use build tools that will inline the templates.
4) Load modules on demand using requireJS. This will force you to modularize your application and think hard about concatention strategy (combining JS files). Will force you to write independent modules.
5) Browser caching your assets and a good cache busting mechanism. Grunt based plugins are to your rescue. Minify your assets.
6) Compresss the assets for efficient network utilization and faster transmission.
7) Keep your app encapsulated in Angular. Don't create any global objects. Chances are that they have access to some closure or are referring to some things within angular $scope and $scopes are now still hanging on or are in difficult trajectory to be able to get Garbage Collected.
8) Use one-time-bind or bind-once model binding as much as possible.
9) Splash screen is an excellent weapon - us it. Helps you load some stuff upfront while the user watches smooth/jazzy picture/picture :)
Regarding 8 hours a day constant use
Unless there is a leak of the following kind you should be fine:-
1) Listeners leaking i.e. hanging around.
2) DOM leaks. Detached DOM issue.
3) Size of Javascript objects. Javascript objects coded in a certain way creates repeated instances of function.
(I am developing AngularJS app - with more than 450 screen - MDI like app. The app is not yet in production. The above points are coming from my burnouts in the functionality we have developed so far.)
I've worked on multiple projects that are extremely large single-page applications in Angular and Ember.JS at companies like McKesson an Netflix.
I can tell you for a fact that it's completely "feasible" to build "huge, complex" applications with frameworks such as Angular. In fact, I believe Angular is uniquely well suited to this because of it's modular structure.
One thing to consider when creating your Angular application is whether or not every user will benefit from all "200 pages" of your application. That is to say, is it necessary to have all 200 views be part of the same single page application? Or should you break it into a few single page applications with views that share concerns.
A few tips:
Watch out for name collisions in the IOC container: If you create enough services and controllers, even if you break them into separate modules, it's very easy to create two services with the same name. This may or may not result in an outright error. What happens when you register two "fooService" services? The last one wins.
If you decide to break your app into more than one single page app, you have to be sure you have solid boundaries of functionality between the two. They're not going to be able to share state easily other than with something like cookies or local storage.
If you decide to keep everything in one single page app, you're going to want to keenly watch your initial download time. Always check to see how long it takes to start your app "cold" over a slow-ish connection. If the entire app is in one bundle, depending on how you structure things (are you going to use AMD?) then you might see a long initial load time.
AVOID rendering HTML on your server. I can't stress this enough. Let Angular do that work for you. The only rendering your server should be doing is rendering JSON to be returned from some RESTful service.
Flush out your JS build process early on. Look into Node-based tools like Grunt, Gulp, and Broccoli for linting/concatenating/minifying your JS files. Checkout Karma for running unit tests, and look into Istanbul for code coverage. Protractor is a great tool as well, but I recommend strong unit tests in Karma over integration tests with Protractor just because Web Driver based tests tend to be brittle.
Other than that, I think you'll find a single page app written in any modern framework to be extremely snappy after the initial load is done. In fact, it will make any "old" PHP/ASP.Net style app that renders the entire page at the server seem slow as dirt in comparison. This is because you'll only be loading data and the occasional .html file over HTTP.

Question on serving Images on App Engine ( 2 Alternatives )

planning to launch a comic site which serves comic strips (images).
I have little prior experience to serving/caching images.
so these are my 2 methods i'm considering:
1. Using LinkProperty
class Comic(db.Model)
image_link = db.LinkProperty()
timestamp = db.DateTimeProperty(auto_now=True)
Advantages:
The images are get-ed from the disk space itself ( and disk space is cheap i take it?)
I can easily set up app.yaml with an expiration date to cache the content in user's browser
I can set up memcache to retrieve the entities faster (for high traffic)
2. Using BlobProperty
I used this tutorial , it worked pretty neat. http://code.google.com/appengine/articles/images.html
Side question: Can I say that using BlobProperty sort of "protects" my images from outside linkage? That means people can't just link directly to the comic strips
I have a few worries for method 2.
I can obviously memcache these entities for faster reads.
But then:
Is memcaching images a good thing? My images are large (100-200kb per image). I think memcache allows only up to 4 GB of cached data? Or is it 1 Mb per memcached entity, with unlimited entities...
What if appengine's memcache fails? -> Solution: I'd have to go back to the datastore.
How do I cache these images in the user's browser? If I was doing method no. 1, I could just easily add to my app.yaml the expiration date for the content, and pictures get cached user side.
would like to hear your thoughts.
Should I use method 1 or 2? method 1 sounds dead simple and straightforward, should I be wary of it?
[EDITED]
How do solve this dilemma?
Dilemma: The last thing I want to do is to prevent people from getting the direct link to the image and putting it up on bit.ly because the user will automatically get directed to only the image on my server
( and not the advertising/content around it if the user had accessed it from the main page itself )
You're going to be using a lot of bandwidth to transfer all these images from the server to the clients (browsers). Remember appengine has a maximum number of files you can upload, I think it is 1000 but it may have increased recently. And if you want to control access to the files I do not think you can use option #1.
Option #2 is good, but your bandwidth and storage costs are going to be high if you have a lot of content. To solve this problem people usually turn to Content Delivery Networks (CDNs). Amazon S3 and edgecast.com are two such CDNs that support token based access urls. Meaning, you can generate a token in your appengine app that that is good for either the IP address, time, geography and some other criteria and then give your cdn url with this token to the requestor. The CDN serves your images and does the access checks based on the token. This will help you control access, but remember if there is a will, there is a way and you can't 100% secure anything - but you probably get reasonably close.
So instead of storing the content in appengine, you would store it on the cdn, and use appengine to create urls with tokens pointing to the content on the cdn.
Here are some links about the signed urls. I've used both of these :
http://jets3t.s3.amazonaws.com/toolkit/code-samples.html#signed-urls
http://www.edgecast.com/edgecast_difference.htm - look at 'Content Security'
In terms of solving your dilemma, I think that there are a couple of alternatives:
you could cause the images to be
rendered in a Flash object that would
download the images from your server
in some kind of encrypted format that
it would know how to decode. This would
involve quite a bit of up-front work.
you could have a valid-one-time link
for the image. Each time that you
generated the surrounding web page,
the link to the image would be
generated randomly, and the
image-serving code would invalidate
that link after allowing it one time. If you
have a high-traffic web-site, this would be a very
resource-intensive scheme.
Really, though, you want to consider just how much work it is worth to force people to see ads, especially when a goodly number of them will be coming to your site via Firefox, and there's almost nothing that you can do to circumvent AdBlock.
In terms of choosing between your two methods, there are a couple of things to think about. With option one, where are are storing the images as static files, you will only be able to add new images by doing an appcfg.py update. Since AppEngine application do not allow you to write to the filesystem, you will need to add new images to your development code and do a code deployment. This might be difficult from a site management perspective. Also, serving the images form memcache would likely not offer you an improvement performance over having them served as static files.
Your second option, putting the images in the datastore does protect your images from linking only to the extent that you have some power to control through logic if they are served or not. The problem that you will encounter is that making that decision is difficult. Remember that HTTP is stateless, so finding a way to distinguish a request from a link that is external to your application and one that is internal to your application is going to require trickery.
My personal feeling is that jumping through hoops to make sure that people can't see your comics with seeing ads is solving the prolbem the wrong way. If the content that you are publishing is worth protecting, people will flock to your website to enjoy it anyway. Through high volumes of traffic, you will more than make up for anyone who directly links to your image, thus circumventing a few ad serves. Don't try to outsmart your consumers. Deliver outstanding content, and you will make plenty of money.
Your method #1 isn't practical: You'd need to upload a new version of your app for each new comic strip.
Your method #2 should work fine. It doesn't automatically "protect" your images from being hotlinked - they're still served up on a URL like any other image - but you can write whatever code you want in the image serving handler to try and prevent abuse.
A third option, and a variant of #2, is to use the new Blob API. Instead of storing the image itself in the datastore, you can store the blob key, and your image handler just instructs the blobstore infrastructure what image to serve.

Resources