Using babel/i18n with webapp2 and deferred tasks in google appengine - google-app-engine

I have a relatively expensive email-a-set-of-people tasks for which I am trying to use the "deferred" library in Google AppEngine. I'd like to send the handler just the list of recipients and some details, and have the handler format the emails to the correct language for each person. I'm using Babel.
If I do this without deferring, works great - the translations for the content load and all is good. But if I defer and move the fetching of the content to the handler, the i18n library isn't loaded properly - i18n.get_i18n().translations is a NullTranslations object. So the deferred task handler apparently don't get the context when it kicks off - is there a way to intialize i18n to get it to reload properly? Or some way to tell the AppEngine to preserve some context?
thanks!

Ah. If I use the regular task queue instead of the deferred library, it works fine. The 'deferred' library is supposed to just be a wrapper around the task queue system, but it apparently acts differently to the i18n libray. i18n does want the 'request' object; maybe the deferred handler does something weird with it. No matter - using the queue directly is almost as easy.

Related

ExtJS6: Partial App load for special request that always opens in a new window

We have a ExtJS7 app, that for special requests like reset password, that always opens a new tab via email reset link, get loaded in full. We would like to load only few pages that are needed to load for these kind of request
Is there a way in ExtJS that would only load a particular page and its dependencies
I have not seen tutorials on this subject in official documentation. Myself did the following - just created another app (or bundle) for logging. The backend is responsible for the logic of what to display (loginapp or mainapp) - in the absence of a session, the user receives the app login
Absolutely. You can make another app - each app is a page, and will have its own packaged dependencies.
That's the easiest approach. A more complicated approach is to break your application into several ExtJS packages. You can then configure the app.json to exclude all of the packages from the micro loader. You then need to load these packages dynamically, presumably after logging in.
Doing this, though, is extremely complicated, and almost certainly not worth doing.

Why is my UI updated only once?

I have done a simple SPA with AngularJS/signalR that send a notification to my hub Hello when the app starts.
On the client side I have list of notifications managed by a controller. This list got updated only once whereas I have been able to see that my callback is called every time thanks to the console.log (and the debugger shows that this.messages grows at every notification received).
I don't get why the UI only update on first call (which the one the current client have emittted)
Here is the code that work only once.
NotificationCtrl.prototype.hello = function() {
console.log("hello");
this.messages.push(new Notification("Tom", "Now", "Is now connected"));
}
You need to $scope.$apply because the callback is not scope aware since its not called from angular.
I think you would benefit alot from my library called SignalR.EventAggregatorProxy
Its designed around the Event aggregation pattern and is perfect for MV* enabled sites with Knockout or Angular.
Have a look at the wiki for setting it up
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki
Demo project
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4
Recent blog post I did about it
http://andersmalmgren.com/2014/05/27/client-server-event-aggregation-with-signalr/
Install using nuget
Install-Package SignalR.EventAggregatorProxy

How to create a dynamic front end based on Node JS, MongoDB, Sails JS

Basically I'm writing an app and am using Sails, MongoDB and Node JS for the back end. I'll use Sail's API features and was wondering what would be the best way to make the app realtime.
For instance I could use AJAX to call the API and manipulate the DOM using jQuery and update the DB through $.post then let the model update the db in the backend, however I'm finding this approach quite cumbersome. Not to mention I can see the code could become quite difficult to maintain after a while.
I've been doing some research and - if I understood correctly - it seems I could use either Backbone, Angular or Knockout to manipulate the data/DOM on the front end, however I'm not sure what would be the best approach in my case nor whether any of these would indeed suit my needs:
Being able to get the data dynamically
Update the data and the DOM dynamically as the user interact with the page
Post the updated data dynamically with none or as minimal data transformation on the back end as possible
All the above asynchronously
As I don't want this to become a heated debate on which library is best, so I would like to know only whether any of the aforementioned libraries can do what I need and which is the leanest/simplest/has the lighter learning curve.
I did similar research a while ago and when found AngularJS, just stopped looking any further.
Right to your questions:
Being able to get the data dynamically
It is pure pleasure to do it in Angular. For the very basic functionality you have got $http service which allows you to send http request and register a callback when the data arrives.
For more complicated things there are modules ngResource and Restangular (external).
Update the data and the DOM dynamically as the user interact with the page
For manipulating DOM, Angular introduced concept of directive. It is basically future of the web (Shodow DOM and Web Components) right now. At this time point, there is nothing more elegant out there.
Post the updated data dynamically with none or as minimal data transformation on the back end as possible
Yes. JSON.
All the above asynchronously.
Yes, of course.
SailsJS provides interchangeability of HTTP or socket.io connections. In your case I think sockets would be a better fit than AJAX.

AngularJS getting data from backend

I would like to know what is the proper way to get data from backend when I want to use angularJs (or similar) in my web app?
The only way I see is to render html (static html with js scripts - e.g. angularjs) with no data from backend and then download data via ajax requests from my backend API. But I think this solution is not good because of many HTTP requests:
For example I have blog website, I want to show a post, comments, and the related posts on the sidebar. So probably I need to make at least 3 HTTP requests to get the data unless I will prepare API to get all I need in one request.
I can also imagine websites that could have much more HTTP requests. Is it a proper way to do this? Doesn't it overload a server? Or my way of thinking is so wrong?
It is either websockets or HTTP requests. Preparing API to get all in one request is one option. Another two options are XMLHttpRequest/iframe streaming which is a method of a technique known as Comet.
I would go with websockets since it is supposed to solve the problem that was previously solved with weird applications like iframe streaming. There are libraries that properly handles fallbacks if the browser does not support websockets:
web-socket-js ( this needs a websocket server )
Socket.IO ( this has a node.js module and also implements a kind of unnecessary protocol on top of websocket protocol )
If you choose the old methods there will be many problems waiting for you on the road like XmlHttpRequest.responseText while loading (readyState==3) in Chrome
I think you have to distinguish two cases:
You render the page for the first time.
You update parts of your page when something changes
Of course in the second case it makes sense to fetch only parts of the page via individual HTTP requests. However, in the first case you can simply serialize your complete model as one JSON object and embed it in the page like this:
<script type="text/javascript">
var myCompleteModel = { /* Here goes your model */ };
<script>
The controllers of the components on your page can then access this global variable to extract the parts being relevant for them. You can also wrap access to the initial model in a service to avoid accessing a global variable in all your controllers.

How to handle expired files without refreshing the browser when using Single Page Application (SPA)?

I have done a full Single Page Application (SPA) application using Angularjs.
So far so good.
As anyone knows, all javascript files are loaded in the first time access. Or, some file are loaded in lazy mode style when needed.
So far so good...
The situation is: the server updates all files (html partials, javascripts, css's) and the client remain with a lot of files out-dated.
This would be simply solved refreshing the browser, hit F5 key, control+f5, or refresh button in the browser. But this concept does not exists when working with SPA.
I'm not sure how to solve this problem.
I could detect somehow (doing a ping maybe) and just to re-load that specific file. With document.write strategy. But now rises another problem, I have a single javascript file with all javascript minified.
I could try to force a full reload in the browser or force to re-login (and reload because login are SPA part).
But reloading is an ugly solution, imagine the client lose all data in the form because he was unlucky the server have just updated. And worse, I must now create some "auto-save" feature just because of this.
I'm not sure how to handle this, if possible, doing in "angular way".
I wonder how google gmail handles this because I stay logged for many many hours without logging of.
As others have already suggested, keep the logged user on the old version of your webapp.
Not only what you ask is difficult to do in Angular, but it can also lead to a bad user experience and surprising behaviour, since there may not be a mapping between what the user is doing with the old version and what the new version provides. Views may be removed, renamed, split or merged. The behaviour of the same view may have changed, and doing so without notice for the user may cause mistakes.
You made an example with Gmail, but may have noticed that changes to the UI always happen after you logout, never while you're using it.
First of all, if your app is an intranet website used during office time, just update it while nobody is using it. This is a much simpler solution.
Otherwise, if you need to provide 24/24 availability, my suggestion is:
When you deploy the new version of your SPA, keep the old version in parallel with the new version, keep the current users on the old version, and log new users to the new version. This can be made in a number of ways depending on your setup, but it's not difficult to do.
Keep the old version around until you're confident that nobody is still using it or you're pretty sure that the new version is ok and you don't need to rollback to the old version.
The backend services should be backward-compatible with the old version of the frontend. If that's not possible you should keep multiple version of the backend services too.
As the rest of the guys said a solution can be to versioning your files. So every time that your browser check those files out the browser notice that the files are different to the ones that are in the server so the browser cache the new files.
I suggest to use some build tool like gulp, grunt or webpack, the last one is becoming more popular.
By the moment I use gulp for my projects. I´m moving to webpack though.
if you are interested in gulp you can have a look to gulp-rev and gulp-rev-replace plugins.
What do they do?
let´s say that we have the next file in your project app.js what you get after apply gulp-rev to your project is something like app-4j8888dp.js then your html file where the app.js is injected is still pointing to app.js so you need to replace it. To do that you can use gulp-rev-replace plugin.
eg. gulp task where
var gulp = require('gulp');
var rev = require('gulp-rev');
var revReplace = require('gulp-rev-replace');
var useref = require('gulp-useref');
var filter = require('gulp-filter');
var uglify = require('gulp-uglify');
var csso = require('gulp-csso');
gulp.task("index", function() {
var jsFilter = filter("**/*.js", { restore: true });
var cssFilter = filter("**/*.css", { restore: true });
var indexHtmlFilter = filter(['**/*', '!**/index.html'], { restore: true });
return gulp.src("src/index.html")
.pipe(useref()) // Concatenate with gulp-useref
.pipe(jsFilter)
.pipe(uglify()) // Minify any javascript sources
.pipe(jsFilter.restore)
.pipe(cssFilter)
.pipe(csso()) // Minify any CSS sources
.pipe(cssFilter.restore)
.pipe(indexHtmlFilter)
.pipe(rev()) // Rename the concatenated files (but not index.html)
.pipe(indexHtmlFilter.restore)
.pipe(revReplace()) // Substitute in new filenames
.pipe(gulp.dest('public'));
});
if you want to know further details see the links bellow.
https://github.com/sindresorhus/gulp-rev
https://github.com/jamesknelson/gulp-rev-replace
A single page application is that, a single stack that controls the client logic of your application. Thus, any navigation done through the application should be handled by your client, and not by the server. The goal is to have a one single "fat" HTTP request that loads everything you need, and then perform small HTTP requests.
That's why you can only have one ng-app in your apps. You are not suppose to have multiple and just load the modules you need (although the AngularJS team wants to move that way). In all cases, you should serve the same minified file and handle everything from your application.
It seems to me that you are more worried about the state of your application. As Tom Dale (EmberJS) described in the last Cage Match, we should aim to have applications that can reflect the same data between server and client at any point of time. You have many ways to do so, either by cookies, sessions or local storage.
Usually a SPA communicates with a REST based server, and hence perform idempotent operations to your data.
tl;dr You are not supposed to refresh anything from the server (styles or scripts, for instance), just the data that your application is handling. An initial single load is what SPA is all about.
separate your data and logic and reload the data using ajax whenever you want, for that i will suggest you use REST API to get the data from server.
SPA helps you to reduce the HTTP request again and again but its also require some http request to update a new data to view.
Well, you would have to unload the old existing code (i.e. the old AngularJS app, modules, controllers, services and so on). Theoretically, you could create a custom (randomized) app name (with all modules have this prefix for each unique build!) and then rebuild your app in the browser. But seriously.. that's a) very complex and b) will probably fail due memory leaks.
So, my answer is: Don't.
Caching issues
I would personally recommend to name/prefix all resources depended by a build with a unique id; either the build id, a scm hash, the timestamp or whatever like that. So, the url to the javascript is not domain.tld/resources/scripts.js but domain.tld/resources-1234567890/scripts.js which ensures that this path's content will never conflict with a (newer) version. You can choose your path/url like you want (depending on the underlaying structure: it is all virtually, can you remap urls, etcpp). It would be even not required that each version will exist forever (i.e. map all resources-(\d+)/ to resources/; however, this would be not nice for the concept of URLs.
Application state
Well, the question is how often will the application change that it would be important that such reloads are required. How long is the SPA open in a browser? Is it really impossible to support two versions at the same time? The client (the app in the browser) could even send its own version within the HTTP requests.
In the beginning of a new application, there are a lot of changes that would require a reload. But soon after your application has a solid state, the changes will be smaller and would not require a reload. The user itself will make more refreshs.. more than we ever expected :/
As with what everyone else is saying...
Don't, and while socket.io could work it's asking for trouble if you are VERY careful.
You have two options, upon server update invalidate any previous session (I would also give users a half hours notice or 5 minutes depending on application before maintenance would be done.
The second option is versioning. If they are on version 10, then they communicate with backend 10. If you release version 11 and they are still on 10 then they can still communicate with backend 10.
Remember Google wave? It failed for a reason. Too many people writing one source as the same time causes more problems then it solves.
use $state service. create state during loading page using ctor. after specified time re create state and load page.
function(state) {
state.stateCtor(action);
$state.transitionTo(action + '.detail', {}, {
notify: true
});
}
Versioning your files, so on every update increment version number and the browser will update it automaticallly.
My solution consists of several points.
While this is not important, but I send one JavaScript file to the client side, I use grunt to prepare my release. The same grunt file adds to the JavaScript tag a query with version number. Regardless of whether you have one file or lots of files, you need to add a version number to them. You may need to do this for resources as well. (check at the end for an example)
I send back in all my responses from the server (I use node) the version number of the app.
In Angular, when I receive any response, I check the version number against the version number loaded, if it has changed (this means the server code has been updated) then I alert the user that the app is going to reload. I use location.reload(true);
Once reloaded the browser will fetch all new files again because the version number in the script tag is different now, and so it will not get it from cache.
Hope this helps.
<script src="/scripts/spa.min.js?v=0.1.1-1011"></script>

Resources