So I'm not sure if there is truly a way to make this happen.
Basically I have an angular app with a nodejs backend that shows longrunning queries on our production environment. This is set on an interval to refresh this info every 20 seconds. The issue is this:
Buttons on the page, if clicked when a refresh is occurring, will not fire off. They have to wait until the whole stream of calls to refresh the data have finished, and even then sometimes, they do not fire.
If I happen to catch it while no refresh is occurring, the button works as expected.
Is there a way to force this action to take precedence over the refresh so it occurs immediately even if there are other processes occurring?
Thanks!
Based on your clarification in your comments, it sounds like you're hitting your browser's limit on how many connections can be made to any given hostname. Look at the Connections per Hostname column here.
To fix this, send less calls at once. For example, you could implement your own queuing system for your refresh-related calls, instead of firing them off all at once, leaving plenty of connections open for other requests (like your button).
Related
I am creating a socket.io app using angular 2 on the frontend and I am getting a very weird behavior that I have never seen before when working with socket.io. I have no idea if my code is causing the issue or if it is something within the interaction between angular2 and socket.io, but if it is my code, I can't say what code I might need to post.
The mysterious behavior: At first instinct, my process for testing if my sockets connections are working properly is to open up an incognito tab, go to my project site, log in as a different user and see if API requests are emitted properly across the users. However; right now EVERY action that is made on either of the users happens to the other user. EX: if I type into a form one of the clients, the other clients form will get updated with the same information. If I click the forms submit button to post the data in the form, the other clients submit button will be clicked as well. Occasionally, it happens when navigating between states, where the other client will also navigate to the state. The behavior also occurs when logging on to a completely different computer, so imagine it is an issue with how socket.io emits data.
All the clients are connecting and disconnecting appropriately and are getting assigned unique socket ID's.
Turns out, the solution was a bit simpler than I was expecting it to be. I strange behavior occurs through a conflict with npm live-server running at the same time as my socket.io connection. I still cannot explain why the conflict was manifested as this sort of strange behavior, but at least I got it to stop by running the app as an express app serving up the index.html.
If anyone could explain why this might have been happening, I would love to hear.
I want to ask one question, suppose my web page is like a text editor, I have some boxes where user can write.
Suppose i want to implement autosave in angular . I have some ng-models for the info inside those boxes, and now i use ng-watch to listen to all of my boxes models.
So ,In a normal user mode ng-watch is continuously getting that my models are changing.
Will this method produce low performance in my web app?
If it did,Would you recommend me a way to implement an autosave on angular.
-----------EDIT---------
in brief : what i want to know is, if used $watch for multiples scopes who are continuously changing can affect to the web page performance
Maybe you can only bind the text to the model via ng-model and make a timer with some interval you want. That timer would then post the status to the backend.
About the performance, ng-watch will, as you said fire very often. You'll probably want traffic to the backend every keystroke.
Hope it helps! Ask for more info if you want :)
EDIT:
If you want live updates, you should look at something like socket.io.
EDIT 2:
I would not either fear for the performance, but for simplicity. I've used ng-watch a few times, and everything becomes harder to debug.
Cheers!
Well, instead of $watch'ing the models, you may opt for "ng-change" OR ng-blur instead.
If you use ng-change, then you may also set a timer, that will only re-evaluate your model once user stops typing for a second.
ng-blur triggers only once, after user is done typing and blurs/unfocuses the input.
Note: If you dont want to send autosave requests to the backend too often, you may autosave in client side ( using localStorage ), which would be instant. Then, you can send that final saved info to your sever once user is done with editing and submits a form/is ready to continue.
I had the idea to make a SPA application using angularJS and then just sending AJAX updates to the server when I need.
My initial idea would be make the client application fly, but if I have to do an AJAX round trip to the server, I think the time would be approximately the same as to request a single web page.
Requesting a page just has more bytes of data, is not like I'm requesting 20 resources like in this article: https://community.compuwareapm.com/community/display/PUB/Best+Practices+on+Network+Requests+and+Roundtrips
I would be requesting a page or resource per request.
So in the end even if I create my client side application as a SPA using angularJS, these requests (would have to be synchronous and show a please wait message while they don't return, as I don't want to user to take more actions before I make sure his request passes validation and is processed correctly) would take some time and make user wait, just about the same time as requesting a full page.
I think SPA pages would be very useful if I have like a wizard on my app with multiple pages/steps and at the end, submit the results of wizard, to the server, which I don't.
Also found this article:
https://help.optimizely.com/hc/en-us/articles/203326524-AngularJS-Backbone-js-and-other-Single-Page-Applications
One of the biggest advantages of Single Page Apps is that they reduce
data transfer. As a result, pages after the initial loading usually
can be displayed faster and seem more interactive.
But I don't believe this last quote is really true.
Am I right, or is there a way that I'm not seeing to build an application that would look like it's executing locally?
I know how guys will start saying "depends on what you want", but lets focus on this scenario where there's no wizards.
What ever you said is right. But most of the frameworks(Angular,BackBone) you take they are going to cache the templates of html on the browser so the rendering would be pretty fast compared to the normal applications. Traditional apps will have to fetch the html from the server for each request which is a time consuming one.
Hope this helps you!!!
If you are wanting to go through that syncronous server side validation step for each page request, then there is probably no big advantage to using AngularJS.
If you are requesting a page and then manipulating that page's contents once it's loaded you might want to consider AngularJS. A good example would be requesting a page that displays a list of items. Now let's say we want to search that list or order it in different ways. Rather than using AJAX to call the server to filter the list and then re-render it, it could be much faster to user AngularJS to filter and re-render the list without making any further requests to the server.
I'm having an issue where a long synchronous request will freeze Internet Explorer.
Let me explain the context : this is a web application which only supports IE8 and which can only use Synchronous*.
We have a Silverlight component with a save button. When the user presses the button, some information is sent to the server using a synchronous XMLHttpRequest, then some other actions are done client-side in the Silverlight component.
The server-side part includes calculations that will sometime take a while (several minutes).
In short, here is the code (c# silverlight part)
ScriptObject _XMLHttpRequest;
_XMLHttpRequest.Invoke("open", "POST", url, false);
_XMLHttpRequest.Invoke("send", data);
checkResponse(XMLHttpRequest);
doOtherThings();
I know that the server does its work properly because I can see in the verbose logs, the end of the page rendering for the "url" called from Silverlight.
However, in debug mode I can see that I never reach the "checkresponse" line. After calling the "send" line, IE will freeze forever, not unfreezing once the server log shows that "url" has been processed.
Also, I tried to add "_XMLHttpRequest.SetParameter("timeout", 5000)" between the "open" and the "send" lines. IE freezes for 5 seconds, then "checkresponse" and "dootherthings" are executed. Then IE freezes again while server-side calculations are processed and doesn't unfreeze once the server is done with its work.
IE timeout is supposed to be 3 hours (registry key ReceiveTimeout set to 10800000), and I also got rid of IE 2-connexions limit (MaxConnectionsPer1_0Server and MaxConnectionsPerServer set to 20).
Last important information : there is no issue when the server-side part only takes a few seconds instead of several minutes.
Do you know where the freeze could come from (IE bug, XMLHttpRequest bug, something I have done wrong) and how I can avoid this ?
Thank you !
Kévin B
*(while trying to solve my issue with the help of Google I found an incredible amount of "use asynch" and "synch is bad" posts; but I can't do this change in my app. Switching the application, ajax loads, and all server side calculations to asynchronous is a huge work which has been quoted for our client and is a long-term objective. I need a short-term fix for now)
Silverlight virtually requires that everything be done asynchronously. Any long running synchronous process will hang the browser if run on the UI thread. If you never reach the 'checkResponse' line of code it is possible that an unhandled exception was thrown on the previous line, and it is being swallowed. You can check in the dev tools of your browser to see if there are any javascript errors. I am surprised that calling XMLHttpRequest synchronously works at all since I would expect it to lock up the UI thread. But, the solution depends on your definition of async.
You can try:
calling the sync XHR request on a background thread and then marshalling to the UI thread (eg with Dispatcher.BeginInvoke) when you are ready
setting up an XMLHttpRequest wrapper that makes the call in async mode and raises a callback in Silverlight on completion
using HttpClient or WebClient
While these options are async, they don't require your server code to be written any differently (it can stay synchronous). I have seen a web server process a call for over an hour before finally returning a response, at which time the Silverlight app raised a callback. You could even use tools like the TPL await/async, or co-routines available in many mvvm frameworks to make the code appear very procedural/synchronous while still performing its actions asynchronously.
I hope you already solved your issue, but maybe this will be helpful to whomever may come across this post.
I implemented the observer design pattern in my application, but my app sends to an remote server requests via http protocol that take some time to resolve.
So, naturally, I did the sending an receiving part in a separate thread.
Can you please tell me how to make an window that observes the RequestObject to modify it's state based on the state of the request?
In the debugger step by step mode the window runes the code that I want it to do, but the window never refreshes its self.
Since I don't have a sample of your code I don't know the exacts of how you are updating your UI. If you are attempting to update the UI in the seperate thread that could be your issue. This may be of some help. http://msdn.microsoft.com/en-us/magazine/cc188732.aspx
You may also consider using the Task Parellel Library to perform your asyc operations.
http://msdn.microsoft.com/en-us/library/dd997423.aspx