I used to do performance testing on websites mostly with JMeter scripts.
However, more and more projects are build with frontend MVC's, such as AngularJS and a current project is loading all of it's content via angular view files, REST calls etc.
Unfortunately, JMeter doesn't execute any javascript thus my load test return me the homepage in just (400ms).
In real, it actually takes several seconds to load in a browser. When I check the response data, it does not contain any data yet due to Angular.
Instead of investigating the network traffic and individually loading each component (e.g. profile.html, notification.html, REST calls etc. ). Is there a product on the market or some best case I could follow that is similar to executing JMeter scripts, but considering javascript execution and loading of external resources due to javascript?
(I am not planning to profile javascript execution times. This is still to test if the infrastructure behind is capable serving xyz simultaneous users)
Although JMeter isn't capable of executing client-side JavaScript it can record relevant requests via HTTP(S) Test Script Recorder. Once recorded you should be able to combine all the standalone requests into one "aggregate" using JMeter's Transaction Controller
If this easy approach for some reasons doesn't play for you check out How to Load Test AJAX/XHR Enabled Sites With JMeter for more options and clues.
I use Chrome dev tools to do this kind of performance tests in web apps.
I suggest you to read the Chrome Profiling docs (https://developer.chrome.com/devtools/docs/javascript-memory-profiling). All the section of Performance and Profiling in goolge documentation is really good!
You can try to use the option 'Use as Monitor' for the requests you fire up from your test.
http://jmeter.apache.org/usermanual/build-monitor-test-plan.html
They are performance killers, though. Another option is to use the listener 'Save Responses to a File' to see if the end HTML is delivered. It should not give you the ideal result but it might help.
If you want to track down performance of XHRs for a single user, you can try to play with Selenium and BrowserMob Proxy, but it is not under the stress testing, but functional testing.
You can try https://github.com/kidk/felt it is build for this specific purpose.
It uses PhantomJS/SlimerJS to generate load to a website, so you get all the API/JS/CSS and image calls you would get like in a normal browser. It is still a young project, but it might be the solution you are looking for.
(This is my personal project)
Related
Our team is constantly working on an angular application, and every week or 2 we update it with new features or correct some bugs that came out.
We are using a C# backend with webservices.
QUESTION: When we update the application on the server, we sometimes (this doesn't happen all the time) get the problem that user is still seeing the old HTML and functionalities. What is the way to solve this?
I didn't find this on google, maybe I'm not looking for the right terms,
any help is appreciated.
Users have to clear their cache to get the new version of the application.
What you are seeing are cached copies of the JS files (possibly HTML partials too).
When the browser parses the HTML page, it makes the request for getting the JS resource and looks at various information before deciding to retrieve either the cached copy (if any) or whether to query the server again.
You can find some extra details on the Google fundamentals on HTTP caching
A solution I have been adopting myself is to set the cache headers to cache the file for a very long period, and then use tools in the build to version the file either on the filename or with a request parameter.
I have used grunt-cache-breaker and found it to serve well for this purpose. I know there is a gulp equivalent too
The most common way to make sure cached versions of your javascript aren't used is adding the version as a parameter in the reference to the script like so:
<script src="/app.js?v=v1.0"></script>
Is there any way to trigger my javascript modules in jmeter to check whether they are working properly during load/performance tests
JMeter is not a browser so it isn't able to execute client-side JavaScript. All that JMeter is able to do in this regards is:
Download scripts along with other embedded resources like images and styles (but do it only once per virtual user) to simulate real browsers behavior more or less closely.
Simulate JavaScript-driven calls like AJAX/XHR via separate HTTP Requests
The most common practice of web application monitoring during performance session is manually accessing the application under test and assessing it's responsiveness and behavior.
This process can also be automated using JMeter's WebDriver Sampler Plugin which allows to invoke arbitrary Selenium code to orchestrate one or several real browsers to check i.e. actual page load speed during performance test session.
We're developing a website with AngularJS, and would like to have static HTML files for every AngularJS page for search engines.
The site is small and static, about 10-20 pages.
Is there any task in grunt that can generate HTML files based on Angular app? E.g. it generates /static/about.html for /#about page of Angular app.
I saw services and scripts like PhantomJS, but they look too complicated for our case. As the site is static, we can run the task every time we're going to publish any changes.
PhantomJS is indeed the way to go. However, there is existing tools that do mostly what you want.
This one is a good example:
https://github.com/cburgdorf/grunt-html-snapshot
This will run PhantomJS through pages of your website and generate an HTML version of it. It can be automated via Grunt.
The only caveat is that you have to enter manually, in the configuration, all the pages you need to index.
Is it ok for what you need?
EDIT: This is also a quite interesting and complete article about SEO friendly Angular applications: http://www.ng-newsletter.com/posts/serious-angular-seo.html
They provide a few alternatives depending on your needs
Sometimes when starting up a new instance on AppEngine the process may take some time. It obviously depends on your choice of libraries etc. My question is, is it possible to serve some other page while users are waiting? Reddit does a nice job of this. Other sites like Twitter provide a similar notice to users when the load is too much (they probably also starting more instances in the bg).
Does anybody have any experience in doing this on GAE?
Another twisted way that is very simple to implement without code changes: use another appengine that statically serves a tiny html of just an iframe to your real appengine.
This might break TOS so check that before doing it. It sill cost you more too.
That wrapper appengine can serve js that does the "loading" page , inserts the iframe and lster removes its content when hidden iframe finishes loading, then shows the iframe
I would like to write a scenario test which will perform remote web site? How can I do?
You should be able to use browser().navigateTo('http://whereveryouwant.com'), and then use any of the e2e api methods to manipulate the page and make assertions.
The major caveat is that Angular's scenario runner doesn't support full page reloads, so this will limit what you can do in your tests. If you do anything on the page that results in a full page reload, the test runner will freeze up.
Browsers don't support cross site requests. So there is no way to do what you want.
You need to make it look to the browser as if the local e2e runner and the remote site are on the same domain. And the only way to do that is with a proxy.