$httpbackend from ngMockE2E never called - angularjs

I'm trying to make some backendless e2e tests, so I need to mocks API calls.
Here is what I did:
angular.module('foo.bar.e2eConf', ['foo.bar', 'ngMockE2E']).
run(function($httpBackend) {
$httpBackend.whenGET('/foo/bar').respond({foo:'bar'});
});
Then I configured my conf/karma.e2e.conf like this (pathes are ok):
var basePath = '../';
var files = [
ANGULAR_SCENARIO,
ANGULAR_SCENARIO_ADAPTER,
// bower libs
'components/angular/index.js',
'components/jquery/jquery.js',
'components/angular-resource/index.js',
'components/angular-mocks/index.js',
'components/chai/chai.js',
'test/chai.conf.js',
'src/app/**/*.js',
{pattern:'src/app/**/partials/*.tpl.html', included:false},
'test/e2e/**/*.js'
];
var singleRun = false;
var browsers = ['Chrome'];
var proxies = {'/': 'http://localhost:8000/'};
I can run tests that doesn't involve API calls, but when I run a test that involves it I get a nice Failed to load resource: the server responded with a status of 404 (Not Found) http://localhost:9876/foo/bar
I guess I misconfigured some stuff, but I can't figure out what??
Is there a conflict between the proxy and the mock? i.e. proxying /foo/bar to http://localhost:8000/foo/bar instead of using the mock?
Any idea?
Regards
Xavier

You need to create a version of your app that bootstraps off the foo.bar.e2eConf module, instead of bootstrapping off the foo.bar module.
You'll have to include the javascript files angular-mocks.js and the new module you defined above in your app index page.
You should be able to test this outside Karma by just using this new app and seeing it return data from your mocks.
You probably don't need to add half those files to the Karma configuration. That's just for adding files to the testing scenario.. its going to load your app in an iframe and your app is responsible for loading it's own javascript.
I'm using php to server up either version of the app depending on what URL I use: either the real version that uses the api calls, or the e2e version that uses the mocks.

Related

Understanding Selenium + browsermob-proxy + protractor + AngularJS

What I have: several integration test specs written in Jasmine for my AngularJS app (they navigate through my entire app)
What I want: perform a network monitoring of my app and export the data using HAR
Naive solution: just write a script which receives an URL and export the data using HAR. It's easy, but it's not automatic (I need to provide the urls manually)
Enhance solution: automate the process mentioned. A script that navigates through all the pages of my app and extracts the network data for each. But since I'm already navigating through all the pages of my app via integration tests (protractor + Jasmine) I want to just "plug-in" the part about exporting the network traffic.
I've found this How can I use BrowserMob Proxy with Protractor?, and I was checking out the example provided here example, but I'm not quite sure how it works.
What I should put as the host and port for the proxy?
I'm using Selenium, and I've specified the host and port for it, but I'm getting ECONNREFUSED errors.
This is my protractor file config:
var Proxy = require('browsermob-proxy').Proxy;
...
protractorConf = exports.base = {
//... more things
onPrepare: function() {
... more things....
browser.params.proxy = new Proxy({ // my selenium config for browsermob
selHost: '10.243.146.33',
selPort: 9456
});
//... more things
}
}
And in one of my integration tests specs (it's CoffeeScript btw):
beforeEach ->
browser.get BASE_URL
browser.params.proxy.doHAR 'some/page/of/my/app', (err, data) ->
if err
console.log err
else
console.log data
But I'm getting as I've said ECONNREFUSED error. I'm quite lost about the integration about Selenium with Protractor and brosermob.
Any ideas or alternatives? Thanks!

AngularJS better way to read config file?

I built a angularJS app that I wanted dynamically configured, so I created a config.json file with the needed configurations, and decided to load the config file in app.config as such:
angular.module("myapp",[]).config([ my injections] , function(my providers){
if (window.XMLHttpRequest) {
xhr = new XMLHttpRequest();
} else if (window.ActiveXObject) {
xhr = new ActiveXObject("Microsoft.XMLHTTP");
}
xhr.open("GET","config.json"); //my config file
xhr.send();
xhr.onreadystatechange=function()
{
if (xhr.readyState==4 && xhr.status==200)
{
//config file parsed, set up params
}
}
})
The reason I am doing it this way is because $http is not injected in config state, and I do not want to "configure" the app at a controller level.
The application works fine. it does what I want to do, and everything works great...EXCEPT when it comes to unit testing (with karma + jasmine).
even though in karma.conf i have:
{pattern: 'config.json',served:true,watched:false,included:false}
defined, when I launch karma, I get a cli [WARN] about config.json 404. and my unit tests to see if everything is configured, fails (i.e it didnt read config.json)
Is there a better way to write config files for unit testing?
In our apps we have external file config.js, which contains just ordinary module which provides constants with configuration.
angular.module('myAppPrefixconfig')
.constant('PLAIN_CONSTANT', 'value'),
.constant('APP_CONFIG', {...});
In your app you have dependancy on it, and there is ordinary http request - which can be resolved by your backend with proper configuration.
In Karma tests you can then provide 'testing config' directly in karma.conf.

environment specific config with angularjs

I am working on creating a website with Angularjs and WebAPI as backend support with Visual studio.
I have defined the url to webapi in app.config.js as an app constant:
var serviceBase = 'http://localhost/Webapiservice/';
app.constant('ngAppSettings', {
apiServiceBaseUri: serviceBase,
clientId: 'ngTestApp'
});
Now for QA environments (http://QAServer/Webapiservice/), the webapi resides at a different URL, the same goes for our production environment (http://ProdServer/Webapiservice/).
I can manually update the js file to the appropriate location. But is there a way to automate this process so webAPI is pointing to the correct URL?
Is it possible to do this with grunt? Again I have never used grunt before.
ngConstant does a great job, along with grunt: https://github.com/werk85/grunt-ng-constant. This way you can specify your environments as JSON files and on compile/run time, grunt generates a enviornment.js file with a module (I call mine always ENV), which can be injected in every part of your application.
I would do something like this:
angular
.module('app', [])
.value('$path', {
rest : "http://localhost/Webapiservice"
})
you will call something like this:
app.constant('ngAppSettings', {
apiServiceBaseUri: $path.rest,
clientId: 'ngTestApp'
});

Using require.js with jasmine to load scripts returning 404

I'm using Backbonejs and am using require.js to load each dependent backbone widget before firing up my app and putting everything in a custom namespace, in this case "Foo". I'd like to have Jasmine load up this loader file and pick up all the dependent javascripts (located in /public/js of my main app), however, I'm getting all 404's as Jasmine doesn't know about the /public/js directory on port 8888. How can I get jasmine to load these javascripts?
Foo = {};
jQuery(function(){
var include = ['/js/widget.js','/js/delta_widget.js','/js/inbox.js','/js/time_widget.js','/js/high_stock_widget.js','/js/daily_summary_widget.js'];
require(include,function(){
$.getScript('/js/app.js');
});
});
For each of the javascripts, I'm getting:
Failed to load resource: the server responded with a status of 404 (Not Found) http://0.0.0.0:8888/js/widget.js
It would seem that your Jasmine loader file / SpecRunner is in a different directory than your require.js loader (main.js by default). You will have to configure require.js to use a different base path by doing the following:
jQuery(function(){
require.config(
{
baseUrl: '/public'
});
var include = ['js/widget.js',
'js/delta_widget.js',
'js/inbox.js',
'js/time_widget.js',
'js/high_stock_widget.js',
'js/daily_summary_widget.js'];
require(include,function()
{
$.getScript('/js/app.js');
});
You have to configure the above "baseUrl" property to point to the proper URL/Path.
For example, if your Jasmine SpecRunner is located in:
base
- main.js
- js
-- widget.js
-- app.js
- Libs
-- Jasmine
--- SpecRunner.html
then you need to configure
baseUrl: "../../"
Hope this helps.

How to use Web Workers into a Module build with Requirejs?

I have a well working app writing with Requirejs and Backbonejs, but it's really slowing sometimes... For example when it comes to make some arithmetic work! I tried to use a Web Worker to do this arithmetic work like this :
My Module(traffic.js) :
define(["jquery", "use!underscore", "use!backbone", "namespace" ],
function ($, _, Backbone, globals) {
.....
var worker = new Worker("arithmetic.js");
worker.addEventListener('message', function(e) {
console.log(e.data);
}, false);
worker.postMessage(globals.data); // Send data to our worker.
});
arithmetic.js :
define(["use!underscore", "use!backbone" ],
function ($, _) {
//Here die Operations
});
But i have the Error define is not defined!!
I tried it like this too but no success!!
How to use Web Worker into requirejs or with backbonejs??
Thanks!
You can use requireJS from web workers: see the API docs for more info.
The only requirement is to load requireJS at the start of the web worker with importScripts(…). Once that is loaded, you can use define and use requireJS like normal.
The key part for me when I was getting it to work was to make sure that you are also loading the same bootstrap configuration (e.g. config.js or main.js) in the web worker that you are using in the rest of your app. This is what the doc is talking about when it says:
You will likely need to set the baseUrl configuration option to make sure require() can find the scripts to load.
Another thing is that you can load the worker from your traffic.js file with a module ID (instead of hardcoding the script path) utilizing this requireJS plugin.

Resources