In my AngularJS app, I have a lot of different factories for making different REST requests.
Going over them, I realized that all of them use a lot of similar code. This is terrible for DRYness and also makes this messy and longer then necessary.
For example, they all do the same checks for caches:
// Cache stuff
var cacheName = 'projects.rsrc.getProjectUsersService';
var cache = cacheAppFactory.get(cacheName);
if(clearCache) {
cacheAppFactory.clear(cacheName);
}
I want to simply place this code in ONE function which takes a cacheName and call it from every factory.
Is there a way to reference some "global" code, or functions, from inside factories?
You can extend the functionality of your cache factory to register every other factory that uses the cache functionality of the cache factory.
Then simply create a method in the cache factory that would return caches from all the registered factories.
That's probably as DRY as you can get and the similar principle can be applied to all the things you think are repeating too much, not just the cache stuff.
Related
Say I have the following AngularJs service:
angular.module("foo")
.service("fooService", function(){
var svc = this;
svc.get = function(id){...};
svc.build = function(id){...};
svc.save = function(thing){...}; //posts, then returns the saved thing
svc.getOrCreate = function(id){
return svc.get(id).then(function(thing){
return thing || svc.build(id).then(function(builtThing){
return svc.save(builtThing);
});
});
}
});
I can unit test the get method by making sure the right API endpoint is being reached, with the right data.
I can test the build method by making sure it pulls data from the proper endpoints/services and builds what it's supposed to.
I can test the save method by making sure the right API endpoint is being reached.
What should I do to test the getOrCreate method? I get two differing opinions on this:
stub the get, build and save methods and verify they're called when appropriate, and with the proper parameters
stub the API endpoints that are being called in get and build, then verify that the endpoints within save are being called with the proper parameters
The first approach is basically saying, "I know these three methods work because they're independently tested. I don't care how they actually work, but I care that they're being called within this method."
The second approach is saying, "I don't care about how this method acts internally, just that the proper API endpoints are being reached"
Which of these approaches is more "correct"? I feel the first approach is less fragile since it's independent of how the get, build and save methods are implemented, but it's not quite right in that it's testing the implementation instead of the behavior. However, option 2 is requiring me to verify the behavior of these other methods in multiple test areas, which seems more fragile, and fragile tests make people hate programming.
This is a common tradeoff I find myself facing quite often with tests... anybody have suggestions on how to handle it?
This is going to come down to a matter of opinion.
If you are unit testing your tests should work on very specific functionality.
If you start chasing promises, and you have promise chaining, where does it stop?
Most importantly, as your unit test scope gets bigger and bigger, there are more things that it depends on (services, APIs etc...), and more ways in which it can brake that may have nothing to do with the "unit". The very thing that you want to make sure works.
Question: If you have a solid controller that works great with your template, and a unit test that ensures your controller is rock solid. Should a twice detached promise that resolves from the response of a web service http API call break your controller test?
On the other hand, the same way you test your API client end points by mocking the service, you can test the service with its own tests using something like Angular's $httpBackend service.
I have seen it done both ways and don't have a strong preference either way. Personally, however, I would consider option 1 where you don't mock the other functions that are tested elsewhere to be integration tests because they're calling multiple publicly visible functions and therefore would prefer option 2.
We are using Angularjs and ui-router. We generally have a layout of each page that utilizes views. We have a filter view, sort view, and pagination view; as well as display views that can be swapped in and out.
Logically when changes are made we need to any of theses controllers we need to update the displayData as appropriate. Changes to filter should run the filterMethod, but also need to run sort and then pagination logic afterwards, while changes to sort should run just pagination after, making a clear order of operations for when each controller needs to it's update.
My problem comes when I consider that in some cases we may not want to utilize all 3 controllers. We may want filtering, but not pagination for example.
We are having trouble finding a clean way to make these controllers 'just work', so that we can plug in whichever control we want in uirouter and have them function. The problem is mostly one of scoping. If I do the obvious thing, and have each controller define their own updateData method when changes are made to it, I run into scoping problems if I want them call the next controller's update afterwords. The filter controller can't call sort because the two controllers don't share a scope. I can use broadcasts, but what if I want a filter and a pagination controller, but not a sort? How do I ensure that sort runs before pagination if both are present, but if sort controller doesn't exist pagination knows to run after filter?
I could instead move everything up to my top level controller, and then things just work. However then I end up with a controller that feels like it's doing way to much, It's cleaner to have one controller for each type of control if possible.
We have other approaches we could use, but they feel like their making pretty strong presumptions about our controller scheme. If I later added some fourth controller I would have to modify everything because each controller is really hard coded very explicitly with presumptions about how the other's run.
This seems like a common issue. Is there a a best practice or convenient technology for handling splitting of functionality across controllers?
Hoist the data, not the display logic. There are only two ways to share data between controllers cleanly: a service, and a parent controller.
If what you were sharing was data (eg: displayData), I might suggest a service object, but sounds more like application state (eg: orderBy), so I think these "settings" should live on a parent controller.
So I am currently starting to create my first app in AngularJS and I am having a bit of trouble working out how components such as modules, factories and services should be applied for my specific app.
In the tutorials I have seen, for the sake of simplicity, the data collected via $http in a service is $scope ready. What I mean by "scope ready" is a controller can directly call the service and place the output directly into the scope without any modification.
However, in my app, the data that will be retrieved via $http is in an arbitrary json format Data1 and it needs to be converted to an arbitrary json format Data2 before it can be placed into the $scope. Once converted, the data in format Data1 is no longer required. The user then makes changes to the data (in format Data2) and the clicks a submit button when finished. The data is then converted back from format Data2 back to format Data1 and sent back to the server. This process of converting between data formats is I guess "the Brains" of my app and if I where to write it in vanilla JavaScript would require multiple functions (not sure how that translates to AngularJS though).
So my first question is, where would I place my data format conversions, forward and back? Should I put them in services, factories, modules, controllers, etc.? My guess is I would put the conversion between data formats in a service but I am not sure.
To make things a bit more complex, the data for my app only needs to be fetched once (as data format Data1) and then used across multiple views (as data format Data2). From reading this question and its suggested answer I think the best option is to use the $rootScope to store my data (in format Data2). So my second question is, how do I get my data into the $rootScope independently of any particular view so it only runs once at the initialization of the app? (I know how to complete the reverse process with the button described above).
If I haven't explained myself very well please let me know and I will try to clarify.
Thanks, JamesStewy
Let's just get the terminology aligned first - it might simplify things.
Factories and Services are the same things as far as controllers are concerned. The only difference is in their creation, but that happens before they are injected into the controller.
Modules are like packages, or containers of functionality. So, for the purposes of this question, they are of no importance.
Models are object representing the data (and associated functionality) of your backend's data layer.
ViewModels are objects used and modified by the View. This is what you referred to as "scope-ready".
Controllers are responsible for handling calls coming from the View (e.g. ng-click) and for marshaling data between Models and ViewModels (often via Services). It is a good practice to keep the controllers as lean as possible.
So this is where you need to make a judgement call. Is this conversion business-logic independent? Is this conversion specific to a single controller? Is this conversion short and readable? If the answer to the above is no, then you should place this logic in a Service. Services are used to encapsulate the backend/business logic or more-than-trivial front-end logic.
As to your second question, I would suggest having the Service fetch and cache the data in a local service variable. Services are singletons, so every controller will get the same instance.
app.service("myService", function($http){
function myService {
var promise = null;
this.get = function(){
if (!promise){
promise = $http(url);
}
return promise;
}
}
});
app.controller("ctrl1", function($scope, myService){
myService.get().then(data){
$scope.items = data;
}
});
You could use $rootScope - and store the data there within the .run block. But I would only do so if this data is going to be used in the View (i.e. HTML) and by all or most of the Views. Also, be careful (!) - if the data is asynchronously fetched it would not yet be available when controllers run.
I'd like to know if using
angular.extend($scope,MyService);
Does it break oop encapsulation principle ?
Does it smell like MyService.call($scope) ?
Could you face variable and function conflicts ?
Is this a bad/good practice?
Typically from my experience services are injected into the controller and then will be called from that scope. I wouldn't say that using extend to copy over functions and properties is necessarily bad, but it might mitigate some of the purposes of IoC (inversion of control), which is what injection in angular is based on.
Does it break oop...?
My understanding is that what you would get from this call is additional functions and service calls directly on your scope. This doesn't break OOP in the sense that scope is an object and would have functions applied. Provided those functions + properties make sense on the scope it seems like a fine thing to do from that perspective.
Does it smell like MyService.call($scope)?
As I said in the first paragraph - I don't see why you wouldn't just call the service and either share data or pass in references to objects to the service. Another pattern that is common in angular is to use a promise to process returned data in your scope. That looks like:
MyService.callFunction(parameters).then(function (data) {
// process data here. Since this runs in $scope you can also use $scope normally.
// $scope.$apply() is in progress and will complete when the function returns
});
All the service does is provide the data to the scope then. Point is that I think there are better patterns than "extend".
Can you face conflicts?
In the call angular.extend(a, b); data, properties and functions are copied from b to a. If something already exists on a it will be overwritten with the data from b. So technically the short answer is "yes", you can face conflicts.
The bottom line
So at the end of the day this isn't a bad pattern but there are probably more common patterns I would try to use first.
Context: I'm new to Angular, and this feels like a lot more of a "What's the right way to do this in AngularJS" kind of question.
My API backend has a couple of related objects that I need to request and assemble into a coherent user interface. It can be modeled as a subscription hub thing, so I have: Subscription hasMany Subscription_Items, belongsTo Source.
What I want to do is look up a user's Subscriptions (/api/subscriptions?user_id=1), which gives me some JSON that includes a subscription_item_ids=[1,2,3 ...] array. I then want to query on those ids, as well as query the server on source_id to pull shared info, and then repackage everything nicely into the $scope so the view layer has easy access to the system and can do stuff like ng-repeat="item in subscription.subscription_items" inside an outer ng-repeat="subscription in subscriptions".
Conceptually this makes sense, and I've thought of a few ways to load this linked data, but what I'm curious about is: what's the best practice here? I don't want to excessively reload the data, so it seems like a plain old function that does a REST request every time I look at an item is a bad idea, but at the same time, I don't want to just push the data in once and then miss out on updates to items.
So, the question is: what's the best way to handle linked resources like this, to trace hasMany and belongsTo types of connections out to other models in a way that aligns with the ideas embedded in $scope and the $apply cycle?
I like to use a lazy-loaded dataModel service which will cache results and return promises. The interface looks like this:
dataModel.getInstitution(institutionId).then(manageTheInstitution);
If I need something that is a child, I call it like this:
dataModel.getStudents(institutionId).then(manageStudents);
Internally, getStudents looks something like this:
function getStudents(institnutionId) {
var deferred = $q.defer();
getInstitnution(institutionId).then(function(institution) {
institution.students = Student.query({institutionId: institutionId});
institution.students.$promise.then(function(students) {
deferred.resolve(students);
});
});
return deferred.promise;
}
These functions are a bit more complex. They cache the results and don't request them again if they already exist... and return or chain the existing promise. They also handle errors.
By carefully crafting my dataModel service this way, I can manage any nesting of resources and I can optimize my network requests. I've been very happy with this approach so far.