I have an issue in which I wonder if Restangular has support for. I have a UserModel which is part of my model layer. It may have custom attributes that the server doesn't have in it's model and also behavior. I'm not clear if I'm able to use my custom User model, send it to the backend and when it returns transform it back to the UserModel object of my model layer so I still have the custom attribute and methods.
Here's the plunker: http://plnkr.co/edit/IlYcSRuX3GPWmewxniuq?p=preview
Where do I handle the transformation? Do I add the methods in the config block or should I add it via adding a response interceptor? What about custom attributes that the server might not send back to me? I haven't run across any good examples of this.
The UserInfoCntrl controller sends the UserModel object into the contactInformationService in my example.
Some of this might be design choices, i.e. use what you think is best. However, a common pattern [citation needed ;)] would be to integrate the synchronization logic between client and server in the "model" service.
The UserModel service would then be responsible for providing the User object to the rest of the application, keeping it in sync with the server (perhaps via methods like save(), or perhaps automatically?). The service would then be the only module responsible for communicating with the server, at least when it comes to user objects. It can also automatically pull the user data from the server when instantiated.
The architecture feels very clean, at least to me.
I don't have any concrete examples that exactly suits your needs, but this authentication service by Fnakstad springs to mind. It maintains a object (actually a user object!) using $http and $cookieStore. Restangular is a bit more high-level than $http, but the self-contained service concept providing methods for manipulation and storing stands.
Related
Titlte says it all I think.
I have encountered both and don't know which one I should use between $_SESSION['Auth'] and $this->Authentication->getIdentity(). Is one safer than the other ?
Thank you,
Simon
With CakePHP you should always use the abstracted APIs to access any superglobal data like $_POST, $_COOKIE, $_SESSION, etc..
This is advised for a multitude of reasons that depend on the specific situation, but generally it kinda touches the principle of dependency inversion, and decoupling in general, eg. your code should depend on abstractions, not concretes, then for example implementations can change without breaking your application. And while the session object, the request object, or the authentication component aren't interfaces, they still abstract the access to the underlying data (the concrete so to speak).
Something where the need for this would apply generally, would be testing, except for the CakePHP session object, which must write the data to the $_SESSION superglobal internally, other superglobals like $_GET, $_POST, $_COOKIE, etc. are not being populated if you use the API provided by CakePHP, instead the data is written into the request object, which exposes the data via its own API. So if you were for example to access $_POST directly in your code, and then pass POST data in a test like $this->post('/url', $postData), your code wouldn't see the data, as it would directly land in the request object instead of the $_POST superglobal.
As far as the authentication example specifically goes, the authentication middleware could have obtained the identity with data from who knows where, the session, cookies, tokens, etc., and likewise it could persist the identity anywhere, the session, cookies, etc, the inner layers of your application shouldn't have to care about such implementation details, they obtain the identity via the component, or from the request object, and that's it, they don't need to know anything else, then you can easily change how authentication is handled without breaking the rest of your application.
I am currently learning Angularjs for an application that will need role-based access control logic. There are scenarios where the logic will be necessary to restrict access to certain pages based on your user role. There are other scenarios where I will have to restrict access to a section of a page or certain fields on a form based upon a users role. If Angularjs is a client side methodology, this seems to present a problem if I don't want the client to have any access to an item they aren't suppose to have access to.
What is the current approach for handling these scenarios from the server without interfering with Angularjs?
I know I have access to razor to restrict page section access but what problems would this present for Angularjs and would this be a good idea to mix razor and angular view syntax?
In my transition to Angularjs, I am having a problem wrapping my mind around how to handle this.
You can request views using ajax from MVC and handle any access restriction with a response.
I don't have good example now but it might set you in the right way?
Using knockout I have used a template enginge to retrieve partial views from MVC.
Same thing should work here and you can keep the access restriction on the server side (which you should).
One of the methods would require setting up a service to check if a user is authorized for the page (in angular), then setting params on a route
{
"/admin": {
templateUrl: 'partials/admin.html',
controller: 'AdminCtrl',
requireLogin: true//THIS
}
and then canceling navigation with in a controller by handling the $locationChangeStart event
(This is all shown in the article I posted below).
My suggestion would be to create views for each logical page element with individual controllers and handle access the same way as the example. Instead of per page, it will be per view.
In the case of adding/removing inputs from your forms, maybe there would be some way you could also handle this inside the service you created and then ng-show/hide the element depending on the access level.
A quick google search for 'Role Based Access Control in Angular' will pull up tons of useful tutorials/articles.
See This
And This
(more in-depth with the same examples)
You need to think about it in two parts:
Securing your API (ASP.NET Web API)
Using security/role information to present an appropriate UI (available options and routes, elements enabled / disabled etc. -> AngularJS)
You should assume that API clients are malicious - they aren't necessarily your application.
In terms of authentication / authorisation options: HTTP header tokens, such as JWT are a common option but HTTP only cookies are still a good option, especially if your clients are all web-based. The other advantage of JWT is that you can allow the client (AngularJS) to read the payload, and easily share information about what the user is allowed to access. If you use cookies you generally have to supply that information in another way (server side injection or API call, for example).
How your generate the token / cookie (and what they contain in terms of claims) depends how you need to authenticate people. It can be your own ASP.NET MVC login form, with credentials stored in a database if necessary.
You can use MVC views as AngularJS templates if you like, though I tend not to see many advantages beyond the layout.
Let's say we have a User entity. Should I have two smaller services (User and Users)? Or one larger service that deals with both a collection of Users and an individual User? If it is the latter, is it best practice to name the service User or Users?
I use one service per entity that houses the collection, methods used by the entity collection controller, and methods used by the individual entity's controller. This works for my team as we follow the repository pattern in our server code. I save the collection in the service because it is accessed often, and parts of the collection are nice to share in other area's like to keep a count in the menu, or create a relational list in another controller. The individual entity is typically only accessed by the controller for the view, and can be disposed of when the route is changed as long as the list item in the controller was updated if the entity was changed.
The only time I used separate services was one edge case where the customer wanted an entity to save state if the route was changed without persisting the entity to the server or cache. The entity needed to be saved somewhere so that was reason enough to create a service for the individual entity.
I do use a separate service per entity to manage http requests. So each entity does have two services, one to manage the collection and all crud+ functionality, and the other for http for separation of concerns.
In the single page webapp I've recently built I'm getting data for my models using Restangular module. I'd like to add real-time updates to the app so whenever any model has been changed or added on the server I can update my model list.
I've seen this working very well in webapps like Trello where you can see the updates without refreshing the web page. I'm sure Trello webclient uses REST API.
What is a proper way to architect both server and client to archive this?
First of all, your question is too general and can have a lot of solutions that depend
on your needs and conditions.
I'll give you a brief overview for a single case when you want to leave REST APIs
and add some realtime with web sockets.
Get all data from the REST -- Sokets for notifications only.
Pros: Easy to implement both server side and client side. You only need to emit events on the server with
info about modified resource (like resource name and ID), and catch these events on the client side and fetch
data with REST APIs.
Cons: One more request to the server on every notification. That can increase traffic dramaticaly when you have a lot of active clients for a single resource (they will generate a lot of reverse requests to the server).
Get initial load from the REST -- Sockets for notifications with data payload.
Pros: All info comes with the notification and will not cause new requests to the server, so we have less traffic.
Cons: Harder to implement both server side and client side. You will need to add data to all the events on the server. You will need to fetch data from all the events on the client side.
Updated according to the comment
As for handling different types of models (just a way to go).
Client side.
Keep a factory for each model.
Keep in mind that you need realtime updates only for displayed data (in most cases), so you can easily
use memory caching (so you can find any entity by its ID).
Add listener for every type of changes (Created, Updated, Deleted).
In any listener you should call some initObject function, that will find entity in the cache by ID and extend it, if there is no entity with such ID, just create a new one and add it to cache.
Any Delete just removes an entity from the cache.
Any time you need this resource, you should return the link to cache object in order to keep two way databinding (that is why I use extend and not =). Of course, you need to handle the cases like: "User is editing the resource while notification about deleting comes".
Server side.
It is easier to send all the model then just modified fields (in both cases you must send the ID of resource).
For any Create, Update, Delete event push event to all engaged users.
Event name should contain action name (like New, Update, Delete) and the name of resource (like User, Task etc.). So, you will have NewTask, UpdateTask events.
Event payload should contain the model or just modified fields with the ID.
Collection changes can be handled in two ways: with add/update/remove items in collection or changing all the collection as a whole.
All modifications like PUT, POST, DELETE are made with REST of course.
I've made a super simple pseudo gist for the case 1). https://gist.github.com/gpstmp/9868760 but it can be updated for case 2) like so https://gist.github.com/gpstmp/9900454
Hope this helps.
I have function that collates an array of objects received from a REST service into groups while also applying ordering that is set out by user preference only available on the client.
Currently this collation is handled in the Service that calls the REST service but I see a need to separate this functionality from the actual $HTTP call because the user can switch between different collating instructions without reissuing the $HTTP call.
Since this isn't a simple reordering is it appropriate to create this collation process as a another Service or as a Filter?
Filters are UI constructs where Services are not. If you find yourself making $http calls, then they should be in a Service (or Factory). If you find yourself wanting to create a filter that is also responsible for collecting data, I'd do this:
Create a service which is responsible for fetching the data
Inject the service into your controller and expose the service data via the controller
Bind the data from the controller to the filter parameter
Doing this will preserve proper separation of concerns.