I'm working on a project which uses Angular, Breeze and signalR. I want to send data changes using SignalR instead of ajax calls from client side. How can I save changes in the breeze data?
are you planning to send the entire object graph or just individual objects?
In case of individual objects it should be as easy as attaching them to the DBContext.
Object Graph will be another story.. don't even know how that will work with SignalR.
Related
Specifically my question is about how to return a true observable from a .netcore web api controller, using efcore. This is so datasets can be returned as a stream so the front end can start building the page at once using the data that it receives in the first iteration and keep on until all data has been received.
So I have seen observable collections. I have also seen that EFCore now streams rather than buffers.
Can anyone point me at the documentation, or an example so I can do more reading?
Let's say I am returning 20 records, from my web api and returning them to a reactjs project. Reactjs supports observables using rxjs. What do I need to do in the web api app to support this observable flow from sql server all the way up to the controller level?
I am not exactly sure what are you asking for , but the normal process is create a service which return a DTO, in your case an array of DTOs or a DTO which contains an array of DTOs.
And then you can send this info trough the controller.
I'm not sure if WebAPI is able to stream IObservable out-of-the-box (most probably not), so there are 2 options:
you can use SinglarR sockets to stream objects to frontend
You can return IAsyncEnumerable from controller. This way JSON serializer will stream the data element by element and frontend needs to start deserializing them before getting whole result
Personally, I'd use approach 2, unless you already use websockets in your project. In the controller it's mostly a matter of using ToAsyncEnumerable (here's a source), but you need to verify, if your frontend libraries support that
I am starting to work with angular and am fascinated by the bi-directional data-binding capabilities and by its $http method, which lets me save changes in to my mysql database, without refreshing the page.
Another thing I am currently fascinated by is the real time capability across multiple clients using firebase. Here all clients are updated in REAL TIME, when the database receives any changes. I'd probably like to use firebase, but I would have to drop Laravel and MySql as a persistence layer entirely, which I would like to keep for the moment, since my application is already working in Laravel, just not in real time.
How would I go about having a Real Time application, which updates every client, without refreshing the view, in Laravel using MySQL and Angular?
If I am not mistaken, Pusher and PubNub, are providing this necessary open connection with the server using websockets, so when the server has something to share, angular will now and render it.
Since I would like to use Laravel and MySQL as a persistence layer, I am not sure, what the best way would be. I am not even sure, if I understood everything correctly, which I wrote above, since I am new to angular and real-time applications.
What would be the next necessary steps, to get some Real-Time capability into a PHP/MySQL application?
The solution for your problem is:
1º - open websocket connection with the websocket-server and subscribe a channel, after this send the data to your serve using ajax
tutorial angular pusher
2º - In server side, you get the data, saves to your database and send a 'PUBLISH' to the respective channel into websocket server
lib useful for this
3º - Through the subscribe gets the data in real time
Pusher.subscribe('channel', 'event', function (item) {
// code
});
I had a similar problem recently and I finally ended up using Redis publish/subscribe Redis. You can store data in the channel and then subscribe to any changes. When something changes you can send it to Pusher which will send it then to the clients.
I also recommend considering Node.js and Socket.io since you can achieve very good performance without third party service, and even if you don't have experience with node you can find very good examples on Socket.IO how to write an application.
For Redis there is a good library for PHP called Predis and there is Redis Node client as well, so you can mix it all together.
Im unsure how I should structure an AngularJS app logicwise. On the serverside I do it like this:
Request is handled by controllers
Controllers call a service with the incoming data
Services do database queries by calling the appropriate methods on the database layers
So on the client side I have the controllers that bind the scope to my logic, my JSON data objects and my Services.
Do I put everything that is not related to the scope into a seperate service?
Where do I put my data objects? Into a seperate service?
What about two way databinding? Doesnt that write back to my JSON data objects even if I dont want that?
How do I organize my run method? Do I group everything by resource and put it into ArticlesService.init() for instance?
Should my services keep the JSON data objects and do all the updating on the local and remote collection (as in delete a JSON object and then call a DELETE method on the remote server?)
Is it possible to use web sockets to update the content of a web page?
What I am trying to achieve is create a dynamic sign-up list. So using backbone and an mvc web API, I get json from the server by querying a database, then I apply this json to a template.
When someone new signs up, I want this to trigger an update for all clients connected to the server, sending a json representation of the new sign-up-ee.
I'm not sure what you're using on the server side, but essentially you want to override backbone.sync which uses $.ajax by default to use the socket framework of your choice.
Here are a couple of articles that may help:
SignalR
Socket IO
Hope this helps (p.s. we did this and it results in a seriously sexy app). Good Luck
I have a backbone.js application connecting to a REST API. I noticed that if you delete multiple models at once, seperate API request have to be sent for each model.
Is there any way to handle the delete request using 1 request?
You would need your server to expose an endpoint for deleting multiple models at once by passing IDs of the models to be deleted in the first place. If you have this available the common way to handle that would be to add a method to your collection called something along the lines of deleteByIds which would accept array of IDs and then this method would remove the models from the collection on successful delete request (if sync) or straight away before sending the delete request to the API endpoint which would make sure they are all removed from the server.
By default that's how RESTful interfaces work and batch processing is always a custom extension to RESTful interfaces so there is no out of the box way to do that and it might involve you doing some extra work both on the backbone client and on the backend.