I have no problem saving a model one at a time, I wrote a recursive save on an array of objects. on each successful save, i shift out an object and if the array length is not 0 -> repeat... once the array length reaches zero then I know all saves were successful and do the appropiate action.
wondering outloud if there is a better way? than the approach described above the rest service api doesn't take a collection but if I had an example of saving a collection, I'd ask for the service to be modified.
Although there is no save method built in to a Collection, this can certainly be a good idea. Your situation sounds like it would benefit form this, as looping through each model and saving them individually is a really really really expensive network traffic way of getting this done.
At it's core, Backbone uses jQuery's AJAX to post to the server... so why not take advantage of that with your collection?
$.ajax({
type: "POST",
url: "/my/api",
dataType: "JSON",
data: myCollection.toJSON()
});
This will post an array of objects as JSON back to the server at the /my/api endpoint. You could easily wrap this up in to a method on your collection or in another object.
i do it with a save function direct in collection:
Backbone.sync('update', this, {
success: function() {
alert('saved');
}
});
Hope it helps.
Related
I havent find ressources online to solve my problem.
I'm creating an app with React Native that fetches and shows news articles from my database.
At the top of the page, there's some buttons with filters inside, for example:
one button "energy",
one button "politics"
one button "people"
one button "china"
etc...
Everytime I press one of those buttons, the filter corresponding is stored in an array "selectedFilters", and I want to fetch my database to only show articles that are corresponding to those filters.
Multiple filters can be selected at the same time.
I know one way of doing it, with a POST request:
await fetch('187.345.32.33:3000/fetch-articles', {
method: 'POST',
headers: {'Content-Type':'application/x-www-form-urlencoded'},
body: 'filters=${JSON.stringify(selectedFilters)}'
});
But the fact is, I read everywhere, and I also was teached, that POST request are used when creating or removing, and theoretically, what I should use is a GET request.
But I don't know how to send an Array with GET request.
I read online that I can pass multiple parameters to my url(for example: arr[0]=selectedFilters[0]&arr[1]=... but the fact is I never know in advance how many items will be in my array.
And also I'm not sure if I could write exactly the same way as my POST request above, but with GET:
await fetch('187.345.32.33:3000/fetch-articles', {
method: 'GET',
headers: {'Content-Type':'application/x-www-form-urlencoded'},
body: 'filters=${JSON.stringify(selectedFilters)}'
});
or if I can only pass items in the url, but does this work ?
await fetch('187.345.32.33:3000/fetch-articles?arr[0]=${selectedFilters[0]', {
Or even better if something like this could work:
await fetch('187.345.32.33:3000/fetch-articles?filters=${JSON.stringify(selectedFilters)}', {
Thanks for your help
You should definitely use a GET request if your purpose is to fetch the data.
One way of passing the array through the URL is by using a map function to create a comma separated string with all the filters. This way you would not need to know in advance how many elements are in the array. The server can then fetch the string from the URL and split it on the commas.
One more method you can try is to save a filters array on the server side for the session. You can then use a POST/PUT request to modify that array with new filter as user adds or remove them. Finally you can use an empty GET request to fetch the news as the server will already have the filters for that session.
But the fact is, I read everywhere, and I also was teached, that POST request are used when creating or removing, and theoretically, what I should use is a GET request.
Yes, you do read that everywhere. It's wrong (or at best incomplete).
POST serves many useful purposes in HTTP, including the general purpose of “this action isn’t worth standardizing.” (Fielding, 2009)
It may help to remember that on the HTML web, POST was the only supported method for requesting changes to resources, and the web was catastrophically successful.
For requests that are effectively read only, we should prefer to use GET, because general purpose HTTP components can leverage the fact that GET is safe (for example, we can automatically retry a safe request if the response is lost on an unreliable network).
I'm not sure if I could write exactly the same way as my POST request above, but with GET
Not quite exactly the same way
A client SHOULD NOT generate content in a GET request unless it is made directly to an origin server that has previously indicated, in or out of band, that such a request has a purpose and will be adequately supported. An origin server SHOULD NOT rely on private agreements to receive content, since participants in HTTP communication are often unaware of intermediaries along the request chain. -- RFC 9110
The right idea is to think about this in the framing of HTML forms; in HTML, the same collection of input controls can be used with both GET and POST. The difference is what the browser does with the information.
Very roughly, a GET form is used when you want to put the key value pairs described by the submitted form into the query part of the request target. So something roughly like
await fetch('187.345.32.33:3000/fetch-articles?filters=${JSON.stringify(selectedFilters)}', {
method: 'GET'
});
Although we would normally want to be using a URI Template to generate the request URI, rather than worrying about escaping everything correctly "by hand".
However, there's no rule that says general purpose HTTP components need to support infinitely long URI (for instance, Internet Explorer used to have a limit just over 2000 characters).
To work around these limits, you might choose to support POST - it's a tradeoff, you lose the benefits of safe semantics and general purpose cache invalidation, you gain that it works in extreme cases.
The question I have is about how to store a collection of models to a RESTful back end.
I'm using Backbone.js, and I'm considering to either:
Use Async.js parallel method and post / put each model separately in a loop, after which a general callback method is triggered;
Send a collection of objects to the back end, and use a database transaction to make sure that all models are properly saved with a single commit;
The first method seems to cause a lot of overhead, because I have to make different calls to save the models.
But when considering the second approach, Laravel4 does not by default allows to perform a post / put on a collection.
What would be your preferred approach, and more importantly, why?
The second approach is definitely the way to go in my opinion, it reduces the latency and saves bandwidth if you use it with a large number of models since all the HTTP stuff (headers, etc) will be sent only once.
In your controller, use something like this (it may not work, I don't have access to any Laravel installation to test it) :
public function postCollection() {
$collection = Input::get("collection");
DB::transaction(function() {
foreach ($collection as $data) {
// In this example we assume it's a collection of users
// Of course in a real app you would also do input validation
$user = User::create(["name" => $data->name, "email" => $data->email]);
}
})
// Example success response, will be automatically serialized to JSON
return ["status" => "success"];
}
This loops over the collection element of your JSON input, which should be a list of models. Then it should obviously do validation and possibly other stuff. The whole loop is wrapped into a DB::transaction(), which will rollback everything if an exception occurs inside of it.
Context: I'm new to Angular, and this feels like a lot more of a "What's the right way to do this in AngularJS" kind of question.
My API backend has a couple of related objects that I need to request and assemble into a coherent user interface. It can be modeled as a subscription hub thing, so I have: Subscription hasMany Subscription_Items, belongsTo Source.
What I want to do is look up a user's Subscriptions (/api/subscriptions?user_id=1), which gives me some JSON that includes a subscription_item_ids=[1,2,3 ...] array. I then want to query on those ids, as well as query the server on source_id to pull shared info, and then repackage everything nicely into the $scope so the view layer has easy access to the system and can do stuff like ng-repeat="item in subscription.subscription_items" inside an outer ng-repeat="subscription in subscriptions".
Conceptually this makes sense, and I've thought of a few ways to load this linked data, but what I'm curious about is: what's the best practice here? I don't want to excessively reload the data, so it seems like a plain old function that does a REST request every time I look at an item is a bad idea, but at the same time, I don't want to just push the data in once and then miss out on updates to items.
So, the question is: what's the best way to handle linked resources like this, to trace hasMany and belongsTo types of connections out to other models in a way that aligns with the ideas embedded in $scope and the $apply cycle?
I like to use a lazy-loaded dataModel service which will cache results and return promises. The interface looks like this:
dataModel.getInstitution(institutionId).then(manageTheInstitution);
If I need something that is a child, I call it like this:
dataModel.getStudents(institutionId).then(manageStudents);
Internally, getStudents looks something like this:
function getStudents(institnutionId) {
var deferred = $q.defer();
getInstitnution(institutionId).then(function(institution) {
institution.students = Student.query({institutionId: institutionId});
institution.students.$promise.then(function(students) {
deferred.resolve(students);
});
});
return deferred.promise;
}
These functions are a bit more complex. They cache the results and don't request them again if they already exist... and return or chain the existing promise. They also handle errors.
By carefully crafting my dataModel service this way, I can manage any nesting of resources and I can optimize my network requests. I've been very happy with this approach so far.
What technique shall one use to implement batch insert/update for Backbone.sync?
I guess it depends on your usage scenarios, and how much you want to change the calling code. I think you have two options:
Option 1: No Change to client (calling) code
Oddly enough the annotated source for Backbone.sync gives 'batching' as a possible reason for overriding the sync method:
Use setTimeout to batch rapid-fire updates into a single request.
Instead of actually saving on sync, add the request to a queue, and only batch-save every so often. _.throttle or _.delay might help you here.
Option 2: Change client code
Alternatively, instead of calling save on your models, you could add some sort of save method to collections. You'd have to track which models were actually modified and hence in need of update, since as far as I can tell, Backbone only knows whether they're new or not (but I could be wrong about that).
Here's how I did it
Backbone.originalSync = Backbone.sync;
Backbone.sync = function (method, model, options) {
//
// code to extend sync
//
// calling original sync
Backbone.originalSync(method, model, options);
}
works fine for me , and I use it to control every ajax request coming out of any model or collection
I am using Backbone.js, Lawnchair, and backbone.lawnchair.js.
I was wondering what the correct way to "empty" a collection (both from the application and from localStorage) would be?
Currently, I am employing something along these lines:
Favorites.Collection = Backbone.Collection.extend({
model: Favorites.Model,
lawnchair: new Lawnchair({ name: "favorites" }, function(e){}),
empty: function(){
this.lawnchair.nuke();
this.fetch();
}
});
This is essentially destroying the elements in localStorage (lawnchair provides the nuke method) and fetching from localStorage. This seems a bit clunky, and I was wondering if I am thinking about this right - or if there is a better way.
Cheers!
Backbone follows sort of RESTish principles and sadly REST doesn't describe a bulk-delete procedure. Resources can only be deleted one at a time. Typically APIs don't support HTTP DELETE on the entire collection URI like DELETE /favorites, only DELETE /favorites/42 - one at a time. Thus there's no single backbone method that just does this as generally the collection is mostly designed to do fetch/GET and then delegate to the individual models to do saves and deletes. So yes, you're on your own for a bulk-delete approach. You can do something more RPC-like and pass a list of IDs to a delete procedure, but what you have above seems perfectly adequate to me, and yes deleting them all directly and then re-fetching the collection seems also very reasonable to me. So I think you're solution is fine as is, but I'm also curious to see what others suggest.