model.fetch always going to error callback - backbone.js

var mdl = Backbone.Model.extend({
defaults:{
url:'/displayPostVariables.php',
age:0
},
initialize:function(opt){
this.url = function (){
return opt.url
}
}
})
mdli = new mdl({
'name' :'rajkamal'
})
jQuery.ajaxSetup({
'beforeSend': function(xhr) {
xhr.setRequestHeader("Accept", "text/html")
}
})
mdli.fetch({
success : successcallback,
error:errorcallback
});
Ajax call is going but, always ends up with the error callback.
Looks like this post model.fetch success callback does not fire on firefox, but works on chrome but there is no javascript code in that.
Thanks.

Try also passing dataType: 'json' to the fetch.

I had the same problem with fetch only ever returning my error callback.
In the end, it was because I had not specified an id in the object on the server like this:
{"title":"The Green Mile ","author":"Stephen King","img":"green_mile.jpg","id":2}
I think Backbone expects certain properties to be present in the JSON, though I was unable to find any documentation about this. The way I solved it was to do model.save() and to look at the object that was being saved.

For reference, I had the same issue because of an unsafe method incorrectly injecting NaN into the json response
{"progress":NaN}
which failed to be correctly parsed and forced the error callback

I had this issue, and it was due to using single quotes instead of double quotes to surround attributes and values in the API's JSON response.

Specify the url outside of defaults
= Backbone.Model.extend({
url : "/displayPostVariables.php",
defaults:{
},
validation: {
}
});

Related

AngularJs: Does not update the model after the modifying it through $http transform

I need to transform objects coming from $http call to an api. My code adds some fields (functions) to the object coming from the api, here the constructor of this object :
(function () {
window.TransformedObject = function (obj) {
var self = this;
self = {};
if (obj) {
self = angular.copy(obj);
}
self.hasChanged = function () {
// return true or false if the object has changed
}
return self;
}
}());
The $http transform code looks like this :
$http({
url: 'api/...',
method: 'GET',
transformResponse: function(value) {
return new TransformedObject(JSON.parse(value));
})
}).success(function(data){
vm.obj = angular.copy(data);
});
Note that the value in the transformResponse callback is stringified, and need to be parsed to get the object
All this is working fine, suppose the object coming from the api contains a key called title, doing obj.title = 'some title' will update the object.
The problem :
Binding the title field with an input tag will not update the object if the change is coming from the view.
I use a regular ng-model to do it:
<input type="text" placeholder="Title" ng-model="vm.obj.title"/>
even using $rootScope.$watch will never be triggered if the change is coming from the view aka the input tag.
$rootScope.$watch(function () {
return vm.obj;
}, function () {
console.log('watch');
// this log will never appear in the console
});
Am I doing something wrong, why transforming the object coming from the api is breaking angulars binding ???
Thanks.
http://www.bennadel.com/blog/2605-scope-evalasync-vs-timeout-in-angularjs.htm
Sometimes, in an AngularJS application, you have to explicitly tell
AngularJS when to initiate it's $digest() lifecycle (for dirty-data
checking). This requirement is typically contained within a Directive;
but, it may also be in an asynchronous Service. Most of the time, this
can be easily accomplished with the $scope.$apply() method. However,
some of the time, you have to defer the $apply() invocation because it
may or may not conflict with an already-running $digest phase. In
those cases, you can use the $timeout() service; but, I'm starting to
think that the $scope.$evalAsync() method is a better option.
...
Up until now, my approach to deferred-$digest-invocation was to
replace the $scope.$apply() call with the $timeout() service (which
implicitly calls $apply() after a delay). But, yesterday, I discovered
the $scope.$evalAsync() method. Both of these accomplish the same
thing - they defer expression-evaluation until a later point in time.
But, the $scope.$evalAsync() is likely to execute in the same tick of
the JavaScript event loop.

Restangular put requests.. how and why?

If I have a resource, e.g.
var resource = Restangular.all('things');
and I have json object that I want to post to the API
jsonObj = {
someVar: "x",
anotherVar: "y"
}
I can simply do
resource.post(jsonObj).then( ...etc... )
Now if I update the model on the clientside and want to save the changes, why can I not do:
resource.put(thingId, updatedJsonObj)
I'm having trouble getting my head around the demos around on the internet, as they all appear to need to do a get request before they can do a put? Which seems odd. It's worth mentioning that I am using Restangular in a SERVICE, not in a controller or directive. My service has a variable allMyThings which is the result of resource.getList() and I use that in various places in the application
Actually, if you take one item in the collection returned by getList(), you can use .save() on it, and it will call PUT method.
angular.module('demo').controller('DemoCtrl', function (myList) {
// myList is populated by Restangular.all('...').getList();
var item = myList[0];
item.name = 'foo';
item.save() // this one did a PUT :)
.then(function (res) {
// ...
});
});
See it in the docs.
NOTE :
Restangular will use the id property as id for PUT, DELETE, etc. calls.
If you want to change that behaviour, use this in module config :
RestangularProvider.setRestangularFields({
id: "_id"
});

Restangular PUT attaches _id two times for no reason

Iam using Restangular on the clientside with _id as Id field. Sadly Restangular generates wrong URLs, maybe you could say me where the error is?
Restangular.all('/users').one(id).get().then(functon(results) {
$scope.data = results;
})
After the user edited the data:
$scope.save = function() {
$scope.data.put().then(...);
};
This very simple sample generates the following URL with the id twice. I have no idea what went wrong. :(
PUT /users/537283783b17a7fab6e49f66/537283783b17a7fab6e49f66
Solved it by changing the Request workflow of Restangular.
I don't now why, but this approad does not work:
Restangular.all('/users').one(id).get() ... result.put();
But this does:
Restangular.one('/users/',id).get() ... result.put();
Also it is important to tell Restangular that you were using _id instead of id:
angular.module('App').config(function(RestangularProvider, AppSettings) {
RestangularProvider.setRestangularFields({id: "_id"});
});

How do I define Angular $resource to pass parameters correctly

I need to call a web service that requires a list of IDs in the form:
http://service_addr?itemID=34&itemID=36 ...
I tried setting up my service factory as:
.factory("doService", [$resource, function($resource) {
return $resource("service_addr", {}, {
'create' : {method:POST, isArray:true} }); }])
In my controller I invoke the service with this code:
var ids = [];
angular.forEach(listofIDs, function(anId) {
ids.push( { itemID : anID } );
}
doService.create(ids, {}, function (response) {
... do response stuff
}
in the console the POST return a 40 Bad request error. The request is malformed in the parameters as shown below:
http://service_addr?0=%5Bobject+Object%5D&1=%5Bobject+Object%5
How can I get the required parameters passed correctly?
Adding to ricick's answer, you could also pass the IDs in the format
http://service_addr?itemIDs=34,36,38
by doing
ids.join(',')
The issue is you can't have more than one parameter with the same name in GET, so even if angular could pass the data your server will only see one value (unless you're cheating and procssing the url string manually).
A better solution would be to something like:
http://service_addr?itemID0=34&itemID1=36itemID1=38&itemIDCount=3
that way you create a seperate parameter for each variable.

Backbone.Model.destroy not triggering success function on success

So, within one of my views, I've got this function:
delete_model: function() {
var answer = confirm("Are you sure you want to delete this element?");
if (answer) {
this.model.destroy({
success: function() {
console.log("delete was a success");
}
});
}
});
When I ping that, the Ajax call goes out, the backend properly deletes the model and returns a 200 header with "OK" as the body...but the success event never fires. Am I missing something? What should I have the backend serve to fire that event?
I just had the same problem. The solution that worked for me was to make sure I return a json model from the server that corresponds to the deleted one.
edit: returning an empty json response will suffice.
Did not work:
delete(model) {
// deleted model from db
return "Model was deleted";
}
This did work:
delete(model) {
// deleted model from db
return model;
}
or:
delete(id) {
// deleted model from db with this id
return new Model {id: id};
}
Had the same problem using Backbone 1.5.3 with Rails. I tried rudena's solution, and it works!
Here's what my controller's delete function looked like initially:
def destroy
#cell = current_user.cells.find(params[:id])
#cell.destroy
render :json => "success"
end
And here's what worked:
def destroy
#cell = current_user.cells.find(params[:id])
#cell.destroy
render :json => #cell
end
That looks good to me, exactly what I have everywhere (except I have function(model) but that shouldn't matter at all) I do know that older versions of backbone didn't use the destroy(options) but instead had destroy(success, failure). Can you make sure you have the latest version.
Had this problem come up with my UI as well. Upon DELETE, the API came back with an empty 200 response.
What's happening is, jQuery expects a json response body, but when the response comes back empty, json parsing fails and the error callback is triggered.
My solution was to override the Model's sync method:
var MyModel = Backbone.Model.extend({
// Fix for empty DELETE response
sync: function(method, model, options) {
if (method === 'delete') {
options.dataType = 'html';
}
Backbone.sync.call(this, method, model, options);
}
});
This works because options is passed to jQuery's ajax call and we're instructing jQuery not to expect json.

Resources