I recently started learning AngularJS+Firebase. I'm trying to write in my firebase an object like this:
{
title: "Personal Information",
say: [
[{ "eng": "What's", "ukr": "Що є" }, { "eng": "your", "ukr": "твоє" }, { "eng": "surname?", "ukr": "прізвище?" }],
[{ "eng": "Smith", "ukr": "Сміт" }],
[{ "eng": "What's", "ukr": "Що є" }, { "eng": "your", "ukr": "твоє" }, { "eng": "first", "ukr": "перше" }, { "eng": "name?", "ukr": "ім'я?(не фамілія)" }]
]
}
with line:
lessondata.add($scope.topic);
where 'lessondata' is service created with angularFireCollection() and $scope.topic - object bound to my UI.
But got the following error:
Firebase.push failed: first argument contains an invalid key ($$hashKey) in property 'say.0.0'. Keys must be non-empty strings and can't contain ".", "#", "$", "/", "[", or "]"
As I understood Firebase do not allow to use 0 as a key even if it's a key in an attached array for which zero key is natural. So should I change my object structure in some hardcoded instance or I miss something? Thanks in advance!
EDIT: As Anant points out in the comments, in the latest stable version of Angular (1.0.7 atm), you can use angular.copy(obj) to remove $$hashkey attributes.
Like Michael said, the '$' in '$$hashKey' is the issue. Angular creates the $$hashKey properties behind the scenes (see more here: https://groups.google.com/forum/#!topic/angular/pI0IgNHKjxw). I've gotten around this issue by doing something like myRef.push(angular.fromJson(angular.toJson(myAngularObject))).
The issue is the $ in "$$hashKey", not the 0. 0 is allowed.
I wanted to throw another answer in here that is much simpler, just use track by in your repeat. It will get rid of the $$hashKey attribute that is causing so much grief.
<div ng-repeat="item in items track by $index">{{item.name}}</div>
I'm a bit late to this but I thought I would like to add my two cents as I was shaking my head to all the other answers. Hopefully this can help you to avoid this issue all together.
Use the angularFire library intended to handle angular data and use it's methods.
while yes you can use the pure javascript library methods to .push() .add() .update(), .set() ect.
So if you want to avoid any clashes when firebase communicates with angular javascript you need to be using the appropriate .$foo() methods (i.e. .$save()). In your case just add the $ to your .add() (make it .$add())
so use lessondata.$add($scope.topic);
differences when saving with firebase's vs angularfire's
AngularFire's $save() method is implemented using Firebase's set() method.
Firebase's push() operation corresponds to AngularFire's $add() method
Typically you should be using set()/$save() if you either have an object that already exists in the database or if you are working with objects that have a natural key.
more info on that here: https://stackoverflow.com/a/35959496/4642530
Things to note with AngularFire
For a callback:
and if you want a callback to know if your data saved correctly you can do something like this:
var list = $firebaseArray(ref);
list.$add({ foo: "bar" }).then(function(ref) {
var id = ref.key();
console.log("added record with id " + id);
list.$indexFor(id); // returns location in the array
});
I was surprised this wasn't mentioned earlier in any of the other answers, but take a look at these docs https://www.firebase.com/docs/web/libraries/angular/api.html#angularfire-firebasearray-addnewdata
Cheers.
The best way to get rid of the $$hasKeys is to use "track by" in ng-repeats as in the following (as mentioned in the answer above)
<div ng-repeat="(key, value) in myObj track by $index"> ... </div>
but you can also set track as par of ng-model-options
ng-model-options="{ trackBy: '$value.someKeyOnYourObject' }"
on a form control. This way also improves performance of your angular app.
Another way is to remove the $$hashKey is using
angular.copy(someObj);
If all else fails, you could also use lodash to remove the keys that start with the "$".
_.omitBy(yourObject, function(value, key){return _.startsWith(key, '$');});
You can also strip out the property before pushing it.
delete $scope.topic.$$hashKey
Related
I'm following the Meteor To-Do App tutorial with Angular integration and am learning about filtering collections. I've been able to implement a simple filter on a collection for the app I'm working on by following the principles in the tutorial, but now I'm stuck trying to figure out how to add multiple queries to the filter.
In the example, you can view incomplete tasks by toggling a checkbox. This is implemented in the controller by watching $scope.hideCompleted for changes and passing it as a Mongo query to filter the Meteor collection.
Watcher
$scope.$watch('hideCompleted', function() {
if ($scope.hideCompleted)
$scope.query = {checked: {$ne: true}};
else
$scope.query = {};
});
Collection filter
$scope.tasks = $meteor.collection(function() {
return Tasks.find($scope.getReactively('query'), {sort: {createdAt: -1}})
});
How do I make the query support multiple filters? For example, say I've chosen to extend the example and have ranked each to-do item by priority. I then would have an an input field for the user to filter the collection by priority, whose value is bound to $scope.priority. Now, if I wanted to filter the to-do list by incomplete and priority=$scope.priority tasks, I understand the Mongo query would need to be something along the lines of Tasks.find({ $and: [{ checked: {$ne: true} },{ priority: $scope.priority }]},{ sort: { createdAt: -1 } }).
In my app, I've been able to make two watchers properly track changes to two scope variables, analogous to my example with $scope.hideCompleted and $scope.priority, but I don't know how to take the next step to merge the queries when filtering the collection. I've also tinkered around a little with this package, since I'll eventually hope to be able to filter and sort by many criteria, but I didn't get too far with it before switching to the concepts I've described here.
I'd appreciate any help with this. Thank you!
This would be my approach:
$meteor.autorun($scope, function() {
// uncomment subscribe after you've got to that point
// $scope.$meteorSubscribe('yourSubscription').then(function() {
$scope.tasks = $scope.$meteorCollection(function() {
return Tasks.find({
checked: $scope.getReactively('model.hideCompleted'),
priority: $scope.getReactively('model.priority')
}, { sort: { createdAt: -1 } });
});
// });
});
A couple of things here:
Once you have removed autopublish you can uncomment the $scope.$meteorSubscribe method and replace "yourSubscription" with the name of your actual subscription.
$meteor.autorun will fire every time any getReactively variable changes.
$scope.$meteorSubscribe and $scope.$meteorCollection are favored as they will remove the subscriptions and object/collection when the scope is destroyed.
If you have any issues then perhaps I can setup a demo for you to look at.
Well, I guess I was a lot closer than I had expected, so I'll answer my question and share what I did to implement multiple filters with regards to the hypothetical extension of the to-do app.
I made hideCompleted and priority scope variables into properties of a scope object model, and used a single watcher with the argument true at the end to check for object equality (for any changes to model or its properties). I then generated $scope.query by stitching together "sub-queries." I've added the code below.
This seems to be working fine for now, but I'm not sure if it's the best solution, so I will continue experimenting, and I will update my answer if I find anything better. I'd be interested in any other approaches, though!
Watcher
var initQuery=true;
var subQueries=[];
$scope.$watch('model', function() {
if (!initQuery){
subQueries=[];
if ($scope.model.hideCompleted)
subQueries.push({checked: {$ne: true}});
if ($scope.model.priority)
subQueries.push({priority: $scope.model.priority});
$scope.query = { $and: subQueries};
} else {
initQuery = false;
$scope.query = {};
}
}, true);
Filter Collections (unchanged)
$scope.tasks = $meteor.collection(function() {
return Tasks.find($scope.getReactively('query'), {sort: {createdAt: -1}})
});
So I'm using this Rest API with ngResource to do get, query, post and update requests. What I'm looking for, is a way to define the structure for each entity.
For example, assuming we have:
module.factory('app.entity.item', function($resource) {
return $resource('http://xmpl.io/items/:itemId', { itemId: '#id' });
});
I want to instantiate it in a controller like:
module.controller('AddItemCtrl', ['app.entity.item', function(Item) {
$scope.item = new Item();
});
and bind it to the respective form in my template.
The actual problem that I have run into, is that I have to deal with 1:m tables.
An example of the entity structure would be:
{
"name": "",
"categories": [],
"list": [
{
"value": "",
"list": [
{
"value": "",
"list": [
{
"value": ""
}
]
}
]
}
]
}
(A more thorough example in the fiddle below)
Now the first two fields are obviously not the problem. It is the third one. The list. Each one of these lists can have a variable number of items.
I am currently using ngRepeat and an add(type, context) method, which adds a new set of fields to the scope (value field in this example and child lists for the first two levels), which will appear in UI by ngRepeat so the user can fill it up and submit it to the service.
First off, I have to define the structure, so the UI would not be empty when the page loads.
module.controller('AddItemCtrl', ['app.entity.item', function(Item) {
$scope.item = new Item({
"name": "",
"categories": [],
"list": [
{
"value": "",
"list": [
{
"value": "",
"list": [
{
"value": ""
}
]
}
]
}
]
});
});
But that is redundant. I have to do it everywhere!
Another issue is that when the item.$save is called, the model is emptied (perhaps re-instantiated?) and the fields inside the list property (managed by the ngRepeat directive) are gone.
So I'm wondering, what would you do under such circumstances.
Is there a way to define the entity (resource) structure?
SAMPLE: http://jsfiddle.net/g15sqd5s/3/
trying to give simple answer - for simple structures I would use something like
module.factory('Item', function($resource) {
var resource = $resource('http://xmpl.io/items/:itemId', { itemId: '#id' },
// you can also define transformRequest here:
{ transformRequest: function(data) {
// data can be transformed here
return angular.toJson(data);
}});
return angular.extend(resource.prototype,
{
name: null,
categories: []
});
});
but then be aware of need to 'flatten' the object.
and for the more complex model I would check restangular
similar topic is also discussed here:
How can I extend the constructor of an AngularJS resource ($resource)?
I would go ahead and revise my model structure in the backend in the first place - the models on the client side should merely follow the ones already defined, rather than being re-defined in a transform block. So, to answer your question, the "default" model structure comes from the server. What you get in your $resource objects has the structure of what your server returns.
To start off, is it really ok to invoke $save on the Item model when the user has populated some values? What we want to save are obviously the lists associated with an item, not the item itself. A separate resource defined in the backend, say items/<item_id>/list, may be a cleaner solution. It may not scale very well, as you'll have to make a separate GET request for each item to fetch its list, but that's the proper RESTful way to do it.
Extending this approach to the example in your fiddle, I imagine a routing scheme like buildings/<building_id>/floors/<floor_id>/units/<unit_id> would be a proper solution. Making a GET request to buildings/ should yield you a list of buildings; each building in the array returned should be an instance of a Building model, which has the proper URL set so the user can perform a single POST and update only the building name, instead of sending back the whole structure back to the server. Applying this recursively to the nested resources should give you a clean and concise way to deal with model changes.
Regarding the UI part - I would go ahead and define three directives for buildings, floors and units, and let each one manage an array with the respective resources, also taking care for the UI bindings to the model values.
So how could a Building model look like?
var BuildingResource = $resource('/buildings/:id', { id: '#id' });
Invoking BuildingResource.query() should yield an array of existing buildings. Adding a new building could look like this:
var newBuilding = new BuildingResource();
newBuilding.$save().then(function(building) {
$scope.buildings.push(building);
}, function(errData) {
//Handle error here...
});
It should be easy to extend this pattern for the rest of the resources - note that what the server needs to return for every building is just the name and the id; knowing the id is sufficient to construct an URL (and a $resource object, respectively) to fetch the needed child resources (in this case, floors).
Using AngularFire, I am extending the object factories in order to have encapsulated data and to allow specific features, as explained in the official tutorial. I have a data structure like the following:
{
'articles': {
'article-asd876a': {
title: 'abc',
text: 'lorem ipsum ...',
comments: {
'comment-ad4e6a': true,
'comment-dsd9a7': true
}
}
},
'comments': {
'comment-ad4e6a': {
text: 'comment text1',
articleId: 'article-asd876a'
},
'comment-dsd9a7': {
text: 'comment text2',
articleId: 'article-asd876a'
}
}
}
Now I would love to be able to do this:
var article = new Article(8); // Returns the object created by my object factory, fetching data from firebase
var comments = article.getComments(); // Returns an array of type Comment
var firstText = comments[0].getText();
var article2 = comments[0].getArticle(); // article2 === article
But this fails for me on many levels. One of them being: In Article, I can only store the Comment ID, and therefore have to recreate the Comment Object using new Comment(commentId), for which I need to inject Comment into Article. The same is true for Comment, so that I end up with a circular dependency Article -> Comment -> Article. The following fiddle shows the behavior: http://jsfiddle.net/michaschwab/v0qzdgtq/.
What am I doing wrong? Is this a bad concept/structure for angular? Thanks!!
What am I doing wrong? Is this a bad concept/structure for angular?
Thanks!!
You are creating circular dependencies.
I can do var Comment = $injector.get('Comment'); to avoid the error,
is that the best solution?
I see two solutions:-
1) Lazy injecting solution (you suggested yourself)
This is the best solution to avoid those circular dependencies in AngularJS. Although looking at the AngularFire documentation you are into an uncharted territory as some of these things in AngularFire are experimental in nature.
Here is the working fiddle from your
var Comment = $injector.get('Comment');
You are essentially lazy injecting your references.
http://jsfiddle.net/yogeshgadge/ymxkt6up/5/
2) Module Run block:
With this option you may be able to inject those dependencies into your factories instead of using $injector.
Suppose I'm working with an API which returns JSON data, but which has a complex or variable structure. For example, a string-valued property may be a plain literal, or may be tagged with a language:
/* first pattern */
{ "id": 1,
"label": "a foo"
}
/* second pattern */
{ "id": 2,
"label": [ {"value": "a foo", "lang": "en"},
{"value": "un foo", "lang": "fr"}]
}
In my client-side code, I don't want to have view code worrying about whether a label is available in multiple-languages, and which one to pick, etc. Or I might want to hide the detailed JSON structure for other reasons. So, I might wrap the JSON value in an object with a suitable API:
/** Value object for foo instances sent from server */
var Foo = function( json ) {
this.json = json;
};
/** Return a suitable label for this foo object */
Foo.prototype.label = function() {
var i18n = ... ;
if (i18n.prefLang && _.isArray(this.json.label)) // ... etc etc
};
So this is all pretty normal value-object pattern, and it's helpful because it's more decoupled from the specific JSON structure, more testable, etc. OK good.
What I currently don't see a way around is how to use one of these value objects with Backbone and Marionette. Specifically, I'd like to use a Foo object as the basis for a Backbone Model, and bind it to a Marionette ItemView. However, as far as I can see, the values in a Model are taken directly from the JSON structure - I can't see a way to recognise that the objects are functions:
var modelFoo = new Backbone.Model( foo );
> undefined
modelFoo.get( "label" ).constructor
> function Function() { [native code] }
So my question is: what is a good way to decouple the attributes of a Backbone Model from the specifics of a given JSON structure, such as a complex API value? Can value objects, models and views be made to play nice?
Edit
Let me add one more example, as I think the example above focussing on i18n issues only conveys part of my concern. Simplifying somewhat, in my domain, I have waterbodies comprising rivers, lakes and inter-tidal zones. A waterbody has associated with it one or more sampling points, and each sampling point has a latest sample. This might come back from the data API on the server as something like:
{"id": "GB12345678",
"centre": {"lat": 1.2345, "long": "-2.3456"},
"type": "river",
"samplingPoints": [{"id": "sp98765",
"latestSample": {"date": "20130807",
"classification": "normal"}
}]
}
So in my view code, I could write expressions such as:
<%= waterbody.samplingPoints[0].latestSample.classification %>
or
<% if (waterbody.type === "river") { %>
but that would be horrible, and easily broken if the API format changes. Slightly better, I could abstract such manipulations out into template helper functions, but they are still hard to write tests for. What I'd like to do is have a value object class Waterbody, so that my view code can have something like:
<%= waterbody.latestClassification() %>
One of the main problems I'm finding with Marionette is the insistence on calling toJSON() on the models passed to views, but perhaps some of the computed property suggestions have a way of getting around that.
The cleanest solution IMO is to put the label accessor into the model instead of the VO:
var FooModel = Backbone.Model.extend({
getLabel : function(){
return this.getLocalized("label");
},
getLocalized : function(key){
//return correct value from "label" array
}
});
and let the views use FooModel#getLabel instead of FooModel#get("label")
--EDIT 1
This lib seems interesting for your use case as well: Backbone.Schema
It allows you to formally declare the type of your model's attributes, but also provides some syntax sugar for localized strings and allows you to create dynamic attributes (called 'computed properties'), composed from the values of other attributes.
--EDIT 2 (in response to the edited question)
IMO the VO returned from the server should be wrapped inside a model and this model is passed to the view. The model implements latestClassification, not the VO, this allows the view to directly call that method on the model.
A simple approach to this (possibly to simple for your implementation) would be to override the model's parse method to return suitable attributes:
var modelFoo = Backbone.Model.extend({
parse: function ( json ) {
var i18n = ... ;
if (i18n.prefLang && _.isArray(json.label)) {
// json.label = "complex structure"
}
return json;
}
});
That way only your model worries about how the data from the server is formatted without adding another layer of abstraction.
what seemed a simple task, came to be a challenge for me.
I have the following mongodb structure:
{
(...)
"services": {
"TCP80": {
"data": [{
"status": 1,
"delay": 3.87,
"ts": 1308056460
},{
"status": 1,
"delay": 2.83,
"ts": 1308058080
},{
"status": 1,
"delay": 5.77,
"ts": 1308060720
}]
}
}}
Now, the following query returns whole document:
{ 'services.TCP80.data.ts':{$gt:1308067020} }
I wonder - is it possible for me to receive only those "data" array entries matching $gt criteria (kind of shrinked doc)?
I was considering MapReduce, but could not locate even a single example on how to pass external arguments (timestamp) to Map() function. (This feature was added in 1.1.4 https://jira.mongodb.org/browse/SERVER-401)
Also, there's always an alternative to write storedJs function, but since we speak of large quantities of data, db-locks can't be tolerated here.
Most likely I'll have to redesign the structure to something 1-level deep, like:
{
status:1,delay:3.87,ts:138056460,service:TCP80
},{
status:1,delay:2.83,ts:1308058080,service:TCP80
},{
status:1,delay:5.77,ts:1308060720,service:TCP80
}
but DB will grow dramatically, since "service" is only one of many options which will append each document.
please advice!
thanks in advance
In version 2.1 with the aggregation framework you are now able to do this:
1: db.test.aggregate(
2: {$match : {}},
3: {$unwind: "$services.TCP80.data"},
4: {$match: {"services.TCP80.data.ts": {$gte: 1308060720}}}
5: );
You can use a custom criteria in line 2 to filter the parent documents. If you don't want to filter them, just leave line 2 out.
This is not currently supported. By default you will always receive the whole document/array unless you use field restrictions or the $slice operator. Currently these tools do not allow filtering the array elements based on the search criteria.
You should watch this request for a way to do this: https://jira.mongodb.org/browse/SERVER-828
I'm attempting to do something similar. I tried your suggestion of using the GROUP function, but I couldn't keep the embedded documents separate or was doing something incorrectly.
I needed to pull/get a subset of embedded documents by ID. Here's how I did it using Map/Reduce:
db.parent.mapReduce(
function(parent_id, child_ids){
if(this._id == parent_id)
emit(this._id, {children: this.children, ids: child_ids})
},
function(key, values){
var toReturn = [];
values[0].children.forEach(function(child){
if(values[0].ids.indexOf(product._id.toString()) != -1)
toReturn.push(child);
});
return {children: toReturn};
},
{
mapparams: [
"4d93b112c68c993eae000001", //example parent id
["4d97963ec68c99528d000007", "4debbfd5c68c991bba000014"] //example embedded children ids
]
}
).find()
I've abstracted my collection name to 'parent' and it's embedded documents to 'children'. I pass in two parameters: The parent document ID and an array of the embedded document IDs that I want to retrieve from the parent. Those parameters are passed in as the third parameter to the mapReduce function.
In the map function I find the parent document in the collection (which I'm pretty sure uses the _id index) and emit its id and children to the reduce function.
In the reduce function, I take the passed in document and loop through each of the children, collecting the ones with the desired ID. Looping through all the children is not ideal, but I don't know of another way to find by ID on an embedded document.
I also assume in the reduce function that there is only one document emitted since I'm searching by ID. If you expect more than one parent_id to match, than you will have to loop through the values array in the reduce function.
I hope this helps someone out there, as I googled everywhere with no results. Hopefully we'll see a built in feature soon from MongoDB, but until then I have to use this.
Fadi, as for "keeping embedded documents separate" - group should handle this with no issues
function getServiceData(collection, criteria) {
var res=db[collection].group({
cond: criteria,
initial: {vals:[],globalVar:0},
reduce: function(doc, out) {
if (out.globalVar%2==0)
out.vals.push({doc.whatever.kind.and.depth);
out.globalVar++;
},
finalize: function(out) {
if (vals.length==0)
out.vals='sorry, no data';
return out.vals;
}
});
return res[0];
};