I'm newbie with Firebase + GeoFire and I'm having trouble with geoFire query function.
I want to add in an array the results from geoQuery function and return it in a function. But the data I have manipulating inside geoQuery.on method seems out of scope or not available or due to promises, I dont know... the fact is outside the geoquery.on method the variable sellers is empty.
How can I return results from geoQuery and save it into a return variable
//Set seller position in firebase db
var setPosition = function() {
navigator.geolocation.getCurrentPosition(setPositionSuccess, error, options);
//navigator.geolocation.watchPosition(setPositionSuccess, positionError, { enableHighAccuracy:true })
};
//Get sellers near buyer position
var getSellersForCurrentPosition = function() {
navigator.geolocation.getCurrentPosition(getPositionSuccess, error, options);
//navigator.geolocation.watchPosition(positionSuccess, positionError, { enableHighAccuracy:true })
};
//Callback function from html 5 geo api
function getPositionSuccess(pos) {
var crd = pos.coords;
var currentPosition = [crd.latitude, crd.longitude];
// Query radius
var radiusInKm = 2;
var firebaseRef = new Firebase(FBURL + "/geofire/sellers/");
var geoFire = new GeoFire(firebaseRef);
var geoQuery = geoFire.query({
center: currentPosition,
radius: radiusInKm
});
var sellers = [];
var oneSeller = {};
var onKeyEnteredRegistration = geoQuery.on("key_entered", function(key, location, distance) {
oneSeller = {
id: key,
distance: distance,
location: location
};
sellers.push(oneSeller);
});
var onReadyRegistration = geoQuery.on("ready", function() {
geoQuery.cancel();
});
return sellers;
}
By the way, how accurate is html5 geolocation? Is it different between desktop browser and mobile browser?
Geofire monitors the sellers in the range you indicate. Any time a seller enters/exits the range, it fires a key_entered or key_exited event. These events can happen at any time after you start the query. In JavaScript terms this is often described as: the callbacks happen asynchronously.
A simple event flow, might explain what happens best:
you call getPositionSuccess()
you start a Geoquery to monitor the sellers that are in range: geoFire.query()
no sellers are immediately in range, so your callback doesn't fire
the getPositionSuccess() function is done and exits
a seller comes in range
GeoFire fires the key_entered event and your callback runs
but getPositionSuccess() has already exited, so how can it return a value?
Even if you were to wait for the first seller to come into range before returning (not possible in a browser, but it is possible in other languages/environments), how will you return the value when a second seller comes in range?
For this reason, you have to deal with asynchronous data differently. Typically you do this by moving the code that would call the getPositionSuccess() function into the function.
Say you are now trying to do this:
var sellers = getPositionSuccess(pos);
sellers.forEach(function(seller) {
addSellerToMap(seller);
});
To handle the asynchronous nature of the events, you'd move this code into getPositionSuccess:
//Callback function from html 5 geo api
function getPositionSuccess(pos) {
var crd = pos.coords;
var currentPosition = [crd.latitude, crd.longitude];
// Query radius
var radiusInKm = 2;
var firebaseRef = new Firebase(FBURL + "/geofire/sellers/");
var geoFire = new GeoFire(firebaseRef);
var geoQuery = geoFire.query({
center: currentPosition,
radius: radiusInKm
});
var oneSeller = {};
geoQuery.on("key_entered", function(key, location, distance) {
oneSeller = {
id: key,
distance: distance,
location: location
};
addSellerToMap(oneSeller);
});
}
I understand that in your use-case your sellers won't move, so it may be more intuitive to think of them as a static list. But even in this case, the results are loaded from a remote database and it will take some time before that data is loaded. The modern web loads data asynchronously and all your code will have to deal with it in a way similar to what I outlined above.
Related
Trying to cut down code repetition, I've set up a $firebaseArray extension as follows:
var listUsersFactory = $firebaseArray.$extend({
$$added: function (snap) {
return new Customer(snap);
},
$$updated: function (snap) {
var c = this.$getRecord(snap.key);
var updated = c.updated(snap);
return updated;
},
});
and the Customer code:
function Customer(snap) {
this.$id = snap.key;
this.updated(snap);
}
Customer.prototype = {
updated: function(snap) {
var oldData = angular.extend({}, this.data);
this.data = snap.val();
// checks and such
}
}
This works wonders when loading, showing and saving a list of customers, and I'm satisfied with it.
Now, the problem lies in retrieving a single customer and its detail page, because the Customer object isn't an extension of $fireObject and is therefore lacking a $save method
Single customer loading:
customersRef.once("value", function(snapshot) {
if(snapshot.child(uuid).exists())
{
customersFactory.customerDetails = new Customer(snapshot.child(uuid));
return deferred.resolve();
}
}
but when I call customersFactory.customerDetails.$save() I get an error
How can I extend my class so that it works for both array and single object uses?
I couldn't find a way to do this, so I ended up using the $firebaseArray and getting single records off that to pass as details, in case anyone's wondering
I try to set up this example https://github.com/AngularClass/angular-websocket#usage
Here is my code
App.factory('MyData', function($websocket, $q) {
var dataStream = $websocket('wss://url');
var collection = [];
dataStream.onMessage(function(message) {
var result = JSON.parse(message.data);
console.log(result);
collection = result;
});
var methods = {
collection: collection,
get: function() {
dataStream.send(JSON.stringify({
api: "volume",
date: "2017-02-01",
interval: 600
}));
}
};
return methods; });
In my controller I wrote:
$interval(function () {
console.log(MyData.collection);
}, 1000);
The problem is that I don't receive any values, however on message arrive I see console log, so websocket itself is obviously alive. If I change collection.push(result) (like in example) I receive constantly growing array. I need only the last value, however. Why collection = result is wrong ?
var collection = []; instantiates a new array and its reference is stored in the variable collection. Then, this reference is assigned to methods.collection and, hence, MyData.collection. However, with JSON.parse a new array is instantiated. collection = result; overwrites the original reference with the reference of the new array. But MyData.collection still holds the reference to original array.
So, there are two ways to encounter the problem:
Don't overwrite the reference to the original array. push is good, but before, you need to clear the array in order to only show the last value.
collection.splice(0, collection.length);
collection.push(result);
However, that would be an array in an array. You probably need to push the values individually (Array.concat will create a new array, too):
collection.splice(0, collection.length);
result.forEach(function(value) {
collection.push(value);
});
Assign the reference of the new array directly to methods.collection. In this case, no extra variable collection is needed.
App.factory('MyData', function($websocket, $q) {
var dataStream = $websocket('wss://url');
var methods = {
collection: [],
get: function() {
dataStream.send(JSON.stringify({
api: "volume",
date: "2017-02-01",
interval: 600
}));
}
};
dataStream.onMessage(function(message) {
var result = JSON.parse(message.data);
console.log(result);
methods.collection = result;
});
return methods;
});
(was not sure what to have as a title, so if you have a better suggestion, feel free to come up with one - I will correct)
I am working on an angular application where I have some menues and a search result list. I also have a document view area.
You can sort of say that the application behaves like an e-mail application.
I have a few controllers:
DateCtrl: creates a list of dates so the users can choose which dates they want to see posts from.
SourceCtrl: Creates a list of sources so the user can choose from which sources he/she wants to see posts from.
ListCtrl: The controller populating the list. The data comes from an elastic search index. The list is updated every 10-30 seconds (trying to find the best interval) by using the $interval service.
What I have tried
Sources: I have tried to make this a filter, but a user clicks two checkboxes the list is not sorted by date, but on which checkbox the user clicked first.
If it is possible to make this work as a filter, I'd rather continue doing that.
The current code is like this, it does not do what I want:
.filter("bureauFilter", function(filterService) {
return function(input) {
var selectedFilter = filterService.getFilters();
if (selectedFilter.length === 0) {
return input;
}
var out = [];
if (selectedFilter) {
for (var f = 0; f < selectedFilter.length; f++) {
for (var i = 0; i < input.length; i++) {
var myDate = input[i]._source.versioncreated;
var changedDate = dateFromString(myDate);
input[i]._source.sort = new Date(changedDate).getTime();
if (input[i]._source.copyrightholder === selectedFilter[f]) {
out.push(input[i]);
}
}
}
// return out;
// we need to sort the out array
var returnArray = out.sort(function(a,b) {
return new Date(b.versioncreated).getTime() - new Date(a.versioncreated).getTime();
});
return returnArray;
} else {
return input;
}
}
})
Date: I have found it in production that this cannot be used as a filter. The list of posts shows the latest 1000 posts, which is only a third of all posts arriving each day. So this has to be changed to a date-search.
I am trying something like this:
.service('elasticService', ['es', 'searchService', function (es, searchService) {
var esSearch = function (searchService) {
if (searchService.field === "versioncreated") {
// doing some code
} else {
// doing some other type of search
}
and a search service:
.service('searchService', function () {
var selectedField = "";
var selectedValue = "";
var setFieldAndValue = function (field, value) {
selectedField = field;
selectedValue = value;
};
var getFieldAndValue = function () {
return {
"field": selectedField,
"value": selectedValue
}
};
return {
setFieldAndValue: setFieldAndValue,
getFieldAndValue: getFieldAndValue
};
})
What I want to achieve is this:
When no dates or sources are clicked the whole list shall be shown.
When Source or Date are clicked it shall get the posts based on these selections.
I cannot use filter on Date as the application receives some 3000 posts a day and so I have to query elastic search to get the posts for the selected date.
Up until now I have put the elastic-search in the listController, but I am now refactoring so the es-search happens in a service. This so the listController will receive the correct post based on the selections the user has done.
Question is: What is the best pattern or method to use when trying to achieve this?
Where your data is coming from is pretty irrelevant, it's for you to do the hook up with your data source.
With regards to how to render a list:
The view would be:
<div ng-controller='MyController as myCtrl'>
<form>
<input name='searchText' ng-model='myCtrl.searchText'>
</form>
<ul>
<li ng-repeat='item in myCtrl.list | filter:myCtrl.searchText' ng-bind='item'></li>
</ul>
<button ng-click='myCtrl.doSomethingOnClick()'>
</div>
controller would be:
myApp.controller('MyController', ['ElasticSearchService',function(ElasticSearchService) {
var self = this;
self.searchText = '';
ElasticSearchService.getInitialList().then(function(list) {
self.list = list;
});
self.doSomethingOnClick = function() {
ElasticSearchService.updateList(self.searchText).then(function(list) {
self.list = list;
});
}
}]);
service would be:
myApp.service('ElasticSearchService', ['$q', function($q) {
var obj = {};
obj.getInitialList = function() {
var defer = $q.defer();
// do some elastic search stuff here
// on success
defer.resolve(esdata);
// on failure
defer.reject();
return defer.promise();
};
obj.updateList = function(param) {
var defer = $q.defer();
// do some elastic search stuff here
// on success
defer.resolve(esdata);
// on failure
defer.reject();
return defer.promise();
};
return obj;
}]);
This code has NOT been tested but gives you an outline of how you should approach this. $q is used because promises allow things to be dealt with asynchronously.
I've got a factory that gets my data from Firebase, and I want my controller to be able to access it. However, when I console.log the data in my controller, it isn't the Array[10] that I would expect it to be, but rather an Array with keys 0,1,2,..10, $$added, $$error, $$moved,... and so on. However, when I skip out on using the factory, and use $asArray() method on my firebase ref directly in my controller it shows up nicely as an Array[10]
In my factory, this is what it looks like..
var listingsref = new Firebase("https://something.firebaseio.com");
var sync2 = $firebase(listingsref);
var products = sync2.$asArray();
factory.getProducts = function(){
return products;
};
Controller
$scope.products = marketFactory.getProducts();
console.log($scope.products) in my controller should be Array[10], but instead it's an Array with the data + a lot more $$ methods. Anyone know what's going on? Thanks
EDIT: Full Factory File
(function(){
var marketFactory = function($firebase){
var listingsref = new Firebase("https://something.firebaseio.com");
var sync2 = $firebase(listingsref);
var products = sync2.$asArray();
var factory = {};
factory.getProducts = function(){
console.log(products);
return products;
};
factory.getProduct = function(productId){
for(var x = 0; x<products.length ;x++){
if(productId == products[x].id){
return {
product:products[x],
dataPlace:x
};
}
}
return {};
};
factory.getNextProduct = function(productId, e){
var currentProductPlace = factory.getProduct(productId).dataPlace;
if (e=="next" && currentProductPlace<products.length){
return products[currentProductPlace+1];
}
else if(e=="prev" && currentProductPlace>0){
return products[currentProductPlace-1];
}
else{
return {};
}
};
factory.componentToHex = function(c){
var hex = c.toString(16);
return hex.length == 1 ? "0" + hex : hex;
};
factory.rgbToHex = function(r,g,b){
return "#" + factory.componentToHex(r) + factory.componentToHex(g) + factory.componentToHex(b);
};
factory.hexToRgb = function(hex) {
if(hex.charAt(0)==="#"){
hex = hex.substr(1);
}
var bigint = parseInt(hex, 16);
var r = (bigint >> 16) & 255;
var g = (bigint >> 8) & 255;
var b = bigint & 255;
return r + ", " + g + ", " + b;
};
factory.parseRgb = function(rgb){
rgb = rgb.replace(/\s/g, '');
var red = parseInt(rgb.split(',')[0]);
var green = parseInt(rgb.split(',')[1]);
var blue = parseInt(rgb.split(',')[2]);
return {
r:red,
g:green,
b:blue
};
};
return factory;
};
marketFactory.$inject = ['$firebase'];
angular.module('marketApp').factory('marketFactory', marketFactory);
}());
This snippet gets a synchronized AngulareFire array of products:
var products = sync2.$asArray();
The AngularFire documentation is a bit off on this point: what you get back from $asArray() is not an array, but the promise of an array. At some point in the future your products variable will contain an array. This is done because it may take (quite) some time for your array data to be downloaded from Firebase. Instead of blocking your code/browser while the data is downloading, it returns a wrapper object (called a promise) and just continues.
Such a promise is good enough for AngularJS: if you simply bind products to the scope and ng-repeat over it, your view will show all products just fine. This is because AngularFire behind the scenes lets AngularJS know when the data is available and Angular then redraws the view.
But you said:
console.log($scope.products) in my controller should be Array[10]
That is where you're mistaken. While AngularFire ensures that its $asArray() promise works fine with AngularJS, it doesn't do the same for console.log. So your console.log code runs before the data has been downloaded from Firebase.
If you really must log the products, you should wait until the promise is resolved. You this this with the following construct:
products.$loaded().then(function(products) {
console.log(products);
});
When you code it like this snippet, the data for your products will have been downloaded by the time console.log runs.
Note that the object will still have extra helper methods on it, such as $add. That is normal and also valid on an array. See the documentation for FirebaseArray for more information on what the methods are, what they're for an how to use them.
So I edited the code in the plnkr at http://plnkr.co/M4PqojtgRhDqU475NoRY.
The main differences are the following:
// Add $FirebaseArray so we can extend the factory
var marketFactory = function($firebase, $FirebaseArray){
var listingsref = new Firebase("https://something.firebaseio.com");
// Actually extend the AngularFire factory and return the array
var MarketFactory = $FirebaseArray.$extendFactory(factory);
return function() {
var sync = $firebase(listingsref, {arrayFactory: factory});
return sync.$asArray();
};
Check out https://www.firebase.com/docs/web/libraries/angular/guide.html#section-extending-factories for more information on extending AngularFire entries. You will likely need to make some adjustments to the rest of the factory code.
This problem has me stumped.
For some reason, the autoincrementing key generator in indexedDB resets after performing and update on an existing object with a put-transaction, leading to overwrites of data in the database.
For my app, I'm using a self written IndexedDB service for angularJS with all the basic CRUD functions implemented.
I may also add that I'm developing with Ionic Framework, even though I doubt that is to blame.
Considering the service is a work-in-progress, I've let the key path for an object store default to "id" with an autoincrementing strategy.
The indices for the given store, nevertheless, are up to the user to decide in a specific object.
As an example:
dbHelper.objectStores = [{'employees',
indices: [{indexName: 'name', isUnique: false},
{indexName: 'phone', isUnique: true}]}];
This would, unless already created in the db, create the object store 'employees' with indices 'name' and 'phone', where 'phone' would have to be a unique value while 'name' would not.
Here is the implementation of the openDB function.
Please note that dbHelper.objectStores is supposed to be empty as it's up to the user to assign these properties before opening the db(or else it is defaulted).
angular.module('dbProvider', [])
.factory('$db', ['$window', function($window) {
// DB Object
var dbHelper = {};
// Properties - Are given defaults unless assigned manually by user before openDB is invoked.
dbHelper.dbName = 'defaultDB';
dbHelper.dbVersion = 1;
dbHelper.objectStores = [];
dbHelper.openDB = function(onCompleteCallback, onErrorCallback) {
console.log('Atempting to open db with name ' + dbHelper.dbName + '.');
var request = $window.indexedDB.open(dbHelper.dbName, dbHelper.dbVersion);
// Invoked by indexedDB if version changes
request.onupgradeneeded = function(e) {
console.log('Version change. Current version: ' + dbHelper.dbVersion);
var db = e.target.result;
e.target.transaction.onerror = onErrorCallback;
if(dbHelper.objectStores.length === 0) {
dbHelper.objectStores.push({name:'defaultStore', indices: []});
}
for(var store in dbHelper.objectStores) {
if(db.objectStoreNames.contains(dbHelper.objectStores[store].name)) {
console.log(dbHelper.objectStores[store].name + ' deleted.');
db.deleteObjectStore(dbHelper.objectStores[store].name);
}
var newStore = db.createObjectStore(dbHelper.objectStores[store].name, {keyPath: "id", autoIncrement: true});
for(var index in dbHelper.objectStores[store].indices) {
newStore.createIndex(dbHelper.objectStores[store].indices[index].indexName,
dbHelper.objectStores[store].indices[index].indexName,
{unique : dbHelper.objectStores[store].indices[index].isUnique});
}
console.log(dbHelper.objectStores[store].name + ' created.');
}
};
request.onsuccess = function(e) {
console.log('DB ' + dbHelper.dbName + ' open.');
dbHelper.indexedDB.db = e.target.result;
onCompleteCallback();
};
request.onerror = onErrorCallback;
};
Here are some of the CRUD functions(the ones in question):
dbHelper.findItemWithIndex = function(keyValue, storename,
onCompleteCallback,onErrorCallback) {
var db = dbHelper.indexedDB.db;
var trans = db.transaction([storename], "readwrite");
var store = trans.objectStore(storename);
var index = store.index(keyValue.key);
index.get(keyValue.value).onsuccess = function(event) {
onCompleteCallback(event.target.result);
};
};
dbHelper.addItemToStore = function(item, storename,
onCompleteCallback, onErrorCallback) {
var db = dbHelper.indexedDB.db;
var trans = db.transaction([storename], "readwrite");
var store = trans.objectStore(storename);
var request = store.add(item);
trans.oncomplete = onCompleteCallback;
request.onerror = onErrorCallback;
};
dbHelper.deleteItemFromStore = function(itemId, storename,
onCompleteCallback, onErrorCallback) {
var db = dbHelper.indexedDB.db;
var trans = db.transaction([storename], "readwrite");
var store = trans.objectStore(storename);
var request = store.delete(itemId);
trans.oncomplete = onCompleteCallback;
request.onerror = onErrorCallback;
};
dbHelper.updateItem = function(item, storename, onCompleteCallback, onErrorCallback) {
var db = dbHelper.indexedDB.db;
var trans = db.transaction([storename], "readwrite");
var store = trans.objectStore(storename);
var request = store.put(item);
trans.oncomplete = onCompleteCallback;
request.onerror = onErrorCallback;
};
Finally, the code from the controller where the transactions are invoked.
The strategy here, is that the item is added to the db using the addItemToStore function the first time it is persisted, and then afterwards the updateItem function.
After adding the first time, the object is immediately fetched in order to keep working on it with the assigned id from the db.
$scope.updateTemplate = function() {
console.log('Saving..');
var onCompleteCallback = {};
if(!$scope.formTemplate.firstSave) {
onCompleteCallback = $scope.updateModel;
} else {
$scope.formTemplate.firstSave = false;
onCompleteCallback = $scope.setId;
}
$db.updateItem($scope.formTemplate, $scope.objectStore.name,
onCompleteCallback, $scope.dbError);
};
$scope.newItem = function() {
$db.addItemToStore($scope.formTemplate, $scope.objectStore.name,
$scope.setId, $scope.dbError);
};
$scope.setId = function() {
$db.findItemWithIndex(
{key: 'title',
value: $scope.formTemplate.title},
$scope.objectStore.name,
function(result) {
console.log(JSON.stringify(result));
$scope.formTemplate = result;
},
function(error) {
$scope.dbError(error);
});
}
It's here everything goes to hell.
I add an object, go back to another view and find it in the list with id=1.
I add another object, go back to the list view, and there it is with id=2.
And so forth and so forth..
Then, after updating either of the objects with the $scope.updateTemplate function, which also works like a charm, things get interesting:
The next object added gets id=1 and totally erases good old numero uno from earlier.
The next objects also get id's that cause them to replace the already existing objects.
What could cause this?
For testing I'm using Safari 8 in OS 10.10 and I'm deploying to an LGG2 with KitKat 4.4.2.
To be honest, I skimmed, but I saw this, "Safari 8" - the latest iOS and Safari have serious bugs with IndexedDB: http://www.raymondcamden.com/2014/9/25/IndexedDB-on-iOS-8--Broken-Bad
In iOS9, many of the IndexedDb bugs are fixed, but not all. We are currently testing on iOS9 Beta 2 and this particular bug that you found is not fixed.
We were able to work around this problem by not using autoincrement on our object stores. We just manually find the max key value and increment that.
Inserting an object looks something like this:
var store = db.transaction([entity], "readwrite").objectStore(entity);
store.openCursor(null, "prev").onsuccess = function (event) {
var maxKey = event.target.result.key || 0;
object.id = maxKey + 1;
store.add(object);
}