Breeze 1-m-1 in HotTowel Angular with local storage - angularjs

I've had a requirement recently to implement a UI for managing a many-many relationship. Ward Bell kindly provided this plunker showing how to implement using 1-m-1 with Angular and Breeze.
My app's design is based largely (especially the datacontext and the local storage) is based largely on John Papa's recent Pluralsight courses.
In my app, BusUnit = Hero and Dimension = Power (in reference to Ward's example.
Everything seems to be working well when I force the app to fetch data from the server, in that my updates to a business unit's dimensions reflect correctly. The problem I'm facing now is when I navigate away from the page and back again (which gets data from local storage). In this case:
if I previously added a new dimension to a business unit, everything is ok, but
if i previously marked a business unit's dimension for deletion and the save, the dimension still appears for the business unit in question.
this is the controller code that initially gets business units and their dimensions:
function getdboardStructure() {
var busUnitsPromise = datacontextSvc.busUnits.getByDboardConfigId(vm.dboardConfig.id);
var dimensionsPromise = datacontextSvc.dimensions.getByDboardConfigId(vm.dboardConfig.id);
$q.all([busUnitsPromise, dimensionsPromise])
.then(function (values) {
vm.busUnits = values[0];
vm.dims = values[1];
createBusUnitVms();
//vm.currentBusUnitVm = vm.busUnitVms[0]; // not required as using accordion instead of drop-down
vm.hasChanges = false;
});
}
this is the code in my controller that prepares for the save:
function applyBusUnitDimensionSelections(busUnitVm) {
var busUnit = busUnitVm.busUnit;
var mapVms = busUnitVm.dimensionMapVms;
var dimensionHash = createBusUnitDimensionHash(busUnit);
mapVms.forEach(function (mapVm) {
var map = dimensionHash[mapVm.dimension.id];
if (mapVm.selected) {
if (!map) {
datacontextSvc.busUnits.addBusUnitDimension(busUnit, mapVm.dimension)
.then(function () {
});
}
} else {
if (map) {
datacontextSvc.markDeleted(map);
}
}
});
}
this is the code in my controller that executes the save:
function save() {
if (!canSave()) {
return $q.when(null);
}
vm.isSaving = true;
vm.busUnitVms.forEach(applyBusUnitDimensionSelections);
return datacontextSvc.save().then(function (saveResult) {
vm.isSaving = false;
trapSavedDboardConfigId(saveResult); // not relevant to use case
}, function (error) {
vm.isSaving = false;
});
}
this is the code in my repository that add a new busUnitDimension entity:
function addBusUnitDimension(busUnit, dimension) {
var newBusUnitDimension = this.em.createEntity(busUnitDimension);
newBusUnitDimension.busUnitId = busUnit.id;
newBusUnitDimension.dimensionId = dimension.id;
return this.$q.when(newBusUnitDimension);
}
this is my datacontext code for marking an item deleted:
function markDeleted(entity) {
return entity.entityAspect.setDeleted();
}
and finally this is the repository code to get business units and their join table entities:
function getByDboardConfigId(dboardConfigId, forceRefresh) {
var self = this;
var predicate = pred.create('dboardConfigId', '==', dboardConfigId);
var busUnits;
if (self.zStorage.areItemsLoaded('busUnits') && !forceRefresh) {
busUnits = self._getAllLocal(entityName, orderBy, predicate);
return self.$q.when(busUnits);
}
return eq.from('BusUnits')
.expand('BusUnitDimensions')
.where(predicate)
.orderBy(orderBy)
.using(self.em).execute()
.to$q(succeeded, self._failed);
function succeeded(data) {
busUnits = data.results;
self.zStorage.areItemsLoaded('busUnits', true);
self.zStorage.save();
//self.logSuccess('Retrieved ' + busUnits.length + ' business units from server', busUnits.length, true);
return busUnits;
}
}
My departure from John's course examples is that I'm using expand in the function I use to get Business Units from the server, and my hypothesis is that this has something to do with the fact that breeze is going to the server everytime I refresh the page (without clearing cache) instead, and that this also has something to do with the error i'm receiving if I navigate away and then back to the page.
Can anyone offer and suggestions?

Appreciate this was a long time ago and you have probably solved it or moved on but I came up against the same problem recently that took me ages to resolve.
The answer I found is that you have to edit JP's angular.breeze.storagewip.js file.
I contains the names of the entities hard-coded into the file and you will need to change these to match your own entities.
There are two functions where you need to do this, examples below show the changes with the four entities I am using:
function zStorageCore($rootScope, zStorageConfig) {
var storeConfig = zStorageConfig.config;
var storeMeta = {
breezeVersion: breeze.version,
appVersion: storeConfig.version,
isLoaded: {
elementAssets : false,
surveyors : false,
elements : false,
assets : false
}
};
and...
function checkStoreImportVersionAndParseData(importedData) {
if (!importedData) {
return importedData;
}
try {
var data = JSON.parse(importedData);
var importMeta = data[0];
if (importMeta.breezeVersion === storeMeta.breezeVersion &&
importMeta.appVersion === storeMeta.appVersion) {
if (importMeta.isLoaded) {
storeMeta.isLoaded.assets = storeMeta.isLoaded.assets || importMeta.isLoaded.assets;
storeMeta.isLoaded.elements = storeMeta.isLoaded.elements || importMeta.isLoaded.elements;
storeMeta.isLoaded.surveyors = storeMeta.isLoaded.surveyors || importMeta.isLoaded.surveyors;
storeMeta.isLoaded.elementAssets = storeMeta.isLoaded.elementAssets || importMeta.isLoaded.elementAssets;
}
return data[1];
} else {
_broadcast(storeConfig.events.error,
'Did not load from storage because mismatched versions',
{ current: storeMeta, storage: importMeta });
}
} catch (ex) {
_broadcast(storeConfig.events.error, 'Exception during load from storage: ' + ex.message, ex);
}
return null; // failed
}
I solved this by comparing JP's Style Guide course files with his SPA/Angular/Breeze course.

Related

getData() is not called when using authentication type of PATH_USER_PASS

I have encounter a problem when I am trying to build a google data studio community connector. Specifically, everything works and I can see the data rendered on the explorer screen if I user USER_PASS option of authentication, but if I use PATH_USER_PASS it doesn't render properly. When I look at the stack trace it doesn't even show that getData() function is executed. Can somebody help me?
publishing the code below from manifest file and hit CONNECT and EXPLORE will successfully give one row of data.
There will also be success when I change nothing but the AuthType to USER_PASS.
However, it will break when I change nothing but the AuthType to PATH_USER_PASS.
Note I hardcoded things into my getData and getSchema so running this code doesn't need any user input. The processing of getting data is not bind in any shape or form to the authentication process. This demonstrates this is possibly one of Google Data Studio's bugs.
As I said, getData() function is not even ran when I switch to PATH_USER_PATH authentication method.
Any help will be appreciated!!!!
main.js :
var cc = DataStudioApp.createCommunityConnector();
function getAuthType() {
return cc.newAuthTypeResponse()
.setAuthType(cc.AuthType.NONE)
.build();
}
function isAuthValid() {
return true;
}
function setCredentials(request) {}
function getConfig(request) {
var config = cc.getConfig();
return config.build();
}
function getSchema(request) {
var fields = getFields(request).build();
return { schema: fields };
}
function responseToRows(requestedFields, response) {
return response.map(function(submissions) {
var row = [];
requestedFields.asArray().forEach(function (field) {
switch (field.getId()) {
case 'student_name':
return row.push(submissions.student_name);
case 'student_age':
return row.push(submissions.student_age);
case 'student_school_year':
return row.push(submissions.student_school_year);
case 'submissionDate':
return row.push(submissions.__system.submissionDate)
default:
return row.push('');
}
});
return { values: row };
});
}
function getData(request) {
var requestedFieldIds = request.fields.map(function(field) {
return field.name;
});
var requestedFields = getFields().forIds(requestedFieldIds);
// fake response from hardCodedData.
var hardCodedData ='{"value":[{"__id":"uuid:3ab058df-5039-41cd-b16b-5c21f01bf60b","student_name":"Pieter Benjamin","student_age":21,"student_school_year":"Senior","select_student_school_year":"senior","__system":{"submissionDate":"2020-10-10T21:02:40.428Z","submitterId":"532","submitterName":"Pieter Benjamin","attachmentsPresent":0,"attachmentsExpected":0,"status":null},"meta":{"instanceID":"uuid:3ab058df-5039-41cd-b16b-5c21f01bf60b"}}],"#odata.context":"https://sandbox.central.getodk.org/v1/projects/124/forms/odata%20connector%20scheme.svc/$metadata#Submissions"}'
var parsedResponse = JSON.parse(hardCodedData).value;
var rows = responseToRows(requestedFields, parsedResponse);
return {
schema: requestedFields.build(),
rows: rows
};
}
// hard coded schema
function getFields(request) {
var cc = DataStudioApp.createCommunityConnector();
var fields = cc.getFields();
var types = cc.FieldType;
var aggregations = cc.AggregationType;
fields.newDimension()
.setId('student_name')
.setType(types.TEXT);
fields.newMetric()
.setId('student_age')
.setType(types.NUMBER);
fields.newMetric()
.setId('student_school_year')
.setType(types.TEXT);
fields.newDimension()
.setId('submissionDate')
.setType(types.YEAR_MONTH_DAY);
return fields;
}
function isAdminUser() {
return true;
}
appsscript.json:
{
"timeZone": "America/Los_Angeles",
"dependencies": {
},
"webapp": {
"access": "ANYONE",
"executeAs": "USER_ACCESSING"
},
"oauthScopes": ["https://www.googleapis.com/auth/script.external_request"],
"runtimeVersion": "V8",
"dataStudio": {
"name": "ODK central API connector",
"logoUrl": "https://getodk-a3b1.kxcdn.com/uploads/default/original/2X/3/381d364b5dd1069f6540bbd7a38ea48f11023ae9.jpg",
"company": "UW Impact++",
"companyUrl": "https://sites.google.com/view/udubimpact",
"addonUrl": "https://github.com/UDub-Impact/OData-Connector/blob/master/readme.md",
"supportUrl": "https://github.com/googledatastudio/community-connectors/issues",
"description": "Get your data from ODK central",
"sources": ["npm"],
"templates": {
"default": "1twu0sHjqR5dELAPyGJcw4GS3-D0_NTrQ"
}
}
}
As of 11/3/2020, this problem of not being able to use PATH_USER_PASS has been resolved. I didn't change any code but that option of authentication works now.

Extending $firebaseArray with an extended $firebaseObject

Trying to cut down code repetition, I've set up a $firebaseArray extension as follows:
var listUsersFactory = $firebaseArray.$extend({
$$added: function (snap) {
return new Customer(snap);
},
$$updated: function (snap) {
var c = this.$getRecord(snap.key);
var updated = c.updated(snap);
return updated;
},
});
and the Customer code:
function Customer(snap) {
this.$id = snap.key;
this.updated(snap);
}
Customer.prototype = {
updated: function(snap) {
var oldData = angular.extend({}, this.data);
this.data = snap.val();
// checks and such
}
}
This works wonders when loading, showing and saving a list of customers, and I'm satisfied with it.
Now, the problem lies in retrieving a single customer and its detail page, because the Customer object isn't an extension of $fireObject and is therefore lacking a $save method
Single customer loading:
customersRef.once("value", function(snapshot) {
if(snapshot.child(uuid).exists())
{
customersFactory.customerDetails = new Customer(snapshot.child(uuid));
return deferred.resolve();
}
}
but when I call customersFactory.customerDetails.$save() I get an error
How can I extend my class so that it works for both array and single object uses?
I couldn't find a way to do this, so I ended up using the $firebaseArray and getting single records off that to pass as details, in case anyone's wondering

angular push result to controller

(was not sure what to have as a title, so if you have a better suggestion, feel free to come up with one - I will correct)
I am working on an angular application where I have some menues and a search result list. I also have a document view area.
You can sort of say that the application behaves like an e-mail application.
I have a few controllers:
DateCtrl: creates a list of dates so the users can choose which dates they want to see posts from.
SourceCtrl: Creates a list of sources so the user can choose from which sources he/she wants to see posts from.
ListCtrl: The controller populating the list. The data comes from an elastic search index. The list is updated every 10-30 seconds (trying to find the best interval) by using the $interval service.
What I have tried
Sources: I have tried to make this a filter, but a user clicks two checkboxes the list is not sorted by date, but on which checkbox the user clicked first.
If it is possible to make this work as a filter, I'd rather continue doing that.
The current code is like this, it does not do what I want:
.filter("bureauFilter", function(filterService) {
return function(input) {
var selectedFilter = filterService.getFilters();
if (selectedFilter.length === 0) {
return input;
}
var out = [];
if (selectedFilter) {
for (var f = 0; f < selectedFilter.length; f++) {
for (var i = 0; i < input.length; i++) {
var myDate = input[i]._source.versioncreated;
var changedDate = dateFromString(myDate);
input[i]._source.sort = new Date(changedDate).getTime();
if (input[i]._source.copyrightholder === selectedFilter[f]) {
out.push(input[i]);
}
}
}
// return out;
// we need to sort the out array
var returnArray = out.sort(function(a,b) {
return new Date(b.versioncreated).getTime() - new Date(a.versioncreated).getTime();
});
return returnArray;
} else {
return input;
}
}
})
Date: I have found it in production that this cannot be used as a filter. The list of posts shows the latest 1000 posts, which is only a third of all posts arriving each day. So this has to be changed to a date-search.
I am trying something like this:
.service('elasticService', ['es', 'searchService', function (es, searchService) {
var esSearch = function (searchService) {
if (searchService.field === "versioncreated") {
// doing some code
} else {
// doing some other type of search
}
and a search service:
.service('searchService', function () {
var selectedField = "";
var selectedValue = "";
var setFieldAndValue = function (field, value) {
selectedField = field;
selectedValue = value;
};
var getFieldAndValue = function () {
return {
"field": selectedField,
"value": selectedValue
}
};
return {
setFieldAndValue: setFieldAndValue,
getFieldAndValue: getFieldAndValue
};
})
What I want to achieve is this:
When no dates or sources are clicked the whole list shall be shown.
When Source or Date are clicked it shall get the posts based on these selections.
I cannot use filter on Date as the application receives some 3000 posts a day and so I have to query elastic search to get the posts for the selected date.
Up until now I have put the elastic-search in the listController, but I am now refactoring so the es-search happens in a service. This so the listController will receive the correct post based on the selections the user has done.
Question is: What is the best pattern or method to use when trying to achieve this?
Where your data is coming from is pretty irrelevant, it's for you to do the hook up with your data source.
With regards to how to render a list:
The view would be:
<div ng-controller='MyController as myCtrl'>
<form>
<input name='searchText' ng-model='myCtrl.searchText'>
</form>
<ul>
<li ng-repeat='item in myCtrl.list | filter:myCtrl.searchText' ng-bind='item'></li>
</ul>
<button ng-click='myCtrl.doSomethingOnClick()'>
</div>
controller would be:
myApp.controller('MyController', ['ElasticSearchService',function(ElasticSearchService) {
var self = this;
self.searchText = '';
ElasticSearchService.getInitialList().then(function(list) {
self.list = list;
});
self.doSomethingOnClick = function() {
ElasticSearchService.updateList(self.searchText).then(function(list) {
self.list = list;
});
}
}]);
service would be:
myApp.service('ElasticSearchService', ['$q', function($q) {
var obj = {};
obj.getInitialList = function() {
var defer = $q.defer();
// do some elastic search stuff here
// on success
defer.resolve(esdata);
// on failure
defer.reject();
return defer.promise();
};
obj.updateList = function(param) {
var defer = $q.defer();
// do some elastic search stuff here
// on success
defer.resolve(esdata);
// on failure
defer.reject();
return defer.promise();
};
return obj;
}]);
This code has NOT been tested but gives you an outline of how you should approach this. $q is used because promises allow things to be dealt with asynchronously.

Angular - Organise controller, factory and "class"

I would like to understand how to have a nice organisation in my angular project.
[see code below]
Does it makes sense to have the getFireList function into the Factory ? Or should i put it into the controller ?
Does the "class" Fire makes sense ? Should i remove it ? Should i move it to the controller ? Should i move it the the factory ?
If you see anything wrong in this code i'm really interested to learn more.
For now, i've got this :
A class "Fire" to create new object of type Fire.
function Fire (p_power) {
// ATTRIBUTES
this.id = null;
this.power = p_power;
this.position = {
x: null,
y: null
}
// GETTERS/SETTERS
// id
this.getId = function() {
return this.id;
}
this.setId = function(p_id) {
this.id = p_id;
}
// power
this.getPower = function() {
return this.power;
}
this.setPower = function(p_power) {
this.power = p_power;
}
// position
this.getPosition = function() {
return this.position;
}
this.setPosition = function(p_position) {
this.position = p_position;
}
// METHODS
this.increasePower = function(p_plus) {
this.power += p_plus;
}
this.decreasePower = function(p_minus) {
this.power -= p_minus;
}
}
A controller
simuApp.controller('FireController', function($scope, FireFactory) {
// ...
});
And a factory
simuApp.factory('FireFactory', function() {
return {
fire_list: [],
getFireList : function() {
return $http.get(site_url+'fire/fireList').
then(
function(success) {
var data = success.data;
var fires = [];
var fire_tmp;
for (i=0 ; i<data.length ; i++) {
fire_tmp = new Fire( data[i].power );
fire_tmp.setId( data[i].idFire );
fires.push( fire_tmp );
}
fire_list = fires;
return fire_list;
}, function(err) {
// ...
}
);
}
}
});
Thanks for your help.
First, let's get the terminology right. .factory is a method to register a function that generates an instance of the service - hence "factory". What it generates, though, is a singleton service instance.
So, the service you create would be more properly named as FireSvc (as opposed to FireFactory), whereas the function that creates it could have the word "factory" in it (although, in the case below, that function name is not really needed - it could just be an anonymous function):
.factory("FireSvc", function FireSvcFactory(){
});
It is a good practice to use a Service to abstract away any domain/business logic from the controller. Keep the controller thin, responsible only to define the ViewModel, and react to events by changing the ViewModel or invoking functions on the Model.
So, having FireSvc.getFireList() makes sense.
Now, whether the list is a collection of plain objects, or instances of Fire is completely independent of Angular and is entirely up to you. In any case, it is too broad of a topic to discuss in a SO answer.

Fetch a Backbone.Collection made up of other collections

So I have a few types of data:
Post
Project
Event
And each of those data models have their own collection and a route to view them:
/posts => app.postsCollection
/projects => app.projectsCollection
/events => app.eventsCollection
Now I want to add another route:
/ => app.everythingCollection
How can I create a collection which displays an aggregate of the other three collections, but without fetching all the post project and event data again?
Similarly, calling everythingCollection.fetch() would fill the postsCollection, projectsCollection and eventsCollection so that their data was available when they were rendered independently.
The whole point being never to download the same data twice.
Your app.everythingCollection doesn't have to be a really backbone collection. All it needs is just access and fetch to other collections.
You can inherit the Backbone.Events to gain all the events facilities also.
var fetchedRecords = {posts: 0, projects: 0, events: 0};
var Everything = function () {}
_.extend(Everything.prototype, Backbone.Events, {
fetch: function (option) {
that = this;
this.count = 0;
option.success = function () {that.doneFetch(arguments)};
if (fetchRecords.posts == 0) {
option.fetchedName = "posts";
app.postsCollection.fetch(option);
this.count ++;
}
if (fetchRecords.projects == 0) {
option.fetchedName = "projects";
app.projectsCollection.fetch(option);
this.count ++;
}
if (fetchRecords.events == 0) {
option.fetchedName = "events";
app.eventsCollection.fetch(option);
this.count ++;
}
},
donefetch: function (collection, response, options) {
if (this.count <=0) return;
this.count --;
if (this.count == 0) {
if (options.reset) this.trigger("reset");
}
fetchedRecords[options.fetchedName] ++;
},
posts: function () {return app.postsCollection},
projects: function () {return app.projectsCollection},
events: function () {return app.eventsCollection}
});
app.everythingCollection = new Everything;
everythingView.listenOn app.everythingCollection, "reset", everythingView.render;
app.everythingCollection.fetch({reset: true});
You will need to increment fetchedRecrods count to prevent fetch multiple times.
Something like this. Code is untested. But idea is the same.
var EverythingCollection = Backbone.Model.extend({
customFetch: function (){
var collections = [app.postsCollection, app.projectsCollection, app.eventsCollection],
index = -1,
collection,
that = this;
this.reset(); //clear everything collection.
//this function check collections one by one whether they have data or not. If collection don't have any data, go and fetch it.
function checkCollection() {
if (index >= collections.length) { //at this point all collections have data.
fillEverything();
return;
}
index = index + 1;
collection = collections[index];
if (collection && collection.models.length === 0) { //if no data in collection.
collection.fetch({success: function () {
checkCollection();
}});
} else { //if collection have data already, go to next collection.
checkCollection();
}
}
function fillEverything() {
collections.forEach(function (collection) {
if (collection) {
that.add(collection.models); //refer this http://backbonejs.org/#Collection-add
}
});
}
}
});
use like below.
app.everythingCollection = new EverythingCollection();
app.everythingCollection.customFetch();
for other collections, check models length before fetch data. Something like below.
if (app.postsCollection.models.length === 0) {
app.postsCollection.fetch();
}
Store all necessary collections in an array or object at app startup, attach an event listener to each of them listening for the first reset event and remember the ones you fetched in a second array. If the route where you need all collections is used you can fetch the ones not found in the array for the already fetched collections:
(untested, but it will give you the idea of how i suppose to do it)
var allCollections = [app.postsCollection, app.projectsCollection, app.eventsCollection];
var fetchedCollections = [];
$.each(allCollection, function(i, coll){
coll.once("reset", function(){
fetchedCollections.push(coll);
})
});
var fetchAll = function(){
$.each(allCollections, function(i, coll){
if( $.inArray(coll, fetchedCollections ) == -1 ){
coll.fetch();
}
});
}
Do this in your everythingCollection and you have the everythingCollection.fetchAll() functionality you need. You could also override the fetch function of the everythingCollection to first fetch all other collections:
fetch(options){
this.fetchAll();
return Backbone.Collection.prototype.fetch.call(this, options);
}
It sounds like braddunbar's supermodel or benvinegar's backbone.uniquemodel might address your problem
It's also worth checking out Soundcloud's article (see Sharing Models Between Views) on building Soundcloud next. They have a similar approach to the above two plugins in solving this problem.

Resources