How to setup client side metadata for SPA using breeze - angularjs

Hello all I am working on a small report library SPA using angular and breeze that allows a user to manage reports they create. I know you can use breeze along with the EF context to build the model by referencing the API.. But how this is being implemented I want to leave EF out of it. The API (WebAPI 2) is basically calling other repositories to do the work almost like an interface. The result coming back is just a json object. I have looked over the Edmunds sample from the Breeze website and I can see how I can build a client model as well as handle the mapping on the return. The current issue I have is I am not certain that I have the mapping in the jsonResultsAdapter correct. Or that I may be missing something on how the mapping is supposed to work.
I'm testing this currently with some mock data I have stubbed into the API until I can get this to work. Once this is bound and mapping I can go against the actual data. Here is what I have so far:
The mock data is a report object that contains an internal collection called labels (tags basically) a report can have multiple tag and from the label it can have multiple reports.
//report dto
public class ReportDto
{
public Int64 ReportId { get; set; }
public string ReportName { get; set; }
public string ReportDescription { get; set; }
public string ReportDateCreated { get; set; }
public string ReportOwner { get; set; }
public IEnumerable<ReportLabelDto> ReportLabels { get; set; }
}
public class ReportLabelDto
{
public Int64 LabelId { get; set; }
public string LabelName { get; set; }
public bool IsPrivate { get; set; }
public bool IsFavorite { get; set; }
public IEnumerable<ReportDto> Reports { get; set; }//placeholder?
}
Here is the code currently being used within the webapi controller which at this time is just for testing:
[Route ("reportlibrary/myreports/{userid}")]
public IEnumerable<ReportDto> GetAllReports(string userId)
{
List<ReportDto> result = new List<ReportDto>();
List<ReportLabelDto> label = new List<ReportLabelDto>();
//create 5 reports
for (int i = 0; i < 5; i++)
{
ReportDto r = new ReportDto();
ReportLabelDto l = new ReportLabelDto();
r.ReportId = i;
r.ReportOwner = "John Smith";
r.ReportDateCreated = DateTime.Now.ToString();
r.ReportDescription = "Report Description # " + i.ToString();
r.ReportName = "Report Description # " + i.ToString();
//generate labels
l.LabelId = i;
l.LabelName = "Special Label" + i.ToString();
l.IsPrivate = true;
l.IsFavorite = false;
label.Add(l);
r.ReportLabels = label;
result.Add(r);
}
return result;
}
The object that is currently coming back looks like this:
[{"ReportId":0,"ReportName":"Report Description # 0","ReportDescription":"Report Description # 0","ReportDateCreated":"12/22/2014 6:32:05 PM","ReportOwner":"John Smith","ReportLabels":[{"LabelId":0,"LabelName":"Special Label0","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":1,"LabelName":"Special Label1","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":2,"LabelName":"Special Label2","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":3,"LabelName":"Special Label3","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":4,"LabelName":"Special Label4","IsPrivate":true,"IsFavorite":false,"Reports":null}]},{"ReportId":1,"ReportName":"Report Description # 1","ReportDescription":"Report Description # 1","ReportDateCreated":"12/22/2014 6:32:05 PM","ReportOwner":"John Smith","ReportLabels":[{"LabelId":0,"LabelName":"Special Label0","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":1,"LabelName":"Special Label1","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":2,"LabelName":"Special Label2","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":3,"LabelName":"Special Label3","IsPrivate":true,"IsFavorite":false,"Reports":null},{"LabelId":4,"LabelName":"Special Label4","IsPrivate":true,"IsFavorite":false,"Reports":null}]},...]
I have the services and controllers all talking and I can hit the api and get an object returned so I am going to omit that code for right now.
For the js model I defined the report object as follows:
app.factory('model', function () {
var DT = breeze.DataType;
return {
initialize: initialize
}
function initialize(metadataStore) {
metadataStore.addEntityType({
shortName: "Report",
namespace: "Inform",
dataProperties: {
reportid: { dataType: DT.Int64, isPartOfKey: true },
reportname: { dataType: DT.String },
reportdescription: { dataType: DT.String },
reportdatecreated: { dataType: DT.String },
reportowner: { dataType: DT.String },
mappedlabels: { dataType: DT.Undefined },
ishared: { dataType: DT.Bool },
isfavorite: { dataType: DT.Bool }
},
navigationProperties: {
labels: {
entityTypeName: "Label:#Inform", isScalar: false,
associationName: "Report_Labels"
}
}
});
metadataStore.addEntityType({
shortName: "Label",
namespace: "Inform",
dataProperties: {
labelid: { dataType: DT.Int64, isPartOfKey: true },
reportid: { dataType: DT.Int64 },
labelname: { dataType: DT.String },
ispublic: { dataType: DT.Bool },
mappedreports: { dataType: DT.Undefined }
},
navigationProperties: {
labels: {
entityTypeName: "Report:#Inform", isScalar: false,
associationName: "Report_Labels", foreignKeyNames: ["reportid"]
}
}
});
}
})
This is where I think the issue is I don't understand this adapter enough to ensure that I am receiving what I think I am as well as if it is handling the mapping correctly:
/* jsonResultsAdapter: parses report data into entities */
app.value('jsonResultsAdapter',
new breeze.JsonResultsAdapter({
name: "inform",
extractResults: function (data) {
var results = data.results;
if (!results) throw new Error("Unable to resolve 'results' property");
// Parse only the make and model types
return results && (results.reportHolder || results.labelHolder);
},
visitNode: function (node, parseContext, nodeContext) {
//Report parser
if (node.reportid && node.labels) {
node.mappedlabels = node.labels;
node.labels = [];
return { entityType: "Report" }
}
// Label parser
else if (node.labelid && node.reports) {
node.mappedreports = node.reports;
node.mappedreports = [];
return { entityType: "Label" };
}
}
}));
When I step through the code in chrome I can see that an object is returned. with 5 reports and each report has 5 labels ( I know the labels are showing null reports currently). When I set breakpoints within the jsonResultsAdapter I can see the result with 5 objects, but what gets passed back to the service is as a result is null. Can anyone help me verify if the model and mapping is correct or if you see anything out of place in the jsonResultsAdapter. I'd also appreciate any suggestions on things I may want to do different. I feel very black-box right now as I don't see/understand a good way to troubleshoot this mapping piece.
-cheers

Here I'll pick up on some of PW Kad's observations and add a few of my own.
Let's first understand the different roles of metadata and the JsonResultsAdapter.
Metadata is where you define the schema and validation rules for the client-side entity model. It describes the entity objects that Breeze keeps in cache and makes available to your program.
But the metadata have nothing to say about the JSON payload arriving from the server. That's a completely separate and lower level concern. That's the concern of the JsonResultsAdapter.
The JsonResultsAdapter sits in the pipeline between the JSON data arriving from the server as a result of an AJAX call ... and the entities in cache. The JSON data don't have to be shaped like the entities. They don't have to conform to the metadata you wrote. The metadata describe the entities as you would like to consume them. The JSON are the sad reality that the service gives you. The JsonResultsAdapter is there to bridge the gap.
Whether the entity schema conforms to the shape of the JSON payload is anyone's guess. Often the JSON data needs a little tweaking. It's the JsonResultsAdapter's job to manipulate the JSON "nodes" into something that Breeze can map into your entities. The job is easier if the JSON payload closely approximates the entity shape described by your metadata. Let's hope your JSON aligns well with your entities.
Metadata and materialization
Now Breeze does use the metadata when mapping the JSON into the entities. The MetadataStore has a NamingConvention that prescribes how to translate between the client entity property names and the service property names. The "materialization" process expects the JSON emerging from the JsonResultsAdapter to have the expected service property names. That's why I was adamant that the node property names (if you need them) be spelled in PascalCase ... assuming that you are using the standard Breeze camelCase convention and that your service does, in fact, spell property names in PascalCase.
Most C# and Java servers do. Rails and Node servers generally don't; they use camelCase on the server too ... which means you'd want NamingConvention.none if you're consuming feeds from these kinds of servers.
Ideally the JsonResultsAdapter has to do very little. The JSON property names typically map easily and obviously to the entity property names and you can handle whatever translation is needed with a NamingConvention. Such appears to be the case for you (see below).
For sure you're not accomplishing a thing with the code you showed us:
node.ReportId = node.ReportId;
node.ReportName = node.ReportName;
node.ReportDescription = node.ReportDescription;
That is the most elaborate "no op" code I've seen in a very long time. I wonder what you had in mind.
JsonResultsAdapter is often needed when identifying the EntityType corresponding to a JSON node. If you're not sourcing the data from .NET using the Json.Net serializer, your server may not be sending the type name down with the JSON data. Your JSON nodes are missing the $type property that Breeze is looking for by default.
If that's your situation (and it seems it is), your JsonResultsAdapter has to supply the type name.
Apparently, you can do that for your data by examining each node's key property. It seems that key property name contains within itself the distinguishing part of the type name.
Perhaps your JsonResultsAdapter.visitNode method could look like this:
visitNode: function (node, parseContext, nodeContext) {
//Report parser
if (node.ReportId) {
return { entityType: "Report:#Inform" }
}
// Label entity
else if (node.ReportLabels) {
return { entityType: "Label:#Inform" };
}
}
Notice that I included the namespace (:#Inform) in the entityType name property. The namespace is part of each EntityType's full name ... and you must supply it.
Notice that I did NOT do any property name mapping. I didn't see any reason for it. The node property names look just like the entity metadata names ... except for the PascalCasing ... and we take care of that with the NamingConvention.camelCase.
Bad Metadata?
Well actually the node property names do NOT look like the entity property names in your metadata, not even after accounting for the Pascal-to-camel-case conversion. I think this is what PW Kad was pointing out.
The problem is that the entity property names in your metadata are all lower case. Not camelCase; lower case. For example:
reportid: { dataType: DT.Int64, isPartOfKey: true },
reportname: { dataType: DT.String },
reportdescription: ...
Shouldn't they be:
reportId: { dataType: DT.Int64, isPartOfKey: true },
reportName: { dataType: DT.String },
reportDescription: ...
That would correspond nicely to your JSON property names
ReportId
ReportName
ReportDescription
Why would you want all lowercase property names on the client.
You could go all lowercase and write a really wacky custom NamingConvention to navigate between client entity names and service names. That's a lot of work to no purpose in my book.
Why is there no $type in your JSON?
I just scrolled to the top of this question and realized that your server is written in C# and it looks like you're using the Web API.
Why did you not annotate your Web API controller with the [BreezeController] attribute? Doing so would have configured your controller to serialize the data in the manner a Breeze client understands by default. You might not need a JsonResultsAdapter at all.
Don't change the type name!
Looking again I see yet another problem looming ahead. Your server-side class names have the suffix "Dto" but you don't want that suffix on your client type names. You are also changing the type name completely in one case: "ReportLabelDto" to "Label".
Breeze has a naming convention for morphing property names. It doesn't have a naming convention for "entity type" names.
It will be a royal pain if you insist on having different type names on the client and the server. I'm not sure it can be done.
Yes you can morph the entity name in the JsonResultsAdapter. That covers communications on the way in. But you also have to worry about the communications on the way out. The server is not going to be happy when you ask it to save an entity of class "Label" ... which it knows nothing about.
As I write I can't think of an easy way around this. At the moment, Breeze requires that the server-side type name be the same as the client EntityType name. If the type name on the server is "ReportLabelDto", you'll have to name the corresponding EntityType "ReportLabelDto". There is no easy way around that.
Fortunately, unlike property names which show up everywhere, you don't often refer to the EntityType name on the client so calling it "ReportLabelDto" shouldn't be a big deal.

It looks like all of your properties you are defining are not properly camelCased nor PascalCased. Breeze.js will look for properties that match - but unless I am missing something you have defined it will not toLowerCase them.
You need to set your model properties up like this -
ReportName: { dataType: DT.String },
and then in your results adapter you need to check for the property names properly like this -
if (node.ReportId && node.Labels) {

Thanks to PW Kad for pointing me in the right direction. I had forgotten the case sensitivity portion. In addition I went back and reassessed what I was trying to do in the jsonResultsAdapter.js file. I finally realized this file is working similar to automapper and I was under the impression that breeze would internally resolve the mappings. (maybe it does but with the EF context) but when creating client side meta I had to explicitly set the mappings. The updated code now shows:
app.value('jsonResultsAdapter',
new breeze.JsonResultsAdapter({
name: "inform",
extractResults: function (data) {
var results = data.results;
if (!results) throw new Error("Unable to resolve 'results' property");
// Parse only the make and model types
return results;
},
visitNode: function (node, parseContext, nodeContext) {
//Report parser
if (node) {
node.ReportId = node.ReportId;
node.ReportName = node.ReportName;
node.ReportDescription = node.ReportDescription;
node.ReportDateCreated = node.ReportDateCreated;
node.ReportOwner = node.ReportOwner;
node.ReportLabels = node.ReportLabels;
node.ReportLabels = [];
node.IsShared = node.IsShared;
node.IsFavorite = node.IsFavorite;
return { entityType: "Report" }
}
// Label parser
else if (node.ReportLabels) {
node.LabelId = node.LabelId;
node.LabelName = nodel.LabelName;
node.IsPrivate = node.IsPrivate;
node.IsFavorite = node.IsFavorite;
node.Reports = node.Reports;
node.Reports = [];
return { entityType: "Label" };
}
}
}));
I know I still have some tweaking to do on the mappings more than likely or how it is parsed from the API but making this change properly mapped the mock data and allowed it to bind/display within the UI.
Hope this helps

Related

Libgdx Load/save array in Json

I want to save an array of Mission.class which have variables as follows:
public class Mission {
public MissionEnum missionEnum;
public int progress;
public Mission(MissionEnum missionEnum, int progress) {
this.missionEnum = missionEnum;
this.progress = progress;
}
and also save missions in another java class:
public void saveMissions() {
Json json = new Json();
json.setOutputType(JsonWriter.OutputType.json);
json.addClassTag("Mission", Mission.class);
FileHandle missionFile = Gdx.files.local("missions_array.json");
missionFile.writeString(json.prettyPrint(missions), false);
}
and load missions:
public void loadMissions() {
if (Gdx.files.local("missions_array.json").exists()) {
try {
FileHandle file = Gdx.files.local("missions_array.json");
Json json = new Json();
json.addClassTag("Mission", Mission.class);
missions = json.fromJson(Array.class, Mission.class, file);
for (Mission mission : missions) {
Gdx.app.log(TAG, "Mission loaded: " + mission.missionEnum);
}
Gdx.app.log(TAG, "Load missions successful");
} catch (Exception e) {
Gdx.app.error(TAG, "Unable to read Missions: " + e.getMessage());
}
}
}
I got json like this:
[
{
"class": "Mission",
"missionEnum": "BUY_POWERUP"
},
{
"class": "Mission",
"missionEnum": "DISTANCE_ONE_RUN_2"
},
{
"class": "Mission",
"missionEnum": "BANANA_TOTAL_2",
"progress": 35
}
]
However when loadMissions() is run I got the "Load missions successful" log shown but "Mission loaded..." aren't shown without any error log. Missions appeared not loaded properly. I do not know what went wrong because another array is loaded successful the same way.
Not sure why there are no errors in the logs, since while reproducing your problem I've got an exception.
The problem is in loadMissions() method: you create new Json parser without setting the class tag:
Json json = new Json();
// add the line below
json.addClassTag("Mission", Mission.class);
missions = json.fromJson(Array.class, Mission.class, file);
....
Without the tag parser doesn't know what is "class": "Mission" in json file.
Update
Another thing that may cause this issue is the args-constructor. At least, when I added it I got an exception. If you don't use it - just delete.
Still, quite weird that you don't have any exceptions in logs'cos I definitely have.
Updated response:
Add an empty contructor and read this and this
You will either have to add a no-argument constructor to (Mission), or
you will have to add a custom serializer (see
https://code.google.com/p/libgdx/wiki/JsonParsing#Customizing_serialization)
that knows how to save a (Mission) instance and knows the appropriate
constructor to invoke when reading a (Mission) back in.
public Mission() {
// Do nothing.
}
Reading & writing JSON
The class implementing Json.Serializable must have a zero argument
constructor because object construction is done for you.
Alternatively delete the unused constructor. I think Enigo answer is also correct so I'm going to upvote his answer.
Providing Constructors for Your Classes
You don't have to provide any constructors for your class, but you
must be careful when doing this. The compiler automatically provides a
no-argument, default constructor for any class without constructors.
This default constructor will call the no-argument constructor of the
superclass. In this situation, the compiler will complain if the
superclass doesn't have a no-argument constructor so you must verify
that it does. If your class has no explicit superclass, then it has an
implicit superclass of Object, which does have a no-argument
constructor.
Note: I didn't test our responses, I have not developed games or used libgdx in the last two years.
Also read this libgdx issue: Json - constructor default value with Enum:
I don't know if this would be called a bug but, I have a case where I
have an enum like this;
Then I have a class with 2 constructors;
This second gets called by my framework, the first by Json
deserialization.
...
Previous response:
I guess that the missing progress field in some Mission classes can be the source of the issue but would be interesting to read the error logs to be sure.
I followed this, this and this to confirm this but it's hard without extra information about the exact log error.

best practice for overriding ExtJS model getters & setters

i want to intercept field get/set on a model so that i can transform the values before they get displayed as well as before they get stored in the model. in this case, i want to store uriencoded data in the model, but display decoded values.
when i directly override the get/set methods i see that they are being used by the proxy to put raw data into the model. i dont want to override that process and i am not even sure how to tell whether it is the proxy loading a model or the ui.
what is the best method for this? convert, btw, appears to be a poor choice. it is not 2-way and live.
Note: I have not tested this code, so use the concepts and comment later if you have questions.
When you define your Ext.data.Model class for your data entities (use Ext.data.Record for Ext JS 3), add a convert function as part of your Ext.data.Field property instance. This can be done for all fields if you want, but probably not recommended.
I'm new to this framework, so one thing I do not know how to do is use the "this.self" notation in the convert function in case you are extending another class in your Model class. For example, if you do not have a default value defined in the Employee class and it inherits from another class (like Employee for example), but the Employee constructor sets that property, it'll take the value defined in the super class (base class or parent class) constructor when you encode the data. Right now, it just gets the EmployeeDeveloper.dataValue property.
Ext.define('EmployeeDeveloper', {
extend: 'Ext.data.Model',
fields: [
{ name: 'name', type: 'string', defaultValue: 'Paul' },
{ name: 'profession', type: 'string', defaultValue: 'Ext JS developer' },
{ name: 'salary', type: 'float', defaultValue: '95000' },
{ name: 'dataValue', type: 'string',
convert: function(value, record) {
// feel free to use value and record parameters in this function
return encodeData(record.get('dataValue')); // you may have to use the "value" parameter instead of record.get('dataValue') if referencing the field in it's own convert function (haven't tested)
}
}
]
});
var empDev = Ext.create(
'EmployeeDeveloper',
{
name: 'MacGyver',
dataValue: 'ABC'
},
'MacGyver',
123
);
var encodedDataValue = empDev.get('dataValue');
Caution about convert: If you are referencing a field that is defined below the convert function signature, the code will not know about it and won't be able to retrieve the value.
OR IF YOU HAVE A CUSTOM CLASS BUT NOT USING THE MODEL CLASS...
Tonight I also read about something else that's kind of neat in Ext JS. They added auto getters and setters in Ext JS 4. The new version of the framework automatically prefixes your property with "get", "set", "reset", or "apply" and capitalizes the first letter of the property inside of the "config" property of your defined class.
In regular JavaScript, they don't have real classes, but Sencha has made it possible to define a class (with the "define" function) and the ability to instantiate an object instance of your user defined class.
For example, suppose you have the following class defined:
Ext.define('Paul.MyClass'), {
extend: 'Ext.Window',
config: {
name: 'Paul'
}
});
If you create an instance of your object like so, you automatically have access to four functions.
var win = Ext.create('Paul.MyClass'); // create instance of MyClass object
var myName = win.getName(); // get name
win.setName('MacGyver'); // set name
win.resetName(); // this resets the value back to the default value ('Paul')
win.applyName(); // this calls custom code
win.show();
I'd suggest overriding the apply* method with the logic you are suggesting. Then calling win.get*(); afterward to get the new value. It might be wise to have an extra encoded field for each field you plan on encoding so you don't have to manipulate your properties every time you access (get or set) your data in your proxies and/or stores.
Ext.define('Paul.MyClass'), {
extend: 'Ext.Window',
config: {
name: 'Paul',
dataValue: 'non encoded value of your data'
}
applyName: function(title) {
this.name = this.name + ' ' + title; // custom logic goes here
}
applyDataValue: function(encodeKey) {
// get the encoded data value
this.self.dataValue = 'encoded data value'; // custom logic goes here
}
});

Parameter must be an entity type exposed by the DomainService?

Trying to implement a domain service in a SL app and getting the following error:
Parameter 'spFolderCreate' of domain method 'CreateSharePointFolder' must be an entity type exposed by the DomainService.
[EnableClientAccess()]
public class FileUploadService : DomainService
{
public void CreateSharePointFolder(SharePointFolderCreate spFolderCreate)
{
SharePointFolder spf = new SharePointFolder();
spf.CreateFolder_ClientOM(spFolderCreate.listName, spFolderCreate.fileName);
}
[OperationContract]
void CreateSharePointFolder(SharePointFolderCreate spFolderCreate);
[DataContract]
public class SharePointFolderCreate
{
private string m_listName;
private string m_fileName;
[DataMember]
public string listName
{
get { return m_listName; }
set { m_listName = value; }
}
[DataMember]
public string fileName
{
get { return m_fileName; }
set { m_fileName = value; }
}
}
So am I missing something simple here to make this all work?
It may be that the framework is inferring the intended operation because you have the word "Create" prefixing the function name (CreateSharePointFolder). Details of this behaviour can be found here
Although that is all fine for DomainServices and EntityFramework, following the information in that article, it can be inferred that methods beginning "Delete" will be performing a delete of an entity, so must accept an entity as a parameter. The same is true for "Create" or "Insert" prefixed methods. Only "Get" or "Select" methods can take non-entity parameters, making it possible to pass a numeric id (for example) to a "Get" method.
Try changing your method name temporarily to "BlahSharePointFolder" to see if it is this convention of inferrance that's causing your problem.
Also, as there is no metadata defined for your SharePointFolderCreate DC, you might need to decorate the class (in addition to the [DataContract] attribute) with the [MetadataType] attribute. You will see how to implement this if you used the DomainServiceClass wizard and point to an EF model. There is a checkbox at the bottom for generating metadata. Somewhere in your solution.Web project you should find a domainservice.metadata.cs file. In this file, you will find examples of how to use the [MetadataType] attribute.
For the RIA WCF service to work correctly with your own methods, you need to ensure that all entities existing on the parameter list have at least one member with a [Key] attribute defined in their metadata class, and that the entity is returned somewhere on your DomainService in a "Get" method.
HTH
Lee

RIA Services and MVVM loading, a question about querying data (separating data)

First of all, sorry for the bad title, I can only describe the problem
Let's say the database on the server has a table/type called Tasks
and these tasks can be owned by a user and assigned to a user.
SomeTask.Owner = SomeUser
SomeTask.Assignee = SomeOtherUser
In the server some additional queries are defined:
public IQueryable<Task> GetAssignedTasks(int UserId) { /* gets the assigned tasks */ };
public IQueryable<Task> GetOwnedTasks(int UserId) { /* gets the owned tasks */ };
In the ViewModel these could be loaded as such:
var ownedTasksQuery = context.GetOwnedTasksQuery(userId);
context.Load(ownedTasksQuery);
var assignedTasksQuery = context.GetAssignedTasksQuery(userId);
context.Load(assignedTasksQuery);
The problem here is that both results get loaded into the context,
ie, context.Tasks contains the union of both query results
My first thought here was to simply change the getter for the properties in my VieWModel:
public IEnumerable<Task> OwnedTasks
{
get { return context.Tasks.Where(t => t.UserId == userId); }
}
public IEnumerable<Task> AssignedTasks
{
get { return context.Tasks.Where(t => t.UserId == userId); }
}
However, when I bind the view to these properties, nothing is returned,
whereas if I where to use the following, all loaded records are returned (obviously):
public IEnumerable<Task> OwnedTasks
{
get { return context.Tasks; }
}
public IEnumerable<Task> AssignedTasks
{
get { return context.Tasks; }
}
I'm guessing I'm going about this completely the wrong way,
what's the correct way to handle a situation like this?
Update: Or should I simply handle this by creating another instance of the context?
Update: Seems I was going at this the wrong way...
I'm still thinking in terms of classing database queries...
All I have to do to solve my problem here is load the User including the assigned and owned tasks...
ObjectContext.Users.Include("OwnedTasks").Include("AssignedTasks")
Will make this community wiki, in case somebody else does the same thing.
Did you mean to put Where(t => t.UserId = userId) or did you mean:
Where(t => t.UserId == userId);
i.e. you used an assignment operator instead of compare. (LINQ converts "==" to "=" in SQL behind the scenes, but you must use C# operators).

How to achieve "Blendability" when using DataServiceCollection in my ViewModel

I'm looking at using oData endpoints in my Silverlight client. Naturally, I'm doing MVVM and I want the project to be nice and "Blendable" (i.e. I must be able to cleanly use static data instead of the oData endpoints when in design mode.)
Now to the problem. I'd like to use the DataServiceCollection in my ViewModels, since it allows for nice bindable collections without having to worry too much with BeginExecute/EndExecute etc.
Now, let's look at some code. My Model interface looks like this:
public interface ITasksModel
{
IQueryable<Task> Tasks { get; }
}
The oData endpoint implementation of that interface:
public class TasksModel : ITasksModel
{
Uri svcUri = new Uri("http://localhost:2404/Services/TasksDataService.svc");
TaskModelContainer _container;
public TasksModel()
{
_container = new TaskModelContainer(svcUri);
}
public IQueryable<Task> Tasks
{
get
{
return _container.TaskSet;
}
}
}
And the "Blendable" design-time implementation:
public class DesignModeTasksModel : ITasksModel
{
private List<Task> _taskCollection = new List<Task>();
public DesignModeTasksModel()
{
_taskCollection.Add(new Task() { Id = 1, Title = "Task 1" });
_taskCollection.Add(new Task() { Id = 2, Title = "Task 2" });
_taskCollection.Add(new Task() { Id = 3, Title = "Task 3" });
}
public IQueryable<Task> Tasks
{
get {
return _taskCollection.AsQueryable();
}
}
}
However, when I try to use this last one in my ViewModel constructor:
public TaskListViewModel(ITasksModel tasksModel)
{
_tasksModel = tasksModel;
_tasks = new DataServiceCollection<Task>();
_tasks.LoadAsync(_tasksModel.Tasks);
}
I get an exception:
Only a typed DataServiceQuery object can be supplied when calling the LoadAsync method on DataServiceCollection.
First of all, if this is the case, why not make the input parameter of LoadAsync be typed as DataServiceQuery?
Second, what is the "proper" way of doing what I'm trying to accomplish?
The reason LoadAsync requires DataServiceQuery is that just plain IQueryable doesn't define asynchronous way of executing the query. The reason the method takes IQueryable type as its parameter is so that users don't have to cast the query object to DataServiceQuery explicitely (makes the code shorter) and since we assume that users will try to run their code at least once, they would see the error immediately (as you did).
LoadAsync only supports asynchronous operations, so it needs the DataServiceQuery. If you already have the results (without a need to execute async request) you can call the Load method instead. Which is the answer to your second question. Instead of calling LoadAsync for both desing time and run time, you could use Load for design time and LoadAsync for run time. But due to tracking constrains you might need to create the DataServiceCollection in different way.
Something like this:
DataServiceCollection<Task> dsc;
DataServiceQuery<Task> dsq = _tasksModel as DataServiceQuery<Task>;
if (dsq != null)
{
dsc = new DataServiceCollection<Task>();
dsc.LoadAsync(dsq);
}
else
{
dsc = new DataServiceCollection<Task>(myDataServiceContext);
dsc.Load(_tasksModel);
// Invoke the LoadAsyncCompleted handler here
}
If you pass the DataServiceContext to the constructor before caling Load the entities will be tracked (just like in the LoadAsync case). If you don't need that you can call the constructor which takes IEnumerable and TrackingMode and turn off tracking on it.

Resources