Silverlight IsolatedStorage techniques for larger files? - silverlight

I am using IsolatedStorage in Silverlight 3 to store some settings when a user navigates away from a page hosting the application.
Currently i'm using a DataContractSerializer to write the settings to file. In some circumstances the resulting file is quite large, over 10MB (a lot of this size is due to the serializer itself and the XML it generates). This produces problems because
i have to request the extra space from the user
it is really slow writing the data to file
can anyone share some strategies they have used for dealing with larger files in IsolatedStorage?
how do you determine the likely amount of disk space you will need?
do you use a DataContract or Xml Serializer and then zip the result before saving?
or do you use some sort of binary/custom serialization? If so, did you gain any substantial space or time savings?
is there some way of declaratively saying your application requires a certain quota, so that the user doesn't have to be prompted at some arbitrary point?
I personally don't like writing large quantities of data to file like this, but i need to know all the available options before i explain the issues to a product manager and persuade them to change the requirements.
Thanks!

slugster,
You may want to consider switching over to XMLSerializer instead. Here is what I have determined over time:
The XMLSerializer and DataContractSerializer classes provides a simple means of serializing and deserializing object graphs to and from XML.
The key differences are:
1.
XMLSerializer has much smaller payload than DCS if you use [XmlAttribute] instead of [XmlElement]
DCS always store values as elements
2.
DCS is "opt-in" rather than "opt-out"
With DCS you explicitly mark what you want to serialize with [DataMember]
With DCS you can serialize any field or property, even if they are marked protected or private
With DCS you can use [IgnoreDataMember] to have the serializer ignore certain properties
With XMLSerializer public properties are serialized, and need setters to be deserialized
With XmlSerializer you can use [XmlIgnore] to have the serializer ignore public properties
3.
BE AWARE! DCS.ReadObject DOES NOT call constructors during deserialization
If you need to perform initialization, DCS supports the following callback hooks:
[OnDeserializing], [OnDeserialized], [OnSerializing], [OnSerialized]
(also useful for handling versioning issues)
If you want the ability to switch between the two serializers, you can use both sets of attributes simultaneously, as in:
[DataContract]
[XmlRoot]
public class ProfilePerson : NotifyPropertyChanges
{
[XmlAttribute]
[DataMember]
public string FirstName { get { return m_FirstName; } set { SetProperty(ref m_FirstName, value); } }
private string m_FirstName;
[XmlElement]
[DataMember]
public PersonLocation Location { get { return m_Location; } set { SetProperty(ref m_Location, value); } }
private PersonLocation m_Location = new PersonLocation(); // Should change over time
[XmlIgnore]
[IgnoreDataMember]
public Profile ParentProfile { get { return m_ParentProfile; } set { SetProperty(ref m_ParentProfile, value); } }
private Profile m_ParentProfile = null;
public ProfilePerson()
{
}
}
Also, check out my Serializer class that can switch between the two:
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ClassLibrary
{
// Instantiate this class to serialize objects using either XmlSerializer or DataContractSerializer
internal class Serializer
{
private readonly bool m_bDCS;
internal Serializer(bool bDCS)
{
m_bDCS = bDCS;
}
internal TT Deserialize<TT>(string input)
{
MemoryStream stream = new MemoryStream(input.ToByteArray());
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
return (TT)dc.ReadObject(stream);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
return (TT)xs.Deserialize(stream);
}
}
internal string Serialize<TT>(object obj)
{
MemoryStream stream = new MemoryStream();
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(stream, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(stream, obj);
}
// be aware that the Unicode Byte-Order Mark will be at the front of the string
return stream.ToArray().ToUtfString();
}
internal string SerializeToString<TT>(object obj)
{
StringBuilder builder = new StringBuilder();
XmlWriter xmlWriter = XmlWriter.Create(builder);
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(xmlWriter, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(xmlWriter, obj);
}
string xml = builder.ToString();
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern("<?xml*>", WildcardSearch.Anywhere), string.Empty);
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern(" xmlns:*\"*\"", WildcardSearch.Anywhere), string.Empty);
xml = xml.Replace(Environment.NewLine + " ", string.Empty);
xml = xml.Replace(Environment.NewLine, string.Empty);
return xml;
}
}
}
Good Luck,
Jim McCurdy
Face To Face Software and YinYangMoney

Another alternative is to zip the contents of the xml serialization. We also have a large serialization that has a rough compression ratio of 10-to-1. Of course the compression can take a fair bit of CPU to do its magic. We spawn of the compression in a thread to make sure the user interface doesn't slow down. We are using a modified SharpZipLib that works under Silverlight.

Another option is to serialize to json. I do not know about performance, but I just compared the output when serializing a fairly complex list of entities to json vs. xml, and json is much more compact. Using json the resulting string was 1301303 bytes. With xml, 2429630. So it's almost half the size using json.
Below is the helper class I use when serializing/deserializing to json.
EDIT
I did some performance testing, and it actually turns out that json is faster as well. With xml, serializing 10000 objects took 636 milliseconds, with json only 257. Does anybody know if there are reasons not to choose json over xml?
EDIT
Tested again, with real data this time:
(1000 objects)
Uncompressed json: 605 kb
Uncompressed xml: 3,53 MB (!)
Zipped json: 28,5 kb
Zipped xml: 69,9 kb
Performance when using pre-initialized serializer:
json: ~350 ms
xml: ~120 ms
using System;
using System.Net;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Documents;
using System.Windows.Ink;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Animation;
using System.Windows.Shapes;
using System.IO;
using System.Text;
using System.Runtime.Serialization.Json;
namespace GLS.Gui.Helper
{
public static class SerializationHelper
{
public static string SerializeToJsonString(object objectToSerialize)
{
using (MemoryStream ms = new MemoryStream())
{
DataContractJsonSerializer serializer = new DataContractJsonSerializer(objectToSerialize.GetType());
serializer.WriteObject(ms, objectToSerialize);
ms.Position = 0;
using (StreamReader reader = new StreamReader(ms))
{
return reader.ReadToEnd();
}
}
}
public static T Deserialize<T>(string jsonString)
{
using (MemoryStream ms = new MemoryStream(Encoding.Unicode.GetBytes(jsonString)))
{
DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(T));
return (T)serializer.ReadObject(ms);
}
}
}
}

I have a compact binary serializer class for Silverlight and .NET that creates reasonably compact representations of an object graph - I had to build it for the same reason (and the cost of sending stuff over the wire to my WCF service).
You can find the code and a further description on my blog.

Another open source serializer is SharpSerializer. It can serialize even very complicated structures to binary format. There is no need to mark them prior with attributes. Additionally it can serialize the data to Xml, i.e. for debugging.

Related

Put Text type to datastore with Objectify 6

I'm currently migrating project's DAO classes from JDO implementation to Objectify V6.
The requirement I have, is to make sure that in a case of rollback it will be possible to load entities, which were saved by Objectify, with the old version of DAO.
In old code strings are stored as Text. And if I leave Text field in entity definition, Objectify puts it as a String to datastore (because there is no Text type any more).
Currently new DAO implementation is not backward compatible because of a ClassCastException which arise when JDO implementation casts String to Text type.
Is there a way to store Text type to datastore with Objectify V6?
I tried to use String instead of Text in entity definition and create a TranslatorFactory to make the conversion, but I wasn't able to find correct datastore Value implementation type.
public class StringTextTranslatorFactory implements TranslatorFactory<String, Text> {
#Override
public Translator<String, Text> create(TypeKey<String> tk, CreateContext ctx, Path path) {
return new Translator<String, Text>() {
#Override
public String load(Value<Text> node, LoadContext ctx, Path path) throws SkipException {
Text text = node.get();
return text != null ? text.getValue() : "";
}
#Override
public Value<Text> save(String pojo, boolean index, SaveContext ctx, Path path)
throws SkipException {
return ???;
}
};
}
}
Update
The project is using an implementation of JDO 2.3 for the App Engine Datastore. The implementation is based on version 1.0 of the DataNucleus Access Platform.
Data entity defined as the following:
#PersistenceCapable(identityType = IdentityType.APPLICATION)
public class CrmNote {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Long id;
#Persistent
private Text note;
}
Stacktrace:
java.lang.ClassCastException: java.lang.String cannot be cast to com.google.appengine.api.datastore.Text
at com.timzon.snapabug.server.data.CrmNote.jdoReplaceField(CrmNote.java)
at com.timzon.snapabug.server.data.CrmNote.jdoReplaceFields(CrmNote.java)
at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:2772)
at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:2791)
at org.datanucleus.store.appengine.DatastorePersistenceHandler.fetchObject(DatastorePersistenceHandler.java:519)
at org.datanucleus.store.appengine.query.DatastoreQuery.entityToPojo(DatastoreQuery.java:649)
at org.datanucleus.store.appengine.query.DatastoreQuery.entityToPojo(DatastoreQuery.java:603)
at org.datanucleus.store.appengine.query.DatastoreQuery.access$300(DatastoreQuery.java:119)
at org.datanucleus.store.appengine.query.DatastoreQuery$6.apply(DatastoreQuery.java:783)
at org.datanucleus.store.appengine.query.DatastoreQuery$6.apply(DatastoreQuery.java:774)
at org.datanucleus.store.appengine.query.LazyResult.resolveNext(LazyResult.java:94)
at org.datanucleus.store.appengine.query.LazyResult.resolveAll(LazyResult.java:116)
at org.datanucleus.store.appengine.query.LazyResult.size(LazyResult.java:110)
at org.datanucleus.store.appengine.query.StreamingQueryResult.size(StreamingQueryResult.java:130)
at org.datanucleus.store.query.AbstractQueryResult.toArray(AbstractQueryResult.java:399)
at java.util.ArrayList.<init>(ArrayList.java:178)
at com.timzon.snapabug.server.dao.CrmNoteDAO.getOrderedCrmNotes(CrmNoteDAO.java:27)
Exception happens in auto-generated jdoReplaceField method which is added by JDO post-compilation "enhancement". I decompiled enhanced class and I see that datastore object is casted to Text type directly:
public void jdoReplaceField(int index) {
if (this.jdoStateManager == null) {
throw new IllegalStateException("state manager is null");
} else {
switch(index) {
case 0:
this.id = (Long)this.jdoStateManager.replacingObjectField(this, index);
break;
case 1:
this.note = (Text)this.jdoStateManager.replacingObjectField(this, index);
break;
default:
throw new IllegalArgumentException("out of field index :" + index);
}
}
}
So, if note field is saved in data store as a String, then in case of rollback a ClassCastException will be thrown.
There's no way to explicitly store a Text type with the Google-provided SDK that Objectify 6 uses; there is only StringValue. Text is not even in the jar.
However, I don't think this should matter. Ultimately both SDKs (the old appengine one and the new one) are just converting back and forth to protobuf structures. They are supposed to be compatible.
It's especially strange because the old low level API wrote strings into the Entity structure; Text was required only if the strings exceeded a certain length. So JDO should handle String. Do you have some sort of special annotation on your String field to force it to expect Text? What does that stacktrace look like?

How do I write EF.Functions extension method?

I see that EF Core 2 has EF.Functions property EF Core 2.0 Announcement which can be used by EF Core or providers to define methods that map to database functions or operators so that those can be invoked in LINQ queries. It included LIKE method that gets sent to the database.
But I need a different method, SOUNDEX() that is not included. How do I write such a method that passes the function to the database the way DbFunction attribute did in EF6? Or I need to wait for MS to implement it? Essentially, I need to generate something like
SELECT * FROM Customer WHERE SOUNDEX(lastname) = SOUNDEX(#param)
Adding new scalar method to EF.Functions is easy - you simply define extension method on DbFunctions class. However providing SQL translation is hard and requires digging into EFC internals.
However EFC 2.0 also introduces a much simpler approach, explained in Database scalar function mapping section of the New features in EF Core 2.0 documentation topic.
According to that, the easiest would be to add a static method to your DbContext derived class and mark it with DbFunction attribute. E.g.
public class MyDbContext : DbContext
{
// ...
[DbFunction("SOUNDEX")]
public static string Soundex(string s) => throw new Exception();
}
and use something like this:
string param = ...;
MyDbContext db = ...;
var query = db.Customers
.Where(e => MyDbContext.Soundex(e.LastName) == MyDbContext.Soundex(param));
You can declare such static methods in a different class, but then you need to manually register them using HasDbFunction fluent API.
EFC 3.0 has changed this process a little, as per https://learn.microsoft.com/en-us/ef/core/what-is-new/ef-core-3.0/breaking-changes#udf-empty-string
Example of adding CHARINDEX in a partial context class:
public partial class MyDbContext
{
[DbFunction("CHARINDEX")]
public static int? CharIndex(string toSearch, string target) => throw new Exception();
partial void OnModelCreatingPartial(
ModelBuilder modelBuilder)
{
modelBuilder
.HasDbFunction(typeof(MyDbContext).GetMethod(nameof(CharIndex)))
.HasTranslation(
args =>
SqlFunctionExpression.Create("CHARINDEX", args, typeof(int?), null));
}
}

How to read, edit and export word documents in WPF without Microsoft office being installed?

I have an WPF application that relies heavily on manipulating documents; I want to know if there is a library that works independetly from Microsoft Office Word and that provides the following features:
Reading word documents (*.doc or rtf will be suffisiant, *.docx will be perfect)
Enable me to edit the document from my WPF app
Enable me to export again the document into other formats (word, excel, pdf)
Free :)
Thanks in advance.
I will try to answer in order:
Reading: This article is good for you.
Edit & export: May be this library works for you.
Free: The most difficult part of your question. You can do it for free using Interop Assemblies for Office. But controls for free... Many controls not free around the net.
Hope it helps.
I was faced with similar question some years ago. I had Windows forms application with some 20 reports and about 100 users and I needed to generate Word documents from application. Application was installed on a server. My first attempt was done by using Office interop, but it caused problems with performance and all kinds of unpredictable exceptions. So I started to look for alternatives and I soon landed with OpenXML.
First idea was that our team would use OpenXML SDK to generate and manipulate documents. It soon turned out that the learning curve was way too steep and our management wasn't willing to pay for the extra work.
So we started to look for alternatives. We didn't find any useful free library and so we tried some commercial ones (Aspose, Docentric). Aspose gave great results, but it was too expensive. Docentric's license is cheaper and the product performed well in Word document generation, so we finally decided to purchase it.
WHAT IT TAKES TO GENERATE A DOCUMENT FROM A TEMPLATE
Install Docentric Toolkit (you can get 30 day trial version for free)
In your VisualStudio project ad references to 4 Docentric dlls, which you can find in installation folder C:\Program Files (x86)\Docentric\Toolkit\Bin
Include Entity Framework via NuGet package If you will fill data from SQL database into the Word document
Prepare Word template, where you define layout and include fields which will get filled with data at document generation (see on-line documentation how to do it).
It doesn't take much code to prepare the data to be merged with the template. In my example I prepare order for customer "BONAP" from Northwind database. Orders include customer data, order details and product data. Data model also includes header and footer data.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Docentric.Word;
using System.Diagnostics;
namespace WordReporting
{
// Report data model
public class ReportData
{
public ReportData()
{ }
public string headerReportTemplatetName { get; set; }
public string footerDateCreated { get; set; }
public string footerUserName { get; set; }
public List<Order> reportDetails { get; set; }
}
// model extensions
public partial class Order
{
public decimal TotalAmount { get; set; }
}
public partial class Order_Detail
{
public decimal Amount { get; set; }
}
// Main
class Program
{
static void Main(string[] args)
{
// variable declaration
List<Order> orderList = new List<Order>();
string templateName = #"c:\temp\Orders_template1.docx";
string generatedDocument = #"c:\temp\Orders_result.docx";
// reading data from database
using (var ctx = new NorthwindEntities1())
{
orderList = ctx.Orders
.Include("Customer")
.Include("Order_Details")
.Include("Order_Details.Product")
.Where(q => q.CustomerID == "BONAP").ToList();
}
// collecting data for the report
ReportData repData = new ReportData();
repData.headerReportTemplatetName = templateName;
repData.footerUserName = "<user name comes here>";
repData.footerDateCreated = DateTime.Now.ToString();
repData.reportDetails = new List<Order>();
foreach (var o in orderList)
{
Order tempOrder = new Order();
tempOrder.Customer = new Customer();
tempOrder.OrderID = o.OrderID;
tempOrder.Customer.CompanyName = o.Customer.CompanyName;
tempOrder.Customer.Address = o.Customer.Address;
tempOrder.Customer.City = o.Customer.City;
tempOrder.Customer.Country = o.Customer.Country;
tempOrder.OrderDate = o.OrderDate;
tempOrder.ShippedDate = o.ShippedDate;
foreach (Order_Detail od in o.Order_Details)
{
Order_Detail tempOrderDetail = new Order_Detail();
tempOrderDetail.Product = new Product();
tempOrderDetail.OrderID = od.OrderID;
tempOrderDetail.ProductID = od.ProductID;
tempOrderDetail.Product.ProductName = od.Product.ProductName;
tempOrderDetail.UnitPrice = od.UnitPrice;
tempOrderDetail.Quantity = od.Quantity;
tempOrderDetail.Amount = od.UnitPrice * od.Quantity;
tempOrder.TotalAmount = tempOrder.TotalAmount + tempOrderDetail.Amount;
tempOrder.Order_Details.Add(tempOrderDetail);
}
repData.reportDetails.Add(tempOrder);
}
try
{
// Word document generation
DocumentGenerator dg = new DocumentGenerator(repData);
DocumentGenerationResult result = dg.GenerateDocument(templateName, generatedDocument);
// start MS Word and show generated document
ProcessStartInfo startInfo = new ProcessStartInfo();
startInfo.FileName = "WINWORD.EXE";
startInfo.Arguments = "\"" + generatedDocument + "\"";
Process.Start(startInfo);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
// wait for the input to terminate the application
Console.WriteLine("Press Enter to exit...");
Console.ReadLine();
}
}
}
}

Exporting SQL to Excel (xlsx) using SSIS?

I'm an SSIS noob (less than a week experience) so please bear with me.
I am running a stored procedure to export its result to an Excel file.
From my research I have found that SSIS's Excel Destination does not play nicely with .xlsx files (can't be xls since I have more than the ~65K rows in the result), but I found that I can use a OLE DB Destination to write to an excel file.
The issue I am seeing is an error message that occurs on run that says:
OLE DB Destination [212]] Error:
An error occurred while setting up a binding for the "Main Job Notes" column.
The binding status was "DT_NTEXT"."
The fields that are erroring are coming in as Text Streams ([DT_TEXT]), and since I was getting an error around not being able to convert between unicode and non-unicode, I use a Data Conversion to transform it into a Unicode text stream ([DT_NTEXT])
If it helps at all, my setup is as follows:
Any help would be amazing. Thank you.
You should consider doing this using a script component, keep in mind that when in data flow task you cannot debug directly but you can use mbox snipped to check results. Also keep in mind that excel will always try to suppose your column data types automatically, for example when you try to import a file from excel that one of its columns starts with a number but in the row 3455 there's a character, it will import the column as a number and you will lose the char value, you will find it as null in your database.
I will give you some code to construct the file you need programmatically, maybe it can give you an idea. (This example reads a file as one column, then it will split in as if you chose fixed with delimited values in excel and will output in a csv file.
/* Microsoft SQL Server Integration Services Script Component
* Write scripts using Microsoft Visual C# 2008.
* ScriptMain is the entry point class of the script.*/
using System;
using System.IO;
using System.Linq;
using System.Text;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
#region Variables
private string _jumexDailyData;
private string[] _jumexValues;
private string[] _jumexWidthValues;
#endregion
/// <summary>
/// Default constructor
/// </summary>
public ScriptMain()
{
this._jumexValues = new string[22];
}
public override void PreExecute()
{
base.PreExecute();
/*
Add your code here for preprocessing or remove if not needed
*/
}
public override void PostExecute()
{
base.PostExecute();
/*
Add your code here for postprocessing or remove if not needed
You can set read/write variables here, for example:
Variables.MyIntVar = 100
*/
}
public override void JumexDailyData_ProcessInput(JumexDailyDataBuffer Buffer)
{
while (Buffer.NextRow())
JumexDailyData_ProcessInputRow(Buffer);
}
public override void JumexDailyData_ProcessInputRow(JumexDailyDataBuffer Row)
{
this._jumexDailyData = Row.JumexDailyData;
if (this._jumexDailyData != null)
{
this._jumexWidthValues = this.Variables.JUMEXLOADSALESATTACHMENTFILEWIDTHVALUES.Split(new string[] { "," }, StringSplitOptions.RemoveEmptyEntries);
if (this._jumexWidthValues != null && this._jumexWidthValues.Count() > 0)
for (int i = 0; i < this._jumexWidthValues.Count(); i++)
{
this._jumexValues[i] = this._jumexDailyData.Substring(0, int.Parse(this._jumexWidthValues[i])).Trim();
this._jumexDailyData = this._jumexDailyData.Substring(int.Parse(this._jumexWidthValues[i]), (this._jumexDailyData.Length - int.Parse(this._jumexWidthValues[i])));
}
if (string.IsNullOrEmpty(this._jumexValues[3].Trim()) == false &&
string.IsNullOrEmpty(this._jumexValues[17].Trim()) == false &&
!this._jumexValues[3].Contains("---") &&
!this._jumexValues[17].Contains("---") &&
!this._jumexValues[3].Trim().ToUpper().Contains("FACTURA") &&
!this._jumexValues[17].Trim().ToUpper().Contains("PEDIDO"))
using (StreamWriter streamWriter = new StreamWriter(this.Variables.JUMEXFULLQUALIFIEDLOADSALESATTACHMENTFILENAME.Replace(".TXT", ".CSV"), true, Encoding.Default))
{
streamWriter.WriteLine(string.Join("|", this._jumexValues));
}
}
}
}

Using stored procedures (Linq-to-SQL, not EF) in WCF RIA - Silverlight 4

For the love of heaven and earth I really wish someone could help me out with this issue. It seems everyone has something to say about EF but nothing about Linq-to-SQL.
I am trying to grab some data from my table via a stored procedure, believe me, that's all.
I added the Linq-to-SQL model (LAMP.dbml)
added the stored procedure (getAffectedParcel) from the server explorer. getAffectedParcel takes 2 strings as parameters
Build the application.
Added a domain service class (LAMPService)
Selected the (LAMPDataContext) as the data context class (normally I would tick generate metadata, but since I am not working with tables it's not enabled for ticking)
Added the following function to the LAMPService.cs:
public IEnumerable < getAffectedParcelResult > GetTheAffectedParcels(String v, String vf)
{
return this.DataContext.getAffectedParcel(v, vf).AsEnumerable();
}
Added the following code to a Silverlight page in an attempt to consume the stored procedure:
LAMPContext db = new LAMPContext();
try
{
var q = db.GetTheAffectedParcels("18606004005", "").Value;
foreach (getAffectedParcelResult GAP in q)
{
MessageBox.Show(GAP.Owner);
}
}
catch (Exception ex)
{
MessageBox.Show (ex.Message.ToString());
}
Build and run application. An error occurs stating:
Object reference not set to an instance of an object.
I have tried ~1000,000 ways to see if this thing would work, but to no avail. Please don't tell me to use Entity Framework, I want to use Linq-to-SQL. Can someone (anyone) help me out here.
//houdini
Calling a stored procedure from the Silverlight client happens in the Async world. Let's consider an example from the AdventureWorks database...
Here's what the Domain Service method looks like. It is calling the EF on a stored procedure in the database called 'BillOfMaterials'.
public IQueryable<BillOfMaterial> GetBillOfMaterials()
{
return this.ObjectContext.BillOfMaterials;
}
Back on the client side, here is the code for setting up the call...
public GetSp()
{
InitializeComponent();
DomainService1 ds1 = new DomainService1();
var lo = ds1.Load(ds1.GetBillOfMaterialsQuery());
lo.Completed += LoCompleted;
}
First, the Domain Service is created, and then it is used to load the results of the stored procedure. In this particular case, the result of this is an instance of 'LoadOperation'. These things are async, so the LoadOperation needs to have a callback for when it is finished. The callback code looks like this...
public ObservableCollection<BillOfMaterial> MyList { get; set; }
void LoCompleted(object sender, EventArgs e)
{
LoadOperation lo = sender as LoadOperation;
if(lo!=null)
{
MyList = new ObservableCollection<BillOfMaterial>();
foreach(BillOfMaterial bi in lo.AllEntities)
{
MyList.Add(bi);
}
dataGrid1.ItemsSource = MyList;
}
}
In this method, the 'sender' is dereferenced into the LoadOperation instance, and then all the goodies from the database can be accessed. In this trivial example, a list is built and passed to DataGrid as the ItemsSource. It's good for understanding, but you would probably do something else in practice.
That should solve your problem. :)
The best advice I can give on Silverlight and RIA is never do ANYTHING on your own until you have tried it in AdventureWorks. You will just waste your time and beat your head against the wall.
Firstly, it seems like your DomainService code is written for Invoke() rather than Query(). You should use Query as it enables you to update data back to the server.
Solution: you should add a [Query] attribute to GetTheAffectedParcels on the domain service.
[Query]
public IQueryable<Parcel>
GetTheAffectedParcels(string ParcelNumber, string LotNumber)
{
// etc.
}
Secondly, RIA Services needs to know which is the primary key on the Parcel class.
Solution: Apply a MetadataType attribute to the Parcel class, which allows you to add metadata to the Parcel class indirectly, since it is generated by Linq2Sql and you couldn't add annotations directly to the ParcelId - it'd get wiped away.
[MetadataType(typeof(ParcelMetadata)]
public partial class Parcel
{
}
public class ParcelMetadata
{
[System.ComponentModel.DataAnnotations.Key]
public int ParcelId {get; set; }
}
Thirdly, modify your client like this. Instead try this on the Silverlight client:
LAMPContext db = new LAMPContext();
try
{
var q = db.GetTheAffectedParcelsQuery("18606004005", "");
db.Load(q, (op) =>
{
if (op.HasError)
{
label1.Text = op.Error.Message;
op.MarkErrorAsHandled();
}
else
{
foreach (var parcel in op.Entities)
{
// your code here
}
}
}
}
catch (Exception ex)
{
label1.Text = op.ex.Message;
}
Much thanks to Chui and Garry who practically kicked me in the right direction :) [thanks guys...ouch]
This is the procedure I finally undertook:
-After adding the data model(LINQ2SQL) and the domain service, I created a partial class [as suggested by Chui] and included the following metadata info therein:
[MetadataTypeAttribute(typeof(getAffectedParcelResult.getAffectedParcelResultMetadata))]
public partial class getAffectedParcelResult
{
internal sealed class getAffectedParcelResultMetadata
{
[Key]
public string PENumber { get; set; }
}
}
Then, Adjusted the Domain Service to include the following:
[Query]
public IQueryable<getAffectedParcelResult> GetTheAffectedParcels(string v, string vf)
{
// IEnumerable<getAffectedParcelResult> ap = this.DataContext.getAffectedParcel(v, vf);
return this.DataContext.getAffectedParcel(v, vf).AsQueryable();
}
Then Build the app, afterwhich the getAffectedParcelResult store procedure appeared in the Data Sources panel. I wanted to access this via code however. Therefore, I accessed it in silverlight [.xaml page] via the following:
LAMPContext db = new LAMPContext();
var q = db.GetTheAffectedParcelsQuery("18606004005", "");
db.Load(q, (op) =>
{
if (op.HasError)
{
MessageBox.Show(op.Error.Message);
op.MarkErrorAsHandled();
}
else
{
foreach (getAffectedParcelResult gap in op.Entities)
{
ownerTextBlock.Text = gap.Owner.ToString();
}
}
},false);
This worked nicely. The thing is, my stored procedure returns a complex type so to speak. As of such, it was not possible to map it to any particular entity.
Oh and by the way this article helped out as well:
http://onmick.com/Home/tabid/154/articleType/ArticleView/articleId/2/Pulling-Data-from-Stored-Procedures-in-WCF-RIA-Services-for-Silverlight.aspx

Resources