I have a silverlight client accessing data through ado.net data services. One of my queries has a number of expand clauses, and gets back quite a number of entries. The xml response is enormous, and I'm looking for ways to make this more efficient.
I have tried:
Paging (not an option for this behaviour)
Http compression (some client pcs are running IE6)
Doing the expands as separate queries and joining the entities later (this improved things a little)
Is it possible to use JSON as a transport format with the silverlight client? I haven't found anything about this on the web...
You can see the demonstration of using JSON in silverlight in the below link
http://timheuer.com/blog/archive/2008/05/06/use-json-data-in-silverlight.aspx
I am not sure how much performance gain is achieved by using JSON. I definitely remember that ado.net services does JSON.
Well. I got a chance to talk to Tim Heuer about this, who awesomely went and asked Pablo Castro for me. Thanks Tim!
JSON can't be used by the Silverlight client, but Silverlight 3 will be using binary xml by default to talk to web services. Rawr.
One other thing i worked out for myself was that using expand can sometimes result in a lot more data than performing multiple requests. If you batch a few queries together and then hand-stitch the objects together, you can save quite a bit of xml.
Related
I am looking for the easiest method to pull data from a WCF and load it into a SQL Server database.
I have a request for some data in our database. To learn EF I created a Entity Model based on a view that pulls the data (v_report_all_clients). I created the WCF service based on that model and tested it with a simple WPF datagrid. The WCF service exports a collection of v_report_all_clients objects which easily load into the simple data grid.
The user requesting the data is going to load the results into their database. I believe they have access to SSIS packages but I'm not sure using SSIS is the easiest method. I tried two approaches and neither were successful let alone easy. Briefly the methods I tried are:
Use Web Service Task - This works but unless I missed something you have to send the output to an XML file which seems like a wasteful middle step.
Use a Script Component - I am running into various issues trying to get this to work even after I follow some online examples. I can try to fight through the issues is this ends up being the best method but I am still not sure if this is the easiest. Plus the online examples didn't include the next logical step of loading into a database.
This is my first attempt to use a WCF as a means to distribute data from our database to various users. I'd like to leverage this technology for some of our larger web based reports that end up being almost an entire table dump. But if the users cannot easily integrate the WCF output into something they can use then I might have stick with web based reporting.
WCF services generally do not "contain" any data. From your description I am assuming the data you want to pull is probably contained in a SQL database somewhere on the lan/wan.
If it's in a database, and you want to put it in another database, then there is already a technology for doing that, it's called SSIS, which you're already using.
So my solution would be: don't use WCF for this task.
If you're going outside the local lan/wan for the data, then there are other ways to replicate the data locally.
Appreciate this does not answer your question as asked.
EDIT
Uncertain how database ownership "chaining" works - but suffice to say that either of the two methods you're proposing above (built in task or script task) both would appear to be eminently feasible. I am not a SSIS guru so would struggle to make a recommendation as to which one to use.
However by introducing a wcf service call into a ETL process you need to bear a few things in mind:
You are introducing an out-of-process call with potentially large performance implications. WCF is not a lightweight stack - opening a channel takes seconds rather than the milliseconds it takes to open a DB connection.
WCF service calls fail all the time. Handling timeouts and errors on the service side will need to be planned into your ETL process.
If you want to "leverage the service for other uses", you may need to put quite a lot of upfront time into making sure the service contract is useful, coherent, and extensible - this is time you would not have to spend going direct to the data.
Hope this has helped you.
I think the idea given by preet sangha is the best answer. I've experimented with a few possible solutions and I think a WCF Data Service is the easiest solution. After installing the OData add-on package for SSIS I was able to pipe the WCF Data Service directly to another table (or any other SSIS destination). It was very simple and easy in my opinion.
I think a second place vote might go to Web API which will give you more flexibility and control. The problem is that comes at a price of requiring more effort. I am also finding some quirks in Web API scaffolding.
I need to fetch data from normalized MSSQL db and feed them in Solr index.
I was just wondering whether Apatar can be used to perform the job. I've gone through its documents, but doesn't get the information I'm looking for. It states, it can fetch data from SQL server, and post it over HTTP, but still not sure, whether it can post fetched data in XML over http or not?
Any advise will be highly valuable. thank you
I am not familiar with Apatar, but seeing as it is a Java application, it may be a bit challenging to implement it in a windows environment. However, for various scenarios where I need to fetch data from a MSSQL Database and feed it to Solr, I have written custom C# code leveraging the SolrNet client. This tends to be pretty straight forward and simple code and in the cases where we need to load data at specified intervals we are using scheduled tasks calling a console application. I would recommend checking out the Create/Update section of the SolrNet site for some examples of loading/updating data with the .Net client.
For my new project I'm looking forward to use JSON data as a text file rather then fetching data from database. My concept is to save a JSON file on the server whenever admin creates a new entry in the database.
As there is no issue of security, will this approach will make user access to data faster or shall I go with the usual database queries.
JSON is typically used as a way to format the data for the purpose of transporting it somewhere. Databases are typically used for storing data.
What you've described may be perfectly sensible, but you really need to say a little bit more about your project before the community can comment on your approach.
What's the pattern of access? Is it always read-only for the user, editable only by site administrator for example?
You shouldn't worry about performance early on. Worry more about ease of development, maintenance and reliability, you can always optimise afterwards.
You may want to look at http://www.mongodb.org/. MongoDB is a document-centric store that uses JSON as its storage format.
JSON in combination with Jquery is a great fast web page smooth updating option but ultimately it still will come down to the same database query.
Just make sure your query is efficient. Use a stored proc.
JSON is just the way the data is sent from the server (Web controller in MVC or code behind in standind c#) to the client (JQuery or JavaScript)
Ultimately the database will be queried the same way.
You should stick with the classic method (database), because you'll face many problems with concurrency and with having too many files to handle.
I think you should go with usual database query.
If you use JSON file you'll have to sync JSON files with the DB (That's mean an extra work is need) and face I/O problems (if your site super busy).
Architecture :
database on a central server which contains a complex hierarchical database structure.
The clients should be able to insert data into tables through the API, The data would be inserted into multiple tables in the database at the same time, and not only into one table.
The clients should be able to retrieve data by using a complex search query.
The clients can upload/download files to the server which could have a size of multiple GBs
would SOAP be better for this job than REST ? can you please explain why ?
Almost all the things you mention are equally achievable using either SOAP or REST, though perhaps a little easier with SOAP. Certainly it's easier to create client APIs for SOAP interfaces; client tooling support is significantly more advanced on the majority of languages.
However, you say that you're wanting to deal with multi-gigabyte upload and download. That's a crucial point as REST is able to handle that sort of thing far more easily. SOAP is almost always tooled in terms of DOM processing, and that means building full messages in memory; you don't ever want to do that with a multi-GB payload.
So go with REST. That's definitely your best option for achieving all your listed objectives.
It's a pretty classic problem. The company I work for has numerous business reports that are used to track sales, data feeds, and various other metrics. Of course this also means that there is a conglomerate of disparate frameworks, ASP.net pages, and areas where these reports can be found. There have been some attempts at consolidating these into a single entity, but nothing has stuck yet.
Since this is a common problem, and I am sure solved innumerable times, I wanted to see what others have done. For the most part these can be boiled down to the following pieces:
A SQL query against our database to gather data
A presentation of data, generally in a data grid
Filtering that can vary based on data types and the business needs
Some way to organize the reports, a single drop down gets long and unmanageable quickly
A method to download data to alter further, perhaps a csv file
My first thought was to create a framework in Silverlight with Linq to Sql. Mainly just because I like it and want to play with it which probably is not the best reason. I also thought the controls grant a lot of functionality like sorting, dragging columns, etc. I was also curious about the printing in Silverlight 4.
Which brings me around to my original question, what is the best way to do this? Is there a package out there I can just buy that will do it for me? The Silverlight approach seems pretty easy, after it's setup and templated, but maybe it's a bad idea and I can learn from someone else?
It may sound contrived, but we just used SSRS. Once it is installed, the /ReportServer web site does a decent job of managing and organizing reports, permissions, etc. You can also make wrappers in front of SSRS via ASP.Net controls or via SharePoint, etc. Cost = free. It works nicely via SQL Developer edition too.
If your SQL service is MS SQL Server 2005+ then I would definitely recommend SQL Service Reporting Services. It covers all the cases you outlined very well and is very easy to get into for someone already familiar with SQL.
myDBR reporting tool might be suitable for you. With myDBR you can create reports quite easily using the built in Query Builder. Once your reports are done you can organize them in categories and also specify access rights for users and user groups.
You can fully concentrate on creating your report content (using SQL) and myDBR will take care of the data presentation. For example creating charts from data is just a matter of adding one extra line to your stored procedure.
myDBR also provides Single-Sign-On Integration so that your users can continue to log in with their already existing username/password.
The tool is also very affordable, community version is free of charge and premium version is only 129 EUR / year.
We use FastReport .NET. It supports SQL Query, presentation of data in datagrid, filtering, export in many popular formats (PDF, DOC, XLS, CSV, DBF, etc.).