Method to pull from WCF to a SQL Server database - sql-server

I am looking for the easiest method to pull data from a WCF and load it into a SQL Server database.
I have a request for some data in our database. To learn EF I created a Entity Model based on a view that pulls the data (v_report_all_clients). I created the WCF service based on that model and tested it with a simple WPF datagrid. The WCF service exports a collection of v_report_all_clients objects which easily load into the simple data grid.
The user requesting the data is going to load the results into their database. I believe they have access to SSIS packages but I'm not sure using SSIS is the easiest method. I tried two approaches and neither were successful let alone easy. Briefly the methods I tried are:
Use Web Service Task - This works but unless I missed something you have to send the output to an XML file which seems like a wasteful middle step.
Use a Script Component - I am running into various issues trying to get this to work even after I follow some online examples. I can try to fight through the issues is this ends up being the best method but I am still not sure if this is the easiest. Plus the online examples didn't include the next logical step of loading into a database.
This is my first attempt to use a WCF as a means to distribute data from our database to various users. I'd like to leverage this technology for some of our larger web based reports that end up being almost an entire table dump. But if the users cannot easily integrate the WCF output into something they can use then I might have stick with web based reporting.

WCF services generally do not "contain" any data. From your description I am assuming the data you want to pull is probably contained in a SQL database somewhere on the lan/wan.
If it's in a database, and you want to put it in another database, then there is already a technology for doing that, it's called SSIS, which you're already using.
So my solution would be: don't use WCF for this task.
If you're going outside the local lan/wan for the data, then there are other ways to replicate the data locally.
Appreciate this does not answer your question as asked.
EDIT
Uncertain how database ownership "chaining" works - but suffice to say that either of the two methods you're proposing above (built in task or script task) both would appear to be eminently feasible. I am not a SSIS guru so would struggle to make a recommendation as to which one to use.
However by introducing a wcf service call into a ETL process you need to bear a few things in mind:
You are introducing an out-of-process call with potentially large performance implications. WCF is not a lightweight stack - opening a channel takes seconds rather than the milliseconds it takes to open a DB connection.
WCF service calls fail all the time. Handling timeouts and errors on the service side will need to be planned into your ETL process.
If you want to "leverage the service for other uses", you may need to put quite a lot of upfront time into making sure the service contract is useful, coherent, and extensible - this is time you would not have to spend going direct to the data.
Hope this has helped you.

I think the idea given by preet sangha is the best answer. I've experimented with a few possible solutions and I think a WCF Data Service is the easiest solution. After installing the OData add-on package for SSIS I was able to pipe the WCF Data Service directly to another table (or any other SSIS destination). It was very simple and easy in my opinion.
I think a second place vote might go to Web API which will give you more flexibility and control. The problem is that comes at a price of requiring more effort. I am also finding some quirks in Web API scaffolding.

Related

How to design the consumation of a REST API from SQL Server?

I'm using a desktop application that writes its data to SQL server. I don't have the source code of this application nor is there an API I could interact with but I do have access to the database.
Besides this desktop application, I'm working with other web based applications that offer a REST API for interaction.
Now, my goal is to act on specific changes I make in the desktop software and push these changes automatically to the web service. For example, if I create a new customer in my application, I want to have the same customer created in the web service.
The easiest way of doing this IMO is to just introduce a trigger in the SQL database and consume the API of the web service directly from T-SQL.
Researching this topic, I came across many comments saying using SQL Server for this is not recommended, too expensive etc. I fully understand where they are coming from, even though in my own case cost or performance really won't matter that much.
Still, I'm wondering, what would be the correct (or at least better) way of doing what I'm trying to do without considerably blowing up complexity?

Is there a generic oData provider for SQL Server?

I've created a gadget for our CRM consultants that allows them to present data from an oData source in CRM. At the moment, it will connect to any data source but for customer sites we need to develop an oData service using WCF each time for each data source.
Does anyone know if there's a decent generic tool out there that can retrieve data from SQL Server, present it (via IIS) as oData and that can be configured without Visual Studio by a non-developer?
We (the WCF Data Services team) have heard this ask a couple of times; what follows are a few of my thoughts in no particular order.
We haven't heard the ask a lot. There's a reasonable amount of work to do here, and without sufficient asks it's hard to justify. That said, there's nothing stopping the community from spinning up an effort to achieve this (hint, hint :)).
There's a number of questions you would need to answer. For instance, what sort of default limitations would the provider have? Would you really want to allow arbitrary expands on something that's probably a production database server? What about permissions? What about read/write?
What happens for mutable schemas? Is this a completely dynamic provider? How much overhead is there in scanning the database schema, and how frequently would the database schema need to be scanned?
How would clients take advantage of a dynamic OData service? Most clients use some form of code generation to make interacting with the service easier.
These thoughts aren't really intended to dissuade at all, but hopefully they give you some things to think about should you attempt to create a generic provider on your own. If you do so, I'd love to hear about it.

Designing web service calls that read/write from database

Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.

Calls to webservice are slow, should i be using something else?

Currently we got a web service up and running that handles all our C.R.U.D calls to our database, and pretty much every interaction between user and the database.
(starting with checking the database to see if there's an update for the particular version of the end users application, checking credentials, checking/reading various other application settings, etc, etc)
Pretty much everything our desktop application does (written in c# with WPF .net framework 3.5) it makes a call to the web service. Problem is that on slower machines this takes way too long. Some of users have to wait up to 3 minutes for the app to cold start (this is probably partly because of .net frameworks slow load time...but the web service calls don't help anything).
I guess since our applications are more or less a fancy front end to a database (SQL SERVER 2005), should we be using something else to communicate with it besides a web service? What else is there? I choose the web service because it's the only thing I knew how to do. (besides connecting directly to the database) What do other similar apps use?
Thank You
As mentioned by #Chris Marisic, first profile to ensure that the problem is in the web called. Assuming that it is, there are a few things you can try (I don't know WPF so you will have to see how they work with the framework).
Batch similar items.
For example instead of loading 1 row at a time from table (or equivalent), load several.
May web calls asynchronous.
If you have to send out a bunch of independent calls to the web service, send them asynchronously so that multiple requests are going across the network at once.
Cache values. This can add a lot of complexity if you are not careful (depending on how much you care if the cache is up to date). The ability to listen to the server for changes (or the ability to have the server push changes) makes this one easier to handle.
I had a similar problem on a different framework and I got quite a bit of speedup just with #1.
Profile Profile Profile. Don't make assumptions.
By "web service" do you mean "SOAP service"? Profiling will tell you whether or not SOAP is buying you something.
It's likely that latency and (n+1) query is killing you. It should be easy to measure.

Web services and database concurrency

I'm building a .NET client application (C#, WinForms) that uses a web service for interaction with the database. The client will be run from remote locations using a WAN or VPN, hence the idea of using a web service rather than direct database access.
The issue I'm grappling with right now is how to handle database concurrency. That is, if two people from different locations update the same data, how do I deal with it? I'm considering using timestamps on each database record and having that as part of the update where clauses, but that means that the timestamps have to move back and forth through the web service interface, which seems kind of ugly.
What is the best way to approach this?
I don't think you want your web service to talk directly to the database. You probably want your service to interact with some type of business components who in turn interact with a data access layer. Any concurrency exceptions can be passed from the DAL up to the business layer where they can be handled so that the web service never has to see the timestamps.
But if you are passing something like a data table up to the client and you want to avoid timestamps, you can do concurrency checking by comparing field by field. The Table Adapter wizards generate this type of concurrency checking by default if you ask for optimistic concurrency checking.
If your collisions occur infrequently enough that they can be resolved manually, a simple solution is to add an update trigger that copies a row's pre-update values to an audit table. This way the most recent write is the "winner", but no data is ever lost to an overwrite, and an administrator can restore an earlier row state or even combine them.
This technique has its downsides, and is not a very good solution where frequent overwrites are common.
Also, this is slightly off-topic, but using web services isn't necessarily the way to go just because the clients will be remoting into the network. ASP.NET web services are XML-based and very verbose. If your client application can count on being always connected, you'd be better off not using web services.

Resources