We have a mobile application made in VB.NET for Windows CE/Mobile smart devices for shipping/reception operations in a warehouse. This application connects to a SQL Server and extensively uses stored procedures. Since Windows Mobile devices are discontinued and replaced by Android devices, we have to convert our solution to Android, using Visual Studio's Xamarin and C#.
I'm a newbie at Android programming. Is there a way we can connect directly to a SQL Server instance and call a stored procedure from Android? I made some searches and people says it's better to call web services as an intermediary between Android and SQL Server. Is it the best practice?
Thanks for your insight and help
Upgrading Legacy applications can be incredibly complicated depending on the technology and frameworks used.
The first thing I would suggest taking a look at is the Architecture documentation Microsoft produced for Xamarin and Cross-platform frameworks which you can see here
Typical Application Layers
Data Layer – Non-volatile data persistence, likely to be an SQLite database but could be implemented with XML files or any other suitable
mechanism.
Data Access Layer – Wrapper around the Data Layer that provides Create, Read, Update, Delete (CRUD) access to the data without
exposing implementation details to the caller. For example, the DAL
may contain SQL statements to query or update the data but the
referencing code would not need to know this.
Business Layer – (sometimes called the Business Logic Layer or BLL) contains business entity definitions (the Model) and business logic.
Candidate for Business Façade pattern.
Service Access Layer – Used to access services in the cloud: from complex web services (REST, JSON, WCF) to simple retrieval of data and
images from remote servers. Encapsulates the networking behavior and
provides a simple API to be consumed by the Application and UI layers.
Application Layer – Code that’s typically platform specific (not generally shared across platforms) or code that is specific to the
application (not generally reusable). A good test of whether to place
code in the Application Layer versus the UI Layer is (a) to determine
whether the class has any actual display controls or (b) whether it
could be shared between multiple screens or devices (eg. iPhone and
iPad).
User Interface (UI) Layer – The user-facing layer, contains screens, widgets and the controllers that manage them.
Now you could just simply use the System.Data.SqlClient assembly and fire off stored procedure runs against your database. However a more common approach would to create an REST Api that sits in-between your client and your back-end services.
You can download a handy E-book created by the Devs over at microsoft that will show you some common enterprise patterns to use for Cross-platform technologies like Xamarin which can be found here
Here's one such example of the kind of patterns those links refer to.
You can also find an overview of various web services that you can use at this link
The options it gives you an overview of are:
ASMX
WCF
REST
So plenty of choice, but depends on your current approach.
Related
I am building a desktop application that will be used on local network, with SQL Server as database.
This application would have around 50 users top at the same time. In what particular scenario would I need to use WCF service? Is it recommended to create a WCF service on the server computer where database would reside, so we connect to this server through WCF service, instead of connecting to the database directly? What is the recommended way to connect to SQL Server data and why?
Edit: Let me explain in more detail. I have used WCF Ria services before, so I know how they work. Lets assume that WCF services works in same way. The question was directed toward why would we use WCF instead of directly connecting to database? I didnt want to specify my current application requirement, since I would get a specific answer for specific requirement. My goal was to understand in general why and when would you yse one instead of another. And I have received satisfying answers so far.
It appears to me that general consensus is to use WCF only if there would be a demand of another type of application, which would use web access to get data from service. Also, if I understood correctly, from security point of view, there is no difference between the two.
There would be a statistical app in the future that uses web to provide read-only statistics to user, and naturally some service will be required for this task (application has no specific client in mind, it will be offered to lots of clients). Since I need some demo application to be done very rapidly for particular clients, then I am thinking to neglect the service part, and make a proper layering (WPF->VM->Model->EF, so later I would just insert service between the model and EF. I guess it should not take too much time to make WPF app running with inserted layer. I am also postponing the service because of next reason: since HTML5 is (going to be) main technology for web, and there is a possibility that SL will be abandoned as technology (which I have been using), the logical decision would be to choose HTML5 over SL. But since I am totally unfamiliar with HTML5 and its requirements, I am not sure if WCF service is the best choice for it, and this is also one of the reasons to postpone the decision of choosing the service type (along with requirement to make the desktop demo app as fast as possible).
I think a better way to consider the question is whether you should abstract your database and data access layer from the application using a service interface. You could use WCF and SOAP or you could use a REST based HTTP service, the choice of technology is secondary to whether the current or future requirements of your application indicate that an additional layer of abstraction is warrented.
Reasons you might consider using a service interface instead of directly connecting to the SQL database include but are not limited to:
Ease of supporting multiple operating systems/client UIs
Ability to evolve the data/service interface separately from your database schema
Isolate application from changes to database schema or location (you don't have to redeploy change to application, only change internals of the services it is calling)
If data could be used by other systems, you have a standard means of allowing these systems to interface with the data your application is managing
Reduced SQL database connection security concerns (only service identity connects to database, allowing you to use a variety of authentication/authorization strategies on the client side)
The trade off you are looking at is the time/cost/complexity of implementing a service interface versus the flexibility and mantainability benefits you will gain. You should evaluate the needs of your application and your customer before you make a decision on whether to connect directly to your data store using ADO.NET or use a service layer.
You should take a look at the Microsoft Service Layer Guidelines as they cover a lot of the considerations to take into account.
Unless you need to create a reusable service, I can't think of a reason to add a WCF layer, unless you are just looking for a reason to do it. I think you can just go with some sort of ORM like EF or nHibernate and be happy.
The main reason for WCF is security. If the client connects directly to the DB then the client must be given rights on tables. The client can hack into the connection and use TSQL directly. You must expose port 1433 to the network in a single tier application. With WCF there is not direct access from the client to SQL. It is not just more secure in general but you can have more granular security. .NET service code can enforce row level security. A table only has column level security. If this is business on a private network and you don't expect anyone would try and hack into your db then client connecting directly to the SQL server is easier to build. With server side service the other factor is a change to server side code is one spot so you don't have to update 50 devices.
Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.
Starting from scratch in terms of knowledge I have written a desktop application using WPF and NHibernate. This has helped me get up to speed with both NHibernate and WPF.
However, there is a requirement to make the application so that it can be accessed from mutliple places - these include handheld devices which have very simple web browser (no javascript capability), web services, Internet, the desktop application and potentially other user interfaces.
I believe this requires moving the application from its current NHibernate.dll deployed on the client to a web based application. The sheer choice of technology stacks is overwhelming and I am hoping I can get pointed in the right direction.
In essence, I want to be able to access the data from the server side in the desktop client, from the web service, from the handheld devices.
On the server side I guess I would have a web server(IIS?), NHibernate and a database and some way of communicating between the clients and the server.
What would be the best choice in this circumstance? Is it REST? SOAP? WCF? Something I don't know about / haven't mentioned?
Any assistance and advice from people who have implemented similar things would be very much appreciated.
Well since you have already made most of the work on the desktop, moving the actual data access part on a webservice is not going to be hard.
If i were you i would make that transition and make a SOAP/WCF service have the same signatures (or at least as close as possible) as the data layer. This will make the transition for your already written desktop application easier. You will have to add code for synchronizing or just safety code for when the service is inaccesible.
For mobile access via a thin web client as the browser is, you are looking at a web site/application that touches the DB (possibly via the same data-access webservice) and generates html...
I can't advise between rest/soap/wcf as i have minor prior experience with all, but i can tell you that i have created a similar setup with a SOAP WebService feeding a Web Application and .NET Compact Framework Application for WM6+ and it works well.
Please I need help.
I have project in which I need application which communicates with local DB server and simultaneously with central remote DB server to complete some task(read stock quotas from local server create order and then write order to central orders DB,...).
So, I don`t know which architecture and technology do this.
Web application, .NET WinForms client applications on each computer, or web services based central application with client applications?
What are general differences between this approaches?
Thanks
If you don't want to expose your database directly to the clients, I'd recommend having a web service layer in between. Depending on the sensitivity of your data and the security level of your network, I'd recommend either a web service approach (where you can manage the encryption of data yourself, and without need for expensive ssl certificates) or a web interface (which might be easier to construct, but with limitations in security).
I agree with Tomas that a web service layer might be good. However, when it comes to choosing between webforms or winforms I don't think your question includes enough information to make the choice.
I'd say that if you want a powerful and feature rich user interface and want to make development easy, Winforms is probably the way to go. But if you need it to be usuable from a varied array of clients and want easier maintenance and deployment, a web app might be best.
First, focus on the exact relationship between these databases. What does "local" mean. Right there on the user's desktop? Shared between all the users in their office? Presumably the local quotes (you do mean stock quotes and not quotas?) could potentiually be a little out of date relative to the central order server's view of the world. Does that matter? I place an order for 100 X at price 78.34, real price may be different. What is the intended behaviour.
My guess is that there is at least some business logic and so we need to decide where that runs. One (thick client) approach is to put that logic on the desktop, the desktop app then might write directly to the central DB. I don't tend to do this for several reasons:
Every client desktop gets a database connection. Scaling is not good, eventually the database gets unhappy when the number of users gets very large.
If we need a slightly different app, perhaps exposed to a different set of users via the Web or whatever, we end up reproducing that business logic.
An alternative approach (thin or browser based) keeps the UI on the desktop, but puts the logic on the server. The client can then invoke some kind of service. Now there's lots of possible ways of doing that, a simple Web Service or Rest Service will do the job. I hope it's clear that this service-based appraoch addressed my two points above.
By symmetry I would treat the local databases in the same way, wrap them in services. However it's possible that some more complex relationship between the databases exists and in which case you might need the local service layer to interact with the central service layer.
I'm touting the general pronciple of Do Not Repeat Yourself, implement each piece of business logic once.
We have a relatively large application that is strongly tied into Firebird (stored procedures, views etc). We are now getting a lot of requests to support additional databases and we would also like to move a lot of the functionality from the client to the server.
Now seems like a good time to move to a 3(4) tier architecture. We have already looked at DataSnap 2009 and RemObjects SDK/DataAbstract. Both seem like they would do the job, but are there any advantages/disadvantages we should look out for? Are there any other frameworks that you could recommend?
Cheers,
Paul
I can recommend using the KBM Middleware components from Components4Developers. There is a bit of a learning curve but they are very flexible and hold up well under use in real world conditions.
Comment from a user (http://www.components4programmers.com/usercomments/commentfromapowerusertoaquestion.htm)
Changing your application to Multi-Tiers with new framework (RM,DS,kbmMW, or what ever), will make a lot of changes in our application architecture, I recommended to go with this in future, but you can achieve the support for multi database, with other products like
UniDac from DevArt( Best components for database with direct connection).
AnyDac(from same Company who offer RemObjects.
SqlDirect(Has support for 9 MajorDB and also ODBC).
ZeosDB(Open source).
using one of the components above, will give you support for most major databases, beside it will not make you doig a lot of changes, and in some cases you just replace old database components with the new ones, and maybe change some of properties.
However, changing to Multi-tiers will not only make you only support more databases, but it will separate your business logic from presentation layer, therefore you can have more presentation layers for your application like web interface, or smart devices.
But the most important in the Multi-Tiers architecture, you will have a scalable system the grow more than what the database you are using can handle of connection, beside other benefits, like using other languages to write client applications.
In the process of moving to a multitier application you could consider using a transport protocol between the layers, which is language/technology independent (like webservices, (i think tha remobjects supports that)).
This could make a reimplementation of a layer simpler later (like if you later have to make a another version of the client-application in a browser/java/silverlight).
You can also investigate Midware http://www.overbyte.be/frame_index.html
For multi-tier architecture I also recommend to check out message-oriented middleware.
With message-oriented middleware, cross-language and cross-platform application integration can be implemented using the peer-to-peer or the publish/subscribe communication model. Messaging systems are loosely coupled, asynchronous and reliable. For example, they are core components in Java(tm) application servers such as JBoss.
For Firebird, I recently wrote a blog article on replacing Firebird database events, their limitations and ways to replace them with message-broker based solutions (which are available as open source):
Firebird Database Events and Message-oriented Middleware (part 1)
Firebird Database Events and Message-oriented Middleware
(part 2)
(disclaimer: I am a developer of Delphi and Free Pascal client libraries for open source message brokers).