Web-Service orchestration for a CRUD webservices - database

I have 4 web services providing CRUD methods to 4 different simple databases(1 table by databse).
What I want to do is to create another web service which call the aforementioned services to gather some specific data, the data requested might be in one database or in the four databases.
I've tried to work with orchestration (ow2 Orchestra) but I've found that is for business processes, and in my case I only have services representing a CRUD application for a database.
I've also thinked of using an ESB, but it seems a labourious task for simple case study as mine.
Do I have to create manually this web service or there is another approach ?
thank you

I've done a lot of work with Mule ESB - development, design, running in production, etc. I would say an ESB would be overkill for you. If all you need to do is have one wrapper web service that calls 1-4 other web services, I would keep it simple and just implement it manually.
An ESB requires a server to run your projects on and there is a fairly large learning curve.
It would be easier to learn how to make the web service yourself compared to learning the ESB. It would also be easier to maintain a web service as opposed to an ESB.
However if you are going to have web services multiplying all over the place, then an ESB might make sense.

Related

How to design/develop an integration layer or bus for different external services/apps

We are currently looking into replacing one of our apps with possibly an ESB or some similar tool and was looking for some insights into how best to approach this.
We currently have a stand alone service that consumes/interact with different external services and data sources, some delivered through SOAP Web Services and others we just use a DB connection. This service is exposed through SOAP and we have other apps that consume this service but are very tightly coupled to it, now we also have other apps that need to consume some of the external services and would like to replace this all together with an ESB or some sort of SOA platform.
What would be the best way to replace this 'external' services integration layer with an ESB? We were thinking of having a 'global' contract/API in which all of the services we consume are exposed as one single contract where all the possible operations and data structures that we use are exposed under one single namespace, would this be the best way of approaching this? and if so are there any tools that could help us automate this process or do we basically have to handcraft this contract/API?. This would also mean that for any changes to the underlying services/API's we will have to update this new API as well.
If not then the other option I see is to basically use the 'ESB' as a 'proxy' layer in which all of our sources are exposed as they are, so we would end up with several different 'contracts' / API endpoints, but I don't really see the value in this.
Also given the above what would be the best tool for the job? is a full blown ESB an overkill or are we much better rolling our own using something like Apache Camel or Spring Integration?.
A few more details:
We are currently integrating over 5 different external services with more to come in the future.
Only a couple of apps consuming our current app at the moment but several other apps/systems in the future will need to consume some of these external services.
We are currently using a single method of communication (SOAP) between these services but some apps might use pub/sub messaging in the future, although SOAP will still be the main protocol used.
I am new to ESB integration so I apologize in advance if I'm misunderstanding a lot of these technologies and the problems they are meant to solve.
Any help/tips/pointers will be greatly appreciated.
Thanks.
You need to put in some design thoughts of what you want to achieve over time.
There are multiple benefits and potential pitfalls with an ESB introduction.
Here are some typical benefits/use cases
When your applications are hard to change or have very different release cycles - then it's convenient to have an ESB in the middle that can adopt the changes quickly. This is very much the case when your organization buys a lot of COTS products and cloud services that might come with an update the next day that breaks the current API.
When you need to adapt data from one master data system to several other systems and they might not support the same interfaces, i.e. CRM system might want data imported via web services as soon as it's available, ERP want data through db/staging tables and production system wants data every weekend in a flat file delivered via FTP. To keep the master data system clean and easy to maintain, just implement one single integration service in the master data system, and adapt this interface to the various other applications within the ESB plattform instead.
Aggregation or splitting of data from various sources to protect your sensitive systems might be a use case. Say that you have an old system that can take a small updates of information at a time and it's not worth to upgrade this system - then an integration solution that can do aggreggation or splitting or throttling can be a good solution.
Other benefits and use cases include the ability to track and wire tap every message passing between systems - which can even be used together with business intelligence tools to gather KPI:s.
A conceptual ESB can also introduce a canonical message format that is used for all services that needs to communicate. If a lot of applications share the same data with several other applications (not only point to point) - then the benefits of a canonical message format can outweight the cost (which is/can be high). An ESB server might be useful to deal with canonical data as it is usually very good at mapping from one format to another.
However, introducing an ESB without a plan what benefits you are trying to achieve is not really a good thing, since it introduces overhead - you need another server to keep alive, you need perhaps another team to understand all data flows. You need particular knowledge with your integration product. Finally, you need to be able to have some governance around it so that your ESB initiative does not drift away from the goals/benefits you have foreseen.
You should choose some technology that you are comfortable with - or think you can be comfortable with. Apache Camel is indeed very powerful and my favorite integration engine - but it's not an ESB as it does not come with a runtime that you can use to deploy/manage/monitor your integration services with. You can use it together with most Java EE application servers or even better - Apache ServiceMix (= Karaf+Camel+ActiveMQ+CXF) which is built for this task.
The same goes with spring integration - you need to run it somewhere, app servers or what not.
There is a large set of different products, both open source and commercial that does these things.

Application backend and networking?

I'm not sure if the title I used is good, but hopefully my question will be better.
I have an java application that tracks data on PennyAuction sites(such as beezid.com), and will eventually upload all of the data to a database. Clients will download a java application and run it on their local machine, but this local machine application will have to be able to access the database and obtain all of the data.
All I have ever done is java applications programming, so this is all very new to me. Can anybody help me with a solution that will be able to accomplish this?
This is all I can think of:
Server that will run Backend application 24/7, and use JDBC to upload data to database.
Separate server for database alone.
I have no idea how the client application should connect to the database though.
Any help/links to tutorials on stuff like this would be appreciated.
You could actually let your clients connect to the database directly if the database is on a public IP.
An architecurally much better way of doing this is with webservices. This would make your system much more safe, robust and scalable.
Web services are client and server applications that communicate over the World Wide Web’s (WWW) HyperText Transfer Protocol (HTTP). As described by the World Wide Web Consortium (W3C), web services provide a standard means of interoperating between software applications running on a variety of platforms and frameworks. Web services are characterized by their great interoperability and extensibility, as well as their machine-processable descriptions, thanks to the use of XML. Web services can be combined in a loosely coupled way to achieve complex operations. Programs providing simple services can interact with each other to deliver sophisticated added-value services.

Designing web service calls that read/write from database

Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.

Are "WCF Data Services" going in the right direction?

I really like the idea of "WCF Data Services" but how does it work in a real life scenario? WCF Data Services provide just a nice way for the client to CRUD the data. However it's very limited in what you can pass and get back. So one ends up having all the business logic written on a client side. It's probably ok for small applications who just need a database back-end. You don't want that in serious enterprise applications, your client side will grow too large and if your business logic is some kind of know-how it can be easily disassembled.
Don't be misled that SOAP is for enterprise and REST is for sucky little side web apps. Many people have wasted a lot of time on SOAP frameworks, me included and the trouble that these frameworks cause for inter-enterprise communication would count into the Billions of dollars.
REST provides an opportunity to only care about the data being passed too and from services and the semantics used to operate against services, the rest (excuse the pun) is handled by transport level mechanisms. Do you want encrypted data channels? Well HTTPs is there for that. Do you need authentication? there are plenty of frameworks on HTTP that support this already rather than use complex WS-* protocols. Do you want Reliable Messaging? you can engineer it quite simply using message queue software - I have only ever seen one SOAP framework handle this well and it wasn't very interoperable at that point.
Whilst I am not discounting SOAP as enterprise-grade, all I am saying it don't discount REST based services as an excellent way for your enterprise modules to communicate.
I personally have integrated multi-million dollar systems using REST and SOAP and currently prefer REST based services for their ease of development and 3rd party integration, understanding, ease of documentation and their ability to rapidly deploy services across businesses.
I can understand your confusion given the naming... WCF Data Services are REST based which are notoriously poor for enterprise environments. Howvever, you can have normal SOAP based WCF services which work fine for the enterprise.

Using a web service to secure a database

There are some rumors floating around that the team at my company will soon be using web services for all future application development. The architecture is supposed to be something like this:
Application --> Web Service --> Database
The stated reasoning behind it is security. This sounds like a huge waste of time for little if any benefit. My question is, in what ways does a web service make your data more secure than a database? I would think that if an attacker wanted to get all your data and had already gotten onto the app server, it would be fairly trivial to figure out how the application is getting it's data.
Please keep in mind that these web services would be purely for data, and would have little if any business/validation logic, and would also be outside the application developers control (at least that's the way it's worked with all previous applications that have used web services).
If it's true that there will be no business logic or validation on the web services, then there is only a limited security benefit to adding the additional layer of abstraction. I say limited because the interface between your application and the database is still more limited than if they were directly talking to each other.
If you add validation and business logic to the equation, there is a significant security benefit, as anyone who has access to the application account can only do the database what the application is able to do. Additionally, this is a better design because it reduces coupling between your application and implementation details of how the data is stored in the database. If you wanted to change the database schema, you only need to update the web services, and not entire applications.
One important thing about Web Services is interoperability so that different applications from different platforms later can utilize the services and data. Your company will benefit a lot by doing so. And you are right about the security, it is definitely one of the good reasons to use web service rather than expose a public endpoint of the database, it is dangerous!
Web Services enable the accessibility of your data, For example, your data can be accessed within browser by javascript. There is no way to access the database on the server directly within Javascript.
All in all, go for it, that is the right approach.
the security argument is questionable; authenticating to a web service is no different than authenticating to the database
there are legitimate reasons for moving db operations to web services and SOA in general, but security isn't one of them
If you use a webservice hopefully you will also be using some kind of queue when sending the data to the database. If you are using a webservice and queue combo then the security come into place with less chance of lost data. If you do not have a webservice and queue combo if you send data to the database and it never gets there you have no were for it to go it just disappears.
You are correct though if someone wants to break into your system a webservice isnt going to help if anything it might make it worse if you make the webservice public and they find the name of your webservice because then they can just query your DB using the webservice and any security features on your servers will just think it is you applications getting the information.

Resources