Accessing SSAS using External User Database - sql-server

We have a BI team that have created a wonderful data warehouse that works fine for the internal staff using it through Excel on the internal network. They use windows authentication through the domain controller, and everything works fine, including restricting the access to users and AD Groups.
The issue is that we now want to provide the same access to a wider audience that is not part of our Windows Domain. This is further complicated by the fact that the information on the users that need access to the system is already stored in another location (an application with a SQL database).
The goal is to make it so that they connect (probably using HTTPS) to the cube (using Excel Analysis services integration), and be authenticated using the Username/Password that they had in the main application.
The main application has a WCF Service interface for user authentication, and session management, so all I really need is a way to provide authentication in front of the MSMDpump.dll against that WebService. We can also add in a Role Mapping so that we can define the SSAS roles against the users in the application.
I was thinking that I could create a dll that has the same interface as the MSMDPUMP.dll, and have that translate calls between the client and the main dll, but this seems a bit of overkill.
Are there any pre-built tools to do this? (and yes I know that Sharepoint can do something like this, but that's not an option so please don't suggest it). Does anybody know of any blogs detailing how to do it?
Any pointers in where to start with creating an interface between the 2?
The question is similar to How to secure MS SSAS 2005 for HTTP remote access via Internet? however, I'm looking at providing the authentication mechanism from another datasource, and providing the Roles to SSAS, not the users. We don't want to have to setup a new user in SSAS for every user that is setup in the external application.
UPDATE: To be clear, they external users need to connect to the cube using Excel, and the data returned needs to be filtered by the role they're in and the security applied in the Cube.
We are able to cahnge the cube to use Dynamic Dimension Security and use CustomData attributes if that helps.

The eventual solution ended up being a combination of a Third Party Control and Dynamic Dimension Security.
We found that it's not possible to easily apply a MembershipProvider interface to the MSMDPUMP interface without significant effort, so our solution gave them a Web Interface to use instead.
The Control we used was by DevExpress and is their "PivotGrid" control. It's not free, but is significantly less than the development resource costs of implementing any other custom solution.
Along with the control, we've applied Dynamic Dimension Security to the Cube so each user of the Site will have a dedicated connectionString to the cube with "CustomData" appended to the connectionstring. This allows us to delegate Data Segregation tasks to the Cube and BI team, and let the Web Developers concentrate on the display of the controls.
The solution is working quite well, and doesn't involve heavy weight applications like Sharepoint/Excel Services. It can built directly into your site and branded as you need, providing a sales tool as well as a useful functional tool.

Can you create a limited-access user on the cube db (read-only, only for the relevant cube, etc.) and hard-code that user/password into a connection string on the app db?

Related

Is WCF recommended to use with WPF and MVVM to retrieve data from SQL Server?

I am building a desktop application that will be used on local network, with SQL Server as database.
This application would have around 50 users top at the same time. In what particular scenario would I need to use WCF service? Is it recommended to create a WCF service on the server computer where database would reside, so we connect to this server through WCF service, instead of connecting to the database directly? What is the recommended way to connect to SQL Server data and why?
Edit: Let me explain in more detail. I have used WCF Ria services before, so I know how they work. Lets assume that WCF services works in same way. The question was directed toward why would we use WCF instead of directly connecting to database? I didnt want to specify my current application requirement, since I would get a specific answer for specific requirement. My goal was to understand in general why and when would you yse one instead of another. And I have received satisfying answers so far.
It appears to me that general consensus is to use WCF only if there would be a demand of another type of application, which would use web access to get data from service. Also, if I understood correctly, from security point of view, there is no difference between the two.
There would be a statistical app in the future that uses web to provide read-only statistics to user, and naturally some service will be required for this task (application has no specific client in mind, it will be offered to lots of clients). Since I need some demo application to be done very rapidly for particular clients, then I am thinking to neglect the service part, and make a proper layering (WPF->VM->Model->EF, so later I would just insert service between the model and EF. I guess it should not take too much time to make WPF app running with inserted layer. I am also postponing the service because of next reason: since HTML5 is (going to be) main technology for web, and there is a possibility that SL will be abandoned as technology (which I have been using), the logical decision would be to choose HTML5 over SL. But since I am totally unfamiliar with HTML5 and its requirements, I am not sure if WCF service is the best choice for it, and this is also one of the reasons to postpone the decision of choosing the service type (along with requirement to make the desktop demo app as fast as possible).
I think a better way to consider the question is whether you should abstract your database and data access layer from the application using a service interface. You could use WCF and SOAP or you could use a REST based HTTP service, the choice of technology is secondary to whether the current or future requirements of your application indicate that an additional layer of abstraction is warrented.
Reasons you might consider using a service interface instead of directly connecting to the SQL database include but are not limited to:
Ease of supporting multiple operating systems/client UIs
Ability to evolve the data/service interface separately from your database schema
Isolate application from changes to database schema or location (you don't have to redeploy change to application, only change internals of the services it is calling)
If data could be used by other systems, you have a standard means of allowing these systems to interface with the data your application is managing
Reduced SQL database connection security concerns (only service identity connects to database, allowing you to use a variety of authentication/authorization strategies on the client side)
The trade off you are looking at is the time/cost/complexity of implementing a service interface versus the flexibility and mantainability benefits you will gain. You should evaluate the needs of your application and your customer before you make a decision on whether to connect directly to your data store using ADO.NET or use a service layer.
You should take a look at the Microsoft Service Layer Guidelines as they cover a lot of the considerations to take into account.
Unless you need to create a reusable service, I can't think of a reason to add a WCF layer, unless you are just looking for a reason to do it. I think you can just go with some sort of ORM like EF or nHibernate and be happy.
The main reason for WCF is security. If the client connects directly to the DB then the client must be given rights on tables. The client can hack into the connection and use TSQL directly. You must expose port 1433 to the network in a single tier application. With WCF there is not direct access from the client to SQL. It is not just more secure in general but you can have more granular security. .NET service code can enforce row level security. A table only has column level security. If this is business on a private network and you don't expect anyone would try and hack into your db then client connecting directly to the SQL server is easier to build. With server side service the other factor is a change to server side code is one spot so you don't have to update 50 devices.

Designing web service calls that read/write from database

Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.

Use custom authorisation on access to cube

Is it possible to use our solution's existing authentication mechanism for determining access rights to a MS Analysis services cube?
We already have a system that manages usage policies and we would like to avoid duplicating this on the SQL Server.
Our authentication system is based on NetSqlAzMan and we could expose it as a web service, or a set of managed .NET assemblies (or just about anything if it got us the above functionality)...
If you mean Custom Authentication my answer is No. SSAS uses Windows Autentication and you should use it with its services or options (active directory, other integrated solutions etc).
If you mean Custom Authorization then my answer is Yes. Basically, you should follow the steps below;
Create a .NET assembly that uses your NetSqlAzMan backed web services or whatever you have for integration.
Register the assembly with Analysis Services
Use the registered assembly's functions in the advanced Dimension Security section to restrict the dimension members by users.
Custom assembly in this context must be developed as high performant as possible because every MDX query will consult it for filtering the members.

Access database sharing strategies

What are the strategies you employ to let multiple people work on an access database?
Is it possible to host it online and have its features still functional without having to develop a custom frontend?
MS Access as a software has a few nice features that don't require any programming to configure:
Drop down lists - choose one
Multi Checkbox lists - choose multiple
Is it possible to get all of these features available even when hosted online? I'm basically thinking of an alternate way to quickly get people to work with data using GUI features like the above without going the webapp<>MySQL way.
You have some good comments here. Keep in mind that things have changed quite a bit for access 2010.
Access 2010 allows you to build web applications. The development process is very much the same as it’s been for years, but you can’t use VBA in forms for these web applications (you use a new macro language). This new feature set allows you to publish applications you build to a website. Here is an video of an application of mine running in access 2010, and at the halfway point in the video I switch to running the access application 100% in a web browser:
http://www.youtube.com/watch?v=AU4mH0jPntI
The above is for access 2010…due out this year. The above will require you to be running SharePoint services, or use an hosting service that supports "access web" services.
For previous versions of access, for all intents and purposes, it’s not a web based system at all. Now when you say multiple users, you have to clarify what kind of users and where they plan to be. If your users are on a local office network, then MS access can be used as a multi user system right out of a box with no additional coding and programming required. It is recommended however that you split your application into a front end part that’s deployed on each user’s computer. This Concept as outlined in the following article of mine.
http://www.members.shaw.ca/AlbertKallal/Articles/split/index.htm
Now, perhaps the users are going to be on notebooks and in different locations all over the country? In this type of case you are attempting to connect over a wide area network, or have users connect to the application over the Internet. This is a different problem. In this type of scenario, a good solution is to use something like SQL server for the backend, and you continue to deploy the Access front ends to each user’s computer. This application tends to be about the most cost affordable also. And using sql server + ms-access means you get to continue developing in Access for the most part like you always done. Another way to accomplish wide area use without resorting to sql server is to use something called terminal services. I outline these possibilities in the following article:
http://www.members.shaw.ca/AlbertKallal//Wan/Wans.html
As mentioned, a few others here posted links to some of the new SharePoint features that you can consider using, but they not out untill later this year.
Multi-user Access apps are pretty easy to do for small workgroup user populations in the 15-25 ranger or smaller. Above that, a developer should consider upsizing to a server back end, with the trade-off being greater administrative overhead for the server vs. having to program the app more carefully if you retain the Jet/ACE back end.
As to online access, you this isn't possible over HTTP, but if you have a Windows Terminal Server available, you can host your app there and give users access to that. This is actually an extremely easy and efficient and inexpensive way to support remote users of an app, though the larger the user population, the more problematic it becomes. But by the time an Access app has a user population that would strain a Windows Terminal Server setup, you're no longer going to be using a Jet/ACE back end.
And with a server back end, you could give access to a SQL Server on a VPN over the Internet, and if you write your Access app really efficiently, even over a standard broadband connection, your users could still work productively.
Then there's the future of Access: in Access 2010, a great deal of work has been done to integrate with a host of new features in Sharepoint 2010. If you create your A2010 app using the new type of Access web forms and reports, your app can be uploaded to a Sharepoint server running the new Access Services, and it can then be used running in a web browser (not limited to IE and not dependent on any plugins or web controls, as was the case in the past with the completely worthless Access Data Access Pages). The data store can either be a SQL Server, or you could keep it Jet/ACE for users not accessing it via the web browser, and have the data stored in Sharepoint for the online users. Also, you can have an app integrated with Sharepoint running locally in Access that uses Sharepoint when connected to the Internet, and still be able to work offline when disconnected. When connected again, you synch your local changes with the Sharepoint server, resolve any differences and continue working.
The features are really quite remarkable, and according to what I've heard and seen, if the Access app is built entirely of web forms and reports, it will look and function identically when run in Access and when run in the web browser via Sharepoint. And if you need to have client-side features that you don't expose to the users running the app in the browser, you can still use traditional Access objects!
The Access development team's blog has a number of posts on what's coming in A2010, and there's a good video posted there demonstrating how A2010 integrates with Sharepoint 2010's new Access Services.
This constitutes a quantum leap in Access's web capabilities, which were previously almost non-existent, and I'm quite excited about this. I was formerly quite wary of the changes being made to Access that seemed entirely to make it a servant of Sharepoint, but now I can see that the benefit to Access users and Access developers will be huge.
One way i've heard of, is to import the access database into a SQL Server database.
(Almost any version will do.).
Then link to the SQL Server database with Access and let users use it as they did before.
Look at this link: http://office.microsoft.com/en-us/access/HA010345991033.aspx
If you want an online solution i'd recommend going with a normal web application architecture. (Talking to a proper database.).
I have never needed to support it myself, but from what I heard so far, performance dramatically breaks down as soon as you need to support multiple users writing simultaneously. I think this is because Access uses simple file locking to implement isolation, and this just is not the right technique for a concurrent database system.
Hosted on-line? Do you mean on the network? Technically it will work on a network but there is a reason MS-Access in not in Visual Studio - it is not considered a development platform - it is a desktop application. When MS-Access first hit the scene many people built applications using it. The multiuser functionality just is not there. Upto four or five users is ok. But I would not go for more.

How to protect a database?

There is a website with a server database. I'm building a desktop application which uses data from one of the tables. Hacker can just take password from assembly.
How can I protect the database?
I wouldn't store the database information in the application at all. Instead, I would create an API to the database on the website, perhaps implementing a RESTful interface or having queries that return data in an appropriate format, such as JSON, XML, or even plain text. The application could then call these web services and process the results. All of your database information stays on the server, where it is (hopefully) secure.
The API adds a sometimes unnecessary application layer. Not all applications i've been involved with easily convert from using database calls to webservice calls. If the application has not been written i guess it would not matter that much.
My alternative implementation is:
Connect to the server using a secure tunnel of some sort.
Save the password encrypted on disk.
This would save me the effort of creating an API, which in most of my projects would be a waste of time.
This alternative is not viable if let's say you want to distribute the application to customers.
Your can
A) create a three tier system. Your client could interface with a server that in turn interfaces with the database. The server stores the access credentials.
B) create personal accounts on the database for your users. This two tier model is applicable if fine grained access control to data is needed. E.g. in an inhouse application with different user roles.
Don't let the database user the application logs in as perform any write operations or read operations on anything but the application data.
Or, choose a sane architecture, as Thomas mentions above. Databases are for storing and retrieving data, they are not a generic application server.

Resources