Using Membership Provider DataBase with Entity Framework 4.1 - wpf

I have made the database using ASP.Net Configuration(aspnet_regsql.exe tool) . I want to use EF 4.1 to write the
Data Access Layer. (I'll then use it in a WCF service and consume services from a WPF application)
How can I use EF 4.1 to do that .....
So for a simple demo sake .. I want to write Functions for User Management
I know I have to use DataBase first (IN EF 4.1) approach .. but there are so many tables and usually Database entry is done using Stored Procedures(Or the classes provided by the Sql Membership Provider).... and when I add one user or role... Many tables get Updated simultaneously (Sp's does that) ... will EF 4.1 will do the same by examining the structure of database..
I cant write any code because I don't know how to start off (except I have made the database)
How can I mimic the same behavior using EF 4.1
Any pointers in this regard will be helpful.

I want to write Functions for User Management
No you don't or at least you should not! Membership API is self contained. It contains whole logic for user management and that logic is divided between .NET and stored procedures. If you want to access the API through WCF either use Authentication Service directly or wrap the standard API calls into new WCF service without working directly with database.
Direct access to membership database means breaking membership API contract and in most cases it also means creating less secure and less encapsulated solution. Standard membership and role classes cannot be mapped with EF back to membership database - you will have to create your own entities which will break the original encapsulation.
If you just want to create custom authentication and you don't want to use anything from Membership API except the database you should create your own database for your exact requirements.

Related

Accessing SSAS using External User Database

We have a BI team that have created a wonderful data warehouse that works fine for the internal staff using it through Excel on the internal network. They use windows authentication through the domain controller, and everything works fine, including restricting the access to users and AD Groups.
The issue is that we now want to provide the same access to a wider audience that is not part of our Windows Domain. This is further complicated by the fact that the information on the users that need access to the system is already stored in another location (an application with a SQL database).
The goal is to make it so that they connect (probably using HTTPS) to the cube (using Excel Analysis services integration), and be authenticated using the Username/Password that they had in the main application.
The main application has a WCF Service interface for user authentication, and session management, so all I really need is a way to provide authentication in front of the MSMDpump.dll against that WebService. We can also add in a Role Mapping so that we can define the SSAS roles against the users in the application.
I was thinking that I could create a dll that has the same interface as the MSMDPUMP.dll, and have that translate calls between the client and the main dll, but this seems a bit of overkill.
Are there any pre-built tools to do this? (and yes I know that Sharepoint can do something like this, but that's not an option so please don't suggest it). Does anybody know of any blogs detailing how to do it?
Any pointers in where to start with creating an interface between the 2?
The question is similar to How to secure MS SSAS 2005 for HTTP remote access via Internet? however, I'm looking at providing the authentication mechanism from another datasource, and providing the Roles to SSAS, not the users. We don't want to have to setup a new user in SSAS for every user that is setup in the external application.
UPDATE: To be clear, they external users need to connect to the cube using Excel, and the data returned needs to be filtered by the role they're in and the security applied in the Cube.
We are able to cahnge the cube to use Dynamic Dimension Security and use CustomData attributes if that helps.
The eventual solution ended up being a combination of a Third Party Control and Dynamic Dimension Security.
We found that it's not possible to easily apply a MembershipProvider interface to the MSMDPUMP interface without significant effort, so our solution gave them a Web Interface to use instead.
The Control we used was by DevExpress and is their "PivotGrid" control. It's not free, but is significantly less than the development resource costs of implementing any other custom solution.
Along with the control, we've applied Dynamic Dimension Security to the Cube so each user of the Site will have a dedicated connectionString to the cube with "CustomData" appended to the connectionstring. This allows us to delegate Data Segregation tasks to the Cube and BI team, and let the Web Developers concentrate on the display of the controls.
The solution is working quite well, and doesn't involve heavy weight applications like Sharepoint/Excel Services. It can built directly into your site and branded as you need, providing a sales tool as well as a useful functional tool.
Can you create a limited-access user on the cube db (read-only, only for the relevant cube, etc.) and hard-code that user/password into a connection string on the app db?

Is WCF recommended to use with WPF and MVVM to retrieve data from SQL Server?

I am building a desktop application that will be used on local network, with SQL Server as database.
This application would have around 50 users top at the same time. In what particular scenario would I need to use WCF service? Is it recommended to create a WCF service on the server computer where database would reside, so we connect to this server through WCF service, instead of connecting to the database directly? What is the recommended way to connect to SQL Server data and why?
Edit: Let me explain in more detail. I have used WCF Ria services before, so I know how they work. Lets assume that WCF services works in same way. The question was directed toward why would we use WCF instead of directly connecting to database? I didnt want to specify my current application requirement, since I would get a specific answer for specific requirement. My goal was to understand in general why and when would you yse one instead of another. And I have received satisfying answers so far.
It appears to me that general consensus is to use WCF only if there would be a demand of another type of application, which would use web access to get data from service. Also, if I understood correctly, from security point of view, there is no difference between the two.
There would be a statistical app in the future that uses web to provide read-only statistics to user, and naturally some service will be required for this task (application has no specific client in mind, it will be offered to lots of clients). Since I need some demo application to be done very rapidly for particular clients, then I am thinking to neglect the service part, and make a proper layering (WPF->VM->Model->EF, so later I would just insert service between the model and EF. I guess it should not take too much time to make WPF app running with inserted layer. I am also postponing the service because of next reason: since HTML5 is (going to be) main technology for web, and there is a possibility that SL will be abandoned as technology (which I have been using), the logical decision would be to choose HTML5 over SL. But since I am totally unfamiliar with HTML5 and its requirements, I am not sure if WCF service is the best choice for it, and this is also one of the reasons to postpone the decision of choosing the service type (along with requirement to make the desktop demo app as fast as possible).
I think a better way to consider the question is whether you should abstract your database and data access layer from the application using a service interface. You could use WCF and SOAP or you could use a REST based HTTP service, the choice of technology is secondary to whether the current or future requirements of your application indicate that an additional layer of abstraction is warrented.
Reasons you might consider using a service interface instead of directly connecting to the SQL database include but are not limited to:
Ease of supporting multiple operating systems/client UIs
Ability to evolve the data/service interface separately from your database schema
Isolate application from changes to database schema or location (you don't have to redeploy change to application, only change internals of the services it is calling)
If data could be used by other systems, you have a standard means of allowing these systems to interface with the data your application is managing
Reduced SQL database connection security concerns (only service identity connects to database, allowing you to use a variety of authentication/authorization strategies on the client side)
The trade off you are looking at is the time/cost/complexity of implementing a service interface versus the flexibility and mantainability benefits you will gain. You should evaluate the needs of your application and your customer before you make a decision on whether to connect directly to your data store using ADO.NET or use a service layer.
You should take a look at the Microsoft Service Layer Guidelines as they cover a lot of the considerations to take into account.
Unless you need to create a reusable service, I can't think of a reason to add a WCF layer, unless you are just looking for a reason to do it. I think you can just go with some sort of ORM like EF or nHibernate and be happy.
The main reason for WCF is security. If the client connects directly to the DB then the client must be given rights on tables. The client can hack into the connection and use TSQL directly. You must expose port 1433 to the network in a single tier application. With WCF there is not direct access from the client to SQL. It is not just more secure in general but you can have more granular security. .NET service code can enforce row level security. A table only has column level security. If this is business on a private network and you don't expect anyone would try and hack into your db then client connecting directly to the SQL server is easier to build. With server side service the other factor is a change to server side code is one spot so you don't have to update 50 devices.

Basic Login protocol

I'm wondering what the basic protocol is for storing users in a database, creating accounts, and authenticating them - with ASP.NET MVC 3 framework (using C#), and Azure SQL.
More specifically:
1.) Where in an ASP.NET C# MVC3 Visual Studio project do I write code that only runs on the back-end? Such as logging into my database as an admin, so I can write to and read from the database.
2.) Where should I make database calls from using MVC framework? Do I call a back-end function (e.g. - to create a new account in the database) from the controller?
Thanks for any help!
I'm not 100% sure whether you are talking about SQL Users or Application users.
However, generally, what ASP.Net MVC applications do is:
they use one or two defined users to connect to the database (e.g. they might define a read-write and a readonly connection for different types of queries)
they use the ASP.Net Membership API for application-level user accounts
they use an ORM framework like NHibernate or Entity Framework for other database access
There are lots of tutorials and articles for this sort of information out there - one place to look might be Scott Hanselman's blog - see:
a tutorial on using ASP.Net Membership with SQL Azure
the Mix11 tools walkthrough - including Code First Entity Framework
1) You could use Membership (which I used to use) or simply write your own Authentication code. There are 2 drawbacks with Membership. First, there are a ton of tables and stored procedures that are installed but worse, there's no way to change a user name via the Membership API. Try telling a customer that they cannot change their username (which is usually their email address) and they'll give you weird looks.
2) Forget EF and use your own repository, which simply harnesses SPROCs. Go ahead, make a ton of changes to your EF design canvas, change the db schema, and I can guarantee you will run into issues with the "behind the scene" files at one point or another.

Designing web service calls that read/write from database

Apologies for the newbie web service question -
I am trying to create a webservice that has a list of methods to perform read/writes to a database. An example function will be of form -
CreateNewEmployee(string username, string employeeid, string deptname)
I created a webservice in .net (asmx) that has the above mentioned webmethod. In that, I open the connection to the data base and do an insert in to the database and then close the connections. Is this the right way to design the web service call?
Should I instead be passing an object instead of multiple parameters?
Any pointers toward best practices when trying to create a webservice that writes data into a database?
To add some more information
We would like to have web services since it might be reused by many different applications within the organization (both web and desktop).
We are also planning to create an environment where users can use these web services to create data mashups.
Thanks,
Nate
Yes - pass objects vs large parameter sets. Also, have you considered WCF if you're in a .Net environment? If you look at how ADO.Net Data Services (formerly Astoria) works, it will put you in the right direction.
Quoting from the winning answer to this SO question:
Web Services are an absolutely horrible choice for data access.
It's a ton of overhead and complexity for almost zero benefit.
You can read the rest of the discussion there.
Edit: One excellent approach to having a common data access functionality that can be shared by multiple applications - web, desktop, service - is to create a Visual Studio project that compiles to a DLL. Each solution that wants to use the data access functionality references the DLL, which can deployed to the GAC or some other central location, or just added to the project's bin folder. Alternately, in order to be able to step through the data access code, the data access project can be added to a solution.
This is a very common practice in large enterprises, where many back office applications share common functionality. It is used not just for data access, but also for other services such as logging and authentication/authorization. Some divisions create a set of these DLLs, which they refer to as their "framework". It ensures that every application will have the same functionality and the same business logic, and that there is a single place for revisions to be made that will affect all of the applications. This is a similar benefit to using web services, but it avoids the overhead and performance hit of web services.

Should I duplicate my Entity Model?

Let me set up my LOB scenario.
I am re-writing our core business app. The requirements are that I create an internally usable app (I'd like to use Silverlight) that our employees use on a daily basis. I also need to provide a SOAP service that can be used to input orders, get invoices, etc.
I also will be doing this in pieces, so when I update a record in the new SQL Server database, I need to make sure to update our legacy SQL Server as well.
So, it certainly makes sense to create a DAL that will pull data from the new SQL server, as well as write back to 2 data stores.
It would also make sense to create a BLL that can be used by both Silverlight/RIA and the WCF web services.
I have created a data entity of the new database in it's own project and it is used in all the other projects. The problem here is that RIA seems to require that I create it right inside the ASP.Net project in order to get the metadata for Silverlight. Without this, I need to manually re-create the metadata for Silverlight to access it correctly.
My question then, should I create duplicates of the Entity Model? One for RIA and one for everything else? Is there a better way to do this? Should I just forego using RIA and have Silverlight access WCF services? Or should I just continue to duplicate the metadata in RIA?
We use entities for direct reference to storage and Data Transfer Objects (DTOs) which are almost identical for passing back/forth between BLL and WCF/GUI/etc. We map between the 2 using AutoMapper which means there's very little additional work but we don't have to worry about if a given entity is attached to the context/tracking state changes/etc...
Edit: You definitely want to keep your code as DRY as possible. Personally, I'd look at using DTOs above the BLL and either having 2 sets of repositories which are co-ordinated in the DAL (one RW, one W only). or even having Meta-repositories which handle the datasets on the 2 stores themselves.
If you're not already using it, Unity and IoC would be of real benefit to you here. You might also want to use one of the modular code patterns to allow you to register [n] data stores in different modes, so that when you finally want to retire the old store, you don't need to do much work.
I'd also question whether your entities need to be defined in ASP.Net - you may simple be able to reference the appropriate DLLs from your entity/DTO project and add the appropriate markup/config

Resources