Should i just use 1 database? - winforms

Hi i am building a window apps retailer pos but was wondering what is the best method to design the database. Should i just use 1 database to store all my clients data?
Meaning to say if i have 100 clients from different businesses using my App, all of their data will be stored in 1 database.
e.g. i will store 1 company column in the user table to indicate which company does the customer or transaction belongs to.
My current practice is i create new database for each business and put it installed into their local machine. (Got to manually install sqlserver + sqlexpress).
Do u think it is more easier for me to design in this way? and i can just put the database online to sql server. Will i be getting any latency ? how bad will it be? I heard Window Azure able to handle this well. In my case i think the speed and data size per business is not really a concern.
Could you advice?

You should definitely look at other alternatives within Azure for storing data, specifically Azure Storage Tables and Blobs.
Utilizing all of the Azure Storage Options with SQL Azure will allow you to choose different data tiers depending on your application's needs and your desired cost structure. Running everything inside of SQL Azure will cost you more in the long run, but it makes a good place to tie together federated data for relational reporting, whereas you can store each tenant's data inside of Azure Tables, using PartitionKeys which keep each client's data separated from the others.

Related

SQL Server move data between databases

We have a requirement where we will have to move data between different database instance on regular basis. (For e.g. some customers willing to pay more for the better performance). So this is not going to be one off.
The database tables has referential integrity. Is there a way in which this can be done without rewriting sql script (or some other method) every time we migrate customers data?
I came across this How to move data between multiple database's table while maintaining foreign-key relationships/referential integrity?. However it appears that we have write script every time we migrate data (please correct me if I misunderstood the answer on this thread).
Thanks
Edit:
Both servers are using SQL Server 2012 (same version). Its an Azure SQL Server database.
They are not necessarily linked (no firewall between them)
We are only transferring some data, not the whole database. This is only for certain customers who opted pay more.
The schema are exactly same in both databases.
Preyash - please see the documentation on the Split-Merge tool. The Split-Merge tool enables you do move data between databases, as you have described, based on a sharding key (e.g., customer ID). One modification that you will need for your application is to add a shard map (i.e., a database that understand the global state of which customers resides in which databases).
Have a look into Azure Data Sync. It is much more aligned with your requirements. But you may end up in having another SQL Azure DB to maintain a Hub. Azure data Sync follows hub-spoke pattern and will let you do all flexible directional syncs with a few minutes of syncing gap. It is more simple and can set it up very fast without any scripts and all as you wanted.

How to amalgamate client databases into one database?

In my company we have a selected list of companies that are using our in-house built tool (e.g. Northwind).
When we make changes we deploy these to all our client locations.
The structure currently is: the application is installed at the client's location and the databases sit with them.
However, we would like to consolidate all this information into one database and clients will connect via web services for any data requests.
For example....We have deployed Northwind App and Db to company X, Y and Z and would like to create a single database to maintain all these company's data.
We have reviewed one option which is to create a field for the Company to associate it with the various tables and another option is to create a schema for each company and in this way we can allocate permissions to the relevant company. Is there an alternative to this and what are the pros and cons to the ways we could do this.
One con with adding a company field, is that we have to cater for indexes being the same in all the client databases and this makes it more difficult and the performance of the app as a whole due to multiple requests to the same db. Please Help?
Note: Using Sql Server 2008
Research "multi-tenant database architecture". (For your purposes, think of one tenant as one client.) You'll find a spectrum from "one database per tenant" to "every tenant in every table".
Read carefully. Writers in this field can confuse you. Expect technical terms like shared schema to mean different things to different writers.
See this SO answer for tradeoffs.
For your first step, I wouldn't consider anything besides simply moving those client databases in-house. Just doing that is going to give you and your application programmers enough headaches. You don't need an architectural change on top of it.
That will also give you time for research and testing.

Which Multi-tenant approach is recommended with SQL Server 2008

I have to use ASP.NET MVC 3 or above and SQL Server 2008. As per Multi-Tenant Data Architecture post, there are 3 ways to implement multi-tenancy
Separate Databases
Shared Database, Separate Schemas
Shared Database, Shared Schema
I have following details:
User should be able to backup and restore their data.
No of tenants : 1000 (approx)
Each tenant might belong to different domain(url).
It must support monitoring and management of tenants.
It must support user authentication and authorization for each tenant
It must support tenant customization(enable disable features set)
No of tables in each tenant: 100 (initial)
I would like to know what your experience says about which approach is more suitable for the project considering Economic and Security? Is there any good real time example(open source project) similar to this? I can use one dedicated server for the project.
Your requirement that users should be able to backup its data, can be achivable more easily with approachs 1 and 2... since it will be a native database task.
If you are in approach 3 (shared-shared), you will need to develop the logic to extract all the rows belonging to a single tenant and export it in a xml file or something like that. Then if you need to allow users to restore that backup file, you need to develop a restore logic.
I think this is the only requirement that could make you move away from #3.
Once you set your database using TenantID columns in your table... you can easily use one database for 1 tenant or a small group of tenants if your client is heavily concerned about security. For instance, you could have one database holding tenants that are not paying (free/demo accounts) and paying customers in another one. This way you are using approach #3, but being able to behave as #1 if you need it.
::: BONUS :::
AUTHENTICATION:
You will need to extend the SQL Membership and Role Providers used in your MVC3 app... so that a user login is valid only in the Tenant it belongs to.
MULTIPLE DOMAINS
Here you can see some approaches using ASP.NET MVC3 Routing:
MVC 3 Subdomain Routing
I would always use (3) Shared Database, Shared Schema.
If you want an example, how about Wordpress, Joomla, or any other popular open source web-based project?
Creating separate schemas or databases on a per-tenant basis will lead to massive management overhead. Not to mention increased complexity of analysing your data, costs, etc.
The only reason you'd go for (1) (or perhaps 2) is if you were to give your actual tenant direct access to some/all of the database. As you're using ASP.NET MVC 3, this isn't a consideration.

CakePhp Multiple tenants - single DB versus multiple DBs

We are working on an application in CakePHP that will be used by multiple companies. We want to ensure performance, scalability, code manageability and security for our application.
Our current beta version creates a new database for each customer. When a new company joins the site we run a SQL script to create a blank database.
This has the following advantages:
- Better security (companies users are separated from each other)
- We can set the database via the subdomain (IE: monkey.site.com, uses the site_monkey database)
- Single code base.
- Performance for SQL queries is generally quite good as data is split across smaller databases.
Now unfortunately this has many disadvantages
- Manageability: changes to database have to happen across all existing databases
- The SQL script method of creation is clunky and not as reliable as we would like
- We want to allow users to login from the home page (EG. www.site.com) but we cant currently do this as the subdomain determines what database to use.
- We would like a central place to keep metrics/customer usage.
So we are torn/undecided as to what is the best solution to our database structure for our application.
Currently we see three options:
- Keep multiple database design
- Merge all companies into one DB and identify each by a 'companyId'
- Some kind of split model, where certain tables are in a 'core database' and others are in a customer specific database.
Can you guys offer some of your precious advise on how you think we should best do this?
Any feedback / info would be greatly appreciated.
Many thanks,
kSeudo
Just my suggestion:
I think better you keep the customer related data in different databases and authentication related data in a common database So when a user logs in you should have an entry with domain that user belongs to and redirect to that domain and access the corresponding database and data.
Again your concern of changes to the database, You need to implement the changes in each databases separately. I think there is some advantages to this also. Some customers may ask for few changes according to their process. So this can be easily managed if you are keeping separate databases for different customers.

Organising Dbs and tables in SSMS

This is a repost of a question I asked 4 or 5 days ago, with zero response. Hoping for more luck this time...
(Using SQL Server 2008)
Within the next few weeks I plan to introduce SQL server to an office that is in dire need of a proper data server. Currently there is a heavy reliance on loose Excel and Access file (supplemented with frighteningly large amount of impenetrable VB code to do data manipulations) strewn all over the internal network.
We need SQL server for two things:
1. For internal databases that will be designed upfront and will be capturing data on an ongoing basis
2. For ad hoc uploads of datasets received from clients, which we then analyse
I am the only person in this office who is familiar with SQL. I will have to train the other 5 or 6 people to use it.
Now, my question is this: how would you guys set up the DBs so that it would be easy using Management Studio to visually recognize where what is being stored? To be more precise: if this were a windows file system it would look something like this:
c:\client work\client 1\piece of work 1 (db with 10 tables)\
c:\client work\client 1\piece of work 2 (db with 8 tables)\
c:\client work\client 1\piece of work 3 (db with 7 tables)\
c:\internal\accounting system\some db with 8 tables\
c:\internal\accounting system\some db with 5 tables\
c:\internal\some other system\some db with 7 tables\
etc.
So briefly, I need to visually split by internal and client work. Client work I need to split by different clients. For each client I need to split out the different distinct sets of work. (Internal work follows a similar pattern).
Solutions that I am aware of:
Run multiple data servers (e.g. one internal, one for client work). Not sure what the cons of this would be though
Assign schemas to tables
I would love to hear your suggestions!
Your organizational tools for managing SQL Server are instances, databases and schemas:
A server can run multiple instances. An instance is basically a completely separate server instance on the same machine.
An instance can manage multiple databases. The database is the standard boundary of integrity - you (usually) back up an entire database, referential integrity is constrained to being between objects in the same database, etc.
Each database can contain multiple schemas, which allow you to organize code.
All these "containers" relate to security in some way.
I recommend that you take an organization data and process inventory first, so that you understand what data you are dealing with, who uses it and how - with special attention on data which is public or collaborative (data used by certain people together) and which needs to be compartmentalized access (only used by a particular role). SQL Server is not really a great place of choice to be storing unstructured data - I would not view it as a simple replacement of a file server, for instance.
From there, proceed to define roles for your users. Having roles is a lot better strategy than assigning rights to individual users. It documents the semantic meaning of the access (any person performing this role needs this access as opposed to the user's identity - john and kate need access - this tells you nothing about why they need access). Be certain that the roles are sufficiently fine-grained. A departmental role like AccountsReceivable isn't nearly as useful as PaymentApprover or InvoiceProcessor or AccountsSupervisor. Users can act in multiple roles - this will give you a lot more self-documenting ability in your infrastructure and a lot fewer security holes and headaches.
This should help to define which containers you will need and what access to grant and guide your data infrastructure from there.
As far as giving users direct access, I'm with Randy Minder, SQL Server is only an expert user tool at best. If they are familiar with Access, a good option is to let them use Access against carefully designed and chosen views in SQL Server until they are ready for a more systematic data engineering approach.
IMO, users of your databases should not have to know or care where or how your databases are set up. And they shouldn't be given access to SSMS unless they are well trained in SQL. This is a disaster waiting to happen. You should be creating applications and/or reports that allow the user access to the data they need. That way they don't care where the data sits, and don't need to know.

Resources