Multiple database connectivity - database

We have 4 products and each supports below 4 datasources.
Oracle
SQL server 2005
DB2
Datopia
Now We are building Administration product which will interact will all the products and hence their databases.We have some requirements where we have to access tables from different datasources in a single query.We initially thought of using Oracle transparent gateway to create DB links and then access tables in different datasources. But this requires oracle to be installed for one of the products. This restrictions cannot be brought in our environment(For example among 4 products 2 may have SQL server installation and other two may have DB2 installation). Which is the best way to connect to all datasources with out any restriction. One more thing, we are using java to connect to these databases. Thanks in advance.

You don't say what kind of framework your client software uses. But if it uses Java, dotnet, or PERL, you will be able to use that framework's data access modules to connect to the various table servers. You can connect to all of them from a single client process easily enough.
You db access won't be perfectly transparent. You'll need some aspects of your program to be Oracle- or SQL-Server- specific, for example. On the other hand, if you do this right, it won't be hard to add MySQL and PostgreSQL support if your customers need it.
You'll have a fairly steep QA burden -- you'll need to test with at least one and two instances of all four table servers connected simultaneously to make sure everything works.
But this kind of product usually has high value, so you should be able to justify the QA effort.

Related

Connecting Power Apps Model-Driven Apps to SQL Server tables

Trying to figure out the best way of managing data stored in an on-premise SQL Server database, for a Power Automate Model-Driven App. For Canvas Apps, typically I use a connector to either access the tables directly, or to access the data via a Power Automate flow.
From my research, it seems like I have two options:
Create Dataverse tables which represent all of my SQL Server tables, and use an ETL tool to keep those tables in sync.
Use a Virtual Table to connect "directly" to the SQL Server tables
Neither of those options really seem great, though. The entire premise of the Power Platform is for a relatively low-code experience, and both of those two options require quite a bit of coding. The ETL option, for example, requires me to write ETL operations for every table that I want to manage, along with dealing with any number of concurrency issues that may arise. The Virtual Table, on the other hand, seems on the surface like it'd be a better approach, but to actually implement a Virtual Table for an SQL Server table, I need to write a custom connector, and I need to leverage the old D365 UI to manage it.
The "best" approach I've seen so far is explored a bit by Juuka Niiranen in this blog post, however, a year later, and the approach he wrote about is still in preview: https://jukkaniiranen.com/2021/06/virtual-dataverse-tables-with-no-code-via-connectors/ and https://learn.microsoft.com/en-us/power-apps/maker/data-platform/create-virtual-tables-using-connectors?tabs=sql
Does anyone have any other suggestions? Is that above connector approach the recommended one, despite still being in preview?

Microsoft SQL Server Express vs ACE

Currently, I'm developing a C#-based program for a small rental company (3 locations). Right now, they use MS Access 97 (Jet SQL based) as database and I wish to upgrade this. However, I do want to keep Access as Front-end, since I will be gone after the development, and the local personnel knows how to use Access (some changes require direct editing in the database).
I am doubting between two options:
Upgrade to Access 2013, therefore using MS ACE as DB engine
Use SQL Server Express with Access as front-end, therefore using MS SQL Server as DB engine
The system will have one shared database and one for each location. They are using a shared drive for this (they work on MS Server 2008). Their databases are pretty small (< 1 GB combined), so I won't need the extra performance e.g. MySQL provides. I know the difference between ACE and SQL Server in terms of design (File-sharing vs client/server), but I still don't know what would be better suitable for this situation.
What is the better option here when looking at performance, reliability, security and connection to the application?
Thanks in advance.
As #granadaCoder points out, the security, performance, reliability of using SQLExpress is far better than Jet and ACE and is just as easy to connect/link to your Access 97 front-end. Microsoft provides a free migration tool that is very powerful and easy to use.
Converting an MS-Access 97 application to 2013 may present some real challenges as Cwell. onverting from Access 97 to 2013 is a two step process. You must first convert it to 2002-2003 and then to 2007/2013. You will also need to purchase licenses for all users and the back-end database.
In addition, if your 97 application references external objects, they may not work with later versions of Access.
As #granadaCoder also suggests, a good medium to long term plan would be to convert the front-end to .NET.
Microsoft Jet is just a file sitting on a network drive.
So when you do queries......the Jet-Runtime (on the local PC) has to bring large chunks of data (entire tables) across the network.
Thus it is brutal.
Sql Server (Express or Other)....runs as a service on a host computer. And when a query is executed, it does processing on the Server and returns "smaller buckets of information".
(Which you mention knowing the difference between file-sharing vs client-server).
If you cannot give up your Access(the program) front end...then doing link-tables to Sql-Server would be you best bet, IMHO.
Well, I'm talking from performance.
Security, you have more options for different users and passwords. And you can slice up which logins/db-users are allowed to do what.
IIRC, a Jet database allows one password. Aka, all or nothing.
https://www.connectionstrings.com/ace-oledb-12-0/with-database-password/
That alone would make me go with SqlExpress.
..
The big early design decision was to use Microsoft-Access-Forms. You're paying the price for that early decision.
Even when people use a Jet-Database, I would only use it for basic data storage. And put a Layer .Net application on top of it. Then a swap out to a different data-store isn't as drastic.
Good luck dude.

How to build database reports using multiple remote databases

Does anyone have experience building database reports - doesn't matter which database - i just want design ideas - for a system that is made up of many separate, but identical databases?
I cannot "combine" all databases into one. They must be separate.
But the structure is identical across all databases...
I need to build a web interface that will allow a user to get a "global" report that will query all databases and build one combined report.
Do you have any comments on how the model would look like? or anything you think i need to beware of?
Thanks.
I don't have first hand experience with cross database reports, my experience comes from a product the company i work for sells which can create reports from multiple databases, from your description i believe you require something of the "combine" tables kind, in this case i recommend you to detect the tables used in the query, and unify them in a single temporary intermediary database, for example Access, SQL Server CE or SQLite and then run the query against this temporary database or table.
If your databases are Microsoft SQL Server, then using SQL Server Reporting Services seems like a good solution. The software for the report generation / display is bundled along with the database software.
It gives you a web interface, where you can configure 'data sources' from any number of remote databases, and combine data from these sources into reports. It is user friendly and you can do all the report design / configuration through the web interface without having to write any code.
some references :
Building report using SQL Server stored procedure
http://blog.hoegaerden.be/2009/11/10/reporting-on-data-from-stored-procedures-part-1/

What is the best way to configure a SQL Server for 50 developers?

If I am running an organization that has 50 .net developers and all are using SQL Server, what is the best way to make a single SQL Server available to them?
Here is some of the concerns that I want to be careful about
Should I configure database users per project or per user? or both?
Should I provide single SQL Server instance?
Edit:
How can I track changes done by each user in database?
There are some more concerns but I think getting answer of these two will be a good starting point.
You should definitely configure a database per project, as only project specific items should be in that database. Also for backup and restore purposes a database per project will be a good idea.
Configuring databases for your developers depends on how many developers will actually develop for the database: create tables, views etc. Database developers should probably have some sort of test copy of the database they can use to develop their end of things, while the 'regular' developers work against a published copy of this database:
So a setup could be: 2 databases per project, one for db development and one for other development.
This way changes to the database scheme can first be developed and tested before pushed out to the rest of the developers.

Organising Dbs and tables in SSMS

This is a repost of a question I asked 4 or 5 days ago, with zero response. Hoping for more luck this time...
(Using SQL Server 2008)
Within the next few weeks I plan to introduce SQL server to an office that is in dire need of a proper data server. Currently there is a heavy reliance on loose Excel and Access file (supplemented with frighteningly large amount of impenetrable VB code to do data manipulations) strewn all over the internal network.
We need SQL server for two things:
1. For internal databases that will be designed upfront and will be capturing data on an ongoing basis
2. For ad hoc uploads of datasets received from clients, which we then analyse
I am the only person in this office who is familiar with SQL. I will have to train the other 5 or 6 people to use it.
Now, my question is this: how would you guys set up the DBs so that it would be easy using Management Studio to visually recognize where what is being stored? To be more precise: if this were a windows file system it would look something like this:
c:\client work\client 1\piece of work 1 (db with 10 tables)\
c:\client work\client 1\piece of work 2 (db with 8 tables)\
c:\client work\client 1\piece of work 3 (db with 7 tables)\
c:\internal\accounting system\some db with 8 tables\
c:\internal\accounting system\some db with 5 tables\
c:\internal\some other system\some db with 7 tables\
etc.
So briefly, I need to visually split by internal and client work. Client work I need to split by different clients. For each client I need to split out the different distinct sets of work. (Internal work follows a similar pattern).
Solutions that I am aware of:
Run multiple data servers (e.g. one internal, one for client work). Not sure what the cons of this would be though
Assign schemas to tables
I would love to hear your suggestions!
Your organizational tools for managing SQL Server are instances, databases and schemas:
A server can run multiple instances. An instance is basically a completely separate server instance on the same machine.
An instance can manage multiple databases. The database is the standard boundary of integrity - you (usually) back up an entire database, referential integrity is constrained to being between objects in the same database, etc.
Each database can contain multiple schemas, which allow you to organize code.
All these "containers" relate to security in some way.
I recommend that you take an organization data and process inventory first, so that you understand what data you are dealing with, who uses it and how - with special attention on data which is public or collaborative (data used by certain people together) and which needs to be compartmentalized access (only used by a particular role). SQL Server is not really a great place of choice to be storing unstructured data - I would not view it as a simple replacement of a file server, for instance.
From there, proceed to define roles for your users. Having roles is a lot better strategy than assigning rights to individual users. It documents the semantic meaning of the access (any person performing this role needs this access as opposed to the user's identity - john and kate need access - this tells you nothing about why they need access). Be certain that the roles are sufficiently fine-grained. A departmental role like AccountsReceivable isn't nearly as useful as PaymentApprover or InvoiceProcessor or AccountsSupervisor. Users can act in multiple roles - this will give you a lot more self-documenting ability in your infrastructure and a lot fewer security holes and headaches.
This should help to define which containers you will need and what access to grant and guide your data infrastructure from there.
As far as giving users direct access, I'm with Randy Minder, SQL Server is only an expert user tool at best. If they are familiar with Access, a good option is to let them use Access against carefully designed and chosen views in SQL Server until they are ready for a more systematic data engineering approach.
IMO, users of your databases should not have to know or care where or how your databases are set up. And they shouldn't be given access to SSMS unless they are well trained in SQL. This is a disaster waiting to happen. You should be creating applications and/or reports that allow the user access to the data they need. That way they don't care where the data sits, and don't need to know.

Resources