Access database design in SQL [closed] - sql-server

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
We have 52 MS Access databases and each database has 4 tables. The total data in our databases is around 5 million. Now we are planning to move to SQL Server. We have designed our new database which will be an SQL Server database with approximately 60 tables.
My question is - how will we integrate the 52 Access databases into one SQL Server database?
Is it possible, or we would have to create 52 database in SQL Server too, in order to migrate our data? these 52 databases are interrelated with each other having same structure in access?

If I was you (and I'm not, but if I was...) I would load all of that data into 4 tables. Just append all the data from each Access database into one table. Doctor, Project, Contract, Institution. However, as I'm appending each database, I would add a new field to each table; Country. Then, when you append the data for England to the tables, you also populate the Country field of that table with "England". Etc... with all your countries.
Now, when it comes time to access the data, you can force certain users to only be able to see the data for England, and certain other people to only see the data for Spain, etc... This way, those 4 tables can house all of your data, and you can still filter by any country you like.

From a technical point of view, there's no problem in creating only one SQL Server database, containing all 52 * 4 tables from the MS Access databases. SQL Server provides various options for logically separating your objects, for example by using Schemas. But then again, if you decide to create separate databases, you still have the ability to write queries across databases, even if the databases are not hosted on the same SQL Server instance (although there might be a performance penalty when writing queries across linked servers).
It's difficult to give a more precise answer with the limited detail in your question, but in most cases, a single database with multiple database schema (perhaps 1 schema for each MS access database) would probably be the best solution.
Now, for migrating the data from MS Access to SQL Server, you have various options. If you just need to perform a one-time migration, you could simply use the Import-Export wizard that comes with SQL Server. The wizard automatically creates the tables in the destination database for you, and it also lets you save SSIS-packages that you can use to migrate the data again.

Related

Is there a way to monitor which data items are queried in SQL Server? [migrated]

This question was migrated from Stack Overflow because it can be answered on Database Administrators Stack Exchange.
Migrated 19 days ago.
I am working on a largish SQL Server database and we have Extended Events logging switched on.
One of the main tables has a column called DataItem which contains a relatively small (<100) number of values across the millions of records.
The client would like a report showing who has accessed each DataItem, when it was accessed, and with which technology.
Is there any SQL Server function or other software that can provide this?
Extended events gives the who, when and how but not the what.
You can use an AUDIT for this.
https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-database-engine?view=sql-server-ver16
Here you can add the name and location and how large the file can grow.
Enable it.
In the database you would like to audit, add the database audit.
Enter your Autit Action Type, in this case SELECT, but perhaps you would also like to see the UPDATE, INSERT and DELETE actions?
Don't forget to enable it:
After running some selects, you can see the audit information:

how to join datasuorce results from linked servers into SSRS drill-through report? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I need some advice and direction on this following requirement: I currently have a SSRS report with one dataset and one datasource executing a stored proc
that passes parameters to other drill-trhough reports; however, next month, the database where the datasource points to will be split into two linked servers.
Servers will contain the same databases, tables, etc. only the data will be split.
How can I combine those datasourses into one and use that as for a drill-through report?
I will have 3 servers configured like this:
server 1 is the current server where the datasource points to with all the employee data but starting next week sales employee data will no longer be loaded into this server/db.
server 2 will be the new server containing the only the new sales employee data.The old sales employee data remains in the old server but still needs to be included in the reports.
server 3 will have a link server to both server 1 and server 2.
All SQL servers are on version 2016 as well as Report builder for 2016
We have a lot of SSRS reports that get data from SEVERAL servers. Better yet, it gets data from SQL Server 2016 and DB2 (AS400) servers.
How it is done in my company is, one stored procedure in SQL Server. This stored procedure combines all necessary data from all the servers in question. From here, I've seen the reports built two ways:
Upload all this data to a new table in the server and use this table and this server as the data source on SSRS.
Never mind about creating a new table. Some people are icky about just throwing a bunch of tables on the server. Not us though, we have thousands of databases and tables because it is easier to manipulate. If you don't want to create a new table, you can just select it in the stored procedure and that will work just fine. Make the data source point to the server where the stored procedure is and make the query call this stored procedure. It will return any data this stored procedure has.
We also have a job that runs this stored procedure everyday on SQL server. That way, our end table is up-to-date and in sync with data from other servers.
Hope this helps.
Create linked servers from your current main server to the new ones. That way you can you can access the data directly from the same server by qualifying the table name with
[Servername].[DatabaseName].[Schema].[Table]
as in the following image.
How you link to your servers will depend on your particular setup but there is lots of info on how to do this, it's pretty straight forward.

Copy tables containing BLOB columns between Oracle Databases [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
On adhoc basis, we want to copy contents from 4 of our Oracle production tables to QA/UAT environments.
This is not a direct copy and we need to copy data based on some input criteria for filtering.
Earlier we were using Sybase database hence BCP utility worked with charm there. However, we have recently migrated to Oracle and need similar data copy requirement.
Based on the analyses till now, I have analyzed below options -
RMAN (Recovery Manager) - Cannot use as it does not allow us to copy selected tables or filtering on data.
SQLLDR (SQL Loader) – Cannot use this as we have BLOB columns and hence not sure how to create a CSV file for these BLOBS. Any suggesstions?
Oracle Data pump (Expdp/Imbdp) – Cannot use this as even though it allows copying selected tables it does not allow us to filter data using some query with joins (I know it allows to add query but it works only on single table). A workaround is to create temp tables with desired dataset and dmp them using EXPDP and IMPDP. Any suggesstions if I have missed anything in this approach?
Database Link – This is the best approach which seems possible in this use case. But needs to check if DBA will allow us to create links to/from PRD db.
SQL PLUS COPY - Cannot use this as it does not work with BLOB fields.
Can someone please advise on which should be the best approach w.r.t performance.
I would probably use a DATAPUMP format external table. So it would be something like
create table my_ext_tab
organization external
(
type oracle_datapump
default directory UNLOAD
location( 'my_ext_tab.dmp' )
)
as
<my query>
You can then copy the file across to your other database, create the external table, and then insert into your new table via an insert, something like:
insert /*+ APPEND */ into my_tab
select * from my_ext_tab
You can also use parallelism to read and write the files
Taking all your constraints into account, it looks like Database links is the best option. You can create views for your queries with joins and filters on the PROD environment and select from these views through the db links. That way, the filtering is done before the transfer over the network and not after, on the target side.

SQL Server data transfer

In SQL Server, I have a data source server which has 22 databases and in each database there are 5 tables. Every db has the same table includes different data separated through years.
I want to collect all this data into one single database. Destination database will have only 5 tables, while source has 22 x 5 = 110 tables. I'm using import-export wizard to transfer data but it takes too long and really annoying stuff. For 110 tables I'm going to have to start import-export wizard.
Is there a simple way, tool to do this? There is no linked server between servers.
Here is a simple figure that explains my situation.
Posting my comment as an answer:
Back up each database, restore it to server 2 and then insert the records across using a simple INSERT .. SELECT statement, then drop the restored database and restore the next? You should be able to script this to work unattended, even the creation of all the backups could be scripted to only need a single 'run' which will run for all databases
Your other option (if space permits) is to create a new database on server 1 (potentially a restore of the database on server 2 if it has data already in it), then import all records across into this new database, then backup this database and restore it on server 2.
It depends on several thing like how often do you want the data to be moved, will it be changed on the destinations DB's?
There are 4 methods of High Availability on SQL Server. One of them will surely fits to your scenario (probably a merge replication)
http://msdn.microsoft.com/en-us/library/ms190202.aspx

Linking tables between databases

I’m after a bit of advice on the best way to go about this is SQL server 2008R2 express. I have a number of applications that are in separate databases on the same server. They are all “plugins” that use a central staff/structure list that will be in a separate database. The application is in the process of being migrated from JET.
What I’m looking for is the best way of all the “plugin” databases being able to see the central database and use those tables in standard queries and views etc.
As I’m using express that rules out any replication solution and so far the only option I can think of is to use triggers or a stored procedure to “push” out all the changes to the plugins. The information needs to be populated on a near enough real time basis however the number of changes will be very small maybe up to 100 a day and the biggest table only has about 1000 rows at the moment (the staff names table).
Hopefully that will cover all everything but if anyone needs any more details then just ask
Thanks
Apologies if I've misunderstood, but from your description it sounds like all these databases are hosted on the same instance of SQL Server - it's your mention of replication that makes me uncertain.
Assuming that's the case, you should be able to replace any copies of tables from the central database which are held in the "plugin" databases with views or synonyms which reference the central tables directly, since SQL server allows you to make references between databases on the same server using three-part naming (database_name.schema_name.object_name)
For example, if each plugin db has a table StaffNames, you could replace this with a view by dropping the table, then creating a view:
drop table StaffNames
go
create view StaffNames
as
select * from <centraldbname>.<schema - probably dbo>.StaffNames
go
and your code should continue to work seamlessly, as long as permissions are set up.
Alternatively, you could replace all the references to the shared tables in the plugin databases with three-part name references to the central database, but the view method requires less work.

Resources