Storing/creating local table from linked SQL table - sql-server

I have a linked table in my access database (dbo_Billing_denied (DSN=WTSTSQL05_BB;DATABASE=DEPTFINANCE), etc.) and I want to create a table that will store the data from this linked into local table, so I can use it to run other queries. Currently I can use this because it tells me that it can not make connection (ODB--connection to 'WTSTSQL05_BB' failed.
Do I have to create a table first and assign all the fields before I can do this (create a table and fields that are same as what's in the linked table and than create append query to do this...)?

It sounds like you might have two problems. I will address the second one. You will need to reestablish connection to the linked table before this will work.
You can use a "make table query" in Access to make a local copy of the linked table. You can use the GUI for this, or you can structure the SQL something like this:
SELECT <list of various fields, or * for all fields>
INTO <name of new local table>
FROM <name of linked table(s) on the server>
WHERE <any other conditions you want to put on which records are included>;
I mentioned that there might be more than one table. You can also do this with joined tables or unions etc. The "where" clause is optional. Removing it will copy the entire data set.
You will get a warning when you try to execute this query in Access. It will tell you that you are about to write (or overwrite) a table. If you are trying to write a cleaner application with fewer nuisance messages for the end user, call this query from a macro. The macro would need to turn the warnings off, execute the query, then turn the warnings back on.
Microsoft Access does not require you to create this table before you write it; if the table does not exist Access will create this table for you, based on the field definitions in the source data. If a table of the same name does exist, Access will drop this table from your database and then create a new table of that name.
This also implies that the local table you are generating will need a unique name. If your query tries to overwrite the linked table by using the same name, the first thing Access will do is drop the linked table. It will then look for field definitions and input data in the linked table that it just dropped.
Since the new local table will have a new name, queries developed for the linked table will not work with the new local table. One possible work-around would be to rename the linked table in your local Access database. The table name in Access does not need to equal the name in the database it's linking to. The query could then write to a table with the correct name, and previous queries should work. Still, keep in mind that these queries would no longer be working on live data.

Related

Datafactory - dynamically copy subsection of columns from one database table to another

I have a database on SQL Server on premises and need to regularly copy the data from 80 different tables to an Azure SQL Database. For each table the columns I need to select from and map are different - example, TableA - I need columns 1,2 and 5. For TableB I need just column 1. The tables are named the same in the source and target, but the column names are different.
I could create multiple Copy data pipelines and select the source and target data sets and map to the target table structures, but that seems like a lot of work for what is ultimately the same process repeated.
I've so far created a meta table, which lists all the tables and the column mapping information. This table holds the following data:
SourceSchema, SourceTableName, SourceColumnName, TargetSchema, TargetTableName, TargetColumnName.
For each table, data is held in this table to map the source tables to the target tables.
I have then created a lookup which selects each table from the mapping table. It then does a for each loop and does another lookup to get the source and target column data for the table in the foreach iteration.
From this information, I'm able to map the Source table and the Sink table in a Copy Data activity created within the foreach loop, but I'm not sure how I can dynamically map the columns, or dynamically select only the columns I require from each source table.
I have the "activity('LookupColumns').output" from the column lookup, but would be grateful if someone could suggest how I can use this to then map the source columns to the target columns for the copy activity. Thanks.
In your case, you can use the expression in the mapping setting.
It needs your provide an expression and it's data should like this:{"type":"TabularTranslator","mappings":[{"source":{"name":"Id"},"sink":{"name":"CustomerID"}},{"source":{"name":"Name"},"sink":{"name":"LastName"}},{"source":{"name":"LastModifiedDate"},"sink":{"name":"ModifiedDate"}}]}
So you need to add a column named as Translator in your meta table, and it's value should be like the above JSON data. Then use this expression to do mapping:#item().Translator
Reference: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping#parameterize-mapping

How to use the pre-copy script from the copy activity to remove records in the sink based on the change tracking table from the source?

I am trying to use change tracking to copy data incrementally from a SQL Server to an Azure SQL Database. I followed the tutorial on Microsoft Azure documentation but I ran into some problems when implementing this for a large number of tables.
In the source part of the copy activity I can use a query that gives me a change table of all the records that are updated, inserted or deleted since the last change tracking version. This table will look something like
PersonID Age Name SYS_CHANGE_OPERATION
---------------------------------------------
1 12 John U
2 15 James U
3 NULL NULL D
4 25 Jane I
with PersonID being the primary key for this table.
The problem is that the copy activity can only append the data to the Azure SQL Database so when a record gets updated it gives an error because of a duplicate primary key. I can deal with this problem by letting the copy activity use a stored procedure that merges the data into the table on the Azure SQL Database, but the problem is that I have a large number of tables.
I would like the pre-copy script to delete the deleted and updated records on the Azure SQL Database, but I can't figure out how to do this. Do I need to create separate stored procedures and corresponding table types for each table that I want to copy or is there a way for the pre-copy script to delete records based on the change tracking table?
You have to use a LookUp activity before the Copy Activity. With that LookUp activity you can query the database so that you get the deleted and updated PersonIDs, preferably all in one field, separated by comma (so its easier to use in the pre-copy script). More information here: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
Then you can do the following in your pre-copy script:
delete from TableName where PersonID in (#{activity('MyLookUp').output.firstRow.PersonIDs})
This way you will be deleting all the deleted or updated rows before inserting the new ones.
Hope this helped!
In the meanwhile the Azure Data Factory provides the meta-data driven copy task. After going through the dialogue driven setup, a metadata table is created, which has one row for each dataset to be synchronized. I solved this UPSERT problem by adding a stored procedure as well as a table type for each dataset to be synchronized. Then I added the relevant information in the metadata table for each row like this
{
"preCopyScript": null,
"tableOption": "autoCreate",
"storedProcedure": "schemaname.UPSERT_SHOP_SP",
"tableType": "schemaname.TABLE_TYPE_SHOP",
"tableTypeParameterName": "shops"
}
After that you need to adapt the sink properties of the copy task like this (stored procedure, table type, table type parameter name):
#json(item().CopySinkSettings).storedProcedure
#json(item().CopySinkSettings).tableType
#json(item().CopySinkSettings).tableTypeParameterName
If the destination table does not exist, you need to run the whole task once before adding the above variables, because auto-create of tables works only as long as no stored procedure is given in the sink properties.

Creating Array from column data using Dynamic SQL

I'm working on creating some stored procedures to automatically mirror data from multiple servers/databases (40+) into one central server. I have created a table that has the column names from the databases that I am referencing like this:
What I'm wanting to do, is essentially grab the COUNT(ColumnID) based on the #TableID table variable that I declare. From there, the central server will already have corresponding columns for each of these sets of columns and the reference table also has the name of the central databases columns for each of these columns listed in the same table for the TableID and ColumnID. I want to pull these column names into an array and/or string where I can EXEC a dynamic Query such as EXEC('SELECT '+#ColumnsString+' FROM [LinkedServer].'+#TableName+'');
I already have the dynamic Linked Server's setup and working. But I'm looking for a way to either store the multiple column names into a single array that I can reference to create a string variable and also to reference for UPDATE queries. IE: SET #ColumnString = (#arrayvalue[1]','+#arrayvalue[2]+','+#arrayvalue[n]+'');
Is there a function available in SQL Server that could accomplish this?

Need a recommendation on copying data in SQL Server (2008R2)

I have several versions of a single access table which I need to move to a sql server db, which I have built. What I mean by several versions, there are 6 different schools that built there own access table to track records. I am consolidating. The wrinkle is that each location made there table slightly different.
The Access table is like:
Student (StudentId,
StudentName,
StudentAddress,
TeacherId,
TeacherName,
Class1,
Class2,
Class3,
Class4,
Class5,
StudentStatus)
I have built a new database with multiple tables, such as a (this is a rough layout)
Student table (StudentId, FirstName, LastName, TeacherId)
Teacher table (TeacherId, FirstName, LastName)
Class table (ClassId, ClassName, TeacherId)
etc
I imported the access tables to a 'staging' schema in the sql server DB (staging.Location1, staging.Location2, etc). Now I need to either create an SSIS package to copy the data to the new tables, or write some stored procs. I imagine I will need to do each location individually.
As I stated above, each access table is slightly different, with different datatypes and column names. For example, one location has the studentID as a varchar and is called StudId, while another location has it as an int and it is called SID, or StudentId, etc.
I can not decide to most efficient route. Am I correct in thinking that either way, this will require at least one SSIS package or SP per location due to the differences?
I say do it this way:
Create only one staging table that covers all data you have in those 3 table. Use right data types and field names. Use Identity field as primary key and use a school name as a identifier for each school
Copy data from 3 access db into the single staging. do data conversion if needed. This way you will have all data in one place and they are all consistent.
Read data from staging and populate the 3 tables you have in your new design. You only need one SSIS package to do this part, you just need to change the destination connection and change the filter on school name

A generic sql query to archive more than one table

How could we manage the archive process without writing separate stored procedures
in SQL Server 2000?
For example,
There are two tables in current db-student and employee.
The objective is to archive the data in these tables-
student table - data older than 1 year
employee table - data older than 2 years
The date field to be compared in the student table is field CreatedDate and that of employee
is DOJ
In addition, I have kept a configuration table with columns ConfigtableName, ConfigColumnName , ConfigCutoffdate.
a) How can I write a generic query so that it dynamically takes the table name as well as column
name from the configuration table and insert the data to the archive dbs' tables?
Something like this....
INSERT INTO <ArchiveDb>.Dbo.<Table name obtained from config table>
SELECT *
FROM <CurrentDb>.Dbo.<Table name obtained from config table>
WHERE
<ConfigColumnName obtained from config table> < <Cutoffdate obtained from config table>
b) How to manage the identify field set option?
c) Is it possible if an error occur in the nth iteration, it could save the error detail to a log?
The only way to construct such a dynamic query in a stored procedure is by using the sp_executesql stored procedure. Read the documentation I linked. Is pretty straight forward.
I am not sure I understand what you mean "identity field set option", but if you are concerned about duplicate values in a column that should have unique values (PK), I'd recommend that you disable the unique indexes in the archive tables since they are for archiving. I don't assume there will be a major issue with duplicate values in an id column but most importantly, that situation should never arise anyway if the archiving tables are identical copies of the source tables.
If you want to catch errors in the nth iteration, you are going to have to enclose every iteration in a begin tran/commit tran block and check for errors. If there's one, you can log to any other table you choose; if not, then you commit the transaction. Read this for an example (scroll all the way down to the Transactions section).

Resources