I'm working on creating some stored procedures to automatically mirror data from multiple servers/databases (40+) into one central server. I have created a table that has the column names from the databases that I am referencing like this:
What I'm wanting to do, is essentially grab the COUNT(ColumnID) based on the #TableID table variable that I declare. From there, the central server will already have corresponding columns for each of these sets of columns and the reference table also has the name of the central databases columns for each of these columns listed in the same table for the TableID and ColumnID. I want to pull these column names into an array and/or string where I can EXEC a dynamic Query such as EXEC('SELECT '+#ColumnsString+' FROM [LinkedServer].'+#TableName+'');
I already have the dynamic Linked Server's setup and working. But I'm looking for a way to either store the multiple column names into a single array that I can reference to create a string variable and also to reference for UPDATE queries. IE: SET #ColumnString = (#arrayvalue[1]','+#arrayvalue[2]+','+#arrayvalue[n]+'');
Is there a function available in SQL Server that could accomplish this?
Related
I have a database on SQL Server on premises and need to regularly copy the data from 80 different tables to an Azure SQL Database. For each table the columns I need to select from and map are different - example, TableA - I need columns 1,2 and 5. For TableB I need just column 1. The tables are named the same in the source and target, but the column names are different.
I could create multiple Copy data pipelines and select the source and target data sets and map to the target table structures, but that seems like a lot of work for what is ultimately the same process repeated.
I've so far created a meta table, which lists all the tables and the column mapping information. This table holds the following data:
SourceSchema, SourceTableName, SourceColumnName, TargetSchema, TargetTableName, TargetColumnName.
For each table, data is held in this table to map the source tables to the target tables.
I have then created a lookup which selects each table from the mapping table. It then does a for each loop and does another lookup to get the source and target column data for the table in the foreach iteration.
From this information, I'm able to map the Source table and the Sink table in a Copy Data activity created within the foreach loop, but I'm not sure how I can dynamically map the columns, or dynamically select only the columns I require from each source table.
I have the "activity('LookupColumns').output" from the column lookup, but would be grateful if someone could suggest how I can use this to then map the source columns to the target columns for the copy activity. Thanks.
In your case, you can use the expression in the mapping setting.
It needs your provide an expression and it's data should like this:{"type":"TabularTranslator","mappings":[{"source":{"name":"Id"},"sink":{"name":"CustomerID"}},{"source":{"name":"Name"},"sink":{"name":"LastName"}},{"source":{"name":"LastModifiedDate"},"sink":{"name":"ModifiedDate"}}]}
So you need to add a column named as Translator in your meta table, and it's value should be like the above JSON data. Then use this expression to do mapping:#item().Translator
Reference: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping#parameterize-mapping
I don't know if this already has an answer as I wasn't able to find it, but if so please provide the link.
Column names are used in User Defined Stored Procedures and Functions when accessing certain columns in a table. But what if I want to edit the column name in a table and I have called that column name a lot of times in multiple usp's/udf's.
I am currently using SQL Server and obviously it will not let me update a table column since it is called in multiple existing usp's/udf's. Is there a way to update the column name of a table together with the column names in those usp's/udf's?
I have a DapperRow dynamic object containing fields retrieved from a table. I had gotten the DapperRow from a previous Query.
I'd like to insert these fields, and a few others, into a different table. However, I can't pass the dynamic in the Execute
con.Execute(insertStatment, rowData);
I will drop back to standard SQL and bind the parameters myself -- I'm just wondering if there's some Dapper feature I'm missing.
Thanks
I have a linked table in my access database (dbo_Billing_denied (DSN=WTSTSQL05_BB;DATABASE=DEPTFINANCE), etc.) and I want to create a table that will store the data from this linked into local table, so I can use it to run other queries. Currently I can use this because it tells me that it can not make connection (ODB--connection to 'WTSTSQL05_BB' failed.
Do I have to create a table first and assign all the fields before I can do this (create a table and fields that are same as what's in the linked table and than create append query to do this...)?
It sounds like you might have two problems. I will address the second one. You will need to reestablish connection to the linked table before this will work.
You can use a "make table query" in Access to make a local copy of the linked table. You can use the GUI for this, or you can structure the SQL something like this:
SELECT <list of various fields, or * for all fields>
INTO <name of new local table>
FROM <name of linked table(s) on the server>
WHERE <any other conditions you want to put on which records are included>;
I mentioned that there might be more than one table. You can also do this with joined tables or unions etc. The "where" clause is optional. Removing it will copy the entire data set.
You will get a warning when you try to execute this query in Access. It will tell you that you are about to write (or overwrite) a table. If you are trying to write a cleaner application with fewer nuisance messages for the end user, call this query from a macro. The macro would need to turn the warnings off, execute the query, then turn the warnings back on.
Microsoft Access does not require you to create this table before you write it; if the table does not exist Access will create this table for you, based on the field definitions in the source data. If a table of the same name does exist, Access will drop this table from your database and then create a new table of that name.
This also implies that the local table you are generating will need a unique name. If your query tries to overwrite the linked table by using the same name, the first thing Access will do is drop the linked table. It will then look for field definitions and input data in the linked table that it just dropped.
Since the new local table will have a new name, queries developed for the linked table will not work with the new local table. One possible work-around would be to rename the linked table in your local Access database. The table name in Access does not need to equal the name in the database it's linking to. The query could then write to a table with the correct name, and previous queries should work. Still, keep in mind that these queries would no longer be working on live data.
Is there any way to get information about the updated/deleted tables in a trigger, meaning
which columns are in the inserted/deleted tables?
which data types do they have?
The background for this question is: I would like to create a 'generic' trigger which can be used without having need to be adapted for the table in question.
Dummy code:
foreach column in inserted table
get column name and data type and column (e.g. to exclude text columns or columns with a special name)
if the value has changed do some logging into another table
Unfortunately I could not find the needed information so far; I have to admit that I don't know the proper key words which have to be used.
It would even be helpful to get a list of functions (e.g. UPDATE()) which can be used in triggers to query information about the inserted/deleted tables.
The TSQL code should work on MS SQL Server >= 2008.
TIA