I have an Excel list with object name and column name (like so..)
Object Name
Column Name
OBJECT1
Column A
OBJECT2
Column C
And I want to know if there's a way to search for all the object names and/or column names at once from the list to see if they exist in any stored procedure in all databases in SQL Server. I know I can use the SQL Search utility to search but it can only search one object at a time.
Thanks
Related
I have a database on SQL Server on premises and need to regularly copy the data from 80 different tables to an Azure SQL Database. For each table the columns I need to select from and map are different - example, TableA - I need columns 1,2 and 5. For TableB I need just column 1. The tables are named the same in the source and target, but the column names are different.
I could create multiple Copy data pipelines and select the source and target data sets and map to the target table structures, but that seems like a lot of work for what is ultimately the same process repeated.
I've so far created a meta table, which lists all the tables and the column mapping information. This table holds the following data:
SourceSchema, SourceTableName, SourceColumnName, TargetSchema, TargetTableName, TargetColumnName.
For each table, data is held in this table to map the source tables to the target tables.
I have then created a lookup which selects each table from the mapping table. It then does a for each loop and does another lookup to get the source and target column data for the table in the foreach iteration.
From this information, I'm able to map the Source table and the Sink table in a Copy Data activity created within the foreach loop, but I'm not sure how I can dynamically map the columns, or dynamically select only the columns I require from each source table.
I have the "activity('LookupColumns').output" from the column lookup, but would be grateful if someone could suggest how I can use this to then map the source columns to the target columns for the copy activity. Thanks.
In your case, you can use the expression in the mapping setting.
It needs your provide an expression and it's data should like this:{"type":"TabularTranslator","mappings":[{"source":{"name":"Id"},"sink":{"name":"CustomerID"}},{"source":{"name":"Name"},"sink":{"name":"LastName"}},{"source":{"name":"LastModifiedDate"},"sink":{"name":"ModifiedDate"}}]}
So you need to add a column named as Translator in your meta table, and it's value should be like the above JSON data. Then use this expression to do mapping:#item().Translator
Reference: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping#parameterize-mapping
I don't know if this already has an answer as I wasn't able to find it, but if so please provide the link.
Column names are used in User Defined Stored Procedures and Functions when accessing certain columns in a table. But what if I want to edit the column name in a table and I have called that column name a lot of times in multiple usp's/udf's.
I am currently using SQL Server and obviously it will not let me update a table column since it is called in multiple existing usp's/udf's. Is there a way to update the column name of a table together with the column names in those usp's/udf's?
Background
I'm using Azure data factory v2 to load data from on-prem databases (for example SQL Server) to Azure data lake gen2. Since I'm going to load thousands of tables, I've created a dynamic ADF pipeline that loads the data as-is in the source based on parameters for schema, table name, modified date (for identifying increments) and so on. This obviously means I can't specify any type of schema or mapping manually in ADF. This is fine since I want the data lake to hold a persistent copy of the source data in the same structure. The data is loaded into ORC files.
Based on these ORC files I want to create external tables in Snowflake with virtual columns. I have already created normal tables in Snowflake with the same column names and data types as in the source tables, which I'm going to use in a later stage. I want to use the information schema for these tables to dynamically create the DDL statement for the external tables.
The issue
Since column names are always UPPER case in Snowflake, and it's case-sensitive in many ways, Snowflake is unable to parse the ORC file with the dynamically generated DDL statement as the definition of the virtual columns no longer corresponds to the source column name casing. For example it will generate one virtual column as -> ID NUMBER AS(value:ID::NUMBER)
This will return NULL as the column is named "Id" with a lower case D in the source database, and therefore also in the ORC file in the data lake.
This feels like a major drawback with Snowflake. Is there any reasonable way around this issue? The only options I can think of is to:
1. Load the information schema from the source database to Snowflake separately and use that data to build a correct virtual column definition with correct cased column names.
2. Load the records in their entirety into some variant column in Snowflake, converted to UPPER or LOWER.
Both options add a lot of complexity or even messes up the data. Is there any straight forward way to only return the column names from an ORC file? Ultimately I would need to be able to use something like Snowflake's DESCRIBE TABLE on the file in the data lake.
Unless you set the parameter QUOTED_IDENTIFIERS_IGNORE_CASE = TRUE you can declare your column in the casing you want:
CREATE TABLE "MyTable" ("Id" NUMBER);
If your dynamic SQL carefully uses "Id" and not just Id you will be fine.
Found an even better way to achieve this, so I'm answering my own question.
With the below query we can get the path/column names directly from the ORC file(s) in the stage with a hint of the data type from the source. This filters out colums that only contains NULL values. Will most likely create some type of data type ranking table for the final data type determination for the virtual columns we're aiming to define dynamically for the external tables.
SELECT f.path as "ColumnName"
, TYPEOF(f.value) as "DataType"
, COUNT(1) as NbrOfRecords
FROM (
SELECT $1 as "value" FROM #<db>.<schema>.<stg>/<directory>/ (FILE_FORMAT => '<fileformat>')
),
lateral flatten(value, recursive=>true) f
WHERE TYPEOF(f.value) != 'NULL_VALUE'
GROUP BY f.path, TYPEOF(f.value)
ORDER BY 1
I'm working on creating some stored procedures to automatically mirror data from multiple servers/databases (40+) into one central server. I have created a table that has the column names from the databases that I am referencing like this:
What I'm wanting to do, is essentially grab the COUNT(ColumnID) based on the #TableID table variable that I declare. From there, the central server will already have corresponding columns for each of these sets of columns and the reference table also has the name of the central databases columns for each of these columns listed in the same table for the TableID and ColumnID. I want to pull these column names into an array and/or string where I can EXEC a dynamic Query such as EXEC('SELECT '+#ColumnsString+' FROM [LinkedServer].'+#TableName+'');
I already have the dynamic Linked Server's setup and working. But I'm looking for a way to either store the multiple column names into a single array that I can reference to create a string variable and also to reference for UPDATE queries. IE: SET #ColumnString = (#arrayvalue[1]','+#arrayvalue[2]+','+#arrayvalue[n]+'');
Is there a function available in SQL Server that could accomplish this?
Is there any way to get information about the updated/deleted tables in a trigger, meaning
which columns are in the inserted/deleted tables?
which data types do they have?
The background for this question is: I would like to create a 'generic' trigger which can be used without having need to be adapted for the table in question.
Dummy code:
foreach column in inserted table
get column name and data type and column (e.g. to exclude text columns or columns with a special name)
if the value has changed do some logging into another table
Unfortunately I could not find the needed information so far; I have to admit that I don't know the proper key words which have to be used.
It would even be helpful to get a list of functions (e.g. UPDATE()) which can be used in triggers to query information about the inserted/deleted tables.
The TSQL code should work on MS SQL Server >= 2008.
TIA