Automatic Generation of DDL scripts - snowflake-cloud-data-platform

In real-time when an organization is trying to migrate from an on-prem Teradata to Snowflake, how to create all the objects in Snowflake? (Databases, Schemas, Tables and Views for ex).
Definitely one cannot create each object manually. (Lets say there are some 5000 odd tables). How is this automatic generation of DDL scripts done in real-time?
Also I am trying to avoid any 3rd party tools like Roboquery.

Yes, we can generate DDL Scripts using GET_DDL() function
You can find more information below
https://docs.snowflake.com/en/sql-reference/functions/get_ddl.html

Are You Possibly looking for something like this?
It will require more work to automate the process but here is the idea:
Iterate thru your information schema to get all the ddl content for each type DB/Table/Proc/View/etc.
Then automate this to push the results to a file, and then you can have an entire "Rebuild" script as you see fit.
This hopefully gets the idea going for what you might be looking for.
USE DATABASE SNOWFLAKE;
SELECT (
'USE DATABASE ' || CHAR(39) || DB.DATABASE_NAME || CHAR(39) || '; ' ||
'SELECT GET_DDL(' || CHAR(39) || 'DATABASE' || CHAR(39) ||',' || CHAR(39) || DB.DATABASE_NAME || CHAR(39) || ', 1);' || CHAR(13) || CHAR(10)
) AS SHOW_DATABASE_DDLS
FROM INFORMATION_SCHEMA.DATABASES AS DB;
It will generate all possibilities:
-- EXAMPLE
-- USE DATABASE 'ONEOFYOURDBS'; SELECT GET_DDL('DATABASE','ONEOFYOURDBS', 1);
The next layer down is tables/views this is subject to restrictions of each Database, so you MUST change databases and then you can run these for each database to get all your DDL's
USE DATABASE TESTDATABASE;
WITH CONFIG AS (
SELECT
'%' AS TGT_DATABASE_FILTER
, '%' AS TGT_SCHEMA_FILTER
, '%' AS TGT_TABLE_FILTER
)
SELECT
(
'/* =============================== AUTO-GENERATED FROM SCRIPT BEGIN =============================== */' || CHAR(13) || CHAR(10) ||
'SELECT GET_DDL(' || CHAR(39) || 'TABLE' || CHAR(39) ||', ' || CHAR(39) || CHAR(34) ||
T.TABLE_CATALOG|| CHAR(34) || '.' || CHAR(34) || T.TABLE_SCHEMA || CHAR(34) || '.' || CHAR(34) || T.TABLE_NAME || CHAR(34) || CHAR(39) || ', 1);' || CHAR(13) || CHAR(10) ||
'/* =============================== AUTO-GENERATED FROM SCRIPT END =============================== */' || CHAR(13) || CHAR(10)
) AS INSERT_DML
FROM INFORMATION_SCHEMA.TABLES AS T
CROSS JOIN CONFIG AS C
WHERE UPPER(T.TABLE_TYPE) = 'BASE TABLE'
AND TABLE_SCHEMA != 'INFORMATION_SCHEMA'
AND UPPER(T.TABLE_NAME) LIKE UPPER(C.TGT_TABLE_FILTER)
AND UPPER(T.TABLE_SCHEMA) LIKE UPPER(C.TGT_SCHEMA_FILTER)
;
GENERATES: (THE , 1 produces the entire fully qualified path.)
/* =============================== AUTO-GENERATED FROM SCRIPT BEGIN =============================== */
SELECT GET_DDL('TABLE', '"TESTDATABASE"."TESTSCHEMA"."TESTTABLE"', 1);
/* =============================== AUTO-GENERATED FROM SCRIPT END =============================== */
NOTES:
-- FYI
SELECT CHAR(39); -- SINGLE QUOTE
SELECT CHAR(34); -- DOUBLE QUOTE
SELECT CHAR(13); -- CARRIAGE RETURN
SELECT CHAR(10); -- NEW LINE/LINE BREAK

Related

Looking for Dynamic Stored Procedure, to use for multiple queries [duplicate]

This question already has an answer here:
THE CURSOR CURSOR NAME IS NOT IN A PREPARED STATE
(1 answer)
Closed 2 years ago.
I want to design a dynamic stored procedure, where I will be passing the column name, table name and my where clause. So that I can use the stored procedure to run select on different table with different parameters.
I'm not sure if this is possible. If yes, anyone will able to help me with example.
For example.
Query 1: Select name, number, total into out_name, out_number, out_total from student where total > 100;
Query 2: select book into out_book from lib where cost > 100;
I should able to execute above queries in single stored procedure by passing the column, table and where clause.
I created something like below. I did something like this for delete, delete working fine.
SET V_SELECT =
'SELECT ' || SELECT_FIELDS ||
' FROM ' || TABLE_NAME ||
' WHERE ' || WHERE_CLAUSE ||
' WITH UR';
EXECUTE IMMEDIATE V_SELECT INTO || INTO_FIELDS ||;
CREATE PROCEDURE usp_DynamicProc
(
#SelectFields NVARCHAR(1024)
, #IntoTableName NVARCHAR(255)
, #TableName NVARCHAR(255)
, #WhereClause NVARCHAR(1024)
)
AS
DECLARE #SQL NVARCHAR(MAX)
SELECT #SQL = 'SELECT ' + #SelectFields + ISNULL(' INTO ' + #IntoTableName,'') + ' FROM ' + #TableName + ' WHERE ' + #WhereClause
EXEC sp_executesql #SQL

How do I dump a Postgres DDL into a CSV that can be pasted into Google Sheets?

I am trying to create an output from Postgres that will do something like the following:
schema, table, column, type, attributes
public, table1, id, bigserial, not null primary key
public, table1, name, string, not null
...
As long as the output is formatted with the column headers, I will be fine if it's not CSV (I can export a normal query as a CSV just fine). I am having trouble wrapping my head around the right query.
The closest I've come is as follows. But it feels like I'm on the wrong track and there is a simpler/cleaner way of doing this. I am also not currently getting a single row per table column (which is what I would like).
SELECT table_schema || ',' || table_name || ',' ||
string_agg(column_list.column_expr, ';' || '') ||
'' || ');'
FROM (
SELECT table_schema, table_name, ' ' || column_name || ',' || data_type ||
coalesce('(' || character_maximum_length || ')', ', ') ||
case when is_nullable = 'YES' then '' else ' NOT NULL' end as column_expr
FROM information_schema.columns
WHERE table_schema = 'public'
ORDER BY ordinal_position) column_list
group by table_schema,table_name;
I guess I was overthinking this a bit. Here is the final SQL I ended up using:
SELECT table_schema, table_name, column_list.column_name, column_list.column_expr
FROM (
SELECT table_schema, table_name, column_name, data_type ||
coalesce('(' || character_maximum_length, ' ') ||
case when is_nullable = 'YES' then '' else ' NOT NULL' end as column_expr
FROM information_schema.columns
WHERE table_schema = 'public'
ORDER BY table_name, column_name) column_list;

How can I revoke all the permissions a role has in sql server?

I have a role, it has some select perms on various tables. I would like to remove all the select permissions that the role has across all tables. Eg,
revoke all from my_role_name;
But this doesn't seem to work. How can I do this?
I just had the need to do this.
You can use something along the lines of the below
DECLARE #RevokeScript NVARCHAR(MAX);
DECLARE #PrincipalName SYSNAME = 'my_role_name'
SELECT #RevokeScript = STRING_AGG(CAST('REVOKE ' + permission_name
+ CASE class_desc
WHEN 'OBJECT_OR_COLUMN' THEN ' ON ' + QUOTENAME(OBJECT_SCHEMA_NAME(major_id)) + '.' + QUOTENAME(OBJECT_NAME(major_id))
WHEN 'SCHEMA' THEN ' ON SCHEMA::' + SCHEMA_NAME(major_id)
WHEN 'DATABASE' THEN ''
END
+ ' TO ' + QUOTENAME(#PrincipalName) COLLATE SQL_Latin1_General_CP1_CI_AS AS NVARCHAR(MAX)), ';')
FROM sys.database_permissions AS pe
WHERE pe.grantee_principal_id = DATABASE_PRINCIPAL_ID (#PrincipalName);
PRINT #RevokeScript
EXEC (#RevokeScript)

Altering a column (null -> not null) without knowing the data type

Is there a possibility to alter a column from "allows null" to "does not allow null" without knowledge of the actual data type of the column?
I think no, so I have made as the basic skeleton code for my stored procedure:
SELECT t.name,c.max_length FROM sys.types t
LEFT JOIN sys.columns c ON(t.system_type_id = c.system_type_id)
WHERE object_id=OBJECT_ID(#TableName) AND c.name=#FieldName;
and
EXEC('UPDATE ' + #TableName + ' SET ' + #FieldName + ' = ' + #DefaultValue + ' WHERE ' + #FieldName + ' IS NULL');
EXEC('ALTER TABLE ' + #TableName + ' ALTER COLUMN ' + #FieldName + ' NOT NULL');
I guess now I only have to get the return values from the first query back into the second. I can't get my head around how to get the values into a variable and then access them again. Ideas?
Since the INFORMATION_SCHEMA has all required information and is part of a SQL standard, it might be better to use that in this case (however, SQL Server's ALTER TABLE ALTER COLUMN is non-standard anyway so it might not matter as much).
Either way, you should also be checking for whether there's character length and/or numeric precision being specified, and make sure you're altering the table in the correct schema (and not getting dbo.TableName instead of customschema.TableName). You could try something like this (I used INFORMATION_SCHEMA here but you could easily refactor this to use the sys.columns view):
DECLARE #retVal VARCHAR(500);
SELECT #retVal =
CASE WHEN CHARACTER_MAXIMUM_LENGTH > 0
THEN CONCAT(DATA_TYPE, '(', CHARACTER_MAXIMUM_LENGTH ,')')
WHEN CHARACTER_MAXIMUM_LENGTH = -1 AND DATA_TYPE <> 'xml'
THEN CONCAT(DATA_TYPE, '(MAX)')
WHEN DATA_TYPE IN ('numeric', 'decimal')
THEN CONCAT(DATA_TYPE, '(', NUMERIC_PRECISION,',', NUMERIC_SCALE,')')
ELSE DATA_TYPE
END
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = #schemaName
AND TABLE_NAME = #tableName
AND COLUMN_NAME = #columnName
#retVal will now capture datatypes like int, varchar(100), varbinary(MAX), or decimal(10,2) correctly.
And then build up a dynamic SQL Query like this:
DECLARE #sql VARCHAR(MAX);
SET #sql = 'ALTER TABLE ' + #schemaName + '.' + #tableName + ' ALTER COLUMN ' + #columnName + ' ' + #retVal + ' NOT NULL;'
EXEC(#sql);
You select values into variables like this:
SELECT #Var1=t.name,#Var2=c.max_length FROM sys.types t
LEFT JOIN sys.columns c ON(t.system_type_id = c.system_type_id)
WHERE object_id=OBJECT_ID(#TableName) AND c.name=#FieldName;
This of course assumes that you have already declared Var1 & Var2, and that your query will only return one row.

change collation on all objects in a database

I need to change the collation on a restored database to match the server and the tempdb database. I understand that I can use ALTER DATABASE and ALTER TABLE to change collertion. But will it only affect new data added to the tables? Do i have to go down the road with
Script needed to re-create your user databases and all the objects in them
Export all your data using a tool such as the bcp Utility
Create a new database
Script the tables in right collection and import all data.
It's a Microsoft Sql Server 2008
I used this script:
DECLARE #collation NVARCHAR(64)
SET #collation = 'Latin1_General_CI_AS'
SELECT
'ALTER TABLE [' + TABLE_SCHEMA + '].[' + TABLE_NAME + '] '
+ 'ALTER COLUMN [' + COLUMN_NAME + '] '
+ DATA_TYPE + '(' + CASE CHARACTER_MAXIMUM_LENGTH
WHEN -1 THEN 'MAX'
ELSE CAST(CHARACTER_MAXIMUM_LENGTH AS VARCHAR) END + ') '
+ 'COLLATE ' + #collation + ' '
+ CASE WHEN IS_NULLABLE = 'NO' THEN 'NOT NULL' ELSE 'NULL' END
FROM INFORMATION_SCHEMA.columns
WHERE COLLATION_NAME IS NOT NULL
AND TABLE_NAME IN ( SELECT TABLE_NAME from information_schema.tables WHERE table_type = 'BASE TABLE' )
AND COLLATION_NAME <> #collation
Also check Set or Change the Database Collation

Resources