In Snowflake, you can use the get_ddl function like this to get ddl for tables, views and procedures:
select get_ddl('view', 'SOME_VIEW_I_CREATED')
But it doesn't seem you can do this with tasks.
Is there any way to easily get DDL for snowflake tasks?
NOTE: get_ddl is now supported in tasks in snowflake.
NOTE: This is an old answer. The get_ddl() function now supports tasks.
Here's a script that will kind of generate the DDL you're looking for. It's not perfect but it's worked for my use cases thus far.
show tasks like '%';
SELECT concat('CREATE OR REPLACE TASK ',"name", chr(10)
,' warehouse = ', "warehouse", chr(10)
, case when "predecessor" is null then concat(' ,schedule = ''', "schedule",'''', chr(10)) else '' end
,' ,STATEMENT_TIMEOUT_IN_SECONDS = 14400', chr(10)
,' ,comment = ''', "comment",'''', chr(10)
, case when "predecessor" is not null then concat('after ', "predecessor", chr(10)) else '' end
,'as', chr(10)
, "definition",';'
,chr(10),'alter task ',"name",' resume;') as T
FROM TABLE(RESULT_SCAN(LAST_QUERY_ID()));
As of at least 2019-12-24 (when I discovered it), you can now do this:
select get_ddl('task', 'demo_task')
In order to run this, you have to have the following permissions:
(USAGE on db & OWNERSHIP on schema & Any Permission on task) OR
(USAGE on db & USAGE on schema & Any Permission on task)
Related
I am trying to build one stored procedure in snowflake which will
i. Delete existing records from dimension table.
ii. Insert the records into the dimension table from corresponding materialized view.
iii. Once the data is inserted, record the entry into a config table.
First two are running fine, third one is not working, code below
Table DIM_ROWCOUNT
CREATE TABLE ANALYTICSLAYER.AN_MEDALLIA_P.DIM_ROWCOUNT
(
"TABLE_NAME" VARCHAR(500),
"VIEW_NAME" VARCHAR(500),
"TABLE_ROWCOUNT" VARCHAR(500) ,
"VIEW_ROWCOUNT" VARCHAR(500),
"LOADDATE" timestamp
)
The SP has parameter as SRC_TABLE_NAME which should be loaded for column : TABLE_NAME,
VIEW_NAME will be derived inside code.
TABLE_ROWCOUNT and VIEW_ROWCOUNT will be calculated within the code.
I need to have a multi line query for insertion.
How to create the insert statement multiline?
var config_details = `INSERT INTO DBNAME.SCHEMANAME.TABLENAME
SELECT '${VAR_TABLE_NAME}','${VAR_VIEW_NAME}','${VAR_TABLE_ROW_COUNT}','${VAR_VIEW_ROW_COUNT}',getdate();`
var exec_config_details = snowflake.createStatement( {sqlText: config_details} );
var result_exec_config_details = exec_config_details.execute();
result_exec_config_details.next();
Any help in appreciated.
I tend to write complex SQL statements in a javascript SP by appending each line/portion of text to the variable, as I find it easier to read/debug
Something like this should work (though I haven't tested it so there may be typos). Note that to get single quotes round each value I am escaping them in the code (\'):
var insert_DIM_ROWCOUNT = 'INSERT INTO DBNAME.TEST_P.DIM_ROWCOUNT ';
insert_DIM_ROWCOUNT += 'SELECT \'' + SRC_TABLE_NAME + '\', \'' + VIEW_NAME + '\', \'';
insert_DIM_ROWCOUNT += TABLE_ROWCOUNT + '\', \'' + VIEW_ROWCOUNT + '\', ';
insert_DIM_ROWCOUNT += GETDATE();
I then test this to make sure the SQL statement being created is what I want by adding a temporary line of code, after this, to just return the SQL string:
return insert_DIM_ROWCOUNT;
Once I'm happy the SQL is correct I then comment out/delete this line and let the SQL execute.
Can anyone help me with creating a replica in EXASOL i.e. I need to copy all the tables including Views,Functions and Scripts from one schema to another schema in the same server.
For Eg.: I want all the data from Schema A to be copied not moved to Schema B.
Many thanks.
Thank you wildraid for your suggestion :)
In-order to copy DDL of all the tables in schema, I've got a simple way that will give us the DDLs for all the tables :
select t1.CREATE_STATEMENT||t2.PK||');' from
(Select C.COLUMN_TABLE,‘CREATE TABLE ’ || C.COLUMN_TABLE ||'(' || group_concat( ‘“’||C.COLUMN_NAME||'“' || ' ' || COLUMN_TYPE || case when (C.COLUMN_DEFAULT is not null
and C.COLUMN_IS_NULLABLE=‘true’) or(C.COLUMN_DEFAULT<>‘NULL’ and C.COLUMN_IS_NULLABLE=‘false’) then
' DEFAULT ' || C.COLUMN_DEFAULT end || case when C.COLUMN_IS_NULLABLE=‘false’ then ' NOT NULL ' end
order by column_ordinal_position) CREATE_STATEMENT
from EXA_ALL_COLUMNS C
where
upper(C.COLUMN_SCHEMA)=upper(‘Source_Schema’) and column_object_type=‘TABLE’
group by C.COLUMN_SCHEMA, C.COLUMN_TABLE order by C.COLUMN_TABLE ) t1 left join
(select CONSTRAINT_TABLE,‘, PRIMARY KEY (’ ||group_concat(‘“’||COLUMN_NAME||'“' order by ordinal_position) || ‘)’ PK
from EXA_ALL_CONSTRAINT_COLUMNS where
constraint_type=‘PRIMARY KEY’ and upper(COnstraint_SCHEMA)=upper(‘Source_Schema’) group by CONSTRAINT_TABLE ) t2
on t1.COLUMN_TABLE=t2.constraint_table
order by 1;
Replace the Source_Schema with your schema name and it will generate the Create statement that you can run on the EXAplus.
For copying the data, I have used the same way that you've mentioned in step 2.
Ok, this question consists of two smaller problems.
1) How to copy DDL of all objects in schema
If you need to copy only small amount of schemas, the fastest way is to use ExaPlus client. Right click on schema name and select "CREATE DDL". It will provide you with SQL to create all objects. You may simply run this SQL in context of new schema.
If you have to automate it, you may take a look at this official script: https://www.exasol.com/support/browse/SOL-231
It creates DDL for all schemas, but it can be adapted to use single schema only.
2) How to copy data
This is easier. Just run the following SQL to generate INSERT ... SELECT statements for every table:
SELECT 'INSERT INTO <new_schema>.' || table_name || ' SELECT * FROM <old_schema>.' || table_name || ';'
FROM EXA_ALL_TABLES
WHERE table_schema='<old_schema>';
Copy-paste result and run it to make the actual copy.
I have just recently began using OPENROWSET to insert images into a table. Previously, I would specify the path to each image (1 image = 1 INSERT statement), and use PHP to generate the image's binary string:
INSERT INTO nopCommerce..Picture (PictureBinary, MimeType, SeoFilename, AltAttribute, TitleAttribute, IsNew)
VALUES (
(
SELECT *
FROM OPENROWSET(BULK '" . $this->image['path'] . "', SINGLE_BLOB) AS Binary
),
'" . $this->image['mime_type'] . "',
'" . $this->image['seo_filename'] . "',
'" . $this->image['alt'] . "',
'',
0
)
However, I am trying to insert all images with a single query. So, I have began storing the path to each image into a table, and now I need to insert each one as I did before (just using the table's path field instead of a PHP string). But, when I attempt the following:
INSERT INTO nopCommerce..Picture (PictureBinary, MimeType, SeoFilename, AltAttribute, TitleAttribute, IsNew)
SELECT
(
SELECT *
FROM OPENROWSET(BULK ImagePath, SINGLE_BLOB) AS Binary
),
MimeType,
Name,
Alt,
'',
0
FROM nopRMS..Product_Image_Mappings
I receive the following error:
Msg 102, Level 15, State 1, Line 5
Incorrect syntax near 'ImagePath'.
So, I tried adding quotes around the column's name (to no avail):
INSERT INTO nopCommerce..Picture (PictureBinary, MimeType, SeoFilename, AltAttribute, TitleAttribute, IsNew)
SELECT
(
SELECT *
FROM OPENROWSET(BULK 'ImagePath', SINGLE_BLOB) AS Binary
),
MimeType,
Name,
Alt,
'',
0
FROM nopRMS..Product_Image_Mappings
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file "ImagePath" does not exist.
There has to be a way to accomplish this, I just cannot find the proper syntax online anywhere. Does anyone know how to tell SQL Server to get the path (string) from dbo.Product_Image_Mappings.ImagePath?
UPDATE
I forgot to give you an example of a value that dbo.Product_Image_Mappings.ImagePath would return. It's paths like \\DEREK\WebImages\1\ca-82300.jpg...
UPDATE
It appears that Eirikur Eiriksson has provided a solution in this thread, but this looks like an overly-complicated method of achieving the same end...
UPDATE (Attempt Using Eirikur Eiriksson's Method)
DECLARE #SQL_STR NVARCHAR(MAX) = N'';
SELECT #SQL_STR = STUFF(
(
SELECT
N'
UNION ALL
SELECT '
+ N'(SELECT X.BulkColumn FROM OPENROWSET(BULK '
+ NCHAR(39) + im.ImagePath + NCHAR(39)
+ N', SINGLE_BLOB) AS X) AS PictureBinary,'
+ NCHAR(39) + im.MimeType + NCHAR(39)
+ N' AS MimeType,'
+ NCHAR(39) + im.Name + NCHAR(39)
+ N' AS SeoFilename,'
+ NCHAR(39) + REPLACE(im.Alt, '''', '''''') + NCHAR(39)
+ N' AS AltAttribute,'
+ N'NULL AS TitleAttribute,'
+ N'0 AS IsNew'
FROM nopRMS..Product_Image_Mappings im
FOR XML PATH(''), TYPE
).value('.[1]','NVARCHAR(MAX)'),1,12,N''
)
INSERT INTO nopCommerce..Picture (PictureBinary, MimeType, SeoFilename, AltAttribute, TitleAttribute, IsNew)
EXEC (#SQL_STR);
This kinda worked, but it only inserted 42 rows (out of 7200+)... I need this to be 100% accurate :( I admit though, I may need to change something about this query, but I don't know anything about it (aside from the basic INSERT, SELECT, etc.)
Maybe don't use OPENROWSET? What you are wanting can be handled in a much simpler and cleaner manner using SQLCLR. You can create a Scalar UDF to just read the contents of a file and return that as a VARBINARY(MAX). Then it will fit in nicely to your existing query. For example:
INSERT INTO nopCommerce.dbo.Picture (PictureBinary, MimeType, SeoFilename,
AltAttribute, TitleAttribute, IsNew)
SELECT
dbo.GetBinaryFile([ImagePath]) AS [PictureBinary],
MimeType,
Name,
Alt,
'',
0
FROM nopRMS.dbo.Product_Image_Mappings;
And how much code does it take for dbo.GetBinaryFile()? Here it is:
using System;
using System.Data.SqlTypes;
using System.IO;
using Microsoft.SqlServer.Server;
[return:SqlFacet(MaxSize = -1)]
[SqlFunction(IsDeterministic = false, IsPrecise = true)]
public static SqlBytes GetBinaryFile([SqlFacet(MaxSize = 1000)] SqlString FilePath)
{
if (FilePath.Value.Trim().Equals(String.Empty))
{
return SqlBytes.Null;
}
return new SqlBytes(File.ReadAllBytes(FilePath.Value));
}
And the T-SQL wrapper object is the following (please note the WITH RETURNS NULL ON NULL INPUT line as it skips execution if NULL is passed in, hence no need to check for FilePath.IsNull in the C# code :-)
CREATE FUNCTION [dbo].[GetBinaryFile](#FilePath NVARCHAR(1000))
RETURNS VARBINARY(MAX)
WITH RETURNS NULL ON NULL INPUT
AS
EXTERNAL NAME [CSVParser].[CSVUtils].[GetBinaryFile];
The Assembly will need to be marked as WITH PERMISSION_SET = EXTERNAL_ACCESS. Many people go the easy route of setting the database property of TRUSTWORTHY to ON in order to accomplish this, but that is a security risk and isn't even necessary. Just do the following and you can set the Assembly to EXTERNAL_ACCESS while keeping TRUSTWORTHY set to OFF:
Sign the Assembly.
Create an Asymmetric Key in master from that DLL.
Create a Login (also in master) from that Asymmetric Key.
Grant the new Login the EXTERNAL ACCESS ASSEMBLY permission.
You can find detailed instructions on how to do this in Visual Studio / SSDT (SQL Server Data Tools) in the following article that I wrote: Stairway to SQLCLR Level 7: Development and Security (that site does require free registration in order to view the content).
Also, for anyone that does not want to bother with creating / deploying the Assembly, a similar function is available (though not for free) in the Full version of the SQL# library (which I created, and while many functions are free, the File_* file system functions are only in the Full version).
Using the stored procedure sp_msforeachtable it's possible to execute a script for all tables in a database.
However, there are system tables which I'd like to exclude from that. Instinctively, I would check the properties IsSystemTable or IsMSShipped. These don't work like I expect - I have for example a table called __RefactorLog:
But when I query if this is a system or MS Shipped table, SQL Server reports none of my tables are system tables:
exec (N'EXEC Database..sp_msforeachtable "PRINT ''? = '' + CAST(ObjectProperty(Object_ID(''?''), ''IsSystemTable'') AS VARCHAR(MAX))"') AS LOGIN = 'MyETLUser'
-- Results of IsSystemTable:
[dbo].[__RefactorLog] = 0
[schema].[myUserTable] = 0
and
exec (N'EXEC Database..sp_msforeachtable "PRINT ''? = '' + CAST(ObjectProperty(Object_ID(''?''), ''IsMSShipped'') AS VARCHAR(MAX))"') AS LOGIN = 'MyETLUser'
-- Results of IsMSShipped:
[dbo].[__RefactorLog] = 0
[schema].[myUserTable] = 0
When I look into the properties of the table (inside SSMS), the table is marked as a system object. An object property like IsSystemObject doesn't exist though (AFAIK).
How do I check if a table is a system object, apart from the object property? How does SSMS check if a table is a system object?
Management studio 2008 seems to run some quite ugly following code when opening the "System Tables" folder in the object explorer, the key bit seems to be:
CAST(
case
when tbl.is_ms_shipped = 1 then 1
when (
select
major_id
from
sys.extended_properties
where
major_id = tbl.object_id and
minor_id = 0 and
class = 1 and
name = N''microsoft_database_tools_support'')
is not null then 1
else 0
end
AS bit) AS [IsSystemObject]
(Where tbl is an alias for sys.tables)
So it seems that it's a combination - either is_ms_shipped from sys.tables being 1, or having a particular extended property set.
__refactorlog is, in contrast to what SSMS suggests, a user table. It is used during deployment to track schema changes that cannot be deduced from the current database state, for example renaming a table.
If all your other user tables are in a custom (non-dbo) schema, you can use a combination of the isMSshipped/isSystemTable attributes and the schema name to decide if a table is 'in scope' for your script.
In the past I've worked on the assumption that, in the sys.objects table, column is_ms_shipped indicates whether an object is or is not a system object. (This column gets inherited by other system tables, such as sys.tables.)
This flag can be set by procedure sp_ms_markSystemObject. This, however, is an undocumented procedure, is not supported by Microsoft, I don't think we're supposed to know about it, so I didn't tell you about it.
Am I missing something?
However, there are system tables which I'd like to exclude from that
At least on SQL Server 2008, sp_MSforeachtable already excludes system tables, as this excerpt from it shows:
+ N' where OBJECTPROPERTY(o.id, N''IsUserTable'') = 1 ' + N' and o.category & ' + #mscat + N' = 0 '
I have several variables in an SSIS package that I would like inserting into a table.
example:-
#financialMonth, #Status, #Comments
The Variables have been populated along the way with values based on lookups, filename, dates, etc, and I want to store them in a results table.
Is using the execute SQL task the way to do this ?
Do I need to call a sproc and pass those variales as parameters ?
I've tried putting the following T-SQL into the SQLStatement property
INSERT INTO FilesProcessed
(ProcessedOn, ProviderCode, FinancialMonth,
FileName, Status, Comments)
SELECT GETDATE(), 'ABC' , 201006,
'ABC_201005_Testology.csv',
'Imported','Success'
I tried hardcoding the values above to get it to work
These are the columns on the table I'm inserting into
Column_name Type Computed Length
fileID int no 4
ProcessedOn datetime no 8
ProviderCode nchar no 6
FinancialMonth int no 4
FileName nvarchar no 510
Status nvarchar no 40
Comments nvarchar no 510
This is the Expression code that feeds the SQLStatementSource property
"INSERT INTO FilesProcessed (ProcessedOn, ProviderCode, FinancialMonth,
FileName, Status, Comments) SELECT GETDATE() AS ProcessedOn, '"
+ #[User::providerCode] + "' , "
+ (DT_STR,6,1252)#[User::financialMonth] + ", '"
+ #[User::fileName] + "', 'Imported' AS Status,'Successfully' AS Comments "
Unfortunately I'm missing something, and can't quite get it to work.
The Error message I'm getting is ...
Error: 0xC002F210 at Log entry in
FilesProcessed, Execute SQL Task:
Executing the query "INSERT INTO
FilesProcessed (ProcessedOn,
ProviderCode, FinancialMonth,
FileName, Status, Comments) SELECT
GETDATE(), 'ABC' , 201006,
'DAG_201005_Testology.csv',
'Imported','Successfully'" failed with
the following error: "An error
occurred while extracting the result
into a variable of type (DBTYPE_I2)".
Possible failure reasons: Problems
with the query, "ResultSet" property
not set correctly, parameters not set
correctly, or connection not
established correctly.
Please
a). Advise whether the Execute SQL Task is the way to do what I want to do.
b). Give me any pointers or pitfalls to look out for and check.
Thanks in advance.
OK, here is what I did.
I created an Execute SQL task and configured, thus :-
General Tab
ConnectionType = OLE DB
SQLSourceType = Direct Input
SQLStatement = (left blank)
BypassPrepare = True
ResultSet = None
Parameter Mapping
(none - leave blank)
Result Set
(none - leave blank)
Expressions
SQLStatementSource = "INSERT INTO FilesProcessed (ProcessedOn, ProviderCode, FinancialMonth, FileName, Status, Comments) SELECT GETDATE(), '" + #[User::providerCode] + "' , " + (DT_STR,6,1252)#[User::financialMonth] + ", '" + #[User::fileName] + "', 'Import - Success', '" + #[User::fileComments] + "'"
Then as long as I set up the variables and populate them in the variables window (the Expression editor will not let you save an expression that references a variable that does not exist. Keep notepad handy to store the contents while you go back and edit the variables window, and add new variables in ;)
Build the expression slowly, using the Parse expression button regularly to check.
make sure that the data types of the VALUES match the destination column data types.
see: http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/e8f82288-b980-40a7-83a6-914e217f247d/
A couple of speculative suggestions
The Error message says An error occurred while extracting the result into a variable of type (DBTYPE_I2). But this is a straight insert statement. There shouldn't be a result except for rows affected. Do you have any parameter mappings erroneously set to Output?
What if you try and run the SQL Query from the error message directly in management studio? Does that give you an error?
In the above table definition FinancialMonth as int datatype as
FinancialMonth int no 4
while inseting casting as :
(DT_STR,6,1252)#[User::financialMonth]
I think it's purely a datatype mismatch with the target table definition.