using variables to create tasks in snowflake - snowflake-cloud-data-platform

I want to use variables while creating tasks in snowflake. Below for example:
Instead of this:
CREATE OR REPLACE TASK BEAST_DSC_SBX.DSC_STG.DTCI_DSC_R2O_AVAIL_INCREMENTAL_TRUNCATE
WAREHOUSE = 'BEAST_DSC_ADHOC_WH_SBX' SCHEDULE = '10 MINUTE' WHEN SYSTEM$STREAM_HAS_DATA('BEAST_DSC_SBX.KAFKA.DTCI_DSC_R2O_AVAIL_CHANGES') = True
AS
TRUNCATE TABLE BEAST_DSC_SBX.DSC_STG.DTCI_DSC_R2O_AVAIL_INCREMENTAL;
I want to use this:
SET STG_SCHEMA_NAME = 'BEAST_DSC_SBX.DSC_STG';
SET KAFKA_SCHEMA_NAME = 'BEAST_DSC_SBX.KAFKA';
CREATE OR REPLACE TASK $STG_SCHEMA_NAME.DTCI_DSC_R2O_AVAIL_INCREMENTAL_TRUNCATE
WAREHOUSE = $WAREHOUSE SCHEDULE = '10 MINUTE' WHEN SYSTEM$STREAM_HAS_DATA($KAFKA_SCHEMA_NAME'.DTCI_DSC_R2O_AVAIL_CHANGES') = True
AS
TRUNCATE TABLE $STG_SCHEMA_NAME.DTCI_DSC_R2O_AVAIL_INCREMENTAL;
I am getting errors while executing the above. is it even possible to variablize snowflake tasks? if yes, how?

You cannot create the task name with a combination of both text and variable in the Snowflake UI. That doesn't work.
The workaround is to use SnowSQL and use this format:
&{<variable>}_<TEXT>
Please refer to this KB article as an example: https://community.snowflake.com/s/article/How-to-use-variable-when-creating-user-account-in-Snowflake-UI-and-SnowSQL
In the KB article above, we show that you can combine the text and variable in the SnowSQL like so:
create USER &{DIV}_LOCAL_&{ENV}_ADMIN`
PASSWORD = '&passwd'
LOGIN_NAME = '&{DIV}_LOCAL_&{ENV}_ADMIN'
COMMENT = "&DIV &ENV local account admin user"
DISPLAY_NAME = '&{DIV}_LOCAL_&{ENV}_ADMIN'
FIRST_NAME = '&{DIV}_LOCAL_&{ENV}_ADMIN'
LAST_NAME = '&{DIV}_LOCAL_&{ENV}_ADMIN'
EMAIL = 'test#test.com'
MUST_CHANGE_PASSWORD = TRUE;
The example above is to create a user, you can adopt the SQL statement for creating a task.

Related

Using binding inside SQL procedures

I am having trouble getting the following code to work:
create or replace secure procedure create_wh (wh_name varchar)
returns varchar
language sql
comment = '<string_literal>'
execute as owner
as
begin
create warehouse if not exists :wh_name
warehouse_size = xsmall
auto_suspend = 60
auto_resume = true
initially_suspended = true;
return 'SUCCES';
end;
The idea is that the SP can be called with a name for a warehouse. It errors in unexpected 'if' after the create warehouse statement when trying to run the above code.
I am guessing I am missing something in relation to binding the param to the query, but I can't figure out what.
It is possible to provide warehouse name as parameter by using IDENTIFIER(:wh_name):
create or replace secure procedure create_wh (wh_name varchar)
returns varchar
language sql
comment = '<string_literal>'
execute as owner
as
begin
create warehouse if not exists IDENTIFIER(:wh_name)
warehouse_size = xsmall
auto_suspend = 60
auto_resume = true
initially_suspended = true;
return 'SUCCES';
end;
CALL create_wh('test');
SHOW WAREHOUSES;

Is there a way to automatically unload data in Snowflake to local machine using tasks or stored procedure?

I would like to automatically unload my data using Task or Stored Procedure but they didn't work out, seems like the get command wasn't run, is there a way to fix it? Thank you,
CREATE OR REPLACE STREAM unload_dimads_stream
ON TABLE "FA_PROJECT01_DB"."ADSBI"."DIM_ADS";
CREATE OR REPLACE TASK unload_dimads_task1
WAREHOUSE = FA_PROJECT01_CLOUDDW
SCHEDULE = '1 minute'
WHEN SYSTEM$STREAM_HAS_DATA('unload_dimads_stream')
AS
COPY INTO #adsbi.%dim_ads from AdsBI.Dim_Ads file_format = (TYPE=CSV FIELD_DELIMITER = '|' BINARY_FORMAT = 'UTF-8' compression=none) header= true overwrite=true;
CREATE OR REPLACE TASK unload_dimads_task2
WAREHOUSE = FA_PROJECT01_CLOUDDW
AFTER unload_dimads_task1
WHEN SYSTEM$STREAM_HAS_DATA('unload_dimads_stream')
AS
GET #adsbi.%dim_ads file://D:\unload\table1;
ALTER TASK unload_dimads_task2 resume;
ALTER TASK unload_dimads_task1 resume;
GET #adsbi.%dim_ads file://D:\unload\table1;
You cannot run this step on a stored procedure or task. Stored procedures and tasks are running on Snowflake servers. You can script GET operations outside of Snowflake using the SnowSQL client, but something else will need to call it to run.

How to set the CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly' for elastic queries on Azure SQL?

I am accessing the other database using elastic queries. The data source was created like this:
CREATE EXTERNAL DATA SOURCE TheCompanyQueryDataSrc WITH (
TYPE = RDBMS,
--CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly',
CREDENTIAL = ElasticDBQueryCred,
LOCATION = 'thecompanysql.database.windows.net',
DATABASE_NAME = 'TheCompanyProd'
);
To reduce the database load, the read-only replica was created and should be used. As far as I understand it, I should add the CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly' (commented out in the above code). However, I get only the Incorrect syntax near 'CONNECTION_OPTIONS'
Both databases (the one that sets the connection + external tables, and the other to-be-read-only are at the same server (thecompanysql.database.windows.net). Both are set the compatibility lever SQL Server 2019 (150).
What else should I set to make it work?
The CREATE EXTERNAL DATA SOURCE Syntax doesn't support the option CONNECTION_OPTIONS = 'ApplicationIntent=ReadOnly'. We can't use that in the statements.
If you want achieve that readonly request, the way is that please use the user account which only has the readonly(db_reader) permission to login the external database.
For example:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>' ;
CREATE DATABASE SCOPED CREDENTIAL SQL_Credential
WITH
IDENTITY = '<username>' -- readonly user account,
SECRET = '<password>' ;
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc
WITH
( TYPE = RDBMS ,
LOCATION = '<server_name>.database.windows.net' ,
DATABASE_NAME = 'Customers' ,
CREDENTIAL = SQL_Credential
) ;
Since the option is not supported, then we can't use it with elastic query. The only way to connect to the Azure SQL data with SSMS is like this:
HTH.

I am trying to run multiple query statements created when using the python connector with the same query id

I have created a Python function which creates multiple query statements.
Once it creates the SQL statement, it executes it (one at a time).
Is there anyway to way to bulk run all the statements at once (assuming I was able to create all the SQL statements and wanted to execute them once all the statements were generated)? I know there is an execute_stream in the Python Connector, but I think this requires a file to be created first. It also appears to me that it runs a single query statement at a time."
Since this question is missing an example of the file, here is a file content that I have provided as extra that we can work from.
//connection test file for python multiple queries
import snowflake.connector
conn = snowflake.connector.connect(
user = 'xxx',
password = '',
account = 'xxx',
warehouse= 'xxx',
database= 'TEST_xxx'
session_parameters = {
'QUERY_TAG: 'Rachel_test',
}
}
while(conn== true){
print(conn.sfqid)import snowflake.connector
try:
conn.cursor().execute("CREATE WAREHOUSE IF NOT EXISTS tiny_warehouse_mg")
conn.cursor().execute("CREATE DATABASE IF NOT EXISTS testdb_mg")
conn.cursor().execute("USE DATABASE testdb_mg")
conn.cursor().execute(
"CREATE OR REPLACE TABLE "
"test_table(col1 integer, col2 string)")
conn.cursor().execute(
"INSERT INTO test_table(col1, col2) VALUES " +
" (123, 'test string1'), " +
" (456, 'test string2')")
break
except Exception as e:
conn.rollback()
raise e
}
conn.close()
The reference to this question refers to a method that can be done with the file call, the example in documentation is as follows:
from codecs import open
with open(sqlfile, 'r', encoding='utf-8') as f:
for cur in con.execute_stream(f):
for ret in cur:
print(ret)
Reference to guide I used
Now when I ran these, they were not perfect, but in practice I was able to execute multiple sql statements in one connection, but not many at once. Each statement had their own query id. Is it possible to have a .sql file associated with one query id?
Is it possible to have a .sql file associated with one query id?
You can achieve that effect with the QUERY_TAG session parameter. Set the QUERY_TAG to the name of your .SQL file before executing it's queries. Access the .SQL file QUERY_IDs later using the QUERY_TAG field in QUERY_HISTORY().
I believe though you generated the .sql while executing in snowflake each statement will have unique query id.
If you want to run one sql independent to other you may try with multiprocessing/multi threading concept in python.
The Python and Node.Js libraries do not allow multiple statement executions.
I'm not sure about Python but for Node.JS there is this library that extends the original one and add a method call "ExecutionAll" to it:
snowflake-multisql
You just need to wrap multiple statements with the BEGIN and END.
BEGIN
<statement_1>;
<statement_2>;
END;
With these operators, I was able to execute multiple statement in nodejs

PhpMyAdmin does not list a database I have access to

When I start phpMyAdmin, it does not list a database I have access too.
I know that the db exists and that I have access too, because its is the database used by one of my Joomla site.
I picked up the credential from my joomla configuration to the config.inc.php file.
Here is the file:
$cfg['Servers'][$i]['verbose'] = 'SQL9_MODULES';
$cfg['Servers'][$i]['host'] = 'sql9.modules';
$cfg['Servers'][$i]['port'] = '';
$cfg['Servers'][$i]['socket'] = '';
$cfg['Servers'][$i]['connect_type'] = 'tcp';
$cfg['Servers'][$i]['extension'] = 'mysql';
$cfg['Servers'][$i]['auth_type'] = 'config';
$cfg['Servers'][$i]['user'] = 'xxxx';
$cfg['Servers'][$i]['password'] = 'yyy';
$cfg['DefaultLang'] = 'en';
$cfg['ServerDefault'] = 1;
$cfg['UploadDir'] = '';
$cfg['SaveDir'] = '';
I tried as mentionned in other post this command :
SELECT SCHEMA_NAME AS `Database` FROM INFORMATION_SCHEMA.SCHEMATA;
The results is empty.
I have also started the "Synchronize" function from my joomla database to my joomla database and pma found all the tables !!!!
The I tried to write some selects directly on the tables but I got a
#1046 - No database selected
Last remark, this is shared hosting, so I cannot connect as root to grand me some extra privileges !!
Thanks for your help
MySQL version : 5.1.66-0+squeeze1-log - (Debian)
phpMyAdmin version : 3.5.3
Update:
SHOW GRANTS FOR CURRENT_USER
gives
GRANT USAGE ON *.* TO 'xxxx'#'%' IDENTIFIED BY PASSWORD '*****************'
GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES, LOCK TABLES, EXECUTE, CREATE VIEW, SHOW VIEW, CREATE ROUTINE, ALTER ROUTINE ON `xxxx`.* TO 'xxxx'#'%'
Any missing grants ?
Got answer from my ISP support. Needed to add the following line in the configuration:
$cfg['Servers'][$i]['only_db'] = 'my-database';
Now it works.

Resources