How to use snowsql cli to call Snowflake procedure, (Snowflake-CLI(Snowsql) - snowflake-cloud-data-platform

I am trying to call snowflake procedure in snowsql cli. Any one has tried and able to succeed. I am trying to pass parameters to procedure to deploy artifacts.

Have you tried yourself? If yes, what error did you get?
create or replace procedure my_proc(param varchar)
returns string
language javascript
AS
$$
return "PARAM VALUE: " + PARAM;
$$;
Call from SnowSQL:
snowsql -q "call ericlin.stack_overflow.my_proc('test');"
* SnowSQL * v1.2.20
Type SQL statements or !help
+-------------------+
| MY_PROC |
|-------------------|
| PARAM VALUE: test |
+-------------------+
1 Row(s) produced. Time Elapsed: 0.369s

Related

How to check if a request located in JDBC_SESSION_INIT_STATEMENT is working? DataframeReader

I am trying to connect to sql server with spark-jdbc, using JDBC_SESSION_INIT_STATEMENT to create a temporary table and then download data from the temporary table in the main query.
I have the following code:
//df is org.apache.spark.sql.DataFrameReader
val s = """select * into #tmp_table from ( SELECT op.ID,
| op.Date,
| op.DocumentID,
| op.Amount,
| op.AmountCurr,
| op.CurrencyID,
| operson.ObjectTypeId AS PersonOT,
| op.PersonID,
| ocontract.ObjectTypeId AS ContractOT,
| op.ContractID,
| op.DocNum,
| op.MomentCreate,
| op.ObjectTypeID,
| op.OwnerObjectID
|FROM dbo.Operation op With (Index = IX_Operation_Date) --Без хинта временами уходит в скан всей таблицы
|LEFT JOIN dbo.Object ocontract ON op.ContractID = ocontract.ID
|LEFT JOIN dbo.Object operson ON op.PersonID = operson.ID
|WHERE op.Date>='2019-01-01' and op.Date<'2020-01-01' AND 1=1
|) wrap_for_single_connect
|OPTION (LOOP JOIN, FORCE ORDER, MAX_GRANT_PERCENT=25)""".stripMargin
df
.option(JDBCOptions.JDBC_SESSION_INIT_STATEMENT, s)
.jdbc(
jdbcUrl,
"(select * from tempdb.#tmp_table) sub",
connectionProps)
i get com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name '#tmp_table'.
And I have a feeling that JDBC_SESSION_INIT_STATEMENT is not working, because I deliberately tried to mess up the request and still got the Invalid object error.
How can I check if the request is working in JDBC_SESSION_INIT_STATEMENT?
One way to know whether your JDBCOptions.JDBC_SESSION_INIT_STATEMENT is executed is to enable INFO logging level for org.apache.spark.sql.execution.datasources.jdbc logger.
That should trigger this line and print out the following message to the logs:
Executing sessionInitStatement: [sql]
Given the comment I don't think you should use it to create a source table to load records from:
// This executes a generic SQL statement (or PL/SQL block) before reading
// the table/query via JDBC. Use this feature to initialize the database
// session environment, e.g. for optimizations and/or troubleshooting.
You should use dbtable or query parameter instead.

how to convert the time type of 2020-06-02 10:40:28.001 to 1591065628 in TDengine database?

how to convert the time type of 2020-06-02 10:40:28.001 to 1591065628 in TDengine database?
I want to convert the time in a certain format and can see it in the shell. Take an example, I want to convert the time 2020-06-02 10:40:28.001 to 1591065628, what should I do?
you can use taos -r option, it will output time as uint64_t
ubuntu#taos ~ $ taos -r
taos> use test;
Database changed.
taos> select * from tb;
ts | speed | desc |
================================================================
1644216894189 | 1 | test |
Query OK, 1 row(s) in set (0.006103s)

403 error running data unload with snowsql GET

I'm having issues testing a data unload flow from Snowflake using the GET command to store the files on my local machine.
Following the documentation here, it should be as simple as creating a stage, copying the data I want to that stage, and then running a snowsql command locally to retrieve the files.
I'm on Windows 10, running the following snowsql command to try and unload the data, against a database populated with the test TCP-H data that snowflake provides:
snowsql -a <account id> -u <username> -q "
USE DATABASE TESTDB;
CREATE OR REPLACE STAGE TESTSNOWFLAKESTAGE;
copy into #TESTSNOWFLAKESTAGE/supplier from SUPPLIER;
GET #TESTSNOWFLAKESTAGE file://C:/Users/<local user>/Downloads/unload;"
All commands run successfully, except for the final GET:
SnowSQL * v1.2.14
Type SQL statements or !help
+----------------------------------+
| status |
|----------------------------------|
| Statement executed successfully. |
+----------------------------------+
1 Row(s) produced. Time Elapsed: 0.121s
+-------------------------------------------------+
| status |
|-------------------------------------------------|
| Stage area TESTSNOWFLAKESTAGE successfully created. |
+-------------------------------------------------+
1 Row(s) produced. Time Elapsed: 0.293s
+---------------+-------------+--------------+
| rows_unloaded | input_bytes | output_bytes |
|---------------+-------------+--------------|
| 100000 | 14137839 | 5636225 |
+---------------+-------------+--------------+
1 Row(s) produced. Time Elapsed: 7.548s
+-----------------------+------+--------+------------------------------------------------------------------------------------------------------+
| file | size | status | message |
|-----------------------+------+--------+------------------------------------------------------------------------------------------------------|
| supplier_0_0_0.csv.gz | -1 | ERROR | An error occurred (403) when calling the HeadObject operation: Forbidden, file=supplier_0_0_0.csv.gz |
+-----------------------+------+--------+------------------------------------------------------------------------------------------------------+
1 Row(s) produced. Time Elapsed: 1.434s
This 403 looks like it's coming from the S3 instance backing my Snowflake account, but that's part of the abstracted service layer provided by Snowflake, so I'm not sure where I would have to go to flip auth switches.
Any guidance is much appreciated.
You need to use Windows-based slashes in your local file path. So, assuming that to #NickW's point, you are filling your local user correctly, the format should be like the following:
file://C:\Users\<local user>\Downloads
There are some examples in the documentation for this here:
https://docs.snowflake.com/en/sql-reference/sql/get.html#required-parameters

is there a way to put comments in SnowSQL files that do not cause errors?

I'm using files with snowsql to automate certain processes. For maintenance, I want to include comments in the file (using // at the start of the line) to explain key steps. However, when I do this snowsql reports an error: 000900 (42601): SQL compilation error: Empty SQL statement.
for example:
select 'hello world';
// and now exit the session
!exit
will cause the error:
$ snowsql --filename comments.sql
* SnowSQL * v1.2.5approval |
Type SQL statements or !help
+---------------+
| 'HELLO WORLD' |
|---------------|
| hello world |
+---------------+
1 Row(s) produced. Time Elapsed: 0.209s
000900 (42601): SQL compilation error:
Empty SQL statement.
Goodbye!
If I remove the comments and leave empty lines instead:
select 'hello world';
!exit
Then it works with no errors reported
$ snowsql --filename no-comments.sql
* SnowSQL * v1.2.5approval |
Type SQL statements or !help
+---------------+
| 'HELLO WORLD' |
|---------------|
| hello world |
+---------------+
1 Row(s) produced. Time Elapsed: 1.088s
Goodbye!
This occurs with snowsql version 1.2.5
Is there a way to include comments in sql file that does not cause errors in snowsql?
You can use the standard SQL comment markers, double dashes. I tested it and it works:
select 'hello world';
-- and now exit the session
!exit
I think that if it works in the web UI it should work the same way in SnowSQL, so I'll open a ticket to check on this.
It looks like the issue is that it's treating the comment text as a sql statement. Even though it's correctly finds that it's just a comment, it's trying to execute and doesn't find an actual script. Hence "Empty SQL statement."
As a workaround You should just be able to put the comment prior to the semicolon so that it realizes that there's only one script to run.
You could also put in a dummy select 1 or something similar.
select 'hello world'
// and now exit the session
;
!exit
Not sure there's a problem ... though the !exit might be a problem. First, another example:
My test with comments.sql:
select 'hello Grovers Corner';
-- this is a double dash comment
select 'hello Indiana'
-- dashes in middle of sql statement
from information_schema.tables
fetch 1 row only;
select 'hello United States'
// slashes in middle of sql statement
from information_schema.tables
fetch 1 row only;
// some final slashes on the last line
... note there is no !exit at the end of the script
... and the snowsql execution with results:
$ snowsql -f comments.sql -o echo=True
* SnowSQL * v1.1.86
Type SQL statements or !help
select 'hello Grovers Corner';
+------------------------+
| 'HELLO GROVERS CORNER' |
|------------------------|
| hello Grovers Corner |
+------------------------+
1 Row(s) produced. Time Elapsed: 0.085s
-- this is a double dash comment
select 'hello Indiana'
-- dashes in middle of sql statement
from information_schema.tables
fetch 1 row only;
+-----------------+
| 'HELLO INDIANA' |
|-----------------|
| hello Indiana |
+-----------------+
1 Row(s) produced. Time Elapsed: 1.803s
select 'hello United States'
// slashes in middle of sql statement
from information_schema.tables
fetch 1 row only;
+-----------------------+
| 'HELLO UNITED STATES' |
|-----------------------|
| hello United States |
+-----------------------+
1 Row(s) produced. Time Elapsed: 1.748s
// some final slashes on the last line
000900 (42601): SQL compilation error:
Empty SQL statement.
Goodbye!
$
So, the double slashes did not impact the sql statement, but they are confusing the end of the script!
Finally, I did replace the last line double-slashes with double-dashes and there was no SQL compilation error. And same results without the echo=True option on the command line execution step.
I think this is related to my own question about the apparent inability to NOT exit a script when called via the snowsql command line. (See SO question login.sql and NOT exit)

execute stored procedure with dbslim with Fitnesse (Selenium,Xebium)

https://github.com/markfink/dbslim
I'd like to execute the stored procedures with DbSlim using Fitnesse (Selenium, Xebium)
now what I tried to do is:
!define dbQuerySelectCustomerbalance (
execute dbo.uspLogError
)
| script | Db Slim Select Query | !-${dbQuerySelectCustomerbalance}-! |
which gives a green indicator,
however Microsoft SQL Server profiler gives no actions/logging...
so what i'd like to know is: is it possible to use dbslim for executing stored procedures,
if yes
what is the correct way to do it?
By the way, the connection to the Database i've on 1 page, and on the query page i included the connection to the database. (is that ok?)
Take out the !- ... -!. It is used to escape wikified words. But in this case you want it to be translated to the actual query.
!define dbQuerySelectCustomerbalance ( execute dbo.uspLogError )
| script | Db Slim Select Query | ${dbQuerySelectCustomerbalance} |
| show | data by column index | 1 | and row index | 1 |
You can add in the last line which outputing the first column of the first row for testing purpose if your SP is returning some result (or you can create one simple SP just to test this out)
Specifying the connection anywhere before this block will be fine, be it on the same page or in an SetUp/SuiteSetUp/normal page included/executed before.

Resources