Calling procedure through Snowsql - snowflake-cloud-data-platform

I am calling strored procedure thorugh SNOWSQL and getting below error.
002141 (42601): SQL compilation error:
Unknown user-defined function ETL_SCHEMA.PROC
Below is the snowsql query:
snowsql -c newConnection -o log_level=DEBUG -r ACCT_ROLE -w ETL_XS_WH -d ETL_DEV_DB -s ETL_SCHEMA -q "CALL ETL_SCHEMA.PROC('202')" -o friendly=False -o header=False -o output_format=plain -o timing=False
Is anything is wrong here?

Is CALL ETL_SCHEMA.PROC('202') working in your Snowflake Web GUI? Maybe its not a Stored Procedure but an User defined function.

The issue you are having is either permissions based or it's a search path issue.
I'd recommend prefixing the "etl_schema" with the database name (aka fully qualified name), and trying that. You can also simply run a select current_role(), current_database(), current_schema(); command instead of the call command to see what the context is, you might have something in the config that is overwriting the arguments passed in via the command.

Related

Pass config file as arg when calling snowsql from command line

I am trying to run a snowsql query from the command line and also to pass config file while calling snowsql. On this blog there is an option presented:
–config PATH SnowSQL config file path.
I tried including this:
#!/bin/bash
snowsql -f training-data.sql \
-o quiet=true \
-o friendly=false \
-o header=false \
-config=./config
When I attempt ti run this I get:
No connection could be found for onfig=./config
It's odd because previously, I could swear the error message was (Note onfig Vs. nfig!):
No connection could be found for nfig=./config
How can I tell snowsql to use ./config as the config file when running the query?
You don’t need an equals sign. It should just be:
-config ./config

snowsql option while exporting data to a csv file in unix

we are exporting data into a csv file by using unix shell script (using snowsql)
below is the script
#!/bin/ksh
snowsql -c newConnection -o log_level=DEBUG -o
log_file=~/snowsql_sso_debug.log -r SRVC_ACCT_ROLE -w LOAD_WH -d
ETL_DEV_DB -s CTL_DB -q "select * from mytable" -o friendly=False -o
header=False -o output_format=pipe -o timing=False>test_file.csv
output starts something like below
|:--------|:-----------|
i dont want to display above lines in my csv file, what is the option that we need to use in my snowsql query?
appricate your response.
Thanks.
Providing my comment as an answer, just in case it works better for you.
I would leverage a COPY INTO command to create a CSV file to an internal stage location:
https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html
And then use a GET statement to pull the file down to your Unix machine.
https://docs.snowflake.com/en/sql-reference/sql/get.html
This gives you greater control over the output format and will likely perform faster, as well.

outputing select statement result into file

I have an sql file that contains below script that is ran via isql.
May I ask whats wrong with my output syntax? I am getting "Incorrect syntax near the keyword 'output'"
Sybase ASE version is 15.7
select * from tempdb..M3_STI_extracts_checking
output to employee.txt format ASCII
GO
isql offers the possibility to write the output into a file, if you set the option -o (Utility Commands Reference).
input.sql
select * from tempdb..M3_STI_extracts_checking
go
isql -i input.sql -o employee.txt
-J sets the charset (ASE 15.7 charsets)
isql -i input.sql -o employee.txt -J ascii_7
Was able to workaround by passing the variable from a shell script.
test.sh
output_file=test_file_'date +%m%d%Y'
${PARAM} isql << EOF
select * from tempdb..M3_STI_extracts_checking
GO > ${output_file}
EOF

cscope: -c or -T mismatch between command line and old symbol database

I'm trying to create tags for *.c, *.x and *.h files.
These are the following commands which I executed.
find <absolute_path_of_code> -name *.c -o -name *.x -o -name *.h > cscope.files
cscope -bkqc cscope.files
Till here everything is ok.
But after this when I execute the command,
cscope -Rb
I get the following message at console.
cscope: -c or -T option mismatch between command line and old symbol database
How do I resolve this?
If you generate a database using the -c or -T options (you use -c in your original command) you are required to pass those options to every subsequent invocation of cscope. Just add -c to your second command (making it cscope -Rbc) and it should work.
cscope -Rb generates only cscope.out file but cscope -bkqc -I cscope.files generates cscope.in.out, cscope.po.out and cscope.out. So there is no need to execute cscope -Rb.

BCP utility:"Copy direction must be either 'in', 'out' or 'format'"

I get below mentioned error when I run BCP command for trusted connection:
Copy direction must be either 'in', 'out' or 'format'.
I tried searching MSDN, where it specifies that servername passed could be incorrect.
The command I am trying is:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C -T RAW -S%SQL_server%
When I pass a username and password instead of using the -T option, it works. The command is executed from command prompt by passing parameters from command line.
Your -C and -T options are flip-flopped - -C -T RAW instead of -C RAW -T.
Check the bcp utility's online documentation for confirmation that -C rather than -T should precede RAW.
Try this instead:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C RAW -T -S%SQL_server%
My guess is that you probably misplaced the -T option when switching to a trusted connection (with the -T option) from integrated security (with the -U and -P options).

Resources