snowsql option while exporting data to a csv file in unix - snowflake-cloud-data-platform

we are exporting data into a csv file by using unix shell script (using snowsql)
below is the script
#!/bin/ksh
snowsql -c newConnection -o log_level=DEBUG -o
log_file=~/snowsql_sso_debug.log -r SRVC_ACCT_ROLE -w LOAD_WH -d
ETL_DEV_DB -s CTL_DB -q "select * from mytable" -o friendly=False -o
header=False -o output_format=pipe -o timing=False>test_file.csv
output starts something like below
|:--------|:-----------|
i dont want to display above lines in my csv file, what is the option that we need to use in my snowsql query?
appricate your response.
Thanks.

Providing my comment as an answer, just in case it works better for you.
I would leverage a COPY INTO command to create a CSV file to an internal stage location:
https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html
And then use a GET statement to pull the file down to your Unix machine.
https://docs.snowflake.com/en/sql-reference/sql/get.html
This gives you greater control over the output format and will likely perform faster, as well.

Related

Pass config file as arg when calling snowsql from command line

I am trying to run a snowsql query from the command line and also to pass config file while calling snowsql. On this blog there is an option presented:
–config PATH SnowSQL config file path.
I tried including this:
#!/bin/bash
snowsql -f training-data.sql \
-o quiet=true \
-o friendly=false \
-o header=false \
-config=./config
When I attempt ti run this I get:
No connection could be found for onfig=./config
It's odd because previously, I could swear the error message was (Note onfig Vs. nfig!):
No connection could be found for nfig=./config
How can I tell snowsql to use ./config as the config file when running the query?
You don’t need an equals sign. It should just be:
-config ./config

Calling procedure through Snowsql

I am calling strored procedure thorugh SNOWSQL and getting below error.
002141 (42601): SQL compilation error:
Unknown user-defined function ETL_SCHEMA.PROC
Below is the snowsql query:
snowsql -c newConnection -o log_level=DEBUG -r ACCT_ROLE -w ETL_XS_WH -d ETL_DEV_DB -s ETL_SCHEMA -q "CALL ETL_SCHEMA.PROC('202')" -o friendly=False -o header=False -o output_format=plain -o timing=False
Is anything is wrong here?
Is CALL ETL_SCHEMA.PROC('202') working in your Snowflake Web GUI? Maybe its not a Stored Procedure but an User defined function.
The issue you are having is either permissions based or it's a search path issue.
I'd recommend prefixing the "etl_schema" with the database name (aka fully qualified name), and trying that. You can also simply run a select current_role(), current_database(), current_schema(); command instead of the call command to see what the context is, you might have something in the config that is overwriting the arguments passed in via the command.

cscope: -c or -T mismatch between command line and old symbol database

I'm trying to create tags for *.c, *.x and *.h files.
These are the following commands which I executed.
find <absolute_path_of_code> -name *.c -o -name *.x -o -name *.h > cscope.files
cscope -bkqc cscope.files
Till here everything is ok.
But after this when I execute the command,
cscope -Rb
I get the following message at console.
cscope: -c or -T option mismatch between command line and old symbol database
How do I resolve this?
If you generate a database using the -c or -T options (you use -c in your original command) you are required to pass those options to every subsequent invocation of cscope. Just add -c to your second command (making it cscope -Rbc) and it should work.
cscope -Rb generates only cscope.out file but cscope -bkqc -I cscope.files generates cscope.in.out, cscope.po.out and cscope.out. So there is no need to execute cscope -Rb.

BCP utility:"Copy direction must be either 'in', 'out' or 'format'"

I get below mentioned error when I run BCP command for trusted connection:
Copy direction must be either 'in', 'out' or 'format'.
I tried searching MSDN, where it specifies that servername passed could be incorrect.
The command I am trying is:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C -T RAW -S%SQL_server%
When I pass a username and password instead of using the -T option, it works. The command is executed from command prompt by passing parameters from command line.
Your -C and -T options are flip-flopped - -C -T RAW instead of -C RAW -T.
Check the bcp utility's online documentation for confirmation that -C rather than -T should precede RAW.
Try this instead:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C RAW -T -S%SQL_server%
My guess is that you probably misplaced the -T option when switching to a trusted connection (with the -T option) from integrated security (with the -U and -P options).

bcp export for unicode data

I am using bcp command in sql server to export data generated from a query to .csv file with the help of following command
*xp_cmdshell bcp EXEC .DBO. QUERYOUT -U -P /c /t, -T -S *
It is working fine and exporting data as expected but now we have a column which contain multilingual data and during exporting with above command data in csv file shows as "????????".
After doing some googling I found some other switch like -w to be used for unicode character but this option is creating unicode file and doesnot open in excel properly(columns are not separted by comma(,))
Can anybody help me if I am missing anything?
/t, is not correct. It should read -t, to set the field delimeter. And -w is the correct flag for unicode. Also I don't think -U/-P is used with -T.
bcp AdventureWorks2012..myTestUniCharData out C:\myTestUniCharData-w.Dat -w -t, -T
See the examples area in the following tech note:
http://technet.microsoft.com/en-us/library/ms188289.aspx
try using -w -T only.
don't add a field delimeter
i dont think "/c" or "/t" are the correct switches. see the documentation here: it should be -c or in your case -w for unicode. try from the cmd line using:
bcp AdventureWorks2012.dbo.myTestUniCharData IN C:\temp\logs\ex170112.log -w -S
localhost\SQLEXPRESS -T
you could also use a format file (.fmt) to tell it to format certain columns. see here.
Using -w switch produces a file with UTF16 format, which is not easy to work with.
If your special characters are covered by ISO 8859-1 characters, then use the switches -C -c in your bcp command. -C makes a ISO 8859-1 file.
The following worked for me to import unicode from text file:
bcp dbo.MyTable in unicode-input.txt -S localhost -T -d EWBKonfig -C 65001 -c -t, -r '0x0A'

Resources