I am exporting a table from Snowflake to a local folder using Snowsql in a tab delimited format. In the Snowflake table one of the column values contains a tab value (char(9)) which I would like to retain in the output. However, Snowsql seems to automatically escape it. Is there a way to set the escape character to none for Snowsql exports?
e.g.
Here is some test data to replicate the issue:
create or replace table test as
select '1234'as col1, 'test'||char(9)||'data' as col2
union
select '2468', 'more_test_data';
You can see we have a tab value in col2 for the first record.
Now if we try to export using snowsql:
snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=tsv -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt:
When opened, the export file shows:
Any ideas of how I can force the output to be like this please?
Can you try "output_format=plain"?
snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=plain -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt:
https://docs.snowflake.com/en/user-guide/snowsql-config.html#output-format
I found an alternate solution based on the encoding of the file specified in this doc. https://community.snowflake.com/s/article/SnowSQL-Non-ASCII-whitespace-characters-are-displayed-as-Unicode-escape-sequences-in-CSV-TSV-format.
Try this output instead:
SNOWSQL_OUTPUT_AS_UNICODE=true snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=tsv -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt
Sample Output with Tab:
Related
Using the below command generates CSV output with ". I am using windows. If already answered please share the post link.
snowsql -a <account> -u <username> -r accountadmin -d mydb -s myschema -q "select * from customer limit 20" -o output_file="e:\customer.csv" -o quiet=true -o friendly=false -o output_format=csv -o header=false -o timing=false
Sample output is as follows:
"C_CUSTKEY","C_NAME","C_ADDRESS","C_NATIONKEY","C_PHONE","C_ACCTBAL","C_MKTSEGMENT","C_COMMENT"
"1369097","Customer#001369097","jOccbXiKQLaDjnL1VlzTm","2","12-827-936-7420","6053.92","FURNITURE","ng packages cajole upon the slyly bold dolphins. fin"
Thank you for your help!
we are exporting data into a csv file by using unix shell script (using snowsql)
below is the script
#!/bin/ksh
snowsql -c newConnection -o log_level=DEBUG -o
log_file=~/snowsql_sso_debug.log -r SRVC_ACCT_ROLE -w LOAD_WH -d
ETL_DEV_DB -s CTL_DB -q "select * from mytable" -o friendly=False -o
header=False -o output_format=pipe -o timing=False>test_file.csv
output starts something like below
|:--------|:-----------|
i dont want to display above lines in my csv file, what is the option that we need to use in my snowsql query?
appricate your response.
Thanks.
Providing my comment as an answer, just in case it works better for you.
I would leverage a COPY INTO command to create a CSV file to an internal stage location:
https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html
And then use a GET statement to pull the file down to your Unix machine.
https://docs.snowflake.com/en/sql-reference/sql/get.html
This gives you greater control over the output format and will likely perform faster, as well.
I have a need to copy 2 million records to a flat file or excel from the result set, how can i do that?
I tried using where clause with the row number but that is not yielding the results
Thanks
Export it with SnowSQL https://docs.snowflake.com/en/user-guide/snowsql-use.html#exporting-data e.g.
snowsql -c my_example_connection -d sales_db -s public -q "select * from mytable limit 10" -o output_format=csv -o header=false -o timing=false -o friendly=false > output_file.csv
I get below mentioned error when I run BCP command for trusted connection:
Copy direction must be either 'in', 'out' or 'format'.
I tried searching MSDN, where it specifies that servername passed could be incorrect.
The command I am trying is:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C -T RAW -S%SQL_server%
When I pass a username and password instead of using the -T option, it works. The command is executed from command prompt by passing parameters from command line.
Your -C and -T options are flip-flopped - -C -T RAW instead of -C RAW -T.
Check the bcp utility's online documentation for confirmation that -C rather than -T should precede RAW.
Try this instead:
bcp %SQL_database%..TABLE1 in \FileSERVER\file.dat -f\fileserver\Formats\file.fmt -eERR.txt -m1000000 -C RAW -T -S%SQL_server%
My guess is that you probably misplaced the -T option when switching to a trusted connection (with the -T option) from integrated security (with the -U and -P options).
I'm trying to set up a simple loop to periodically query a database table in bash. Normally I seem to have to do:
sqsh -s SERV -U user -P passwd -D db -L bcp_colsep=','
then within sqsh I have to type:
select * from some_table where foo=bar
\go -m bcp > /path/to/output.out
I was trying to use the -C option to sqsh to pass in the command like this:
sqsh -s SERV -U user -P passwd -D db -L bcp_colsep=',' -C 'select * from some_table where foo=bar \go -m bcp > /path/to/output.out'
but I keep getting:
Incorrect syntax near '\'.
How can I get the desired effect?
When you use the -C option to pass on a SQL statement to sqsh, the \go command will be implicitly executed. To get the output in 'bcp' result style you need to set the variable 'style=bcp' using the -L parameter or use -mbcp as a commandline parameter and just redirect the output to a file, or use the sqsh -o parameter to specify a filename for output. So basically your command would look like:
sqsh -S SERV -U user -P passwd -D db -L bcp_colsep=',' -m bcp \
-C 'select * from some_table where foo=bar' > /path/to/output.out
HTH, Martin