How to copy 2 million records to csv - snowflake-cloud-data-platform

I have a need to copy 2 million records to a flat file or excel from the result set, how can i do that?
I tried using where clause with the row number but that is not yielding the results
Thanks

Export it with SnowSQL https://docs.snowflake.com/en/user-guide/snowsql-use.html#exporting-data e.g.
snowsql -c my_example_connection -d sales_db -s public -q "select * from mytable limit 10" -o output_format=csv -o header=false -o timing=false -o friendly=false > output_file.csv

Related

how to set the escape character for Snowsql export

I am exporting a table from Snowflake to a local folder using Snowsql in a tab delimited format. In the Snowflake table one of the column values contains a tab value (char(9)) which I would like to retain in the output. However, Snowsql seems to automatically escape it. Is there a way to set the escape character to none for Snowsql exports?
e.g.
Here is some test data to replicate the issue:
create or replace table test as
select '1234'as col1, 'test'||char(9)||'data' as col2
union
select '2468', 'more_test_data';
You can see we have a tab value in col2 for the first record.
Now if we try to export using snowsql:
snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=tsv -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt:
When opened, the export file shows:
Any ideas of how I can force the output to be like this please?
Can you try "output_format=plain"?
snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=plain -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt:
https://docs.snowflake.com/en/user-guide/snowsql-config.html#output-format
I found an alternate solution based on the encoding of the file specified in this doc. https://community.snowflake.com/s/article/SnowSQL-Non-ASCII-whitespace-characters-are-displayed-as-Unicode-escape-sequences-in-CSV-TSV-format.
Try this output instead:
SNOWSQL_OUTPUT_AS_UNICODE=true snowsql -c [config_file_name] -d [database] -s [schema] -q "select * from test" -o output_format=tsv -o header=false -o timing=false -o friendly=false > C:\Users\user\snowsql\test.txt
Sample Output with Tab:

How to remove " from Snowsql generated CSV file?

Using the below command generates CSV output with ". I am using windows. If already answered please share the post link.
snowsql -a <account> -u <username> -r accountadmin -d mydb -s myschema -q "select * from customer limit 20" -o output_file="e:\customer.csv" -o quiet=true -o friendly=false -o output_format=csv -o header=false -o timing=false
Sample output is as follows:
"C_CUSTKEY","C_NAME","C_ADDRESS","C_NATIONKEY","C_PHONE","C_ACCTBAL","C_MKTSEGMENT","C_COMMENT"
"1369097","Customer#001369097","jOccbXiKQLaDjnL1VlzTm","2","12-827-936-7420","6053.92","FURNITURE","ng packages cajole upon the slyly bold dolphins. fin"
Thank you for your help!

snowsql option while exporting data to a csv file in unix

we are exporting data into a csv file by using unix shell script (using snowsql)
below is the script
#!/bin/ksh
snowsql -c newConnection -o log_level=DEBUG -o
log_file=~/snowsql_sso_debug.log -r SRVC_ACCT_ROLE -w LOAD_WH -d
ETL_DEV_DB -s CTL_DB -q "select * from mytable" -o friendly=False -o
header=False -o output_format=pipe -o timing=False>test_file.csv
output starts something like below
|:--------|:-----------|
i dont want to display above lines in my csv file, what is the option that we need to use in my snowsql query?
appricate your response.
Thanks.
Providing my comment as an answer, just in case it works better for you.
I would leverage a COPY INTO command to create a CSV file to an internal stage location:
https://docs.snowflake.com/en/sql-reference/sql/copy-into-location.html
And then use a GET statement to pull the file down to your Unix machine.
https://docs.snowflake.com/en/sql-reference/sql/get.html
This gives you greater control over the output format and will likely perform faster, as well.

how do i pass in a command to sqsh and get the output to a file in one go?

I'm trying to set up a simple loop to periodically query a database table in bash. Normally I seem to have to do:
sqsh -s SERV -U user -P passwd -D db -L bcp_colsep=','
then within sqsh I have to type:
select * from some_table where foo=bar
\go -m bcp > /path/to/output.out
I was trying to use the -C option to sqsh to pass in the command like this:
sqsh -s SERV -U user -P passwd -D db -L bcp_colsep=',' -C 'select * from some_table where foo=bar \go -m bcp > /path/to/output.out'
but I keep getting:
Incorrect syntax near '\'.
How can I get the desired effect?
When you use the -C option to pass on a SQL statement to sqsh, the \go command will be implicitly executed. To get the output in 'bcp' result style you need to set the variable 'style=bcp' using the -L parameter or use -mbcp as a commandline parameter and just redirect the output to a file, or use the sqsh -o parameter to specify a filename for output. So basically your command would look like:
sqsh -S SERV -U user -P passwd -D db -L bcp_colsep=',' -m bcp \
-C 'select * from some_table where foo=bar' > /path/to/output.out
HTH, Martin

bcp export for unicode data

I am using bcp command in sql server to export data generated from a query to .csv file with the help of following command
*xp_cmdshell bcp EXEC .DBO. QUERYOUT -U -P /c /t, -T -S *
It is working fine and exporting data as expected but now we have a column which contain multilingual data and during exporting with above command data in csv file shows as "????????".
After doing some googling I found some other switch like -w to be used for unicode character but this option is creating unicode file and doesnot open in excel properly(columns are not separted by comma(,))
Can anybody help me if I am missing anything?
/t, is not correct. It should read -t, to set the field delimeter. And -w is the correct flag for unicode. Also I don't think -U/-P is used with -T.
bcp AdventureWorks2012..myTestUniCharData out C:\myTestUniCharData-w.Dat -w -t, -T
See the examples area in the following tech note:
http://technet.microsoft.com/en-us/library/ms188289.aspx
try using -w -T only.
don't add a field delimeter
i dont think "/c" or "/t" are the correct switches. see the documentation here: it should be -c or in your case -w for unicode. try from the cmd line using:
bcp AdventureWorks2012.dbo.myTestUniCharData IN C:\temp\logs\ex170112.log -w -S
localhost\SQLEXPRESS -T
you could also use a format file (.fmt) to tell it to format certain columns. see here.
Using -w switch produces a file with UTF16 format, which is not easy to work with.
If your special characters are covered by ISO 8859-1 characters, then use the switches -C -c in your bcp command. -C makes a ISO 8859-1 file.
The following worked for me to import unicode from text file:
bcp dbo.MyTable in unicode-input.txt -S localhost -T -d EWBKonfig -C 65001 -c -t, -r '0x0A'

Resources