How to export data to local system from snowflake cloud data warehouse? - database

I am using snowflake cloud datawarehouse, which is like teradata that hosts data. I am able run queries and get results on the web UI itself. But I am unclear how can one export the results to a local PC so that we can report based on the data.
Thanks in advance

You have 2 options which both use sfsql which is based on henplus. The first option is to export the result of your query to a S3 staging file as shown below:
CREATE STAGE my_stage URL='s3://loading/files/' CREDENTIALS=(AWS_KEY_ID=‘****' AWS_SECRET_KEY=‘****’);
COPY INTO #my_stage/dump
FROM (select * from orderstiny limit 5) file_format=(format_name=‘csv' compression=‘gzip'');
The other option is to capture the sql result into a file.
test.sql:
set-property column-delimiter ",";
set-property sql-result-showheader off;
set-property sql-result-showfooter off;
select current_date() from dual;
$ ./sfsql < test.sql > result.txt
For more details and help, login to your snowflake account and access the online documentation or post your question to Snowflake support via the Snowflake support portal which is accessible through the Snowflake help section. Help -> Support Portal.
Hope this helps.

You can use a COPY command to export a table (or query results) into a file on S3 (using "stage" locations), and then a GET command to save it onto your local filesystem. You can only do it from the "sfsql" Snowflake command line tool (not from web UI).
Search the documentation for "unloading", you'll find more info there.

You can directly download the data from Snowflakes to Local Filesystem without staging to S3 or redirecting via unix pipe
Use COPY INTO to load table data to table staging
https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-location.html
snowsql$> copy into #%test_table/result/data_ from test_table
file_format = (TYPE ='[FILE_TYPE]' compression='[COMPRESSION_TYPE]');
Use GET command to download data from table staging to Local FS
https://docs.snowflake.net/manuals/sql-reference/sql/get.html
snowsql$> get #%test_table/result/data_ file:///tmp/;

Related

How to load data from UNIX to snowflake

I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?
Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html

How to stream data from Snowflake SQL output to S3 on AWS

I have a SQL to run on three Snowflake tables hosted on AWS account. I would like to stream any new records based on the output of my SQL to an S3 bucket using possibly Kafka or any other streaming service. What are my options to implement this?
You can unload data directly into an S3 bucket :
Create storage integration.
Create stage or specify bucket url directly in the query.
copy into s3://mybucket/unload/ from mytable storage_integration = s3_int;
Ref : https://docs.snowflake.com/en/user-guide/data-unload-considerations.html

End user initiating SQL commands to create a file from a SQL table?

Using SQL Manager ver 18.4 on 2019 servers.
Is there an easier way to allow an end user with NO access to anything SQL related to fire off some SQL commands that:
1.)create and update a SQL table
2.)then create a file from that table (csv in my case) that they have access to in a folder share?
Currently I do this using xp_command shell with bcp commands in a cloud hosted environment, hence I am not in control of ANY permission or access, etc. For example:
declare #bcpCommandIH varchar(200)
set #bcpCommandIH = 'bcp "SELECT * from mydb.dbo.mysqltable order by 1 desc" queryout E:\DATA\SHARE\test\testfile.csv -S MYSERVERNAME -T -c -t, '
exec master..xp_cmdshell #bcpCommandIH
So how I achieve this now is allowing the end users to run a Crystal report which fires a SQL STORED PROCEDUE, that runs some code to create and update a SQL table and then it creates a csv file that the end user can access. Create and updating the table is easy. Getting the table in the hands of the end user is nothing but trouble in this hosted environment.
We always end up with permission or other folder share issues and its a complete waste of time. The cloud service Admins tell me "this is a huge security issue and you need to start and stop the xp_command shell with some commands every time you want generate this file to be safe".
Well this is non-sense to me. I wont want to have to touch any of this and it needs to be AUTOMATED for the end user start to finish.
Is there some easier way to AUTOMATE a process for an END USER to create and update a SQL table and simply get the contents of that table exported to a CSV file without all the administration trouble?
Are there other simpler options than xp_command shell and bcp to achieve this?
Thanks,
MP
Since the environment allows you to run a Crystal Report, you can use the report to create a table via ODBC Export. There are 3rd-party tools that allow that to happen even if the table already exists (giving you an option to replace or append records to an existing target table).
But it's not clear why you can't get the data directly into the Crystal report and simply export to csv.
There are free/inexpensive tools that allow you to automate/schedule the exporting/emailing/printing of a Crystal Report. See list here.

Can we load the files from a file location to Named internalstage using stored proc in snowflake?

Can we load the files from a file location to internalstage using stored proc in snowflake? I understand we cant use the put command here. Kindly help
You won't be able to use Snowflake Stored procedure to PUT files from the local to Internal stage.
Below are few links, which might be helpful:
https://community.snowflake.com/s/article/How-to-use-Variable-Substitution-in-PUT-command-using-Snowflake-Python-connector
https://community.snowflake.com/s/article/How-to-use-an-ODBC-DSN-connection-in-a-NET-Client-to-put-the-file-to-Snowflake-internal-stage
You can't use a Snowflake stored procedure. If the file is in cloud storage (AWS,Azure,GCP) then you could setup an external stage and ingest from their directly. If the file is local, you will need to use a tool to push into Internal Stage. E.g. Snowsql client, python, .Net, or a tool that can use ODBC or JDBC client.
The short answer is no.
Reason being, that snowflake is running in the cloud, and does not have access to your local computer.
You have to upload your files from local machines into a cloud storage solution, via supported tools, then you are able to use stored procedures to process files.
Below is some additional information
To load a file from a computer or vm to snowflake you are looking for one of two options:
Upload to Snowflake Internal Stage via PUT command
or
Uploading from your computer to Snowflake External Stage via Cloud Provider Tools,
AWS:
aws s3 cp filename.txt s3://bucket-name
Azure:
az storage blob upload \
--account-name <storage-account> \
--container-name <container> \
--name helloworld \
--file helloworld \
--auth-mode login
GCP:
gsutil cp *.txt gs://my-bucket
After you have loaded your data into a stage.
If you are unsure what type of stage to use, see Choosing a Stage
You load the data into your table via COPY INTO command
If you are using snowflake internal stage, the following sample script is an upload command that would upload your file.
put file:///tmp/data/mydata.csv #my_int_stage;
You can run the PUT command from snowsql or python using the snowflake connector for python.
Provide us with the code you are attempting to use and error messages, and we can assist you in moving forward.

Azure SQL blob storage select

I am attempting to create a temp table to store the values of an xlsx file on my azure blob storage, I have followed numerous Microsoft articles now and I am under the impression that I should be using SELECT * FROM OPENROWSET(), this seems to be working or at least selecting something.
Here is my code:
SELECT * INTO ##TempTest FROM OPENROWSET(BULK 'test.xlsx',
DATA_SOURCE = 'DevStoreAccount', SINGLE_CLOB) AS a;
SELECT * FROM ##TempTest
This all runs fine, but the output is not what I am expecting, surely this should return all my columns / rows from the excel file? Or am I mistaken?
The above code returns the following:
What exactly is it returning and should I be doing something different? Any help would really be appreciated.
I'm trying this route as the columns in the excel file could change at any time, so I need to dynamically create my tables.
I'd recommend checking this thread, although the post is old, it is still relevant to your question.
The approach taken for the similar scenario:
1- Create and update Excel file using Open XML SDK
2- Upload Excel Template in Azure BLOB
3- Download Excel template in azure web role local storage
4- Read and update excel file from azure web role local storage
5- Upload updated excel in Azure BLOB.
II
You could also use another similar concept as mentioned here
Downloading excel file as Stream from BLOB
Creating Excel document using Open XML SDK
After edit saving doc to Stream
Uploading back the Stream to BLOB

Resources