Can we load the files from a file location to Named internalstage using stored proc in snowflake? - snowflake-cloud-data-platform

Can we load the files from a file location to internalstage using stored proc in snowflake? I understand we cant use the put command here. Kindly help

You won't be able to use Snowflake Stored procedure to PUT files from the local to Internal stage.
Below are few links, which might be helpful:
https://community.snowflake.com/s/article/How-to-use-Variable-Substitution-in-PUT-command-using-Snowflake-Python-connector
https://community.snowflake.com/s/article/How-to-use-an-ODBC-DSN-connection-in-a-NET-Client-to-put-the-file-to-Snowflake-internal-stage

You can't use a Snowflake stored procedure. If the file is in cloud storage (AWS,Azure,GCP) then you could setup an external stage and ingest from their directly. If the file is local, you will need to use a tool to push into Internal Stage. E.g. Snowsql client, python, .Net, or a tool that can use ODBC or JDBC client.

The short answer is no.
Reason being, that snowflake is running in the cloud, and does not have access to your local computer.
You have to upload your files from local machines into a cloud storage solution, via supported tools, then you are able to use stored procedures to process files.
Below is some additional information
To load a file from a computer or vm to snowflake you are looking for one of two options:
Upload to Snowflake Internal Stage via PUT command
or
Uploading from your computer to Snowflake External Stage via Cloud Provider Tools,
AWS:
aws s3 cp filename.txt s3://bucket-name
Azure:
az storage blob upload \
--account-name <storage-account> \
--container-name <container> \
--name helloworld \
--file helloworld \
--auth-mode login
GCP:
gsutil cp *.txt gs://my-bucket
After you have loaded your data into a stage.
If you are unsure what type of stage to use, see Choosing a Stage
You load the data into your table via COPY INTO command
If you are using snowflake internal stage, the following sample script is an upload command that would upload your file.
put file:///tmp/data/mydata.csv #my_int_stage;
You can run the PUT command from snowsql or python using the snowflake connector for python.
Provide us with the code you are attempting to use and error messages, and we can assist you in moving forward.

Related

Azure Data Factory - File System to Oracle Cloud Storage

Is it possible to copy files from an on-prem File System to Oracle Cloud Storage. Note that we are not concerned with the data inside the files.
In simple terms it's as if copying files from one folder to another.
Here is what I have tried:
Created Self-Hosted Runtime for the file system (testing on my local machine)
Created Linked Service for File System
Linked Service for Oracle Cloud Storage (OCS)
Data Set of File System
Data set of Oracle (OCS)
However, I get error saying that my C:\ can not be resolved in step 2. when connection is tested.
and
In 5. it says not able to sink because it is not supported under OCS. At this point it seems like it is not possible to copy files into OCS?
I tried different configurations to see if OCS can be used as a drop container for files.

How to load data from UNIX to snowflake

I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?
Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html

Where is local log file located for Google App Engine Java app

I am writing to the log with my Google App Engine (GAE) app but I cannot find where it is being stored. I have read a lot of posts on Stackoverflow but none of them indicate where to find this. I'm running my app in IntelliJ, using Gradle and IntelliJ.
The logs are not stored in a plain text file, they're stored as an SQLite database. From Local Development Server Options:
--logs_path=...
By default, the logs for the local development server are stored in
memory only. Specify this option when you run the local development
server to store the logs into a file, which makes the logs available
across server restarts. You must specify the directory path and name
to a SQLite database file. A SQLite database file is created with the
specified name if the file does not already exist. For example:
--logs_path=/home/logs/boglogs.db
Not sure about the in-memory storage, though, I can see my devserver writing to a default db file even without setting this option explicitly (on Linux, using lsof on the PyCharm-driven devserver's PID):
$ lsof -p 22811 | grep -i log
python2.7 22811 username 4ur REG 8,3 1648899072 1705871 /tmp/appengine.<app_name>.username/logs.db
python2.7 22811 username 24u REG 8,3 3608 1712816 /tmp/appengine.<app_name>.username/logs.db-journal
$ file /tmp/appengine.<app_name>.username/logs.db
/tmp/appengine.<app_name>.username/logs.db: SQLite 3.x database
Note: the above are for the Python devserver, Java is a bit different, but on Linux the same method may be usable to identify the location and type of the default file holding logs. The --generated_dir option might be the one to use to overwrite the default location. From Command-Line Arguments:
--generated_dir=...
Set the directory where generated files are created.

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

How to export data to local system from snowflake cloud data warehouse?

I am using snowflake cloud datawarehouse, which is like teradata that hosts data. I am able run queries and get results on the web UI itself. But I am unclear how can one export the results to a local PC so that we can report based on the data.
Thanks in advance
You have 2 options which both use sfsql which is based on henplus. The first option is to export the result of your query to a S3 staging file as shown below:
CREATE STAGE my_stage URL='s3://loading/files/' CREDENTIALS=(AWS_KEY_ID=‘****' AWS_SECRET_KEY=‘****’);
COPY INTO #my_stage/dump
FROM (select * from orderstiny limit 5) file_format=(format_name=‘csv' compression=‘gzip'');
The other option is to capture the sql result into a file.
test.sql:
set-property column-delimiter ",";
set-property sql-result-showheader off;
set-property sql-result-showfooter off;
select current_date() from dual;
$ ./sfsql < test.sql > result.txt
For more details and help, login to your snowflake account and access the online documentation or post your question to Snowflake support via the Snowflake support portal which is accessible through the Snowflake help section. Help -> Support Portal.
Hope this helps.
You can use a COPY command to export a table (or query results) into a file on S3 (using "stage" locations), and then a GET command to save it onto your local filesystem. You can only do it from the "sfsql" Snowflake command line tool (not from web UI).
Search the documentation for "unloading", you'll find more info there.
You can directly download the data from Snowflakes to Local Filesystem without staging to S3 or redirecting via unix pipe
Use COPY INTO to load table data to table staging
https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-location.html
snowsql$> copy into #%test_table/result/data_ from test_table
file_format = (TYPE ='[FILE_TYPE]' compression='[COMPRESSION_TYPE]');
Use GET command to download data from table staging to Local FS
https://docs.snowflake.net/manuals/sql-reference/sql/get.html
snowsql$> get #%test_table/result/data_ file:///tmp/;

Resources