I am trying to figure out how the SnowPipes execute when you setup Snowflake to automatically import data using Azure event grid notification, like this document describes - https://docs.snowflake.com/en/user-guide/data-load-snowpipe-auto-azure.html
So say I have a Azure Data Lake Gen2 container attached to Snowflake as an external stage, and this container has three folders (FolderA, FolderB, and FolderC), and I have a SnowPipe setup for each folder. Then I add a file to FolderA. So Snowflake gets a message from Azure Event Grid saying that the file has been added (and the Event Grid message has the full file name). Does Snowflake know to just run the SnowPipe setup for FolderA? Or will it run all three of the SnowPipes? And when the SnowPipe runs, does it scan for files? Or does the SnowPipe just import the specific file named in the EventGrid message?
Setting up event grid and snowpipe and overall handshake process in Azure/Snowflake combination is a bit tricky and have never tried with multiple folders and snowpipe but I prefer to give folder and file pattern to make sure even if snowpipe is triggered it only picks the files which is targeted for copy command which snowpipe has wrapped.
In AWS, all snowpipe with auto-ingest true flag generates the same ARN key and SNS (equivalent to even grid) also takes the file pattern on each folder and calls the same ARN. So I assume it runs but does not copy anything.
But I will surely try and simulate how it works.
Related
I have a azure logic app that monitors my emails and when target is found, it drops the attachment into blob storage. The plan is a consumption plan.
The issue is, sometimes it takes up to 50 minutes for the email to be grabbed and dropped. I know there is a startup time when things go idle, but I was reading seconds/minutes. Not close to an hour. Does anyone know how I can trouble shoot this?
sometimes it takes up to 50 minutes to grab and drop the email
Based on this doc ,
The reason for delay is:
When the triggers encounter a new file, it will try to ensure that the new file is completely written. For instance, it is possible that the file is being written or modified, and updates are being made at the time the trigger polled the file server. To avoid returning a file with partial content, the trigger will take note of the timestamp such files which are modified recently, but will not immediately return those files. Those files will be returned only when the trigger polls again. Sometimes, this may lead a delay up to twice the trigger polling interval. This also means that the trigger does not guarantee to return all files in a single run when "Split On" option is disabled.
For more information you can refer this:
. Automate tasks to process emails by using Azure Logic Apps | MS DOC, .
.How to Send an Email with one or more attachments after getting the content from Blob storage? | SO Thread & Logic app Created with add email attachments in Blob storage .
I have created snowpipe for one of snowflake table. Source file will be landed in AWS S3 bucket at periodical time, So followed below steps to create snowpipe:
Created external stage
Queried the files using "PUT" command (Able to see the list of available files in result panel)
Created snowpipe
Configured SQS notification on top of S3 bucket
Added one sample file and its noy loaded automatically
Altered snowpipe using following command:
alter pipe snowpipe_content refresh;
The file got added into snowflake target table after some time.
Can someone please help me to figure out what I missed on snowpipe setup
Follow the below setps to trouble shoot your snowpipe:
Step: I : Check the status of your snowpipe:
SELECT SYSTEM$PIPE_STATUS('pipe_name');
Make sure your pipe status is RUNNING
Step: II: Check copy history for the table associated with snowpipe:
select
*
from
table(information_schema.copy_history(table_name=>'table_name', start_time=> dateadd(hours, -1, current_timestamp())));
Ensure the file is not loaded from the list / errored.
Step III: Validate your snowpipe load
select
*
from
table(validate_pipe_load(
pipe_name=>'pipe_name',
start_time=>dateadd(hour, -1, current_timestamp()))
);
If above steps looks good, Might be issue with your SQS notification set up:
Follow the snowflake article by referring below link:
Snowflake KB
I want to migrate Files (Attachments) from a FTP server to another server (Salesforce), to do that i am going to use talend. i have no clue which components to use and in which order in order to download the files (multiple formats but downloadable via a http link), and to insert them into salesforce database, i will be grateful if someone explains to me how to proceed (what are the components to use and how to relate them) ?
Based on the info provided, first you will obtain the files from the remote server, then load them as a BLOB into a database.
See the diagram for a typical FTP flow. The first component is a connection to the server which allows connection reuse. The second component is optional, it allows you to get a count of file prior to your operations (you can use it later to make sure you retrieved all the files). The third component (tFTPGet) is technically all you need. This component actually grabs the files based on the file mask you set. The final component tFTPDelete cleans up the remote directory.
Once you have the files locally see this help link for information on how to insert files as BLOBs into a database. You will have to tweak it for your SalesForce db.
I have to make a mailing list to which you can subscribe in Wordpress. I've found a WP plugin in which you have a form with two fields, name and email. These are being saved into a csv file which I can export with a press of a button, literally. I want to automatically export this csv file into a database or a simple text file, which keeps updating and adding new subscribers.
The plugin I'm using now is called "Mail Subscribe List".
I'm using Wordpress version 4.0
You need to create a cron job to automatically do stuff periodically. To do this on windows or linux us the 'at' command. Google it for details on how to do it.
Basically you will need to create a php file to do the export, then set up a cron job to run the php at whatever intervals you require.
I have trying to overwrite or perform some file operation to the files uploaded in a webserver.
Previously I have uploaded the files from joomla extension. It defined its owner as 99. Without changing its owner it my login name i am unable to perform file operation using ftp and cpanel.
what can be done?
You could enable the FTP layer of Joomla.
It does depend a bit on how your hosting sets permissions (whether they use ACL's, etc), but the FTP layer of Joomla is designed to get around exactly this issue.
Documentation for this feature is here:
http://help.joomla.org/content/view/1941/302/1/2/