i am facing an error while loading snowpipe into snowflake - snowflake-cloud-data-platform

Pipe Notifications bind failure "Cross cloud integration is not supported for pipe creation in AZURE using a stage in AWS."
auto_ingest=true; //couldn't compile, facing the above error.
i couldn't find any possible solutions to try

Snowflake Snowpipe on Azure can only ingest from Azure cloud storage services. See Snowpipe Supported Cloud Storage Services for details.

Related

Integrating Datadog RUM data into Snowflake

My team is trying to integrate datadog's RUM data into Snowflake for our data scientists to consume. Is this possible? If so how?
So far I have found documentation on how to integrate data from snowflake into the datadog dashboard, but not the other way around.
There are a number of options:
Use an ETL tool that can connect to both Snowflake and Datadog
Bulk load: export the data to an S3 (or similar) file and use the Snowflake COPY INTO command
Streaming: stream the data out of Datadog and then into Snowflake using Snowpipe
Poll the RUM events API with an application you develop yourself.
https://docs.datadoghq.com/api/latest/rum/
Write microbatches to your target tables using one of the language connectors, or the Spark connector.
https://docs.snowflake.com/en/user-guide/spark-connector-use.html

WHere does Snowflake stores data like metadata, tables data

WHere does Snowflake stores data like metadata, tables data and all other data ? Does it uses the Public Cloud which we used to configure while creating account in Snowflake and if yes then under that cloud where it keeps it ? and if no then which cloud provider does it use for the storage ?
Each Snowflake deployment has its own metadata servers. You may get more information on what is used for storing metadata:
https://www.snowflake.com/blog/how-foundationdb-powers-snowflake-metadata-forward/
Based on the additional questions:
The data (micro-partitions) are stored in "object storage services" of the same cloud provides (ie S3 for AWS etc)
Yes, all the data and metadata are stored in the cloud itself where the account is deployed.
Yes, it's deployed on the cloud service linked to the account.
Snowflake consists of following three layers, Database Storage, Query Processing, Cloud Services.
https://docs.snowflake.com/en/user-guide/intro-key-concepts.html
Metadata is managed in Cloud Services layer and it's clearly devided from database storage.
Snowflake's core feature is micro-partitions and immutable. Snowflake doesn't overwrite original targets but copies and updates to reference them by Cloud Services layer if something update is required.

Data copy from Salesforce to Salesforce using Azure Data Factory

Can we use Dataflow activity in ADF for copying data from Salesforce to Salesforce.
According to the documentation, yes, you can. Just using the proper Linked Service.
Specifically, this Salesforce connector supports:
Salesforce Developer, Professional, Enterprise, or Unlimited editions.
Copying data from and to Salesforce production, sandbox, and custom domain.
https://learn.microsoft.com/en-us/azure/data-factory/connector-salesforce?tabs=data-factory
Currently, Data Flow activity does not support copying data to or from Salesforce.
You can refer to these Microsoft documents to check the supported datasets in source and sink transformations in mapping data flows.
You can also raise a feature request from the ADF portal.
Alternatively, you can use copy activity to copy data to or from Salesforce.

Can Snowflake stage Standalone without the help of any cloud or lock machine?

For staging in Snowflake, we need S3 AWS layer or Azure or Local machine. Instead of this, can we FTP a file from a source team directly to Snowflake internal storage, so that, from there the Snowpipe can the file and load to our Snowflake table.
If yes, please tell how. If no, please confirm that as well. If no, won't that is a big drawback of Snowflake to depend on other platforms every time.
You can use just about any driver from Snowflake to move files to Internal stage on Snowflake. ODBC, JDBC, Python, SnowSQL, etc. FTP isn't a very common protocol in the cloud, though. Snowflake has a lot of customers without any presence on AWS, Azure, or GCP that are using Snowflake without issues in this manner.

Using amazon RDS from google app engine. please help !

I have some queries.
By default Google appengine (cloud based deployment platform) does not support mysql or any database for that matter. So we thought of using Amazon RDS as a option (Since it is in cloud.). After reading the documentations, i understood that amazon exposes the web services and provides API s for basic operations like creating a DB instance etc. But i am not sure whether it provides APIs for CRUD operations.? So that programatically i should configure amazon RDS and perform CRUD operations in it. Please answer.
Can i write a web service which is similar to amazon WS to perform CRUD operations in amazon RDS? is it feasible? please answer my questions ASAP.
Amazon RDS exposes MySQL databases using the standard MySQL protocol. App Engine can only make outgoing connections over HTTP, so it won't be possible to connect directly to RDS from App engine. You certainly could write a web service such as you describe, but you'd need to run it on a separate server (such as an EC2 instance), and you'd need to write your own interface for accessing the database on the client end, separate from the MySQL libraries.
Note that we're planning to introduce support for relational databases in the future with App Engine for Business.
I have never used them but RdbHost was built (so it seems) for this reason. You can make your SQL calls over https. This will be slower though.

Resources