Is it possible to upload files to Snowflake via REST protocol directly? - snowflake-cloud-data-platform

Does anybody know whether it is possible to upload files to Snowflake using REST API endpoint directly, not using 3rd party drivers like https://docs.snowflake.com/en/user-guide/dotnet-driver.html
I didn't find such information in their general API docs: https://docs.snowflake.com/en/user-guide/data-load-snowpipe-rest-apis.html But I assume that may be this information is not publically available. Does anybody know?

The API you're referencing is for the Snowpipe REST API. This API is supported and publicly documented, but I don't think it's what you want.
The Snowpipe REST API does not upload files. Instead, you can invoke it to inform Snowpipe that there are new files in an external stage ready for copying into a table. Something else needs to get the files uploaded to the external stage in S3, Azure Blob, or GCP.
As far as a general-purpose REST API, it's supported only for Snowflake and partner developers use and not publicly documented. The best method is to use one of the drivers or connectors (ODBC, JDBC, .NET driver, etc.) to upload files. If that doesn't work for you, you can put the files to an external stage using whatever supported method you like for that cloud host. You can then use the Snowpipe REST API to initiate the copy into the table or just use SQL and a warehouse to do the copy into the table.

Related

Connecting API to database

I am using this API to build an app (Xcode) and the maximum number of calls a day is 5000. The way I have currently built the app for testing purposes is to call the API every time the user refreshes the data. So, I am running out of calls per day. So, I was wondering how to connect an API to a database like firebase. Then update the data in the database maybe 4 times a day at a specific time. When the user would be refreshing, they would pull data from the database instead. I'm new to programming and am not sure if this is the best solution and would appreciate if anyone could direct me to more resources. Thanks!
This is the api I am using: https://projects.propublica.org/api-docs/congress-api/?
Edit: Also would something like this also mean I would build a REST API? https://github.com/unitedstates/congress It is a repository that includes data importing scripts and scrapers. I'm guessing this isn't compatible with swift but is compatible with building a REST API in AWS or Firebase?
You can use AWS (Amazon Web Services). Their free tier allows many of their services for free (12 months, and usage limit) including the ones I would recommend you for this project:
Make an AWS account.
Use S3 storage buckets to host a datafile.
Use API Gateway to make an API.
Use Lambda to run a Python/Javascript in the cloud which connects the API with the S3 bucket (your data).
Use IAM to create roles and permissions for the S3 bucket, API and Lambda scripts to communicate.
Here's how you set up the API: https://www.youtube.com/watch?v=uFsaiEhr1zs
Here's how you read the S3 bucket: https://www.youtube.com/watch?v=6LvtSmJhVRE
You can also work with these tools to set up an API that PUTS data to the S3 bucket and updates the data regularly.

Using a large API via custom connectors. API's definition exceeds the 1MB file size limit

The company I work for uses a product named Pension Pro for workflow management. They offer an API and we would like to look into using it for automating tasks and connecting multiple systems. We want to do this through Azure Logic Apps and/or Microsoft Flow.
Pension Pro provides documentation for their API on their website here: https://api.pensionpro.com/#/
In theory, I should be able to save down the API documentation from their website and import it into Azure. The issue is, once saved to a text file, the API exceeds the 1MB file limit for importing an OpenAPI file.
I tried copy/pasting the JSON data into the swagger editor inside of Azure and ran into issues when saving the connector as well.
What would a solution to this problem be?
Another approach would be to use the Azure API Management Built-In Connector.
You can import the OpenAPI spec into APIM and then use the built-in connector to call the operations directly from Logic Apps.

Saving images in Azure storage

I am building a web application , where users can upload images & videos and store them in their account. I want to store these files somewhere and save only the URL in the DB.
What is the right way to do it using Azure services? Is there a dedicated server for this, or some VM?
Yes, there is a dedicated service for this purpose. It is the Azure Blob Storage. And you are highly advised to save all and any user uploaded content to that service instead to the local file system.
The provided link has samples for almost any language that has client SDK provided by microsoft.
If, at the end you use some platform or language that is not directly supported by an SDK, you can always refer to the Blob Storage REST API documentation.
You will need to go through the blob service concepts to get deeper understanding of the service and how to use it.

Integration from sfdc with sft

My requirement is:
I have to create a CSV file or spreadsheet which contains usernames and emails. The file will be automatically generated and stored in with help of batch script and scheduler. I want to send the file to secure a file system (sftp) without using third party software. This task should be in an automated way.
Could you please tell me feasible solution for this requirement?
You cannot use Secure File Transfer Protocol to insert data into Salesforce. You must access an appropriate API. There are several clients available, including The Apex Data Loader. This article describes using the command line interface to automate uploads. The Apex Data Loader makes connections over SSL and also requires you to provide your User's Security Token.
It is also possible to modify Salesforce objects from Heroku.
Both of these are Salesforce applications, not 3rd party.

Is there a way to access an external database using the force.com platform?

My organization wants to be able to regularly read in data from an external web service which provides an ODBC interface, and update our salesforce data with that information. I've been hunting around Salesforce's documentation, and it seems like there's no way to do this except by using the Apex Data Loader's batch functionality. Unfortunately, this means that my organization would have to maintain a local computer to run the data loader nightly, which we're trying to avoid doing.
What we'd like to do is create an Apex Schedulable class or something similar and run code that can access the ODBC interface from our external data source on the salesforce platform itself. Is it possible to do this?
There's no support for making outbound ODBC connections from salesforce. If the external service has an HTTP based API, then you use the http client in apex to make the api calls and get the data.
Outbound as mentioned you'd have to make wrap your database in a webservice. You could load the data in using data loader/Talend/Informatica/etc.

Resources