Integration from sfdc with sft - salesforce

My requirement is:
I have to create a CSV file or spreadsheet which contains usernames and emails. The file will be automatically generated and stored in with help of batch script and scheduler. I want to send the file to secure a file system (sftp) without using third party software. This task should be in an automated way.
Could you please tell me feasible solution for this requirement?

You cannot use Secure File Transfer Protocol to insert data into Salesforce. You must access an appropriate API. There are several clients available, including The Apex Data Loader. This article describes using the command line interface to automate uploads. The Apex Data Loader makes connections over SSL and also requires you to provide your User's Security Token.
It is also possible to modify Salesforce objects from Heroku.
Both of these are Salesforce applications, not 3rd party.

Related

Is it possible to upload files to Snowflake via REST protocol directly?

Does anybody know whether it is possible to upload files to Snowflake using REST API endpoint directly, not using 3rd party drivers like https://docs.snowflake.com/en/user-guide/dotnet-driver.html
I didn't find such information in their general API docs: https://docs.snowflake.com/en/user-guide/data-load-snowpipe-rest-apis.html But I assume that may be this information is not publically available. Does anybody know?
The API you're referencing is for the Snowpipe REST API. This API is supported and publicly documented, but I don't think it's what you want.
The Snowpipe REST API does not upload files. Instead, you can invoke it to inform Snowpipe that there are new files in an external stage ready for copying into a table. Something else needs to get the files uploaded to the external stage in S3, Azure Blob, or GCP.
As far as a general-purpose REST API, it's supported only for Snowflake and partner developers use and not publicly documented. The best method is to use one of the drivers or connectors (ODBC, JDBC, .NET driver, etc.) to upload files. If that doesn't work for you, you can put the files to an external stage using whatever supported method you like for that cloud host. You can then use the Snowpipe REST API to initiate the copy into the table or just use SQL and a warehouse to do the copy into the table.

Azure logic apps http connector for ADLS is corrupting the zip file

I am using the Azure logic-apps to get the Attachments from an email (outlook) and dump into the Azure Datalake Gen2. I am using the http connector to dump the file into the adls.
Though I am able to dump the file into the datalake but this zip file is getting corrupted.
Previously I had Azure datalake Gen1 so I was using the adls Upload File action to upload the attachment then I didn't face such type of issue.
I am not sure whether I am committing mistake or is there issue with the http connector.
Hence seeking help from the community.
I am also attaching the part of the logic apps flow:
It is always better to use inbuilt connectors in Logic App.
For Azure Data Lake Storage Gen2 (ADLS Gen2) accounts, you can use Azure Blob Storage connector (recommended by Microsoft), while having multi-protocol access. You can read more about this new feature, including the availability and known limitations in this blog.
Known issues and limitations:
The action Extract archive to folder ignores empty files and folders in the archive, they are not extracted to the destination.
The trigger does not fire if a file is added/updated in a subfolder. If it is required to trigger on subfolders, multiple triggers should be created.
Logic apps can't directly access storage accounts that are behind firewalls if they're both in the same region. As a workaround, you can have your logic apps and storage account in different regions. For more information about enabling access from Azure Logic Apps to storage accounts behind firewalls, see the Access storage accounts behind firewalls.
For more information about this, you can visit here.

Edit server-side file using GWT

I'm new to GWT. I need to read a text file from the server, display its content in a TextArea widget, allow the user to modify it, and save the updated file back to the server. Can anyone please tell me, is it possible, and what should be the right way to do this?
It is possible.
You can make a service to access the file (read and write) and the GWT client will easily call the service methods and update the user interface (TextArea).
Read here for more details.
Also you can start right away by making a new GWT project in Eclipse and choose to generate sample code. It will generate a simple service and simple GWT page that calls the service. You can add your methods to this service to try it out as a proof of concept.
If you are using Google App Engine server, there's no way to write file to server because of restriction.
To write file to server, you will need to create your own server, create a service (use Java or another server-side language) then use one of these methods to communicate with your server.
If you still want to use GAE server (on appspot.com domain), you can use another method to store your data like Datastore or Google Cloud Storage, see this article for more information.

uploading file from Access 2010

I have a Access 2010 frontend database + MySQL as backend. So far it is working fine. I would like to upload document and decided to save on the server rather than on the Database. My first question is, how do I upload file from access frontend to the remote server/location?
I was thinking, maybe store the data on the database and use some kind of triggers or script which reads the blob file from database and saves on the server as well as fills the file path into another column.
is there any easier way to upload files from access frontend to a remote server? I am using MySQL server as backend.
thank you in advance
SFTP with Putty
This might help you - it's a great example using Putty on the Windows machine to communicate over SFTP with the Linux server using VBA: SFTP upload with VBA
You would need to install putty on each Windows machine that uses Access and ensure that the appropriate rights are in place on the Linux server.
Custom Add-In
You could use .NET to create an add-in for access to transfer the file to the server over sockets, but this would require you to write a server-side application to listen for requests. You would have complete freedom over how you implement it at the cost of added complexity for yourself as the developer.
You would need to:
Create an add-in using Visual Studio (or other .NET IDE)
Add this into your Access application and use the API you've built.
Create a server-side application to listen to it (this could be a simple Python application)
SMTP Approach
If you want to be creative you could email the file to your own mock SMTP server using Access' CDO functionality: Sending emails with Access
Again, you would have to create a handler application to handle the SMTP protocol, but I'm sure there are some great examples out there.
HTTP Approach
You could even encode the file and send it over HTTP to a simple PHP server in a simple POST request: Example web request with Access You would need to encode the file to base64 or something or file a way of handling file uploads.
Conclusion
As you can see, the easiest approach by far is using Putty, but there are some interesting custom approaches you could take.
I'd say using either SMTP or HTTP would be suitable but that depends how easily you could set up the server-side handler. There may be existing SMTP emulators out there that you could use to handle receiving and managing files.
this might help someone.
I have used Chilkat FTP activeX component and its working fine. Chilkat provides prewritten code just copied from his website and everything is fine. Although I could not find how to show the transferring progress.
regards
krish

Is there a way to access an external database using the force.com platform?

My organization wants to be able to regularly read in data from an external web service which provides an ODBC interface, and update our salesforce data with that information. I've been hunting around Salesforce's documentation, and it seems like there's no way to do this except by using the Apex Data Loader's batch functionality. Unfortunately, this means that my organization would have to maintain a local computer to run the data loader nightly, which we're trying to avoid doing.
What we'd like to do is create an Apex Schedulable class or something similar and run code that can access the ODBC interface from our external data source on the salesforce platform itself. Is it possible to do this?
There's no support for making outbound ODBC connections from salesforce. If the external service has an HTTP based API, then you use the http client in apex to make the api calls and get the data.
Outbound as mentioned you'd have to make wrap your database in a webservice. You could load the data in using data loader/Talend/Informatica/etc.

Resources