Azure logic apps http connector for ADLS is corrupting the zip file - azure-logic-apps

I am using the Azure logic-apps to get the Attachments from an email (outlook) and dump into the Azure Datalake Gen2. I am using the http connector to dump the file into the adls.
Though I am able to dump the file into the datalake but this zip file is getting corrupted.
Previously I had Azure datalake Gen1 so I was using the adls Upload File action to upload the attachment then I didn't face such type of issue.
I am not sure whether I am committing mistake or is there issue with the http connector.
Hence seeking help from the community.
I am also attaching the part of the logic apps flow:

It is always better to use inbuilt connectors in Logic App.
For Azure Data Lake Storage Gen2 (ADLS Gen2) accounts, you can use Azure Blob Storage connector (recommended by Microsoft), while having multi-protocol access. You can read more about this new feature, including the availability and known limitations in this blog.
Known issues and limitations:
The action Extract archive to folder ignores empty files and folders in the archive, they are not extracted to the destination.
The trigger does not fire if a file is added/updated in a subfolder. If it is required to trigger on subfolders, multiple triggers should be created.
Logic apps can't directly access storage accounts that are behind firewalls if they're both in the same region. As a workaround, you can have your logic apps and storage account in different regions. For more information about enabling access from Azure Logic Apps to storage accounts behind firewalls, see the Access storage accounts behind firewalls.
For more information about this, you can visit here.

Related

Transfer data collected from Log Analytics to Blob Storage Account using Logic Apps

I am new to Azure, I have created a Custom Log in Azure Log Analytics which is able to capture all the logs placed in a file in my desktop. Now I wish to transfer these logs into a Blob Storage Account using Azure Logic Apps. Can someone explain me in detail how to proceed. Detailed explanation would be appreciated

Saving images in Azure storage

I am building a web application , where users can upload images & videos and store them in their account. I want to store these files somewhere and save only the URL in the DB.
What is the right way to do it using Azure services? Is there a dedicated server for this, or some VM?
Yes, there is a dedicated service for this purpose. It is the Azure Blob Storage. And you are highly advised to save all and any user uploaded content to that service instead to the local file system.
The provided link has samples for almost any language that has client SDK provided by microsoft.
If, at the end you use some platform or language that is not directly supported by an SDK, you can always refer to the Blob Storage REST API documentation.
You will need to go through the blob service concepts to get deeper understanding of the service and how to use it.

Display onedrive filelist on my website

Here is the basic concept of what I am trying to do. My web app allows my clients to log in to a dashboard.
One of the things I want to show on their dashboard is THEIR work files.. ie: PDF files.
I store these files in OneDrive in a seperate folder for each client
Root Doc Directory
- Client A
- File1.pdf
- File2.pdf
- Client B
- File1.pdf
etc
so when client A logs in, I want to show all the files in the ClientA folder...
concept sounds simple, and with storage on my own server, I can do this easy, but I cant find how to do it using OneDrive...
Does anyone out there have any ideas?? All the info I have found about OneDrive APIs requires users to actually log into onedrive which I dont want.
Basically you're using OneDrive wrong. You should be asking each user of your service to sign in with their Microsoft Account and store the files in the user's OneDrive. Storing them all in your OneDrive means they can't access those files outside of your app (like by logging into OneDrive). Instead of using Microsoft Account as the security for those files, you're putting all of the security requirements on your own ability to protect access to your OneDrive account. Basically, doing it way you proposed is strongly not recommended.
You can pretty easily integrate OAuth into your website so that a user can connect your site to OneDrive and then have access to their files from OneDrive in your service.
The alternative would be to use something like Azure Blob Storage to store/retrieve these files. Then your app would just have the set of access keys required to access storage and you wouldn't have to deal with signing into a single OneDrive account from the service and keeping the access and refresh tokens up to date.

AngularJS and Ruby on Rails - Uploading multiple files directly to Amazon S3

I'm writing an app where users can write Notes, and each note can have many files attached to it.
I would like users to be able to click 'Browse', select multiple files, which will be uploaded when the user clicks 'Save Note'.
I want these files to be uploaded directly into Amazon S3 (or some other cloud storage solution?) without going through my server, so I don't have to worry about uploads blocking my server.
What is the best way to accomplish this?
I have seen many examples to upload directly into Amazon S3, but none of them support multiple files. Will I somehow have to do this all in Javascript by looping through a collection of files selected with the Browse button?
Thanks!
Technically, your javascript residing in the browser could make HTTP RESTful calls to AWS and store data in S3, but then you would be exposing the security credentials to connect to AWS in the script.. not good.
I guess the only way is to process it thru a web-server which can securely access AWS and store the notes.. or, you could just write those notes to a local disk (where the webserver sits), and schedule tools like s3cmd to automatically synch them with S3 buckets.

iPhone App Built on Amazon Web Services

I am building an iPhone app that stores user logon credentials in an AWS DynamoDB. In another DynamoDB I am storing locations of files (stored in S3) for that user. What I don't understand is how to make this secure. If I use a Token Vending Machine that gives that application an ID with access to the user DynamoDB, isn't it possible that any user could access the entire DB and just add or delete any information that they desire? They would also be able to access the entire S3 bucket using this setup. Any recommendations on how I could set this up securely and properly?
I am new to user DB management, and any links to helpful resources would be much appreciated.
Regarding S3 and permissions, you may find the answer on the following question useful:
Temporary Credentials Using AWS IAM
IAM permissions are more finegrained than you think. You can allow/disallow specific API calls, so for example you might only allow read operations. You can also allow access to a specific resource only. On S3 this means that you can limit access to a specific file or folder , but dynamodb policies can only be set at the table level.
Personally I wouldn't allow users direct access to dynamodb - I'd have a webservice mediating access to that, although users being able to upload directly to s3 or download straight from s3 is a good thing (Your web service can in general give out pre signed urls for that though)

Resources