Get filename from proxy response when write to file in OSB - osb

I want to make a OSB service that asks for a file on another service (URI) and store the received file to a specific location. I managed to make the request and have the response from the external service, and now I want to store the file with a BS over a JCA Connector by writing the binary on the local disk (for instance).
I'm calling the BS from the Proxy with a Publish component. My problem is that I don't know how to pass the filename from the response in proxy to JCA through the Publish and BS.
Can anyone help me? Thank you.

What I understand is you have the file name in your proxy but not able pass it to jca. Here is what you have to do -
Inside you publish activity, place a 'Transport Header' activity.
Set direction to Outbound Request.
Set protocol to jca.
Set jca.file.FileName and jca.file.Directory to the values that you have in proxy.

Related

Uploading images to S3 with React and Elixir/Phoenix

I'm trying to scheme how I'm going to accomplish this and so far I have the following:
I grab a file in the front end and on submit send the file name and type to the back end where it generates a presigned URL. I send that to the FE. I then send the file on the front end.
The issue here is that when I generate the presign, I want to commit my UUID filename going to S3 in my database via the back end. I don't know if the front end will successfully complete this task. I can think of some janky ways to garbage collect this - but I'm wondering, is there a typically prescribed way to do this that doesn't introduce the possibility of failures the BE isn't aware of?
Yes there's an alternate way. You can configure your bucket so that it sends an event whenever an object is created/updated. You can either send this event to a SNS topic or AWS Lambda.
From there you can make a request to your Phoenix app webhook, that can insert it into the database.
The advantage is that the event will come only when the file has been created.
For more info, you can read the following: https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
The way I'm currently handling this is as such:
Compress the image client side.
Send the image to the backend application server.
Create a UUID on the backend.
Send the image from s3 to the backend, using the UUID as the key.
On success, put the UUID into the database.
Respond to the client with the UUID so it can display the image.
By following these steps, you don't introduce error into your database.

File with the extension of ".ehi" content is not readable

Has anyone encountered a file that has an extension of ".ehi" before?
If I open it using Wordpad, its contents are not readable strings.
What app should I use in order to open this file type?
What is an EHI file?
Config file used by HTTP Injector, an Android app used to modify
requests and access blocked websites behind firewall with SSH support
and proxy server; contains settings that configure HTTP Injector,
which includes server login credentials and security/locking
information.
Based on this information, it makes sense that the data isn't stored in plain text. The credentials are likely encrypted or otherwise stored in an obfuscated format to help prevent unauthorized access.
Source: https://fileinfo.com/extension/ehi

Azure Logic Apps-Download File from URL

I have a requirement in Logic Apps where I need to do HTTP GET from a website URL which gives a file which I need to download to Azure File Storage.
I am able to call the downloadable URL but not sure how to go about downloading the file to Azure File storage directory.
Please let me know your inputs.Do I need to write an Azure function or can I get the HTTP action to do the trick to download the file?
Thanks,
SP
I suppose Logic apps has moved on a little since you first asked this question.
The short answer is yes you can do this solely within Logic Apps.
I'm assuming you're making a HTTP Request at some point and the downloadable file is being returned as a content type of application/octet
Use a 'Blob Storage'->Create Blob action, the only thing I needed to do was to use the binary function as the content in this action
e.g. binary(body('HTTP'))
This caused my zip file to be created in the Azure storage account as a blob.
Ping me if you need more info.
1) You need to create one web api function or azure funtion which return file content like i tried for zip file
2) You need to call that method using HTTP connector
3) You can use "azure File storage" connector "create file" action
in that you need to pass file and file content which return from your GET API Url
if you need more help feel free to ask

Best approach to write generic azure logic app/azure functions to do FTP operations

I would like to build a FTP service using azure logic apps/azure functions. I would like the logic app to be invoked via HTTP request (will expose this app as REST API later). The FTP server details like directory, username, password will be sent in the request.
Is there a way by which I can have my logic app to create FTP connector dynamically based on the incoming request and then do a FTP upload or download?
You cannot create a connection at Logic Apps run time, it need to happen at author time. If it's a pre-defined list of connections, you can create them first, then use switch-case to branch into the connection that should be used at run time.

How to store mail info for mail client

I'm working on a side project right now for an email client. I'm using a library to handle the retrieval of the messages from the server. However, I have a question on caching.
I don't want to fetch the entire list of headers everytime I load the client. Ideally, what I'd like to do is cache them and then update the list with what is on the server.
What's the best way to go about this? Should I store all the header information (including the server's message ID #) in a database, load the headers from that DB. Then as a background task sync up with the server...
Or is there a better way?
Look at the webmail sample of this open source project that use local caching:
http://mailsystem.codeplex.com/
If I remember well, he used a combination of local RFC822 plain text email storing with the message id as the filename and an index file with high level data.
Maybe the message itself where zipped to save disc space.
That's just a sample for the library, so don't expect code art there, but that's a start.

Resources