So currently I am doing a synchronous call to mulesoft which returns raw image(no encoding is done) and then storing the image in a document.So when ever we are getting bigger images more than 6 MB it is hitting the governerlimit for max size.So wanted to know is there a way to get a reduced or compressed image
I have no idea if Mule has anything to preprocess images, compress...
In apex you could try to make the operation asynchronous to benefit from 22 mb limit. But there wil be no UI element for it anymore, your component / user would have to periodically check if the file got saved or something.
you could always change the direction. Make Mule push to salesforce over standard API instead of apex code pulling from Mule. From what I remember standard files API is good for up to 2GB.
Maybe send some notification to mule that you want file XYZ attached to account 123, mule would insert contentversion, contentdocumentlink? And have apex periodically check.
And when file is not needed - nightly job to delete files created by "Mr mule" over a week ago?
Related
Im working on a Wordpress project that will take entries from a 3rd party API and add them as individual post on the site with the data saved as metadata. Up until now it was going well but with the number of entries on the API increasing im starting to run in to issue regarding the server timing out while making the API request.
What I have done now is to wright the response from the API that is all the interies in json format to a file then creating the post on the site from the file. But still running in to time out issues.
That brings me to my question.
I want to break the data (from the JSON file) up in to smaller manageable request to the server. only processing lets say 20 entries at a time out if the 1000+. But I need to find out how do I select the entry number in the json file, for example after processing the first 20 entries I then want the function to go back to the json file but this time start at the 21st entry in the file, if at all possible. Or is there a better method of programmatically creating post from a json file with a large amount of entries.
// Just some more info.
Im using wp_remote_post() with blocking set to false to run the function that creates the post in batches in the background.
I have a azure logic app that monitors my emails and when target is found, it drops the attachment into blob storage. The plan is a consumption plan.
The issue is, sometimes it takes up to 50 minutes for the email to be grabbed and dropped. I know there is a startup time when things go idle, but I was reading seconds/minutes. Not close to an hour. Does anyone know how I can trouble shoot this?
sometimes it takes up to 50 minutes to grab and drop the email
Based on this doc ,
The reason for delay is:
When the triggers encounter a new file, it will try to ensure that the new file is completely written. For instance, it is possible that the file is being written or modified, and updates are being made at the time the trigger polled the file server. To avoid returning a file with partial content, the trigger will take note of the timestamp such files which are modified recently, but will not immediately return those files. Those files will be returned only when the trigger polls again. Sometimes, this may lead a delay up to twice the trigger polling interval. This also means that the trigger does not guarantee to return all files in a single run when "Split On" option is disabled.
For more information you can refer this:
. Automate tasks to process emails by using Azure Logic Apps | MS DOC, .
.How to Send an Email with one or more attachments after getting the content from Blob storage? | SO Thread & Logic app Created with add email attachments in Blob storage .
We have a few nodejs servers where the details and payload of each request needs to be logged to SQL Server for reporting and other business analytics.
The amount of requests and similarity of needs between servers has me wanting to approach this with an centralized logging service. My first instinct is to use something like Amazon SQS and let it act as a buffer with either SQL Server directly or build a small logging server which would make database calls directed by SQS.
Does this sound like a good use for SQS or am I missing a widely used tool for this task?
The solution will really depend on how much data you're working with, as each service has limitations. To name a few:
SQS
First off since you're dealing with logs, you don't want duplication. With this in mind you'll need a FIFO (first in first out) queue.
SQS by itself doesn't really invoke anything. What you'll want to do here is setup the queue, then make a call to submit a message via the AWS JS SDK. Then when you get the message back in your callback, get the message ID and pass that data to an invoked Lambda function (you can write those in NodeJS as well) which stores the info you need in your database.
That said it's important to know that messages in an SQS queue have a size limit:
The minimum message size is 1 byte (1 character). The maximum is
262,144 bytes (256 KB).
To send messages larger than 256 KB, you can use the Amazon SQS
Extended Client Library for Java. This library allows you to send an
Amazon SQS message that contains a reference to a message payload in
Amazon S3. The maximum payload size is 2 GB.
CloudWatch Logs
(not to be confused with the high level cloud watch service itself, which is more sending metrics)
The idea here is that you submit event data to CloudWatch logs
It also has a limit here:
Event size: 256 KB (maximum). This limit cannot be changed
Unlike SQS, CloudWatch logs can be automated to pass log data to Lambda, which then can be written to your SQL server. The AWS docs explain how to set that up.
S3
Simply setup a bucket and have your servers write out data to it. The nice thing here is that since S3 is meant for storing large files, you really don't have to worry about the previously mentioned size limitations. S3 buckets also have events which can trigger lambda functions. Then you can happily go on your way sending out logo data.
If your log data gets big enough, you can scale out to something like AWS Batch which gets you a cluster of containers that can be used to process log data. Finally you also get a data backup. If your DB goes down, you've got the log data stored in S3 and can throw together a script to load everything back up. You can also use Lifecycle Policies to migrate old data to lower cost storage, or straight remove it all together.
How can I upload parse and download excel files in Google appengine that require more than 30secs ? I use java poi and backend tasks, but as soon as the backend does the job I cannot notify the client. I cannot download the excel that is created from the backend task... Any suggestions would be much appreciated.
The best approach here is not to fight HTTP and a web service architecture but rather to work with it.
Introduce a notion of a job id. When your client uploads a file, immediately return a token that represents that job. Extra credit, include an estimated duration of the job. For starters, lets say its 2 minutes.
The client is then responsible for querying the server for the state of that job id using the token. The server either returns the answer, or it returns the token back with an updated ETA.
For starters, you could just always tell the client to check back in 2 minutes (or whatever constant makes most sense for your workload). As your server processing becomes smarter, you could give more accurate estimates, and decrease the busy-waiting the client does.
I have the below requirement
A large text file around 10Mb to 25Mb (With 50,000 to 100,000 lines of data) is uploaded into the web application. I have to validate the file line by line and write the output to another location and then display a message to the user.
The App Server is WebLogic and its is accessed through Web Server through Apache Bridge. Apache Bridge times out pretty quickly during the upload + processing activity. Is there any way to solve this issue without changing the timeout of the Apache Bridge
What is best possible solution ? Below are my current thoughts.
Soln 1 Upload the file and return back to the page. Then trigger a Ajax to run the validation in a separate thread and check its status through further Ajax requests.
Soln 2. Use sc_partial_content(206) http Code to keep the connection alive.