Im working on a Wordpress project that will take entries from a 3rd party API and add them as individual post on the site with the data saved as metadata. Up until now it was going well but with the number of entries on the API increasing im starting to run in to issue regarding the server timing out while making the API request.
What I have done now is to wright the response from the API that is all the interies in json format to a file then creating the post on the site from the file. But still running in to time out issues.
That brings me to my question.
I want to break the data (from the JSON file) up in to smaller manageable request to the server. only processing lets say 20 entries at a time out if the 1000+. But I need to find out how do I select the entry number in the json file, for example after processing the first 20 entries I then want the function to go back to the json file but this time start at the 21st entry in the file, if at all possible. Or is there a better method of programmatically creating post from a json file with a large amount of entries.
// Just some more info.
Im using wp_remote_post() with blocking set to false to run the function that creates the post in batches in the background.
So currently I am doing a synchronous call to mulesoft which returns raw image(no encoding is done) and then storing the image in a document.So when ever we are getting bigger images more than 6 MB it is hitting the governerlimit for max size.So wanted to know is there a way to get a reduced or compressed image
I have no idea if Mule has anything to preprocess images, compress...
In apex you could try to make the operation asynchronous to benefit from 22 mb limit. But there wil be no UI element for it anymore, your component / user would have to periodically check if the file got saved or something.
you could always change the direction. Make Mule push to salesforce over standard API instead of apex code pulling from Mule. From what I remember standard files API is good for up to 2GB.
Maybe send some notification to mule that you want file XYZ attached to account 123, mule would insert contentversion, contentdocumentlink? And have apex periodically check.
And when file is not needed - nightly job to delete files created by "Mr mule" over a week ago?
I have switched from accessing my media files from a network share to a regular http server since VLC player can issue http range request and this way I can jump right into the middle of a movie.
Lately I want to organise these files a bit more and thought about putting them into Jackrabbit.
Upload even large files works just fine, but getting the is more of a problem as Jackrabbits http access to the media files seems not be handle http range requests, bummer.
How difficult would it be to implement this, provided of course that the files are stored in a file system?
Günther
I have a ~2MB file that my Google AppEngine server must use (not serve) as part of a computation for a service request.
That is, a client makes a particular request, my GAE server must first get the data from this ~2MB file, do some computations using this data, then serve a small response back to the client.
Where best do I store this data so that it can be quickly read and used by the server in the computation?
If the following assumptions hold true
the file is not going to require updates outside of appengine code updates
that the file is read only
Then deploy the file with your code and read the file into memory during startup (ideally using warmup requests) and just operate on it from memory. If you code has to have file based semantics to access the data (read,seek, etc) then read the file contents and wrap it in StringIO.
You will need to assign the value read from the file to a module level variable, that way whenever you get a new request you can just get the files contents by importing the module and referencing the name. ie. mymodule.filecontents
Objective: Suppose the client submits a string or text file to the server (Google App Engine) using a web form. I want the server to modify the original file and serve it back to the client.
I think the only way to serve files from GAE is using the Blobstore, right? Then, as we cannot modify blobs, I believe a solution would be:
Client uploads a file using HttpRequest
Server reads the uploaded file and copies it to a temp buffer (not sure if is there a method to do this)
Server deletes original blob
Server modifies data in the temp buffer
Server writes the modified buffer to the Blobstore
Server serves the new blob to the client
Would this work? Could you think about any other solution?
Thanks
I think the only way to serve files from GAE is using the Blobstore, right?
Wrong. A 'file' is just a way of storing data on disk; there's nothing about serving them from a webserver that requires the data come from an actual, writable disk file. You can simply accept the user's data via a form upload, modify it, and serve it back to them, without it having to ever touch disk, the blobstore, or any other permanent storage medium.
This only becomes a problem if the user's data is too large to fit in memory, in which case you will have to store the data somewhere while you work on it, such as in the blobstore.
http://code.google.com/appengine/kb/java.html#fileforms
shows you how to do it for file upload, which has to be performed thro multipart form-data.
Similarly for non-file data, where you read straight from the request stream.
You don't even have to store the file/input stream. Just spit out the processed data into the output response stream, while reading the input FileItemStream or request inputstream.
If your file/input processing requires look-forward, determine the maximum distance of look-forward and use that distance as your buffer size.
Further Edits
To respond to the client with a file type, set the response content-type or mime-type.
e.g., I've had apps which dynamically generated gifs, jpgs, xls, cvs, etc.
There isn't any difference whether source of response stream is a file you read or a stream that you generate dynamically. Because, even if you had a stored file that needs to be sent as response to client, you could still have to convert it into a response stream and flag the content-type appropriately.
For dynamically generated content, unless you need to cache the output, you need not generate the file into a web URL-visible location and then generate a new html page with the link, and send that html page to the browser. You don't need the user's browser to have to refresh itself just to get that link.
You would simply send the "file" directly with the response stream. You could design your GWT client to accept the "file", perhaps in a named frame, where the named frame src url is the app that performs the dynamic generation of the file.
Read http://en.wikipedia.org/wiki/Mime-type to find the content-type you need.
If the target client's browser does not have the content handler set-up for the response's content-type, it would as for a treatment or be treated as a file download.
I had frequently used jsp or jspx to generate dynamically generated charts or spreadsheets. No stored files involved. The response is written to while the request is being read. Let's look at the jsp page directive to set the content-type to invoke MS Excel on a CSV.
<%# page language="java" contentType="application/vnd-ms-excel; charset=UTF-8"
pageEncoding="UTF-8"%>
For a servlet, ServletResponse.setContentType(String)
is method to set the content-type.