Error trying to transform image < 1 MB in App Engine - google-app-engine

So I know that App Engine prevents working with images greater than 1 MB in size, but I'm getting a RequestTooLargeError when I call images.resize on an jpg that is 400K on disk. The dimensions of the jpg are 1600 x 1200, so is it that app engine can't handle resizing images over 1 megapixel, even if the image file itself is a compressed format that is smaller than 1 MB?

This a is best guess... not an real answer.
Based on what I have read here and in some other threads, it seems like the image api has decompressed your image into a form that is larger than 1 MB and then proceeded to complain about the image that it created.
About the only way to prevent that is to cut your original image into chunks that will not be bigger than 640x520... But that will require some pretty heavy listing on the client side.
Added: This app engine issue regarding image size limits may have some helpful pointers
Added: You can probably leverage the finding that you had in your initial revision of this question... you said that crop worked but resize did not... This will allow you to keep most of the processing on the server-side.
Added: Another thread about the effects of small JPG that transforms into a larger image

The image API would not raise a RequestTooLargeError (see exceptions related to image API here). It would indicate that your total request size is too big.
What other data are you sending over with the request? Although it probably would be hard to push the total request over 10MB (which is the maximum request size) if it's a fairly simple request (i.e. just uploading a single image).

Related

Storing Images on a Database vs Fileserver vs Zip file on Server

I am creating a simple system where users can view a small image by entering the name of that image into a search box. I am having trouble deciding how to store the images in the most efficient way. I have thought of 3 solutions, and I am wondering which one is the best solution for this problem:
Storing the Images as blobs or base64 string in a Database and Loading the Image based on the user input with a simple query. Doing it this way will increase the load on the database and will result in longer load times.
Storing the Images as separate files on a file server. And just loading it by assigning the image.src attribute based on the user input: image.src = "./server/images/" + userInput; Doing it this way however will increase the number of file requests on my server, so it will be more expensive.
Or lastly, I could store the Images in a single zip file on the fileserver. And download the all at once at the start of the program. The advantage of this, is that there will only be a single request when loading the page. However it will take some time to load all the files.
Note that each image is around 1-3KB in size. And all images will be placed on the server/db manually. Furthermore, there will only be around 100-200 Images max. So all these considerations may not matter too much. But I want to know what the recommended way of doing this is.
NOTE: The server I am running is an AWS server, I I found that having too many requests will increase the cost of the server. This is why I am sceptical about approach nr. 2
I too, manage stored images and retrieve from a AWS, EC2. My solution, and suggestion to you is similar to option 2, but adding caches as a way to reduce server requests.
Keep the images in a folder, or better in a S3 storage, and call by name from either a database query that holds the URL or just the image name. Then place it inside a placeholder in HTML.
Select url from img.images where image_name='blue_ocean'
then I bind it to a placeholder in HTML
<img src="/images/< blue_ocean>.jpg alt="">
About many request to the server, you can cache images, I suggest the use of Service Workers a powerful WebApi which allow to cache images, and therefore reduce the amount of data served.
Another approach is to use Sprites, which is a one file or image sheet that contains all the requested images, so instead of many requests, just one request, then grab each required image by parsing it's X,Y, coordinates. This method is very efficient, is used in games, in order to reduce the overhead derived of requesting multiple images in short spans of time, multiple times.

Uploading Large Amounts of Data from C# Windows Service to Azure Blobs

Can someone please point me in the right direction.
I need to create a windows timer service that will upload files in the local file system to Azure blobs.
Each file (video) may be anywhere between 2GB and 16GB. Is there a limit on the size? Do I need to split the file?
Because the files are very large can I throttle the upload speed to azure?
Is it possible in another application (WPF) to see the progress of the uploaded file? i.e. a progress bar and how much data has been transferred and what speed it is transferring at?
The upper limit for a block blob, the type you want here, is 200GB. Page blobs, used for VHDs, can go up to 1TB.
Block blobs are so called because upload is a two-step process - upload a set of blocks and then commit that block list. Client APIs can hide some of this complexity. Since you want to control the uploads and keep track of their status you should look at uploading the files in blocks - the maximum size of which is 4MB - and manage that flow and success as desired. At the end of the upload you commit the block list.
Kevin Williamson, who has done a number of spectacular blog posts, has a post showing how to do "Asynchronous Parallel Blob Transfers with Progress Change Notification 2.0."

Windows Phone 7.5 XAP Package Size requirement

Windows Phone 7.1/7.5/Mango Silverlight App.
Coming from here: Need clarification on using audio on button click, background audio, etc in Windows Phone
Our designer will be converting all the mp3s to .wav files. Done few and they are coming to about 200kb each.
The current estimate is we might have like 100+ of those for our app.
I know the Certification Requirment is:
The maximum size of the XAP package file is 225 MB.
Designer said he will try to compress them down to about 100kb making sure sound quality is ok as well.
Though I am sure we won't exceed 225MB but I think lesser is better as it will affect the download time on the device as well. Don't want the user to quit download halfway.
I read somewhere there is some time restiction as well for certification.
Is this acceptable, or am I missing any other strategies for keeping my audio files small other than compression? Are there any other considerations I need to take into account when certifying a large app?
Keep in mind that the number of video files and resources together shouldn't exceed 2000 files (Plus the size requirement of course). I had a lot of issues in my experience with submitting xap packages that contains a lot of files. The last app was a video dictionary that contains more than 2000 video files all with tiny size but that didn't work well, though the size was just 90 Mega bytes, the responses from the support are slow and we had to wait each time to finally find that we had to respect this rule which is not documented
IMO, download times don't largely affect conversion rates, because they download in the background. I'll frequently download a few apps, then check back on them the next day or so.

Google App Engine Large File Upload

I am trying to upload data to Google App Engine (using GWT). I am using the FileUploader widget and the servlet uses an InputStream to read the data and insert directly to the datastore. Running it locally, I can upload large files successfully, but when I deploy it to GAE, I am limited by the 30 second request time. Is there any way around this? Or is there any way that I can split the file into smaller chunks and send the smaller chunks?
By using the BlobStore you have a 1 GB size limit and a special handler, called unsurprisingly BlobstoreUpload Handler that shouldn't give you timeout problems on upload.
Also check out http://demofileuploadgae.appspot.com/ (sourcecode, source answer) which does exactly what you are asking.
Also, check out the rest of GWT-Examples.
Currently, GAE imposes a limit of 10 MB on file upload (and response size) as well as 1 MB limits on many other things; so even if you had a network connection fast enough to pump up more than 10 MB within a 30 secs window, that would be to no avail. Google has said (I heard Guido van Rossum mention that yesterday here at Pycon Italia Tre) that it has plans to overcome these limitations in the future (at least for users of GAE which pay per-use to exceed quotas -- not sure whether the plans extend to users of GAE who are not paying, and generally need to accept smaller quotas to get their free use of GAE).
you would need to do the upload to another server - i believe that the 30 second timeout cannot be worked around. If there is a way, please correct me! I'd love to know how!
If your request is running out of request time, there is little you can do. Maybe your files are too big and you will need to chunk them on the client (with something like Flash or Java or an upload framework like pupload).
Once you get the file to the application there is another issue - the datastore limitations. Here you have two options:
you can use the BlobStore service which has quite nice API for handling up 50megabytes large uploads
you can use something like bigblobae which can store virtually unlimited size blobs in the regular appengine datastore.
The 30 second response time limit only applies to code execution. So the uploading of the actual file as part of the request body is excluded from that. The timer will only start once the request is fully sent to the server by the client, and your code starts handling the submitted request. Hence it doesn't matter how slow your client's connection is.
Uploading file on Google App Engine using Datastore and 30 sec response time limitation
The closest you could get would be to split it into chunks as you store it in GAE and then when you download it, piece it together by issuing separate AJAX requests.
I would agree with chunking data to smaller Blobs and have two tables, one contains th metadata (filename, size, num of downloads, ...etc) and other contains chunks, these chunks are associated with the metadata table by a foreign key, I think it is doable...
Or when you upload all the chunks you can simply put them together in one blob having one table.
But the problem is, you will need a thick client to serve chunking-data, like a Java Applet, which needs to be signed and trusted by your clients so it can access the local file-system

Elegant way to determine total size of website?

is there an elegant way to determine the size of data downloaded from a website -- bearing in mind that not all requests will go to the same domain that you originally visited and that other browsers may in the background be polling at the same time. Ideally i'd like to look at the size of each individual page -- or for a Flash site the total downloaded over time.
I'm looking for some kind of browser plug-in or Fiddler script. I'm not sure Fiddler would work due to the issues pointed out above.
I want to compare sites similar to mine for total filesize - and keep track of my own site also.
Firebug and HttpFox are two Firefox plugin that can be used to determine the size of data downloaded from a website for one single page. While Firebug is a great tool for any web developer, HttpFox is a more specialized plugin to analyze HTTP requests / responses (with relative size).
You can install both and try them out, just be sure to disable the one while enabling the other.
If you need a website wide measurement:
If the website is made of plain HTML and assets (like CSS, images, flash, ...) you can check how big the folder containing the website is on the server (this assumes you can login on the server)
You can mirror the website locally using wget, curl or some GUI based application like Site Sucker and check how big the folder containing the mirror is
If you know the website is huge but you don't know how much, you can estimate its size. i.e. www.mygallery.com has 1000 galleries; each gallery has an average of 20 images loaded; every image is stored in 2 different sizes (thumbnail and full size) an average of for _n_kb / image; ...
Keep in mind that if you download / estimating a dynamic websites, you are dealing with what the website produces, not with the real size of the website on the server. A small PHP script can produce tons of HTML.
Have you tried Firebug for Firefox?
The "Net" panel in Firebug will tell you the size and fetch time of each fetched file, along with the totals.
You can download the entire site and then you will know for sure!
https://www.httrack.com/

Resources