Options for upload files bigger than 2Gb using web browser - silverlight

Good day!
I'm looking for options on uploading really big files (over 2Gb) using web browsers. I know that Java applet solutions will work, I know (and have tested by myself) that Flash has internal limitation about 2Gb. What about Silverlight? Have I missed some way\technology of doing this?
Thanks in advance!

To my knowledge upload in .NET 4 is limited to "2097151" (2 GB). It could be set in web.config
<system.web>
<httpRuntime maxRequestLength="2097151" />
</system.web>
OK, so there's another idea: you can upload files in chunks.
There's a project on codeplex that might be of use to you.

(For Flash) Split the file into fixed sized chunks (maybe 10-50 MB each) of byte-arrays in the flash client, not too hard with the ByteArray class.
Now you can upload each chunk and the server can puzzle them together. Another plus to this is that if the client is ever disconnected, the server knows which parts of that file the user has already sent and the user can just continue from almost where he left.
You could even send multiple chunks at once (between 2 and 4, each browser has different max connection count), gaining better network utilization.

You can split the file into parts using 7zip, then upload load the parts as per usual.

Related

Laravel Session File gigantic

I am using Laravel 5 for my web application,
since running it for over a week, the session are stored as files with over 9MB file size. Instead of the 1kb it used to be.
The CPU is running at 99% all the time and the server is not responding anymore. What causes this enormous file size and what do i need to do to reduce it?
Thanks!
You can play around with the session settings in config/session.php, specifically the lottery setting might help you out.
You can also switch the session driver, if your system is unable to cope with the files. Depending on what you actually store in your sessions and the size of your application, it might be beneficial to switch to a different session driver. Avaiable options can be found here: http://laravel.com/docs/5.1/session#introduction

Windows Phone 7.5 XAP Package Size requirement

Windows Phone 7.1/7.5/Mango Silverlight App.
Coming from here: Need clarification on using audio on button click, background audio, etc in Windows Phone
Our designer will be converting all the mp3s to .wav files. Done few and they are coming to about 200kb each.
The current estimate is we might have like 100+ of those for our app.
I know the Certification Requirment is:
The maximum size of the XAP package file is 225 MB.
Designer said he will try to compress them down to about 100kb making sure sound quality is ok as well.
Though I am sure we won't exceed 225MB but I think lesser is better as it will affect the download time on the device as well. Don't want the user to quit download halfway.
I read somewhere there is some time restiction as well for certification.
Is this acceptable, or am I missing any other strategies for keeping my audio files small other than compression? Are there any other considerations I need to take into account when certifying a large app?
Keep in mind that the number of video files and resources together shouldn't exceed 2000 files (Plus the size requirement of course). I had a lot of issues in my experience with submitting xap packages that contains a lot of files. The last app was a video dictionary that contains more than 2000 video files all with tiny size but that didn't work well, though the size was just 90 Mega bytes, the responses from the support are slow and we had to wait each time to finally find that we had to respect this rule which is not documented
IMO, download times don't largely affect conversion rates, because they download in the background. I'll frequently download a few apps, then check back on them the next day or so.

Silverlight streaming upload

I have a Silverlight application that needs to upload large files to the server. I've looked at uploading using both WebClient as well a HttpWebRequest, however I don't see an obvious way stream the upload with either option. Do to the size of the files, loading the entire contents into memory before uplaoding is not reasonable. Is this possible in Silverlight?
You could go with a "chunking" approach. The Silverlight File Uploader on Codeplex uses this technique:
http://www.codeplex.com/SilverlightFileUpld
Given a chunk size (e.g. 10k, 20k, 100k, etc), you can split up the file and send each chunk to the server using an HTTP request. The server will need to handle each chunk and re-assemble the file as each chunk arrives. In a web farm scenario when there are multiple web servers - be careful to not use the local file system on the web server for this approach.
It does seem extraordinary that the WebClient in Silverlight fails to provide a means to pump a Stream to the server with progress events. Its especially amazing since this is offered for a string upload!
It is possible to code what would be appear to be doing what you want with a HttpWebRequest.
In the call back for BeginGetRequestStream you can get the Stream for the outgoing request and then read chunks from your file's Stream and write them to the output stream. Unfortunately Silverlight does not start sending the output to the server until the output stream has been closed. Where all this data ends up being stored in the meantime I don't know, its possible that if it gets large enough SL might use a temporary file so as not to stress the machines memory but then again it might just store it all in memory anyway.
The only solution to this that might be possible is to write the HTTP protocol via sockets.

Google App Engine Large File Upload

I am trying to upload data to Google App Engine (using GWT). I am using the FileUploader widget and the servlet uses an InputStream to read the data and insert directly to the datastore. Running it locally, I can upload large files successfully, but when I deploy it to GAE, I am limited by the 30 second request time. Is there any way around this? Or is there any way that I can split the file into smaller chunks and send the smaller chunks?
By using the BlobStore you have a 1 GB size limit and a special handler, called unsurprisingly BlobstoreUpload Handler that shouldn't give you timeout problems on upload.
Also check out http://demofileuploadgae.appspot.com/ (sourcecode, source answer) which does exactly what you are asking.
Also, check out the rest of GWT-Examples.
Currently, GAE imposes a limit of 10 MB on file upload (and response size) as well as 1 MB limits on many other things; so even if you had a network connection fast enough to pump up more than 10 MB within a 30 secs window, that would be to no avail. Google has said (I heard Guido van Rossum mention that yesterday here at Pycon Italia Tre) that it has plans to overcome these limitations in the future (at least for users of GAE which pay per-use to exceed quotas -- not sure whether the plans extend to users of GAE who are not paying, and generally need to accept smaller quotas to get their free use of GAE).
you would need to do the upload to another server - i believe that the 30 second timeout cannot be worked around. If there is a way, please correct me! I'd love to know how!
If your request is running out of request time, there is little you can do. Maybe your files are too big and you will need to chunk them on the client (with something like Flash or Java or an upload framework like pupload).
Once you get the file to the application there is another issue - the datastore limitations. Here you have two options:
you can use the BlobStore service which has quite nice API for handling up 50megabytes large uploads
you can use something like bigblobae which can store virtually unlimited size blobs in the regular appengine datastore.
The 30 second response time limit only applies to code execution. So the uploading of the actual file as part of the request body is excluded from that. The timer will only start once the request is fully sent to the server by the client, and your code starts handling the submitted request. Hence it doesn't matter how slow your client's connection is.
Uploading file on Google App Engine using Datastore and 30 sec response time limitation
The closest you could get would be to split it into chunks as you store it in GAE and then when you download it, piece it together by issuing separate AJAX requests.
I would agree with chunking data to smaller Blobs and have two tables, one contains th metadata (filename, size, num of downloads, ...etc) and other contains chunks, these chunks are associated with the metadata table by a foreign key, I think it is doable...
Or when you upload all the chunks you can simply put them together in one blob having one table.
But the problem is, you will need a thick client to serve chunking-data, like a Java Applet, which needs to be signed and trusted by your clients so it can access the local file-system

Elegant way to determine total size of website?

is there an elegant way to determine the size of data downloaded from a website -- bearing in mind that not all requests will go to the same domain that you originally visited and that other browsers may in the background be polling at the same time. Ideally i'd like to look at the size of each individual page -- or for a Flash site the total downloaded over time.
I'm looking for some kind of browser plug-in or Fiddler script. I'm not sure Fiddler would work due to the issues pointed out above.
I want to compare sites similar to mine for total filesize - and keep track of my own site also.
Firebug and HttpFox are two Firefox plugin that can be used to determine the size of data downloaded from a website for one single page. While Firebug is a great tool for any web developer, HttpFox is a more specialized plugin to analyze HTTP requests / responses (with relative size).
You can install both and try them out, just be sure to disable the one while enabling the other.
If you need a website wide measurement:
If the website is made of plain HTML and assets (like CSS, images, flash, ...) you can check how big the folder containing the website is on the server (this assumes you can login on the server)
You can mirror the website locally using wget, curl or some GUI based application like Site Sucker and check how big the folder containing the mirror is
If you know the website is huge but you don't know how much, you can estimate its size. i.e. www.mygallery.com has 1000 galleries; each gallery has an average of 20 images loaded; every image is stored in 2 different sizes (thumbnail and full size) an average of for _n_kb / image; ...
Keep in mind that if you download / estimating a dynamic websites, you are dealing with what the website produces, not with the real size of the website on the server. A small PHP script can produce tons of HTML.
Have you tried Firebug for Firefox?
The "Net" panel in Firebug will tell you the size and fetch time of each fetched file, along with the totals.
You can download the entire site and then you will know for sure!
https://www.httrack.com/

Resources