Windows Phone 7.5 XAP Package Size requirement - silverlight

Windows Phone 7.1/7.5/Mango Silverlight App.
Coming from here: Need clarification on using audio on button click, background audio, etc in Windows Phone
Our designer will be converting all the mp3s to .wav files. Done few and they are coming to about 200kb each.
The current estimate is we might have like 100+ of those for our app.
I know the Certification Requirment is:
The maximum size of the XAP package file is 225 MB.
Designer said he will try to compress them down to about 100kb making sure sound quality is ok as well.
Though I am sure we won't exceed 225MB but I think lesser is better as it will affect the download time on the device as well. Don't want the user to quit download halfway.
I read somewhere there is some time restiction as well for certification.
Is this acceptable, or am I missing any other strategies for keeping my audio files small other than compression? Are there any other considerations I need to take into account when certifying a large app?

Keep in mind that the number of video files and resources together shouldn't exceed 2000 files (Plus the size requirement of course). I had a lot of issues in my experience with submitting xap packages that contains a lot of files. The last app was a video dictionary that contains more than 2000 video files all with tiny size but that didn't work well, though the size was just 90 Mega bytes, the responses from the support are slow and we had to wait each time to finally find that we had to respect this rule which is not documented

IMO, download times don't largely affect conversion rates, because they download in the background. I'll frequently download a few apps, then check back on them the next day or so.

Related

Storing Serialized Video files to SQL Server

I currently am faced with a need to host 20 small video files for my website. I know I could just host them with my project in a folder but I came a crossed this article.
http://www.kindblad.com/2008/04/how-to-store-files-in-ms-sql-server.html
The thought of storing the file in the db had not occurred to me. My question is would there be a performance increase or decrease by storing the files as bit data in the db versus just streaming the data. I like the idea of having the data in the db for portability and having control and who gets access to the videos. Thanks in advance.
Unless you have a pressing need to store them in a database, I wouldn't, personally. You can still control who gets access to which files by using a handler to validate access to the file. One big problem that the method in that article has is that it doesn't support reading a byte range - so if someone wants to seek to the middle of a video, for example, they would have to wait for the whole thing to download. You'd want it to be able to support the range header, as described in this question.

Large File Advice

I have to deliver 2-ish GB media files to customers (zipped up) after purchase. Any advice on how to deliver such big files to the general population (translated: novice internet users who will not be savvy enough to use FTP or something).
We can build a download manager for Windows users, but I doubt we'll be able to get one for Mac/Linux. Is there a standard solution I don't know about?
Thanks!
For most users on a high speed internet connection, novice or not, a direct HTTP download link is likely sufficient. Just be sure that your HTTP responses for both HEAD and GET return the Content-Length header so that users get an accurate progress bar for their download.
In my opinion, the only other reasonable option for novices is probably a download manager. You could of course build your own (possibly using a product like Real Basic to quickly code for all 3 platforms)
There are a number of companies out there that have off the shelf "download assistants" as well. May want to take a look at what companies like Adobe are using for their software downloads.
EDIT: Turns out Adobe uses a custom AIR application for their "download assistant" which is a cross platform option as well.
I'd say have them as a .torrent file. That way people can continue where they left off, and don't have to start over. You can divide the file into a bunch of rar's or .r01-.r20 and it'll help with distribution. THe bottom line is you don't want people to keep having to start over, this can be frustrating. With a .torrent is viable, especially if you don't want to use FTP.
Windows doesn't have a built in .torrent handler, but I'm sure Linux does. OS X I'm not sure about either.

Options for upload files bigger than 2Gb using web browser

Good day!
I'm looking for options on uploading really big files (over 2Gb) using web browsers. I know that Java applet solutions will work, I know (and have tested by myself) that Flash has internal limitation about 2Gb. What about Silverlight? Have I missed some way\technology of doing this?
Thanks in advance!
To my knowledge upload in .NET 4 is limited to "2097151" (2 GB). It could be set in web.config
<system.web>
<httpRuntime maxRequestLength="2097151" />
</system.web>
OK, so there's another idea: you can upload files in chunks.
There's a project on codeplex that might be of use to you.
(For Flash) Split the file into fixed sized chunks (maybe 10-50 MB each) of byte-arrays in the flash client, not too hard with the ByteArray class.
Now you can upload each chunk and the server can puzzle them together. Another plus to this is that if the client is ever disconnected, the server knows which parts of that file the user has already sent and the user can just continue from almost where he left.
You could even send multiple chunks at once (between 2 and 4, each browser has different max connection count), gaining better network utilization.
You can split the file into parts using 7zip, then upload load the parts as per usual.

Convert pcl to image

I'm communicating with a logic analyzer (HP 1660A) over RS232. I issue a command which tells the analyzer to print screen its display and send it over to the controller (my pc) through serial communication. I'm saving the result (which is usually abut 25kB) to my computer and I would like to view it as a TIFF or other format. The problem is that the response from the analyzer comes in PCL format, therefore suitable to be sent to a printer and printed directly, but not to be opened as an image. I have tried a few PCL to image converters to do the job, I found one which does it properly, however I've used the trial version and I am reluctant to purchase it. I've given you the background of my labour. I would appreciate any kind of help, a reference to the commands in pcl 1 and what should I do in order to extract the data and format it properly from the PCL file. I have no experience with PCL and image processing whatsoever, so please, give me a hand here. Thank you.
P.S. I've obtained the PCL file from the analyzer, both in C# and matlab... I have one slight problem in C# with the serial port control, some images have some uninterpreted characters in the image, when using the above converters. I say all these because I need an algorithm or some indications, no matter the programming language, so please feel free to post.
PCL is complex to read. There are only a handful of tools out there that do a good job of this. We have lots of PCL expertise and still often look to other to supply conversion to PDF and other formats. If the PCL is quite simple, that is, just text, a few fonts, and a graphic or two, a couple of RegEx commands could deal with the extraction of the text and then you could mock up a new document using whatever tools you wish.
Looking at these files in stackoverflow might be tough. If you can get them on an ftp and post a link I can take a quick look and post my findings/thoughts here. The other option is to look to an outside tool. There are a few we've had success with. Our needs are broad so I've settled on one that works the best with many different PCL streams (some PCL coding is better than others). As you are dealing with a known quantity of PCL you may have a few options. Here are a few we've used and had some success with (in order of usefulness to us)
PCLWorks by PageTech (they have a GUI viewer and complete SDK)
VeryPDF PCL Converter (command line tool)
SwiftView
There are others, and even an opensource variant of Ghostscript that handles PCL (we've never had much luck as the PCL we use often contains very custom fonts, symbol sets, and tons of macros which seem to choke it.
GhostPCL
EDIT: Most recently we've been working with LincPDF (http://www.lincolnco.com/). This is also an excellent product with has one big benefit, deployment is simple. Some of the other tools have complex software installations. This solution is very easy for us to deploy as a feature in an application. It's also faster then any tools we've tested to date (at least with the PCL that we generate from our apps which is quite complex as they include specialized fonts and macros).
According to the spec sheet for the HP 1660 (pdf) series can send the TIFF,PCX and postscript.
Wouldn't it be easier to use TIFF?
The project was put on hold for a while, but I would like to offer a complete and usable solution.
#Adrian
You can save the image to a floppy disk, I've done that, saved it as TIFF and everything worked fine. Unfortunately, it sends only PCL through RS232. The idea to save the print screen over serial communication was to avoid using too much the floppy disk, which the device uses in order to boot.
#Douglas
Thank you for your elaborate answer. I'll take a look at the indicated tools, however, my desire is to offer a complete front-end solution, which yields directly the graphic. I've put some files from my tests here in order to see the complexity of the PCL constructions. Do you have any knowledge of a possible API that I could integrate into my application, which can parse the file and interpret the PCL?
Regards,
Cosmin
We capture the serial input via a serial spooler that watches COM1:. It's called SSpool.exe. It redirects the PCL as input to PCLXForm. PCLXForm converts it into any raster format (TIFF, JPG, PDF, BMP, etc.) However, we can also extract the text during the conversion and we can extract individual raster objects from the PCL for re-arrangement in the downstream application. Our pricing model is positioned for licensee's that need to convert up to 50,000 pages of invoices into indexed PDF's per month. However, this type of application normally requires a custom license in order to get our pricing down to the level required. In order to do so, we often have to restrict our product to convert unlimited files, but only up to the 20th page within any one PCL print file. That provides enough page volume and gives us the ability to reduce the pricing per unit. To demo, you would need the PCLTool SDK.

Elegant way to determine total size of website?

is there an elegant way to determine the size of data downloaded from a website -- bearing in mind that not all requests will go to the same domain that you originally visited and that other browsers may in the background be polling at the same time. Ideally i'd like to look at the size of each individual page -- or for a Flash site the total downloaded over time.
I'm looking for some kind of browser plug-in or Fiddler script. I'm not sure Fiddler would work due to the issues pointed out above.
I want to compare sites similar to mine for total filesize - and keep track of my own site also.
Firebug and HttpFox are two Firefox plugin that can be used to determine the size of data downloaded from a website for one single page. While Firebug is a great tool for any web developer, HttpFox is a more specialized plugin to analyze HTTP requests / responses (with relative size).
You can install both and try them out, just be sure to disable the one while enabling the other.
If you need a website wide measurement:
If the website is made of plain HTML and assets (like CSS, images, flash, ...) you can check how big the folder containing the website is on the server (this assumes you can login on the server)
You can mirror the website locally using wget, curl or some GUI based application like Site Sucker and check how big the folder containing the mirror is
If you know the website is huge but you don't know how much, you can estimate its size. i.e. www.mygallery.com has 1000 galleries; each gallery has an average of 20 images loaded; every image is stored in 2 different sizes (thumbnail and full size) an average of for _n_kb / image; ...
Keep in mind that if you download / estimating a dynamic websites, you are dealing with what the website produces, not with the real size of the website on the server. A small PHP script can produce tons of HTML.
Have you tried Firebug for Firefox?
The "Net" panel in Firebug will tell you the size and fetch time of each fetched file, along with the totals.
You can download the entire site and then you will know for sure!
https://www.httrack.com/

Resources