Understanding Silverlight, MEF, on-demand Xap loading and caching - silverlight

I'm currently successfully using MEF to load-on-demand Xaps in my Silverlight 4 application. I am now looking into increasing performance through the use of caching. It is my understanding that MEF uses the WebClient to download the Xap, which in turn will use the browser's downloading mechanism, and is therefore subject to its caching policy.
In my testing, the results I'm seeing are slightly confusing, and results differ between browsers. Obviously, on the first access, with my cache cleared, the on-demand Xaps are requested from the server. And, for the duration of an application session, Xaps are downloaded only this once. All good so far.
However, I was expecting (or at least hoping) Xaps to be cached between browser sessions as well. But no - I observe the following (using fiddler):
Internet Explorer
If I refresh the browser (Ctrl+F5) then the on-demand Xaps are not requested from the server, and are loaded from the local cache. But If I restart the browser, then everything is downloaded again. Boo.
Firefox and Chrome
If I refresh the page (Ctrl+F5) then the on-demand Xaps are requested again from the server - no caching occurs at all. Boo. And obviously, no caching occurs if I restart the browser.
The ideal behavior for me is for the browser, when it needs to load a Xap, to query the server with an If-Modified-Since header to see if a new version exists, and if so download it, and if not, load it from its local cache. But in none of my testing did I see an If-Modified-Since header sent to the server. So my question:
Is there any way to achieve this transparently using MEF? Or another framework? Or do I have to roll my own caching layer using isolated storage (yuck)?
Seems like on-demand Xap loading should go hand in hand with caching, so I'm surprised this doesn't just work out of the box.

OK I figured it out just after I posted this question. I thought I'd share the solution here in case anyone else has the problem:
I was using the built in Visual Studio web server to host my project. It appears that it doesn't support caching at all. But as soon as I hosted my project in IIS, I saw the exact behavior I desired, specifically:
The ideal behavior for me is for the
browser, when it needs to load a Xap,
to query the server with an
If-Modified-Since header to see if a
new version exists, and if so download
it, and if not, load it from its local
cache
In Internet Explorer at least, I can now see it sending If-Modified-Since headers, and receiving the 304 Not Modified response for recently accessed Xaps. Perfect!

Related

How to do performance testing for multiple user using Chrome Dev Tool for angular JS Web Site

I have developed an Angular JS Web Console. Web Console is basically creating, deleting, retrieving and deleting Users.
I want to do its performance testing using Chrome Dev Tool or Jmeter
If I use Jmeter how can I actually monitor the behavior of web console itself because from Jmeter I can only check the response time of API.
If I use chrome dev tool then how can I test it for multiple users against post and get operations.
For Example I have a Scenario that 10 Users are registering or signing in at a time. How can I test this behaviour.
OR
50 Persons are creating or deleting or retrieving a user using a form at a time.
OR
What will be the behavior of web console if 50 users are using web console at a time.
NOTE: Web Console is deployed on server. I want to test it locally and on server as well.
Need help. Thanks in advance!
Server side performance and client-side performance are different beasts so you can break down your performance testing requirements into 2 major parts:
Conduct the required load onto your web console using JMeter HTTP Request samplers. Make sure you configure JMeter properly to handle cookies, cache, headers, embedded resources (scripts, styles, images). See How To Make JMeter Behave More Like A Real Browser article for comprehensive explanation with regards to how to configure JMeter properly. If you need the requests to be fired in exactly the same moment of time also consider Synchronizing Timer
As JMeter neither actually render pages nor executes client-side JavaScript you can check client-side performance using one of below approaches (or any combination)
Using YSlow software
Using aforementioned Chrome Dev Tools
Using WebDriver Sampler (which provides Selenium and JMeter integration) so you will be able to measure page rendering time. If necessary you can add custom scripting logic using Navigation Timing API to get some extended information on page loading events in automated manner

pdf not opening in ie in production evironment with load balancer

I have a silverlight application with browsercontrol that needs to use Acrobat PDF Reader to display PDFs in the browser. I am using Acrobat Reader XI and internet explorer as browser. When the application is in the Stage environment everything works fine. However, when the application is in the Production environment pdf does not load or partially loads and stops. There is no difference between the two environments except production environment uses load balancer. The even weirder thing is that Production work and PDF loads when we hit a specific server URL instead of the load balancer URL. Why is it not loading. Why is this happening and better yet, how do I fix it?
Remember that Silverlight is a client technology. While you downloaded the initial site through the load balancer you live then on the client.
So the question is how do you load the pdf within silverlight?
If this is a direct url not through the load balancer the pdf file will not see the load balancer.
If the acrobat reader opens or not depends on the response mime-type and with pdf's this is a disscussion on its own.
Here you find a good stackoverflow question with an answer:
Proper MIME media type for PDF files
HTH

Chrome desktop application for web based product

Chrome desktop application for web based product. Is this possible in chrome web apps
Product has following items
Angular JS --- Front-end framework
Rails --- JSon Communication
I have created the chrome desktop apps, which will directly open the site with icon. It's more feel like desktop application. In any OS it will run. The thing is working fine.
Problem:
It will always download the js and css files.
How i want to develop the chrome desktop apps
When launching the chrome desktop app, save all the assets locally.
Whenever chrome desktop app launched, it should refer the locally saved assets (I mean angular js files and css)
Before launching the chrome desktop app, it should request the server whether the assets are changed or not. If changed delete the locally saved files and save the latest one.
If assets are not changed use the old assets files. In this way, we can avoid the initial loading of all the files from the server.
Anybody did it previously or chrome provide any options for this?
Ideas are welcome!
Its totaly possible.
Read these docs: https://developer.chrome.com/apps/offline_apps
By myself i pack the css and javascript into the chrome app so you never have to download them on startup. But in your context its more like a webview app with caching functions.
You can use indexeddb or other local storage APIs to store assets in the client computer.
You can solve this on the web app side by employing ApplicationCache, which specifically fits what you describe.
Using the cache interface gives your application three advantages:
Offline browsing - users can navigate your full site when they're offline
Speed - resources come straight from disk, no trip to the network.
Resilience - if your site goes down for "maintenance" (as in, someone accidentally breaks everything), your users will get the offline experience
The Application Cache (or AppCache) allows a developer to specify which files the browser should cache and make available to offline users. Your app will load and work correctly, even if the user presses the refresh button while they're offline.
While it is primarily an offline-fallback technique, it allows you to cache resources locally just for speedup purposes. Actually having an offline fallback is a bonus in this case.
Actually building a Chrome app for this will probably not help - you cannot update local resources from your web app side, you'll need to update your app through WebStore channels only.

Angular/Breeze app connecting to Web API throws "Access Denied" error on IE11

The Application
In short, the application surfaces data from a backend onto a web page. The client application is a Single Page Application made using AngularJS. It calls an ASP.NET Web API service located on a different domain to retrieve the data. I utilize BreezeJS on both the client application and the web service to manage this data. The client is hosted on a SharePoint Online site, and the service is hosted on IIS7.
The Error
I encounter an "Access is Denied" error when AngularJS attempts to make an XMLHttpRequest.open() call targeting this web service. However, though this seems like a simple CORS issue, it has some peculiarities which have me stumped:
Foremost, this error only occurs when using Internet Explorer 11 (or a previous version). When the application is viewed in Chrome or Firefox, it can connect and retrieve data from the service.
The web service is configured to accept the calling origin.
No network traffic is detected either by the native IE development tools or by Fiddler.
Specifically, the line: xhr.open(method, url, true); of the angular.js file throws this error.
Does anyone have any insight as to why this error occurs?
Also, if more specific details are needed I certainly can provide them. I'm not even sure where the issue might be coming from and I don't want to dump tons of irrelevant lines of code.
I think that this is due to using "localHost", i.e when attempting to access a resource in the "Local intranet" zone from an origin in the "Internet" zone.
See: Access denied in IE 10 and 11 when ajax target is localhost
This is not just a localhost issue as previously suggested. I have a production AngularJS application that is trying to POST to a public Web API 2 on a different domain. IE11, Chrome, and Firefox work without a hitch when the site is accessed externally. When accessing from a subnet that can talk directly to those servers, I get Access Denied and IE doesn't even send the request (Chrome and Firefox work flawlessly, of course. One workaround (I refuse to call this a fix) is to add the site as a Trusted Site in IE11. Even when the security settings for Internet mirror the settings for Trusted Site, I get access denied. I have to add the site on each internal IE system to gain access.

ajax with manifest not workign

I have a strange problem with my web app. It is app which loads data from database and than it can work offline (HTML5 database feature).
But when I added manifest file to made offline mode more powerful, ajax calls for loading data from server to client does't working anymore.
Does anybody know what can be reason for that? Does adding manifest file make ajax calls unusable?
nest
You likely need to include any server-side AJAX files in the NETWORK section of your cache manifest. This should let your AJAX calls work properly. Note however that this will ONLY work when the user is online. If they're running the application from cache and are offline, any AJAX calls will fail since the server is unavailable.

Resources