I'm trying to cache Silverlight XAP files using modern browsers' localStorage - not Silverlight Isolated Storage - to save download time.
Any ideas?
XAP files are automatically cached by the browser and (assuming you don't clear the cache) not downloaded again when you next run the application.
You can improve download speed by checking the "Reduce XAP size by using application library caching" option in the project settings and making sure that any third party assemblies you use have extmap files. This will create a separate XAP file for each project in your solution and put the 3rd party assemblies into zip files shared by all XAP files.
This should a) reduce the size of each XAP file your application needs and b) only require updated assemblies and/or projects to be downloaded next time your application runs.
Related
I'm actually working on an AngularJS app which consume a lot of independant resources.
The code and the resources are versioned with Git, and the resources (images, html, json...) are organized in module by theme.
Here is my problem : The resources use a lot of disk space in our Git repo.
So do you know any free versioning tool more efficient than git to store this type of file ?
Thanks
Most text based files (HTML, json) should be fine with git. They won't take up too much space. For binary files, such as images, that are changed often you might want to consider using Git Large File Storage. This should limit the space used by old revisions of binary files to the Large File Storage itself, not the development machines.
How can i include txt files and access to them within play framework? I need to load some text from .txt files depending on user request. I'm used to access files from inside jars and thinking in deploying the web as a runnable jar.
You can do exactly the same thing in Play. It doesn't matter whether your application is packaged as a jar file or not.
The resource file just needs to be on the CLASSPATH.
When working with Google Cloud Endpoints in a appengine project (eclipse based), some files that describe the api are automatically generated every time the endpoint is edited and for every version.
The files are *-v1.api, *-v1-rest.discovery and *-v1-rpc.discovery (the version number may change) and are placed in WEB-INF.
¿Should these files be committed to source control?
My impression is that if the files are automatically generated, they will always be available and there is no need to track them.
Even if I add more versions of the endpoint in the future, I need to keep all those versions for backwards compatibility so all .api and .discovery files will also be generated for all the versions.
Personally, I don't version control (or even worry about backing up) any generated files. I only worry about source and published binaries. And in theory you don't need the binary either because you should be able to recreate the binary from those source files.
We have an net 4.0 winforms application that we publish with clickonce to the client pc's. The installation is about 80 MB. The application is offline available and the update occurs in the startup of the app using
ApplicationDeployment.CurrentDeployment.Update
Each time we do an update of the application everything works fine and each client gets udpated. However the application cache keeps growing in size... We noticed that more then two versions are kept in the LocalAppData folder. The size of the clickonce installation folder is more then 1GB.
ClearOnlineAppCache works only for online applications and we don't find any information to clean the LocalAppData for offline application.
Is there any way to manage previous versions of our application in the LocalAppData folder from our client pc's?
Update:
We removed our custom update code and used the update mechanism of the Clickonce framework. Now old versions are removed properly and only two versions are kept in LocalAppData. I have still no idea why all versions are kept when we update through the custom update code.
I've seen this issue before, but I clarified with the ClickOnce lead at Microsoft before answering.
It keeps two versions of the deployment plus there are extra folders for each assembly. When processing an update, ClickOnce figures out which files have changed by comparing against the assembly it has already cached, and it only downloads the ones that have changed. The deployment folders have hard links to the assemblies in the separate folders. So you might see additional files, but it's not actually the file, it's a link to the files in the assembly-only folders. Explorer will show it as a file, but it's not. So unless you're running out of disk space and are just concerned about the folder size, be aware that the information reported by Windows Explorer may not be accurate.
There is an answer to this problem here
I wrote a function to clean old ClickOnce versions in the client side.
In my machine I've freed 6Gb of space. I don't want to even know the total space used by old versions org wide...
I've noticed that when deploying a XAP to my device via Visual Studio 2010 that it does not clear/erase the Isolated Storage for that application. However, when using the stand-alone Application Deployment program, it does erase the Isolated Storage.
The reason why I ask, is that I'm using dotfuscator & runtime intelligence, so I need to build, dotfuscate, then deploy with AppDep, but then my application data is gone. I realize that I could get around this by setting up the dotfuscator to run via command line in the post-build scripts, and then deploy w/ VS, but for now let's assume that I don't want to do that.
Does anybody know how to deploy a XAP the way VS 2010 does it, so that it doesn't erase Isolated Storage? I'm hoping there's a command line program I can run.
AFAIK there is no way around this. The deployment tool first uninstalls and then re-installs the app, so isolated storage gets wiped. The same is true when deploying from Visual Studio after selecting the Rebuild All or Clean Build -> Build options.
There are a couple of ways around this:
Use Isolated Storage Explorer, this will let you browse, upload and download files to isolated storage.
Setup a WCF / web service service to connect to and transfer files to and from the app
Write a bunch of initialization code in the app that can be triggered to create all the files you need
I've generally been checking for DEBUG and then running a method that sets up my test data, etc.
ie-
#if DEBUG
SetupTestData();
#endif
//load like normal now that test data's set up.
To make this really easy, on occasion, I've also just used the app to generate the data I want to use from then on, then Debug.WriteLine(...) my XML or similar method to dump it.