I updated my app to initialize routes in callback after loading the model from database call.
Since doing this my browser (Chrome for dev) caches data very persistently. Any tips on how to force browser to reload with new data or can clear session?
Also concerned about ensuring deployed apps would use latest data.
"clearing cache" is not as easy as it should be. Instead of clearing cache on my browsers, I realized that "touching" the server files cached will actually change the date and time of the source file cached on the server (Tested on Edge, Chrome and Firefox) and most browsers will automatically download the most current fresh copy of whats on your server (code, graphics any multimedia too). I suggest you just copy the most current scripts on the server and "do the touch thing" solution before your program runs, so it will change the date of all your problem files to a most current date and time, then it downloads a fresh copy to your browser:
<?php
touch('/www/control/file1.js');
touch('/www/control/file2.js');
touch('/www/control/file2.js');
?>
then ... the rest of your program...
It took me some time to resolve this issue (as many browsers act differently to different commands, but they all check time of files and compare to your downloaded copy in your browser, if different date and time, will do the refresh), If you can't go the supposed right way, there is always another usable and better solution to it. Best Regards and happy camping. By the way touch(); or alternatives work in many programming languages inclusive in javascript bash sh php and you can include or call them in html.
Related
I have a Java/Spring MVC WebApp using Angular as the Front End. The ui application is deployed as part of the web app in the src>main>webapp folder. The problem is when I make any changes in the CSS or HTML files, the same are not reflected instantaneously. I have tried clearing cache and hard resetting also but to no avail.
I tried running the app in incognito mode too but it does not work.
Please help.
Specific resources can be reloaded individually if you change the date and time on your files on the server. "Clearing cache" is not as easy as it should be. Instead of clearing cache on my browsers, I realized that "touching" the server files cached will actually change the date and time of the source file cached on the server (Tested on Edge, Chrome and Firefox) and most browsers will automatically download the most current fresh copy of whats on your server (code, graphics any multimedia too). I suggest you just copy the most current scripts on the server and "do the touch thing" solution before your program runs, so it will change the date of all your problem files to a most current date and time, then it downloads a fresh copy to your browser:
<?php
touch('/www/sample/file1.css');
touch('/www/sample/file2.css');
touch('/www/sample/file2.css');
?>
then ... the rest of your program...
It took me some time to resolve this issue (as many browsers act differently to different commands, but they all check time of files and compare to your downloaded copy in your browser, if different date and time, will do the refresh), If you can't go the supposed right way, there is always another usable and better solution to it. Best Regards and happy camping. By the way touch(); or alternatives work in many programming languages inclusive in javascript bash sh php and you can include or call them in html.
I'm wondering, when I press "deploy" in the google app engine launcher, how does it sync my changes to the actual instance.... maybe it would be better to ask specific questions :)
1) Does it only upload the delta changes (as opposed to the entire file) for changed files?
2) Does it only upload new files and changed files (i.e. does not copy pre-existing) unchanged files?
3) Does it delete remote files that do not exist locally?
4) Does all of this happen instantaneously for the end user once the app has finished deploying? (i.e. let's say I accidentally uploaded an insecure file that sits on example.com/passwords.txt - if #3 is true, then once I remove it from the local directory and re-deploy it should be gone- but can I be sure it is really gone and not cached on some edge somewhere?)
If you use only the launcher or the appcfg util as opposed to manage your code by means of git, AppEngine will keep only one 'state' of that particular version of your app and will not store any past state. So,
1) Yes, it uploads only deltas, not full files.
2) Yes, only new, modified or deleted files.
3) Yes, it deletes them if you delete locally and deploy. As Ibrahim Arief suggested, it is a good idea to use appcfg so you can prove it to yourself.
4) Here there are some caveats. With your new deploy, your old instances are sent a kill signal, and until it actually gets executed, there is a time span (seconds to minutes) during wich new requests could hit your previous version.
It is also very important the point Port Pleco has made. You have to be careful with caching on static files. If you have a file with Expires or Cache-Control headers, and it is actually served, then it could be cached on various places so the existence of old copies of it, is completely out of your control.
Happy coding!
I'm not a google employee so I don't have guaranteed accurate answers, but I can speak a little about your questions from my experience:
1) From what I've seen, it does upload all files each time
2) See 1, I'm fairly sure everything is uploaded
3) I'm not entirely sure whether it "deletes" the files, but I'm 99% sure that they're inaccessible if they don't exist in your current version. If you want to ensure that a file is inaccessible, then you can deploy your project with a new version number, and switch your app version to the new version in your admin panel. That will force google to use all your most recent files in that new version.
4) From what I've seen, changes that are rendered/executed, like html hardcoded text or controller changes or similar, appear instantly. Static files might be cached, as normal with web development, which means that you might have old versions of files saved on user's machines. You can use a query string on the end of the file name with the version to force an update on that.
For example, if I had a javascript file that I knew I would want to redeploy regularly, I would reference it like this:
<script type="text/javascript src="../javascript/file.js?version=1.2" />
Then just increment the version number manually when I needed to force deployment of the javascript to my users.
We have a project service uploaded in GAE. It is working fine but some time we need to change message string in ini/properties or change any particular image file for that we redeploy the whole application every time.
So as a user point of view. I think there should be a vision to upload a particular component/file from GAE interface.
Use datastore/memcache.
There are no way to update particular file without re-deploying all application files.
FWIW the entire application does not redeploy every time. The appcfg.py tool is smart enough to work out only the files that have changed and push them up, not the entire thing.
Now the problem you might face is that when you redeploy your app, it will result in new instances being started to load the updated files and if you had a lot of in memory state you'd lose it.
I’ve just updated most of my static files but it seems that the old versions of those files are still being served. How long does it usually take for the new versions to be served? Is there any ways to speed that up?
Are you talking on production server?
In my project, usually they are affected immediately. Sometime due to the caching framework ,it keeps old static file served. I'm using Django-nonrel.
If you are using Google Chrome, you can you Inspect Element to see if it has an cache-control header or not.
Also this link will help you to change default_expiration on app engine.
Maybe it gives you some clues
I've found that it's usually immediate but sometimes takes about 15 minutes or so. For css/js many people append a build # to the filenames to break the cache.
I have an application for a huge business, which needs many pages, controls etc. The .xap file easily goes up to 50MB. I notice that every time when I load the page, the .xap file got downloaded to my local. However, my users may use 3G network to connect, so it must be very slow if we downlaod the app everytime they open the page. So I was wondering if there is some way I can do the deployment similar to WPF, which only download to local when the version is changed....
Any other suggestion to improve the loading speed is welcomed.
Thanks a lot
First and for most get your web server caching headers sorted. Typically you open the ClientBin folder in IIS Manager and enter the HTTP Response Header section. Set expiry to something like 1 Day (or if you update during normal working hours set to 15 Minutes). Note just because the content expires doesn't mean it will be re-downloaded but it does mean it'll get cached before being used. The browser will inform the server of the version it currently has if it has expired allow the server to simply respond with "go ahead and use that it hasn't changed since the last time you checked".
For such a large system you should seriously consider dividing the app up into multiple dll projects. Then use the Application Library Caching feature found in the main apps project properties. You need to create the appropriate .extmap.xml files for each of your dlls. Many of the SDK and Toolkit dlls have them already. This results in separate .zip files for these dlls being placed in the ClientBin folder and not incorporated into one large Xap. This allows you separate slow moving / never changing code into a set of zips and more frequently changing business code into another set. When you update the app the you only update the changed zips thus reducing the download burden of a new version. (Note this only works with inbrowser based apps).
In the serverlight project option, check the Reduce XAP size by using application library caching.