How to test if mod_cache works - apache2

Some information:
"Cache-Control: must-revalidate, max-age=300"
This tells mod_cache to cache the page for 300 seconds (max-age) - unfortunately mod_cache doesn't know the s-maxage option (see http://www.mnot.net/cache_docs/#CACHE-CONTROL), that's why we must use the max-age option (which also tells your browser to cache).
If mod_cache knew the s-maxage option, we could use
"Cache-Control: must-revalidate, max-age=0, s-maxage=300"
which would tell mod_cache, but not the browser, to cache the page.
The question:
How do I know if mod_cache is actually memory caching anything at all.

With firebug, you can disable the entire cache, so every css, js, image, video ...etc, it will request to the web server.
So if this mod_cache is working, activate the firebug cache and the files that have that header, only will be requested to the web server every s-maxage time

200 responses will contain an Age header. At least it does on Apache 2.2. I did not see the header on 304 responses.

Related

Understanding formation of Request Headers

I am running a NextJS application, and I observed that JavaScript resources are not getting gzipped on mobile (on desktop it works fine). On further debugging, I found that this is because the Request Headers in mobile are not sending the Accept-Encoding header. These JS files are basically chunks (bundles of JS code) downloaded using a <script> tag. Now I understand that there is no way to add Request Headers to a <script> tag, but I can see that the Accept-Encoding headers are present when running the app in desktop resolution.
So I want to understand where are these Request Headers (specifically Accept-Encoding) added to outgoing JavaScript file requests? Is it automatically added by the browser when making a request, or added by NextJS, or WebPack (embedding it when sending the html page for the first time to client)? And how can I fix this issue? Note that I observe this behaviour even on my local machine running on localhost:3000.
That header is set by the client and it is used to tell the server what kind of content encoding ( compressions algorithm ) it can understand, so basically the client will get the gzipped files if it declares that it can accept them, and the server is able to serve them (nextjs has gzip compression on by default).
Mdn Docs Accept-Encoding
Next.js Docs gzip

Angular, Chrome - all requests are stalled for about 450ms

We have had a angularJS application, then we've updated our FE to angular7.1 and now every request in chrome is stalled for about 450ms, both static files and data requests.
There isn't such an issue on local version of an application
Firefox and IE don't block requests.
Do you have a any idea about this? Thanks.
Was tested with Cache-Control: no-store, no-cache, must-revalidate
Was tested with only one request at a time
Was tested with and w/o proxy
(can't post images)
Old app chrome behavior:
AngularJS behavior image
New app chrome behavior:
Angular7 behavior image
New app IE behavior:
Angular7 IE behavior image
UPDATE:
Even first request is stalled.
Waterfall looks like this:
Waterfall
Please read the following information in the link that #JonathanHamel posted:
A request being queued indicates that:
The request was postponed by the rendering engine because it's considered lower priority than critical resources (such as scripts/styles). This often happens with images.
The request was put on hold to wait for an unavailable TCP socket that's about to free up.
The request was put on hold because the browser only allows six TCP connections per origin on HTTP 1.
Time spent making disk cache entries (typically very quick.)
Additionally, Stalled/Blocking is the time the request spent waiting before it could be sent. It can be waiting for any of the reasons described for Queueing. Additionally, this time is inclusive of any time spent in proxy negotiation.
Please tell us which of these case might apply to you.

'Access control allow origin' Not working on different system

I am using Django for Rest API and Angularjs for front-end.
I have set the header in Django to give API access to a domain (www.example.com).
The code is working fine, I am able to make ajax calls, from my system (OS - MAC, Browser : Firefox, Chrome, Safari, Android chrome, Native browser). Almost everything.
Now suddenly I am getting this error, on specific OS:
XMLHttpRequest cannot load http://www.apicalls.in/.
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://www.example.com' is therefore not allowed access.
I am getting this problem on Windows 10 Chrome browser (Both OS have same chrome version - 46.x.x.x). I am unable to understand this behavior since the API is working fine with all the other devices with same browser. Anyone faced this problem before?
If your server is configured correctly (I mean, it respects the CORS specification), then it may be a cache problem.
If you have retrieved http://www.apicalls.in/ in your browser before your ajax, (That is to say, you have triggered a GET request to http://www.apicalls.in/ with no Origin in your request header. )
then your server would serve the web page with no Access-Control-Allow-Origin header in response. Your next ajax request to the same URL would hit the browser cache and blocked by the browser because of the same origin restriction.
To fix this, you could add a random param to your request url (e.g. http://www.apicalls.in/?_=123) or open up the developer tool of chrome and checked on disable cache. Good luck :)

Google Edge Cache : is it compatible with HTTPS?

After some configuration :
setting response header Cache-control,
deploying app with a custom domain name,
I managed to leverage the server-side Edge Cache of Google Front-End for some HTTP traffic on a sample app.
The cache hits appear in the Logs console as 204, while non-cached responses are 200.
My question is : can I expect the same behavior for a company website which enforces HTTPS ?
I guess it depends how the Google datacenter distibuted architecture works, and where the SSL certificates are stored, but my networking/security skills are limited.
I can confirm to you that the edge caching works for server side requests served over https as well, even though I have no insights on how it works inside the GFE.
I just ran a quick query in the logs of one of our app with a filter set to status:204 and to only see the hits doing 204 on a specific servlet (such as we do not see all the static content):
I do not think there is a way to see that the serving was https or add a query filter on the logs, but I manually verified that some of these are for served over https.
As you mentioned, cache control headers are required to get this working. Here are the cache control headers we set:
Cache-Control: public, max-age=3600
Pragma: Public

App Engine Accept-Encoding

In the APP Engine API, it is mentioned that, if the request comes with "Accept-Encoding" set, then it will automatically compress the response.
But when I look at the request, the header is not there. but at the browser, it is set. when I try to explicitly set the header(with JQuery ajax function), there is a message:
Refused to set unsafe header "Accept-Encoding"
But this situation is not occurring when working in local host - request has the "Accept-Encoding" header. this happens only after publishing. but not allowing to set the "Accept-Encoding" explicitly happens always.
I searched everywhere, but couldn't find a explanation to the problem. It would be really helpful if someone can explain...
You have two different problems:
App Engine does not compress reply. GAE uses a number of factors to determine if response needs to be compressed. It takes content type and user agent into account when deciding. See the answer by Nick Johnson (from GAE team).
jQuery refuses to set "Accept-Encoding" header. Note that this is a jQuery issue and has nothing to do with GAE. See this: Is it possible to force jQuery to make AJAX calls for URLs with gzip/deflate enabled?
I have a similar problem as in the HTTPRequest header, "Accept-Encoding" is null. As GAE has explained it looks for Accept-Encoding and User-Agent headers, if it wants to compress. but in my case there is no way the GAE can recognize whether to compress.
From the browser, then header is set, but in the request header, it is not.

Resources