I am running a NextJS application, and I observed that JavaScript resources are not getting gzipped on mobile (on desktop it works fine). On further debugging, I found that this is because the Request Headers in mobile are not sending the Accept-Encoding header. These JS files are basically chunks (bundles of JS code) downloaded using a <script> tag. Now I understand that there is no way to add Request Headers to a <script> tag, but I can see that the Accept-Encoding headers are present when running the app in desktop resolution.
So I want to understand where are these Request Headers (specifically Accept-Encoding) added to outgoing JavaScript file requests? Is it automatically added by the browser when making a request, or added by NextJS, or WebPack (embedding it when sending the html page for the first time to client)? And how can I fix this issue? Note that I observe this behaviour even on my local machine running on localhost:3000.
That header is set by the client and it is used to tell the server what kind of content encoding ( compressions algorithm ) it can understand, so basically the client will get the gzipped files if it declares that it can accept them, and the server is able to serve them (nextjs has gzip compression on by default).
Mdn Docs Accept-Encoding
Next.js Docs gzip
Related
I made a simple weather app using react. It was working well. But when I hosted it on Github, it gives the error and is not working properly. Do someone know how to resolve this error?
Mixed Content: The page at '' was loaded over HTTPS, but requested an insecure resource ''. This request has been blocked; the content must be served over HTTPS.
my errors
2nd screenshot
To fix the error you need to make sure that all external resources are all running over HTTPS.
So in this instance, you just need to change the openweathermap.com resource to load from https rather than http. If you leave it as http, then it just won't get loaded and then you'll end up with a ton of js errors as the code wasn't loaded properly.
use https instead of http as the protocol of your base url
I'm using ReactJs and Axios to send API requests to my server but I keep getting the same error:
Failed to load http://***: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:3000' is therefore not allowed access. The response had HTTP status code 400.
I'm trying to perform a POST request. I've also tried to download a Chrome Plugin to allow CORS. It did work for the GET requests, but not working for POST it looks like.
If I try to make requests to https://jsonplaceholder.typicode.com/users it's working fine. So I guess there's something wrong with the server.
My server is using Nginx and is on a CentOS 7 OS.
Q: How is it possible to enable CORS just for my local development (localhost) or specific websites?
EDIT: I have already tried using this config on my Nginx server - without luck: https://enable-cors.org/server_nginx.html
While I can't answer you with some specific code, here's what happens (at least what happened last time I tried Angular and had similar issues):
Before sending any further requests, there'll be a header only OPTIONS HTTP request sent to the server URL. When answering this call, the server is supposed to send a header field Access-Control-Allow-Origin containing a whitelist of domains allowed to do further calls using the API. To whitelist all requests in your dev environment it should be enough to set Nginx to answer with Access-Control-Allow-Origin: *.
For development I use the Firefox extension CORS Everywhere. It modifies all web traffic, to include the correct CORS headers. (It works at least with the somewhat dated Firefox in Opensuse 42.3.)
https://addons.mozilla.org/en-US/firefox/addon/cors-everywhere/
Note, that this subverts a security mechanism of the browser.
For deployment you must configure the server, to send the correct CORS headers. (I did never do this, the finished website is planned to work in a single IP.)
If you have access to the server and the server is using Nodejs, this should work for you:
CD into your server folder:
cd server-folder
Then run this command in order to install the 'cors' package:
npm install cors
In order to access this package, go into your server file in your IDE and in the next available line:
const cors = require('cors');
Next, add this line to use the middleware (assuming you are using Express):
app.use(cors());
The users of our website run our Chrome plugin which, amongst other things, performs cross-origin requests via XMLHttpRequest as described on the Chrome extension development pages. This has been running just fine for a few years now. However, ever since our users upgraded to the latest version of Chrome (v38), these requests have failed. Our site runs on HTTPS and some of the URLs loaded via our content script are on HTTP. The message is:
[blocked] The page at 'https://www.ourpage.com/' was loaded over
HTTPS, but ran insecure content from 'http://www.externalpage.com':
this content should also be loaded over HTTPS.
The reported line where the error occurred is in the content script where I'm issuing the HTTP call:
xhr.send(null);
I have no control over the external page and I would rather not remove SSL from our own page. Question: Is this a bug or is there a workaround that I am not aware of?
(Note: The permissions in the manifest were always set to <all_urls> which had worked for a long time. Setting it to http://*/ and https://*/ did not help.)
If possible, use the https version of that external page.
If that is not possible, use the background page to handle the AJAX request (example).
I am using Django for Rest API and Angularjs for front-end.
I have set the header in Django to give API access to a domain (www.example.com).
The code is working fine, I am able to make ajax calls, from my system (OS - MAC, Browser : Firefox, Chrome, Safari, Android chrome, Native browser). Almost everything.
Now suddenly I am getting this error, on specific OS:
XMLHttpRequest cannot load http://www.apicalls.in/.
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://www.example.com' is therefore not allowed access.
I am getting this problem on Windows 10 Chrome browser (Both OS have same chrome version - 46.x.x.x). I am unable to understand this behavior since the API is working fine with all the other devices with same browser. Anyone faced this problem before?
If your server is configured correctly (I mean, it respects the CORS specification), then it may be a cache problem.
If you have retrieved http://www.apicalls.in/ in your browser before your ajax, (That is to say, you have triggered a GET request to http://www.apicalls.in/ with no Origin in your request header. )
then your server would serve the web page with no Access-Control-Allow-Origin header in response. Your next ajax request to the same URL would hit the browser cache and blocked by the browser because of the same origin restriction.
To fix this, you could add a random param to your request url (e.g. http://www.apicalls.in/?_=123) or open up the developer tool of chrome and checked on disable cache. Good luck :)
In the APP Engine API, it is mentioned that, if the request comes with "Accept-Encoding" set, then it will automatically compress the response.
But when I look at the request, the header is not there. but at the browser, it is set. when I try to explicitly set the header(with JQuery ajax function), there is a message:
Refused to set unsafe header "Accept-Encoding"
But this situation is not occurring when working in local host - request has the "Accept-Encoding" header. this happens only after publishing. but not allowing to set the "Accept-Encoding" explicitly happens always.
I searched everywhere, but couldn't find a explanation to the problem. It would be really helpful if someone can explain...
You have two different problems:
App Engine does not compress reply. GAE uses a number of factors to determine if response needs to be compressed. It takes content type and user agent into account when deciding. See the answer by Nick Johnson (from GAE team).
jQuery refuses to set "Accept-Encoding" header. Note that this is a jQuery issue and has nothing to do with GAE. See this: Is it possible to force jQuery to make AJAX calls for URLs with gzip/deflate enabled?
I have a similar problem as in the HTTPRequest header, "Accept-Encoding" is null. As GAE has explained it looks for Accept-Encoding and User-Agent headers, if it wants to compress. but in my case there is no way the GAE can recognize whether to compress.
From the browser, then header is set, but in the request header, it is not.