Please look at this page
(https)[url]/cart
and compare with
(http)[url]/cart
You will notice the console errors on the secure version and there are no console errors in the insecure version. I am using Angular custom directives they are not resolving over the secure protocol. My assumption would be there is a resource which is being blocked?? However I cannot find this resource... any ideas?
I have also tested this locally with a self signed certificate and it works fine.
The problem is this partial is entirely different between your http and https server:
/partials/bonuses.html
The https version of bonuses is just a page with a script that sets the window.location to the non-https version. As far as I know, you cannot put an html tag as a directive's template, and that's why angular is throwing up errors. It wouldn't make any sense to do that anyway.
You need to make sure that the https server serves the full bonuses.html partial, and not just a redirect to http
Related
I made a simple weather app using react. It was working well. But when I hosted it on Github, it gives the error and is not working properly. Do someone know how to resolve this error?
Mixed Content: The page at '' was loaded over HTTPS, but requested an insecure resource ''. This request has been blocked; the content must be served over HTTPS.
my errors
2nd screenshot
To fix the error you need to make sure that all external resources are all running over HTTPS.
So in this instance, you just need to change the openweathermap.com resource to load from https rather than http. If you leave it as http, then it just won't get loaded and then you'll end up with a ton of js errors as the code wasn't loaded properly.
use https instead of http as the protocol of your base url
When I make an $http.post request and set the "withCredentials" property to true.
My request works fine in Chrome and Fiefox. However, I'm getting the error below in IE:
XMLHttpRequest: Network Error 0x80070005, Access is denied.
I noticed that if I enable the "Access data resources across domains" setting in IE, The error gets resolved. However I need to find an alternative solution because I can't ask the users to enable that setting obviously.
I noticed that a $http.get request to the same domain is working in IE with no issue, the issue is only with the $http.post request, the Options request is getting a 500 internal server and I see the request and response headers below:
Note:
I do have the necessary custom headers, and I can see them in Chrome when the OPTIONS request succeeds. The headers that I see in Chrome are listed below:
Could you please let me know if I'm missing something that would make the request work in IE without having to enable Access data sources across domains?
Internet Explorer 9 doesn't support cookies in CORS requests. The withCredentials property of the $http arguments attempts to send cookies. I don't think there's any way to fix it with headers. IE10+ should work by default, just be sure that you are not in compatibility mode. CORS isn't fully implemented in IE10 either, but the type of request you are trying to do should work.
You didn't mention what the nature of your web app is, but it impacts the type of workaround you will need for IE9. If possible, see if you can refactor your code to use a GET request instead (again, I don't know what you are trying to do via AJAX so this may be impossible).
You may be able to use Modernizr or something similar to detect if the browser supports CORS. If it is not supported, send the request without AJAX and have a page refresh.
Another alternative if you really want to use AJAX is to set up a proxy on your web server, i.e. the server on the same domain. Instead of making the cross-origin request directly, you make the AJAX request to your same-origin server, which then makes the request to the cross-origin server for you. The server won't have CORS issues. This solution assumes, of course, that you have some server-side scripting going on such as PHP, Node or Java.
The users of our website run our Chrome plugin which, amongst other things, performs cross-origin requests via XMLHttpRequest as described on the Chrome extension development pages. This has been running just fine for a few years now. However, ever since our users upgraded to the latest version of Chrome (v38), these requests have failed. Our site runs on HTTPS and some of the URLs loaded via our content script are on HTTP. The message is:
[blocked] The page at 'https://www.ourpage.com/' was loaded over
HTTPS, but ran insecure content from 'http://www.externalpage.com':
this content should also be loaded over HTTPS.
The reported line where the error occurred is in the content script where I'm issuing the HTTP call:
xhr.send(null);
I have no control over the external page and I would rather not remove SSL from our own page. Question: Is this a bug or is there a workaround that I am not aware of?
(Note: The permissions in the manifest were always set to <all_urls> which had worked for a long time. Setting it to http://*/ and https://*/ did not help.)
If possible, use the https version of that external page.
If that is not possible, use the background page to handle the AJAX request (example).
AJAX request from http://localhost:8080 to http://foo.com in AngularJS app was failed because browsers don't let cross-site request and browser say:
"Request header field Content-Type is not allowed by
Access-Control-Allow-Headers."
But there are another ways to solve using JSONP or may be writing extra code on server side but I don't want to do that for development phase.
Is there any alternative ways to make cross-site without any configuration on server side and any writing extra code in development phase (not production)?
There is a pure javascript alternative that uses a proxy to bypass this constraint.
It is called XDomain:
https://github.com/jpillora/xdomain
Step 1: Put this proxy.html file on the root of your server project:
<!DOCTYPE HTML>
<script src="//cdn.rawgit.com/jpillora/xdomain/0.7.3/dist/xdomain.min.js" master="*"></script>
Step 2: Add this script to the client:
<script src="//cdn.rawgit.com/jpillora/xdomain/0.7.3/dist/xdomain.min.js" slave="http://foo.com/proxy.html"></script>
And thats it!
Have fun.
Finally, I found the alternative way is just installing this chrome plugin. No needs to write any extra code. But I don't know how security affected this plugin is.
I'm trying to implement a simple interceptor that allows me to display a message along the lines of "cannot contact the server" in my Angular app. However as the API is on a different host I'm dealing with CORS pre-flight OPTIONS requests.
I've found that if the API is unavailable Chrome dev tools shows a 503 on the OPTIONS request but Angular's $http interceptor catches a 404 response to the subsequent GET request. I believe this is because the OPTIONS response did not contain the required CORS headers so the GET is actually never performed.
Is is possible to intercept the OPTIONS response? If all I see is a 404 I can't distinguish "server down" from "no such resource".
You can't intercept this request by design - the browser is "checking up" on you, making sure YOU should be allowed to make the request.
We've used three solutions to work around this:
If the problem is that you're using a development environment like NodeJS, and your domain names aren't matching (that is, if you normally wouldn't need to deal with this in Production) you can use a proxy. The https://github.com/substack/bouncyBounceJS NodeJS Module is an easy to use option. Then your Web service request domain will match the domain your page is on, and the check won't be triggered. (You can also use tricks like this in Production, although it can be easily abused!)
Also for temporary use, you can use something like Fiddler or Charles to manipulate the request by faking the required headers, or tell your browser not to check them (--disable-web-security in Chrome).
If you have this problem in Production, you either need to legitimately fix it (adjust the Web service handler to add the required headers - there are only two), or find a way to make the request in a way that doesn't trigger the check. For instance, if you control both the source and target domains, you can put a script on the target that makes the requests to itself. Run this in an IFRAME, invisibly. Then you can use things like postMessage() to communicate back and forth. Large services like Facebook use "XHR bridges" like this for the same reason.