So basically I have an http Api and I can't change it to https for business reasons
and an https client that needs to stay https for the payment gateway integration and authentication
the api calls fall down due to CORS and mixed origin, I thought of two solutions
one: proxy the api calls from the client so the client will be https but the api would read its coming from http, not sure if its doable or how can I do it and need help with it
two: wrap the backend with an https api but I think that might slow the web app
which would be better ? if one, how to implement it
Related
I have a react app serving from "https://www.domain1.com" and the backend is served on a different domain "https://www.domain2.com". Api requests made from domain1.com to domain2.com are not simple HTTP requests since we are adding an Authorization header in all the requests. This leads to CORS OPTIONS request being made further increasing the latency of the app by a heavy margin.
1. Is there any way I can merge the 2 domains ?
2. Can I avoid CORS OPTIONS preflight requests for these non-simple HTTP requests if I keep the domains separate ?
Yes, you can avoid CORS Policy option in your backend side not in your frontend. Where the API is developed, CORS Policy must be allowed there. After allowing you can call API which is serving from other domain.
I am working on a application to make it PWA, the issue is it calls an API that is HTTP instead of HTTPS. (all other APIs are in HTTPS) . The problem is service worker does not get registered due to this single HTTP request due to which Add to Home screen pop up doesnot appear. So I was wondering if there is any way to bypass the HTTP request so as to register the service worker. My current flow is as follows:
Clear all the registered service worker on app startup.
Register service worker after that HTTP call.
Please suggest if there is any solution to this. As per the PWA checklist, we cant have a HTTP request. So basically I am looking for a cheat that can be done.
Consider calling the http service from your web server that is secure.
Your server would just act as a middleman in the transaction, forwarding the browser's request to the http service, and returning the result.
In an effort to move a web platform to HTTPS, I ran into a problem. Here is the setup that I am using:
Python Flask-based http server is served by gunicorn
Nginx directs SSL-encrypted HTTPS traffic to the http server
The http server serves a ReactJS page
Independent Python Flask-based API server is served by gunicorn
Now I am making PUT and GET requests from the ReactJS page into the API server. Because the ReactJS page is proxied via HTTPS, of course any request is fully encrypted. Thus, the API server receives encrypted requests.
My question is, how can I query an API server from within an HTTPS ReactJS page?
If I got it right question is "how to override browser security that blocks me from sending requests by http:// from within page hosted at https://".
The answer is: no way to do that.
If it is not possible to enable SSL for API server you may use your own server act as secure proxy. So finally it'd look like ReactJS -> HTTPS -> your server -> HTTP -> API server
UPD similar case sending request from https to http from chrome extension (spoiler: no workarounds available)
I have an App Engine server hosting an AngularJS application that makes CORS requests to some Cloud Endpoints APIs on another App Engine server. As per the $http service documentation I have enabled it to send credentials in cross-domain requests by settinga default header:
$httpProvider.defaults.withCredentials = true;
The front-end server has an associated custom domain with SSL support, and makes requests via HTTPS (so that both the ends are basically HTTPS).
My goal is sending an authentication cookie to the backend in order to manage resource access authorization, but for some reason this cookie never gets sent.
I do see the cookie in the request when the two servers are running locally (frontend: http://localhost:8081, backend: http://localhost:8080), but not when they're deployed.
What am I missing there?
angular http Documetation i follwed
Try adding this header to your HTTP calls {'Content-Type': 'application/x-www-form-urlencoded'}
If I understand correctly, you want to send an authorization cookie over CORS to a different (sub)domain?
To do this, you need to permit CORS requests on the initial page load, use 'withCredentials' as you have detailed as well as have a cookie for the targeted cross domain call. If it's cross domain, you'll have to write the cookie in js code, if it's a sibling subdomain make sure the cookie domain starts with a dot .domain.com, and the cookie will then be shared across all subdomains of the domain.
Localhost can play havoc with this kind of testing because the domain relationships are different (i.e. Not sibling subdomains) - you can try using a local proxy to set up a scenario which maps subdomains to a loop back address.
I have an architecture where I'm using sailsJS (custom node.js implementation) as my api server and an angular front end server. Each app is being served from different domains.
How can I maintain an authenticated session in the sailsJS api app between requests?
Sails has an integrated req.session object for maintaining sessions but it doesn't seem to be working out of the box when the client is being served from another domain.
You need to check two things when doing cross-origin requests from your front-end app to your Sails app:
Make sure your Sails app has CORS enabled; this will ensure that the browser doesn't block the requests for security reasons
Make sure the withCredentials flag is set in your AJAX request; this will ensure that your cookie is sent to Sails and your session is maintained.
With Restangular, you can set withCredentials to be used on all requests by default using setDefaultHttpFields; see this answer for details.