Getting images from S3 with a cookie to display on react - reactjs

I'm building an application with express backend and react frontend. The user can upload images and they're stored in a S3 bucket with a privacy policy and served through Cloudfront.
Currently, I'm using signed urls as a two step way to retrieve the images:
React requests the Express server for a signed url for the image
Express responds with the url
React then loads the <img src={signedUrl} /> tag.
This works, but I rather eliminate the signed url step and use something like cookies so the client can directly ask cloudfront for the image: <img src={cfUrl + cookies?} /> I've tested this using postman and I can retrieve the image directly as long as I pass the cookies. So i know my cookies are setup correctly and the content is being served.
However, I'm not sure how i can do it on react with axios... The application is currently on localhost but It will be deployed to heroku.
Server
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors({
origin: ['http://localhost:3000'],
credentials: true,
}));
app.post('/cookies', async (req, res) => {
// Function to generate the signed cookies
const cookies = await awsS3.generateCookies();
Object.entries(cookies).forEach(([key, value]) => {
console.log(key, value);
res.cookie(key, value);
});
res.json([]);
});
const port = process.env.PORT || config.port;
app.listen(port, () => {
console.log(`Express server is running on port: ${port}...`);
});
What i want to do in react:
<img src={image_url} />
However, looking at the network call for the image request, the cookies are not included by default. I don't know if i need to do something special or specific to include the cookies on that image request.

Here is an overview of how CloudFront signed cookies work:
A user signs in to your website and either pays for content or meets some other requirement for access.
Your application returns the Set-Cookie headers in the response, and the viewer stores the name-value pairs.
The user requests a file e.g. an image in your case.
The user's browser or other viewer gets the name-value pairs from step 3 and adds them to the request in a Cookie header. This is the signed cookie.
CloudFront uses the public key to validate the signature in the signed cookie and to confirm that the cookie hasn't been tampered with. If the signature is invalid, the request is rejected.
More details can be found here.
UPDATE
Setting cookies for different domains is obviously not allowed. However, one can use a top level apex domain in a signed cookie, for example, .example.com. If properly set this cookie will always be sent to any request made to example.com, but also to any of its subdomains, e.g., cloudfront.example.com. So we can take advantage of this and have our backend service, which is running on backend.example.com create and set a signed cookie for .example.com. Later when the frontend requests a file form cloudfront.example.com it will automatically include the signed cookie with the request.
This approach requires having a custom domain for a Cloud Front distribution. Here docs how to do that
Best, Stefan

Related

Is storing access token in cookie to allow for SSR dangerous?

I'm working on a project where I've got a central API server and then multiple microservices for it including a website. The website uses OpenID to handle authentication. To allow for server-side rendering on the website yet have it remain stateless, I'm storing the access token in a cookie which is being used on the server each time the user requests a page via retrieving the access token from the cookie and appending it as an authorization header. Is there an exploits that could happen from this? As far as I'm aware I shouldn't have any problems with CSRF or any other exploit like it, however I haven't seen this way of handling authentication before.
Short answer: Yes
Long answer
The definition of CSRF is that the authentication cookie is automatically attached when any request from anywhere to your website is made. You will always need to implement xsrf counter measures + frontend.
Implementation
On each webrequest the webbrowser makes to the server, the server attaches a non-httponly cookie to the response, containing a CSRF-token which identifies the user currently signed on (NuGet).
public async Task Invoke(HttpContext httpContext)
{
httpContext.Response.OnStarting((state) =>
{
var context = (HttpContext)state;
//if (string.Equals(httpContext.Request.Path.Value, "/", StringComparison.OrdinalIgnoreCase))
//{
var tokens = antiforgery.GetAndStoreTokens(httpContext);
httpContext.Response.Cookies.Append("XSRF-TOKEN", tokens.RequestToken, new CookieOptions() { Path = "/", HttpOnly = false });
//}
return Task.CompletedTask;
}, httpContext);
await next(httpContext);
}
Your frontend must be configured to read this cookie (this is why it's a non-httponly cookie) and pass the csrf-token in the X-XSRF-TOKEN header on each request:
HttpClientXsrfModule.withOptions({
cookieName: 'XSRF-TOKEN',
headerName: 'X-XSRF-TOKEN'
}),
Then you need to add and configure the Antiforgery services to the ASP.NET Core application:
services.AddAntiforgery(options => options.HeaderName = "X-XSRF-TOKEN");
Now you can decorate your controller methods with the ValidateAntiforgeryAttribute.
I'm using angular, and angular does not send a X-XSRF-TOKEN header when the URL starts with https:. This could perhaps also be the case for React, if they provide an embedded solution.
Now if you combine this with the cookie authentication provided by ASP.NET Core Identity (SignInManager.SignInAsync), you should be clear to go.
Appendix
Note that all of the above is useless if you have an XSS vulnerability somewhere in your website. If you're not sanitizing (htmlspecialchars) your user-input before rendering it in HTML, an attacker can manage to inject a script into your HTML:
<div class="recipe">
<div class="title">{!! Model.UnsanitizedTitleFromUser !!}</div>
<div class="instructions">{!! Model.UnsanitizedInstructionsFromUser !!}</div>
</div>
The result could possibly be the following:
<div class="recipe">
<div class="title">Pancakes</div>
<div class="instructions">
<script>
// Read the value of the specific cookie
const csrfToken = document.cookie.split(' ').map(function(item) { return item.trim(';'); }).filter(function (item) { return item.startsWith('XSRF-TOKEN'); })[0].split('=')[1];
$.delete('/posts/25', { headers: { 'X-XSRF-TOKEN': csrfToken } });
</script>
</div>
</div>
The injected script runs in the website context, so is able to access the csrf-cookie. The authentication cookie is attached to any webrequest to your website. Result: the webrequest will not be blocked.
Important links
ASP.NET Core docs
For react I cannot find documentation on CSRF, but the idea is explained in the answer
More info
A hacker could try and send you an email with a link to a Facebook URL. You click this link, the webbrowser opens up, the authentication cookie for facebook.com is automatically attached. If this GET-request consequently deletes posts from your timeline, then the hacker made you do something without you realizing.
Rule of thumb: Never change state (database, login, session, ...) on a GET-request.
A second way a hacker could try and trick you is by hosting a website with the following html:
<form action="https://facebook.com/posts" method="POST">
<input type="hidden" name="title" value="This account was hacked">
<input type="hidden" name="content" value="Hi, I'm a hacker">
<input type="submit" value="Click here and earn 5000 dollars">
</form>
You only see some button on a random website with an appealing message, you decide to click it, but instead of receiving 5000 dollars, you're actually placing some posts on your facebook timeline.
As you can see, this is totally unrelated with whether you're hosting a single-page or MVC application.
Defense
MVC applications
In MVC websites, the usual practise is to add an input containing a CSRF token. When visiting the page, ASP.NET Core generates a CSRF token which represents your session (so if you're signed in, that's you). When submitting the form, the CSRF token in the POST body must contain the same identity as the one in the Cookie.
A hacker cannot generate this token from his website, his server since he isn't signed in with your identity.
(However, I think that a hacker would be perfectly capable of sending an AJAX GET request from his website with you visiting, then try to extract the token returned from your website and append it to the form). This could then again be prevented by excluding the GET-requests which return a form containing a CSRF-token from CORS (so basically don't have a Access-Control-Allow-Origin: * on any url returning some CSRF-token))
Single-page applications
This is explained on top. In each webrequest made to the server, the server attaches a non-httponly cookie to the response containing the CSRF-token for the current user session.
The SPA is configured to read this XSRF-TOKEN cookie and send the token as X-XSRF-TOKEN header. AFAIK, the cookie can only be read by scripts from the same website. So other websites cannot host a form containing this token field for someone's identity.
Although the XSRF-TOKEN cookie is also sent along to the server, the server doesn't process it. The cookie value is not being read by ASP.NET Core for anything. So when the header containing a correct token is present on the request, the backend can be sure that the webrequest was sent by your react (or in my case angular) app.
Spoiler
In ASP.NET Core, during a webrequest, the Identity does not change. So when you call your Login endpoint, the middleware provided in this answer will return a csrf token for the not-signed-in user. The same counts for when you logout. This response will contain a cookie with a csrf-token as if you're still signed in. You can solve this by creating an endpoint that does absolutely nothing, and call it each time after a sign in/out is performed. Explained here
Edit
I did a little test, and this image basically summarises everything from the test:
From the image you can read the following:
When visiting the index page from app4, a cookie is returned (non HttpOnly, SameSite.Strict)
app5 hosts a javascript file which can do anything this website owner wants
app4 references this script hosted by app5
The script is able to access the non HttpOnly cookie and do whatever it wants (send an ajax call to its server, or something rogue like that)
So storing the token in a non-httponly cookie is only fine if the scripts you include (jquery, angularjs, reactjs, vue, knockout, youtube iframe api, ...) will not read this cookie (but they can, even when the script is included with the <script> tag) AND you are certain that your website is fully protected against XSS. If an attacker would be somehow able to inject a script (which he hosts himself) in your website, he's able to read all non-httponly cookies of the visitors.

How to allow express backend REST API to set cookie in a react frontend which is using axios?

Backend
I am trying make a JWT cookie based authentication work. I am currently performing the following cookie setting as part of a login route in the backend API.
res.cookie('authCookie', token, {maxAge: 900000, httpOnly: true});
Later when I am auth(ing) any other requests, I am reading off of this cookie and testing it in a passport-jwt strategy.
I have gotten this work in postman - when I perform a login and access a secured route - it works perfectly fine + the cookie is also getting set in postman.
Frontend
Now, I am performing the following call stack in the frontend just to test the working,
axios.post("http://localhost:3001/login", logInParams, config)
.then(result => {
// User is logged in so push them to the respective dashboard
console.log(result);
axios.get("http://localhost:3001/user/profile")
.then(result => {
console.log(result);
})
.catch(err => {
console.log(err);
return;
})
})
.catch(err => {
console.log(err)
return;
});
So basically, I log the user in and that works perfectly fine - I am getting a JSON response as intended, but the call is supposed to set a cookie which it is not and hence the next axios.get call is not returning successfully. Though for some reason I see session cookie and a CSRF cookie. Only this authCookie or the jwt-cookie is not getting set.
Some extra details
I am using cors with default parameters - could this be an error of this? Or is there any changes I have to do with axios? I have seen some answers and being new to MERN I don't really understand them. Does someone know about this or have experienced it and solved it?
Are you running the server from a different port than the one that provides the client (i.e: webpack-dev-server running on localhost:3000 and Express server on localhost:3001)? This looks like a same-site cookie issue. In the latest versions of some browsers such as Chrome cookie setting is being blocked when this one comes from a different origin site. This is due to security concerns; you can learn more about same-site cookies here.
The change made in the browsers is related to the default value they give to a cookie policy property called same-site. The old workaround was treating all the cookies from a different origin as None by default, but last year it changed to consider the same-site policy as Lax when no same-site policy was not explicitly provided by the server. This is a desirable behaviour from the browser because it helps at preventing third party sites making use of the cookie provided by the server, but you can edit it by different ways:
Changing the default same-site policy of your browser settings (the article about same site cookies explains).
Sending a same-site:'None' from the server. Express has a way to do so explaind on its docs. Keep in mind browsers also have a new policy to ignore same-site:'None' when the cookie is not marked as Secure, what demands the use of HTTPS (I guess this behaviour can be edited in your browser settings if you want to check while using HTTP).
Obviously, any strategy that demands the users to change their browser settings is a no-go, so running HTTPS with Secure cookies is mandatory for same-site:'None'.
You always have the approach of making both browser and client same origin, so you won't have any issues at all with same-site (i.e. the Express server returning the index.html of the production build of your client as its main static). I haven't found any way to configure CORS module to have a default same site cookies policy (or its sole use as middleware to change it), likely not its purpose, but you can try by adding a dynamic origin config.
As far as I've seen, Postman does not support the same-site cookie property yet, so that would explain why is it working on Postman but not on the browser.
From the looks of it - it seems to be an issue with how cors works and I am adding the following answer to help anyone else coming across it. Thank me later :)
Server Side
You will have a cors in your server that looks like this,
app.use(cors());
You will have to set credentials to true and set the allowedHeaders and origin as follows,
app.use(cors({
credentials: true,
allowedHeaders: ['Content-Type', 'Authorization'],
origin: ['http://localhost:3000']
}));
This is because normally cookies are not allowed to be set in the browser if the server and the client are in the same port. To handle this the above is required on the server side.
Client Side
We also have to pass the cookies when we are sending the request and to do this with axios just add the following in the index.js of your react app as so,
axios.defaults.withCredentials = true;
I think you should write send('cookies are set') at the end in res.cookie('authCookie', token, {maxAge: 900000, httpOnly: true});

Laravel 7 Sanctum: Same domain (*.herokuapp.com) but separate React SPA gets CSRF Token Mismatch

I've read a lot from this forum and watched a lot of tutorial videos on how to connect separate React/Vue SPA to Laravel API with Sanctum Auth but none of the solutions worked for me. This is for my school project.
So here's what I did so far.
I created 2 folders, one for api and one for frontend. I installed Laravel on the api folder and installed React app on the frontend folder. Both of these are Git initialized and have their own Github repositories. Also, both of them are deployed to Heroku.
API
Repository: https://github.com/luchmewep/jarcalc_api
Website: https://jarcalc-api.herokuapp.com
Front-end
Repository: https://github.com/luchmewep/jarcalc_front
Website: https://jarcalculator.herokuapp.com
On local, everything runs fine. I can set error messages to email and password fields on the front-end so that means I have received and sent the laravel_session and XSRF_TOKEN cookies. I have also displayed the authenticated user's information on a dummy dashboard so everything works fine on local.
On the internet, both my apps run but won't communicate with each other. In the official documentation, they must at least be on the same domain and in this case, they are subdomains of the same domain which is .herokuapp.com.
Here are my environment variables for each Heroku apps.
API
SANCTUM_STATEFUL_DOMAINS = jarcalculator.herokuapp.com
(I've tried adding "SESSION_DRIVER=cookie" and "SESSION_DOMAIN=.herokuapp.com" but still not working!)
Update
Found out that axios is not carrying XSRF-TOKEN when trying to POST request for /login. It is automatically carried on local testing.
Here is the relevant code:
api.tsx
import axios from "axios";
export default axios.create({
baseURL: `${process.env.REACT_APP_API_URL}`,
withCredentials: true,
});
Login.tsx
...
const handleSubmit = (e: any) => {
e.preventDefault();
let login = { email: email.value, password: password.value };
api.get("/sanctum/csrf-cookie").then((res) => {
api.post("/login", login).then((res) => {
/**
* goes here if login succeeds...
*/
console.log("Login Success");
...
})
.catch((e) => {
console.log("Login failed...")
});
})
.catch((e) => {
console.log("CSRF failed...");
});
};
UPDATE
".herokuapp.com is included in the Mozilla Foundation’s Public Suffix List. This list is used in recent versions of several browsers, such as Firefox, Chrome and Opera, to limit how broadly a cookie may be scoped. In other words, in browsers that support the functionality, applications in the herokuapp.com domain are prevented from setting cookies for *.herokuapp.com."
https://devcenter.heroku.com/articles/cookies-and-herokuapp-com
COOKIES ON LOCAL
COOKIES ON DEPLOYED
Explanation: Although the API and frontend both have .herokuapp.com, that does not make them on the same domain. It is explained on Heroku's article above. This means that all requests between *.herokuapp.com are considered cross-site instead of same-site.
SOLUTION
Since laravel_session cookie is being carried by axios, the only problem left is the xsrf-token cookie. To solve the problem, one must buy a domain name and set the subdomain name for each. In my case, my React frontend is now at www.jarcalculator.me while my Laravel backend is now at api.jarcalculator.me. Since they are now same-site regardless of where they are deployed (React moved to Github pages while Laravel at Heroku), the cookie can be set automatically.
Finally fixed my problem by claiming my free domain name via Github Student Pack. I set my React app's domain name to www.jarcalculator.me while I set my Laravel app's domain name to api.jarcalculator.me. Since they are now subdomains of the same domain which is jarcalculator.me, passing of cookie that contains the CSRF-token and laravel_session token is automatic. No need for modification on axios settings. Just setting the axios' withCredentials to true is all you need to do.

Unable to set cookies in Chrome using Flask-JWT-Extended, React, and Axios

Background and Issues
I have a Flask back-end running in localhost:5000 and a React SPA running on localhost:3000.
I was able to make them talk but when trying to store the token generated from Flask into Browser's cookies 1) response headers does not contain any cookies when doing console.log(response) after a successful POST from axios and 2) the cookies are not being set. But when inspecting the network > Login.js header, I could actually see the Set-Cookie key exists as response's header. I've tried multiple solutions from Google and StackOverflow but no solution seems to work here and I really can't figure out what is going on as the request is being made successfully, and Chrome is allowing third party software to set the cookies. And even I can see the tokens from Network > Login.js header.
Steps
1) Users enters in their username and password and hit login.
2) Axios POST call is made to Flask's back-end.
3) Process the data and generates a couple of tokens and set them into cookies.
4) Browser's cookie are set with few tokens. <- this part is not working.
Code
Flask back-end token generation using flask-jwt-extended
# app_config related to flask-jwt-extended
CORS_HEADERS = "Content-Type"
JWT_TOKEN_LOCATION = ["cookies"]
JWT_COOKIE_SECURE = False
JWT_COOKIE_CSRF_PROTECT = True
# post method from flask-restful for LoginAPI class
def post(self):
email = request.json.get("email")
password = request.json.get("password")
# some processing here.....
payload = {
"email": email
}
access_token = create_access_token(identity=payload)
refresh_token = create_refresh_token(identity=payload)
response = jsonify({"status": True})
set_access_cookies(response, access_token)
set_refresh_cookies(response, refresh_token)
return response
CORS using flask-cors
# in below code, I had some issues with putting wildcard (*) into origin, so I've specified to the React SPA's host and port.
CORS(authentication_blueprint, resources={r"/authentication/*": {"origins": "http://localhost:3000"}},
supports_credentials=True)
React SPA - making a post call using axios
# also tried `axios.defaults.withCredentials = true;` but same result.
export const login = (email, password, cookies) => {
return dispatch => {
const authData = {
email: email,
password: password
};
let url = 'http://127.0.0.1:5000/authentication/login/';
axios.post(url, authData, {withCredentials: true)
.then(
response => {
console.log(response)
})
.catch(err => {
console.log(err)
});
dispatch(authSuccess(email, password));
}
};
Below image is the response from successful post call in axios.
I'm not sure whether it is normal but response's headers are not showing any of the cookies that I'm setting from the back-end.
And below image is from Network > header for login/
As shown, you can clearly see the token information with Set-Cookie key. I've also checked that they aren't secure.
And finally when I check my cookie tab from application > cookies, I do not see anything.
So the issues were coming from the localhost.
I have a Flask back-end running in localhost:5000 and a React SPA running on localhost:3000.
From above statement, to be very specific, I was running the back-end on localhost:5000 and running the React SPA on 127.0.0.1:3000.
Once I've changed the 127.0.0.1 to localhost, it worked like a charm.
And a side note, after playing around with CORS, I think it will be a lot easier to use Nginx and proxy_pass to pass the request coming from React SPA to back-end to avoid using CORS completely, because if one have to use the CORS in different environment such as test, staging and etcs, one would have to set up the CORS at the web server level e.g) Nginx anyway and it requires slightly different configuration that how I set up for local environment anyway.

Frontend/Backend separation: Safari not storing cookies from API which is hosted on a separate domain than its Frontend SPA client

I have a setup which - as far as I can tell - is fairly common nowadays: a backend REST API that lives on its own domain, say myapi.com, and a single page frontend application that is served somewhere else, say myapp.com.
The SPA is a client to the API and the API requires users to authenticate before they can do things.
The backend API is using cookies to store session data for some allowed origins among which myapp.com. This is in order to have a safe bus to transmit and store auth data without having to worry about it client-side.
In Chrome, Opera and Firefox, this works just fine: an API call is made to authenticate the user, Cookies are returned and stored in the browser in order to then be pushed together with the next call.
Safari, on the other hand, does receive the cookies but refuses to store them:
I suspect Safari sees the API domain as a 3rd party cookie domain and therefore blocks the cookies from being stored.
Is this the expected behaviour in Safari? If so, what are some best practices to get around it?
Perpetuating a tradition of answering your own question on this one.
TL;DR this is desired behaviour in Safari. The only way to get around it is to bring the user to a webpage hosted on the API's domain (myapi.com in the question) and set a cookie from there - anything really, you can write a small poem in the cookie if you like.
After this is done, the domain will be "whitelisted" and Safari will be nice to you and set your cookies in any subsequent call, even coming from clients on different domains.
This implies you can keep your authentication logic untouched and just introduce a dumb endpoint that would set a "seed" cookie for you. In my Ruby app this looks as follows:
class ServiceController < ActionController::Base
def seed_cookie
cookies[:s] = {value: 42, expires: 1.week, httponly: true} # value can be anything at all
render plain: "Checking your browser"
end
end
Client side, you might want to check if the browser making the request is Safari and defer your login logic after that ugly popup has been opened:
const doLogin = () => {
if(/^((?!chrome|android).)*safari/i.test(navigator.userAgent)) {
const seedCookie = window.open(`http://myapi.com/seed_cookie`, "s", "width=1, height=1, bottom=0, left=0, toolbar=no, location=no, directories=no, status=no, menubar=no, scrollbars=no, resizable=no, copyhistory=no")
setTimeout(() => {
seedCookie.close();
// your login logic;
}, 500);
} else {
// your login logic;
}
}
UPDATE: The solution above works fine for logging a user in, i.e. it correctly "whitelists" the API domain for the current browser session.
Unfortunately, though, it appears that a user refreshing the page will make the browser reset to the original state where 3rd party cookies for the API domain are blocked.
I found a good way to handle the case of a window refresh is to detect it in javascript upon page load and redirect the user to an API endpoint that does the same as the one above, just to then redirect the user to the original URL they were navigating to (the page being refreshed):
if(performance.navigation.type == 1 && /^((?!chrome|android).)*safari/i.test(navigator.userAgent)) {
window.location.replace(`http://myapi.com/redirect_me`);
}
To complicate things, it turns out Safari won't store cookies if the response's HTTP status is a 30X (redirect). Thereby, a Safari-friendly solution involves setting the cookies and returning a 200 response together with a JS snippet that will handle the redirect within the browser.
In my case, being the backend a Rails app, this is how this endpoint looks like:
def redirect_me
cookies[:s] = {value: 42, expires: 1.week, httponly: true}
render body: "<html><head><script>window.location.replace('#{request.referer}');</script></head></html>", status: 200, content_type: 'text/html'
end
What worked for me (as mentioned by others / in other similar questions too) is to put my Frontend and Backend under same domain, e.g:
frontend.myapp.com
backend.myapp.com
Then both Safari on Mac Monterey and Safari on iOS 15 started to allow set-cookie from backend.myapp.com (with Secure, HttpOnly, SameSite=none) and access them from frontend.myapp.com

Resources