I'm making the login for a app in react with the backend in flask but the session is not working.
The flask app is running in a remote server on pythonanywhere and I'm testing the react app on my localhost:3000
Here's the code for the flask app:
app = Flask(__name__)
app.config["SESSION_PERMANENT"] = True
app.config["SESSION_TYPE"] = "filesystem"
app.config['PERMANENT_SESSION_LIFETIME'] = timedelta(hours=5)
Session(app)
CORS(app, supports_credentials = True, resources={r"/webapp/*": {"origins": "http://localhost:3000/"}})
#app.route("/webapp/login", methods = ["POST"])
def login():
user = request.get_json(force=True)
Email = user.get("email", "")
Password = user.get("password", "")
login = Login.login(Email, Password)
if login["Success"]:
session["email"] = Email
return login
#app.route("/webapp/checksession", methods = ["GET"])
def checksession():
sessionEmail = session.get("email", "")
if sessionEmail == "":
return utils.returnResult(False, "Session not valid")
return utils.returnResult(True, "")
And in the React app I use axios to do the login and check the session in the server, for example:
axios.get(APIURL + CHECK_SESSION, {withCredentials: true})
.then(response => {
console.log(response.headers);
setSession(response.data.success)
setCheckedSession(true)
})
.catch(err => {
setSession(false)
setCheckedSession(true)
if(err.response) {
}
else {
window.alert('Could not establish a connection to the server!');
}
});
In the dev tools we can see that the server is sending the cookie session:
But when I check the cookies for the app there's nothing there:
(Sorry for the Portuguese)
And in the axios promise handler, when I print the headers of the response, this is all I get:
So every time the react app checks if the user has a valid session in the server, the server creates a new session, which means that the react app is not saving and sending the cookie to the server.
Also, when I test this with postman everything works fine.
I searched all over the places and I can't find an answer for this. Can anyone help me figure out what I'm doing wrong, please?
Turns out it was missing configurations in the flask session:
app.config["SESSION_COOKIE_SAMESITE"] = "None"
app.config["SESSION_COOKIE_SECURE"] = True
Related
We are integrating DRF (dj_rest_auth) and allauth with the frontend application based on React. Recently, the social login was added to handle login through LinkedIn, Facebook, Google and GitHub. Everything was working good on localhost with each of the providers. After the staging deployment, I updated the secrets and social applications for a new domain. Generating the URL for social login works fine, the user gets redirected to the provider login page and allowed access to login to our application, but after being redirected back to the frontend page responsible for logging in - it results in an error: (example for LinkedIn, happens for all of the providers)
allauth.socialaccount.providers.oauth2.client.OAuth2Error:
Error retrieving access token:
b'{"error":"invalid_redirect_uri","error_description":"Unable to retrieve access token: appid/redirect uri/code verifier does not match authorization code. Or authorization code expired. Or external member binding exists"}'
Our flow is:
go to frontend page -> click on provider's icon ->
redirect to {BACKEND_URL}/rest-auth/linkedin/url/ to make it a POST request (user submits the form) ->
login on provider's page ->
go back to our frontend page {frontend}/social-auth?source=linkedin&code={the code we are sending to rest-auth/$provider$ endpoint}&state={state}->
confirm the code & show the profile completion page
The adapter definition (same for every provider):
class LinkedInLogin(SocialLoginView):
adapter_class = LinkedInOAuth2Adapter
client_class = OAuth2Client
#property
def callback_url(self):
return self.request.build_absolute_uri(reverse('linkedin_oauth2_callback'))
Callback definition:
def linkedin_callback(request):
params = urllib.parse.urlencode(request.GET)
return redirect(f'{settings.HTTP_PROTOCOL}://{settings.FRONTEND_HOST}/social-auth?source=linkedin&{params}')
URLs:
path('rest-auth/linkedin/', LinkedInLogin.as_view(), name='linkedin_oauth2_callback'),
path('rest-auth/linkedin/callback/', linkedin_callback, name='linkedin_oauth2_callback'),
path('rest-auth/linkedin/url/', linkedin_views.oauth2_login),
Frontend call to send the access_token/code:
const handleSocialLogin = () => {
postSocialAuth({
code: decodeURIComponent(codeOrAccessToken),
provider: provider
}).then(response => {
if (!response.error) return history.push(`/complete-profile?source=${provider}`);
NotificationManager.error(
`There was an error while trying to log you in via ${provider}`,
"Error",
3000
);
return history.push("/login");
}).catch(_error => {
NotificationManager.error(
`There was an error while trying to log you in via ${provider}`,
"Error",
3000
);
return history.push("/login");
});
}
Mutation:
const postSocialUserAuth = builder => builder.mutation({
query: (data) => {
const payload = {
code: data?.code,
};
return {
url: `${API_BASE_URL}/rest-auth/${data?.provider}/`,
method: 'POST',
body: payload,
}
}
Callback URLs and client credentials are set for the staging environment both in our admin panel (Django) and provider's panel (i.e. developers.linkedin.com)
Again - everything from this setup is working ok in the local environment.
IMPORTANT
We are using two different domains for the backend and frontend - frontend has a different domain than a backend
The solution was to completely change the callback URL generation
For anyone looking for a solution in the future:
class LinkedInLogin(SocialLoginView):
adapter_class = CustomAdapterLinkedin
client_class = OAuth2Client
#property
def callback_url(self):
callback_url = reverse('linkedin_oauth2_callback')
site = Site.objects.get_current()
return f"{settings.HTTP_PROTOCOL}://{site}{callback_url}"
Custom adapter:
class CustomAdapterLinkedin(LinkedInOAuth2Adapter):
def get_callback_url(self, request, app):
callback_url = reverse(provider_id + "_callback")
site = Site.objects.get_current()
return f"{settings.HTTP_PROTOCOL}://{site}{callback_url}"
It is important to change your routes therefore for URL generation:
path('rest-auth/linkedin/url/', OAuth2LoginView.adapter_view(CustomAdapterLinkedin))
I am leaving this open since I think this is not expected behaviour.
I'm really new to OAuth2 so could really use some help. I have a site where users register and login via standard means. However, once they register, I want to connect their Google account so they can view/edit/modify their Google calendars. To this end, I installed react-google-login and have a component on the front-end that logs them into their account. That works fine (here's the code). Please note that the jsx is in styled components, which is why it has odd labels.
return (
<GoogleContainer>
<Logo src={GoogleLogo} />
<GoogleLogin
clientId = {process.env.REACT_APP_CLIENT_ID}
render={(renderProps) => (
<GoogleBtn
onClick={renderProps.onClick}
disabled={renderProps.disabled}
style={styleObj}
>
Connect to Google
</GoogleBtn>
)}
// buttonText='Sign in to Google Calendar'
onSuccess={responseGoogle}
isSignedIn={true}
onFailure={responseError}
cookiePolicy={"single_host_origin"}
responseType='code'
accessType='offline'
scope='openid email profile https://www.googleapis.com/auth/calendar '
/>{" "}
</GoogleContainer>
);
On the backend, I have code that grabs the refresh_token, stores it in a database and then I make a token object that I can send back to the frontend. Here is the code for that -
//This next fx will be used in the CreateTokens fx called by Google Login to identify user by the email captured in scope
const fetchInfo = async (accessToken) => {
const request = await axios.get(
`https://www.googleapis.com/oauth2/v2/userinfo?access_token=${accessToken}`
);
let response = await request;
let email = "";
if (response) {
email = response.data.email;
}
return email;
};
//Get authorization tokens from google calendar when signing into Google
const createTokens = async (req, res, next) => {
try {
const { code } = req.body;
const { tokens } = await oauth2Client.getToken(code);
accessToken = await tokens.access_token;
expiryDate = await tokens.expiry_date;
id_token = await tokens.id_token;
//Make an object with accessToken and expiry data and send to front end
const tokenObj = {
accessToken,
expiryDate,
id_token,
};
//Refresh Token goes to the database
const refreshToken = await tokens.refresh_token;
//We find user by using the scope variable from Google Login (frontend) - fx above
let email = await fetchInfo(accessToken);
if (refreshToken) {
//Parameters to update record by putting refreshToken in database
const filter = { email: email };
const update = { refreshToken: refreshToken };
let user = await User.findOneAndUpdate(filter, update, {
new: true,
});
}
res.send({ tokenObj });
} catch (error) {
next(error);
}
};
That also works fine as I get the refresh_token and store it in the database by user and the tokenObject with the access token gets sent back to the frontend. Here's where I'm confused and can use some help - first of all, I thought I needed to send the token to the frontend to store it but pretty much every time I refresh my page now, the frontend is sending a boatload of information to the console (with tons of information from Google - like the profile, tokens, etc). I don't know what code I wrote that is causing this or if it's a good thing or not. If it's automatically generated, do I even need to have backend code to get the token? Also, I'm getting another message that says " react_devtools_backend.js:3973 Your client application uses libraries for user authentication or authorization that will soon be deprecated. See the Migration Guide for more information." I thought this was up-to-date and not sure what part is deprecated. Ugh - sorry I'm so new to this and very confused. Any help would be much, much appreciated!!
Blockquote
I am having the same issue as issue CORs Error: Google Oauth from React to Express (PassportJs validation). But I am unable to get the solution offered by #Yazmin to work.
I am attempting to create a React, Express/Nodejs, MongoDB stack with Google authentication and authorization. I am currently developing the stack on Windows 10, using Vs Code (React on ‘localhost:3000, Nodejs on localhost:5000 and MongoDB on localhost:27017.
The app’s purpose is to display Urban Sketches(images) on a map using google maps, google photos api and google Gmail api. I may in the future also require similar access to Facebook Groups to access Urban Sketches. But for now I have only included the profile and Email scopes for authorization.
I want to keep all requests for third party resources in the backend, as architecturally I understand this is best practice.
The google authorization process from origin http://localhost:5000 works just fine and returns the expected results. However, when I attempt to do the same from the client - origin Http://localhost:3000 the following error is returned in the developers tools console following the first attempt to access the google auth2 api. Although the scheme and domain are the same the port is different, so the message from the third part (Https://account.google.com) has been rejected by the browser.
Access to fetch at 'https://accounts.google.com/o/oauth2/v2/auth?response_type=code&redirect_uri=http%3A%2F%2Flocalhost%3A5000%2Fauth%2Fgoogle%2Fcallback&scope=profile%20email%20https%3A%2F%2Fmail.google.com%2F&client_id=' (redirected from 'http://localhost:3000/auth/google') from origin 'http://localhost:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
No matter what I try the error message is the same.
I think that google is sending the reply to the client (localhost:3000) rather than to the server.
Among other solutions, I attempted to implement Yilmaz’s solution by Quote: “Create setupProxy.js file in client/src. No need to import this anywhere. create-react-app will look for this directory” I had already created my client by running create-react-app previously. So I added setupProxy.js inside my src folder.
Question: I assume I am correct that the new setupProxy.cjs file containing my settings will be included by webpack after I restart the client.
It seems to me that the flow I am getting is not BROWSER ==> EXPRESS ==> GOOGLE-SERVER but BROWSER ==> EXPRESS ==> GOOGLE-SERVER ==>BROWSER where it stops with the cors error as shown above.
To test this theory, I put some console log messages in the client\node_modules\http-proxy-middleware\lib\index.js functions "shouldProxy" and "middleware", but could not detect any activity from the auth/google end point from the google authorization server response (https://accounts.google.com/o/oauth2/v2/auth).
So I think my theory is wrong and I don't know how I will get this working.
Console log messages displayed on VsCode terminal following request to /auth/google endpoint from the React client are as follows...
http-proxy-middleware - 92 HttpProxyMiddleware - shouldProxy
context [Function: context]
req.url /auth/google
req.originalUrl /auth/google
Trace
at shouldProxy (C:\Users\User\github\GiveMeHopev2\client\node_modules\http-proxy-middleware\lib\index.js:96:13)
at middleware (C:\Users\User\github\GiveMeHopev2\client\node_modules\http-proxy-middleware\lib\index.js:49:9)
at handle (C:\Users\User\github\GiveMeHopev2\client\node_modules\webpack-dev-server\lib\Server.js:322:18)
at Layer.handle [as handle_request] (C:\Users\User\github\GiveMeHopev2\client\node_modules\express\lib\router\layer.js:95:5)
at trim_prefix (C:\Users\User\github\GiveMeHopev2\client\node_modules\express\lib\router\index.js:317:13)
at C:\Users\User\github\GiveMeHopev2\client\node_modules\express\lib\router\index.js:284:7
at Function.process_params (C:\Users\User\github\GiveMeHopev2\client\node_modules\express\lib\router\index.js:335:12)
at next (C:\Users\User\github\GiveMeHopev2\client\node_modules\express\lib\router\index.js:275:10)
at goNext (C:\Users\User\github\GiveMeHopev2\client\node_modules\webpack-dev-middleware\lib\middleware.js:28:16)
at processRequest (C:\Users\User\github\GiveMeHopev2\client\node_modules\webpack-dev-middleware\lib\middleware.js:92:26)
http-proxy-middleware - 15 HttpProxyMiddleware - prepareProxyRequest
req localhost
The Google callback uri is http://localhost:5000/auth/google/callback
This is a listing of my nodejs server code.
dotenv.config();
// express
const app = express();
// cors
app.use(cors())
// passport config
require ('./config/passport')(passport)
// logging
if( process.env.NODE_ENV! !== 'production') {
app.use(morgan('dev'))
}
const conn = process.env.MONGODB_LOCAL_URL!
/**
* dbConnection and http port initialisation
*/
const dbConnnect = async (conn: string, port: number) => {
try {
let connected = false;
await mongoose.connect(conn, { useNewUrlParser: true, useUnifiedTopology: true })
app.listen(port, () => console.log(`listening on port ${port}`))
return connected;
} catch (error) {
console.log(error)
exit(1)
}
}
const port = process.env.SERVERPORT as unknown as number
dbConnnect(conn, port)
//index 02
// Pre Middleware
app.use(express.json());
app.use(express.urlencoded({ extended: true }))
const mongoStoreOptions = {
mongoUrl: conn,
collectionName: 'sessions'
}
app.use(
session({
secret: process.env.SESSIONKEY as string,
resave: false,
saveUninitialized: false,
store: MongoStore.create(mongoStoreOptions),
})
)
app.use(passport.initialize())
app.use(passport.session())
// Authentication and Authorisation
const emailScope: string = process.env.GOOGLE_EMAIL_SCOPE as string
//GOOGLE_EMAIL_SCOPE=https://www.googleapis.com/auth/gmail/gmail.compose
const scopes = [
'profile',
emailScope
].join(" ")
app.get('/auth/google', passport.authenticate('google', {
scope: scopes
}));
app.get('/auth/google/callback', passport.authenticate('google', { failureRedirect: '/'}),
(req, res) => {
res.send('Google Login Successful ')
}
)
app.get('/', (req, res) => {
res.send('Hello World');
})
The http-proxy-middleware setupProxy.cjs file. Note the cjs extension. I assume this was because I am using Typescript. It is in the client src folder
const createProxyMiddleware = require('http-proxy-middleware');
module.exports = function (app) {
app.use(createProxyMiddleware('/auth', {target: 'http://localhost:5000'}))
}
And finally the fetch command from the client
async function http(request: RequestInfo): Promise<any> {
try {
const response = await fetch('/auth/google')
const body = await response.json();
return body
} catch (err) { console.log(`Err SignInGoogle`) }
};
And the passport config...
import { PassportStatic} from 'passport';
import {format, addDays} from 'date-fns'
import { IUserDB, IUserWithRefreshToken, ProfileWithJson} from '../interfaces/clientServer'
const GoogleStrategy = require('passport-google-oauth20').Strategy;
const User = require('../models/User')
module.exports = function (passport:PassportStatic) {
const clientID: string = process.env.GOOGLE_CLIENTID as string
const clientSecret: string = process.env.GOOGLE_SECRET as string
const callbackURL: string = process.env.GOOGLE_AUTH_CALLBACK as string
const strategy = new GoogleStrategy(
{
clientID: clientID,
clientSecret: clientSecret,
callbackURL: callbackURL,
proxy: true
},
async (_accesstoken: string, _refreshtoken: string,
profile: ProfileWithJson,
etc
you can't make a fetch call to the /auth/google route!
Here's my solution in javascript...
// step 1:
// handler function should use window.open instead of fetch
const loginHandler = () => window.open("http://[server:port]/auth/google", "_self")
//step 2:
// on the server's redirect route add this successRedirect object with correct url.
// Remember! it's your clients root url!!!
router.get(
'/google/redirect',
passport.authenticate('google',{
successRedirect: "[your CLIENT root url/ example: http://localhost:3000]"
})
)
// step 3:
// create a new server route that will send back the user info when called after the authentication
// is completed. you can use a custom authenticate middleware to make sure that user has indeed
// been authenticated
router.get('/getUser',authenticated, (req, res)=> res.send(req.user))
// here is an example of a custom authenticate express middleware
const authenticated = (req,res,next)=>{
const customError = new Error('you are not logged in');
customError.statusCode = 401;
(!req.user) ? next(customError) : next()
}
// step 4:
// on your client's app.js component make the axios or fetch call to get the user from the
// route that you have just created. This bit could be done many different ways... your call.
const [user, setUser] = useState()
useEffect(() => {
axios.get('http://[server:port]/getUser',{withCredentials : true})
.then(response => response.data && setUser(response.data) )
},[])
Explanation....
step 1 will load your servers auth url on your browser and make the auth request.
step 2 then reload the client url on the browser when the authentication is
complete.
step 3 makes an api endpoint available to collect user info to update the react state
step 4 makes a call to the endpoint, fetches data and updates the users state.
I am developing a typical MERN application and I've completed the authentication cycle. My NodeJS/Express back-end uses 'express-session' and 'connect-mongodb-connection' to create and handle sessions. The React front-end uses 'axios' for communicating with the API. The authentication cycle works on all browsers except Chrome. For all other browsers, a session is successfully created in MongoDB, cookies are set in the browser and I am successfully logged into a session.
But when testing this with Chrome, everything works perfectly except for the part where cookies are set. I've tested this rigorously over the span of a day and I can trace the cookie to the point where it's sent from the back-end. But Chrome refuses to save the cookie.
Here is my code for maintaining sessions:
server/app.js
var store = new MongoDBStore({
uri: DB,
collection: 'sessions'
});
// Catch errors
store.on('error', function (error) {
console.log(error);
});
app.use(require('express-session')({
secret: process.env.SESSION_SECRET,
saveUninitialized: false, // don't create session until something stored
resave: false, //don't save session if unmodified
store: store,
cookie: {
maxAge: parseInt(process.env.SESSION_LIFETIME), // 1 week
httpOnly: true,
secure: !(process.env.NODE_ENV === "development"),
sameSite: false
},
}));
//Mongo Session Logic End
app.enable('trust proxy');
// 1) GLOBAL MIDDLEWARES
// Implement CORS
app.use(cors({
origin: [
process.env.CLIENT_ORIGINS.split(',')
],
credentials: true,
exposedHeaders: ['set-cookie']
}));
The CLIENT_ORIGINS are set to the https://localhost:3000 and http://localhost:3000 where my React client runs.
Some things I've tried:
Trying all combinations of secure:true & secure:false with all combinations of sameSite:false & sameSite:'strict'
Setting domain to NULL or empty string
Trying to change path randomly
Here's my code for setting the cookies on login at the back-end:
exports.signIn = async (req, res, next) => {
const { email, password } = req.body;
if (signedIn(req)) {
res.status(406).json('Already Signed In');
return;
}
const user = await User.findOne({ email: email });
if (!user) {
res.status(400).json('Please enter a correct email.');
return;
}
if (!(await user.matchPassword(password))) {
res.status(400).json('Please enter a correct password.');
return;
}
req.session.userId = user.id;
res.status(200).json({ msg: 'Signed In', user: user });
};
This is the generic request model I use for calling my API from React using Axios:
import axios from "axios";
import CONFIG from "../Services/Config";
axios.defaults.withCredentials = true;
const SERVER = CONFIG.SERVER + "/api";
let request = (method, extension, data = null, responseTypeFile = false) => {
//setting up headers
let config = {
headers: {
"Content-Type": "application/json",
},
};
// let token = localStorage["token"];
// if (token) {
// config.headers["Authorization"] = `Bearer ${token}`;
// }
//POST Requests
if (method === "post") {
// if (responseTypeFile) {
// config['responseType'] = 'blob'
// }
// console.log('request received file')
// console.log(data)
return axios.post(`${SERVER}/${extension}`, data, config);
}
//PUT Requests
else if (method === "put") {
return axios.put(`${SERVER}/${extension}`, data, config);
}
//GET Requests
else if (method === "get") {
if (data != null) {
return axios.get(`${SERVER}/${extension}/${data}`, config);
} else {
return axios.get(`${SERVER}/${extension}`, config);
}
}
//DELETE Requests
else if (method === "delete") {
if (data != null) {
return axios.delete(`${SERVER}/${extension}/${data}`, config);
} else {
return axios.delete(`${SERVER}/${extension}`, config);
}
}
};
export default request;
Some more things that I have tested:
I have double checked that credentials are set to true on both sides.
I have made sure that the authentication cycle is working on other browsers.
I have also made sure that the authentication cycle works on Chrome when I run React on http instead of https
I have also added my self signed certificate into the trusted root certificates on my local machine. Chrome no longer shows me a warning but still refuses to save cookies
I have made sure that the authentication cycle works if I run an instance of Chrome with web security disabled.
I've tried to make it work by using 127.0.0.1 instead of localhost in the address bar to no avail.
No errors are logged on either side's console.
Any and all help would be appreciated
Chrome is always doing crazy stuff with cookies and localStorage...
It seems since chrome 80 chrome will reject any cookies that hasn't specifically set SameSite=None and Secure while using cross site requests. That issue, https://github.com/google/google-api-javascript-client/issues/561, is still open and being discussed there. I also think that using https while not setting Secure will also have it be rejected.
I have faced this same issue once and I have solved it by specifically set mentioned below:
document.cookie = "access_token=" + "<YOUR TOKEN>" + ";path=/;domain=."+ "<YOUR DOMAIN NAME>" +".com;secure;sameSite=none";
Make sure:
Your Path variable is set to /.
Your Domain is set to .<YOUR DOMAIN NAME>.com (NOTE: Here . dots is necessary part).
Your secure variable should be true.
Your sameSite variable should be none.
So I figured out the solution to my issue. My client-side was running on an https connection (even during development), because the nature of my project required so.
After much research, I was sure that the settings to be used for express-session were these:
app.use(require('express-session')({
secret: process.env.SESSION_SECRET,
saveUninitialized: false, // don't create session until something stored
resave: false, //don't save session if unmodified
store: store,
cookie: {
maxAge: parseInt(process.env.SESSION_LIFETIME), // 1 week
httpOnly: true,
secure: true,
sameSite: "none"
},
}));
Keep in mind that my client-side is running on an https connection even in development. However, despite using these settings, my login cycle did not work on Chrome and my cookies weren't being set.
Express session refused to send back cookies to the client, because despite having my client run on an https connection, it contacted my server on an http connection (my server was still running on an http connection in development), hence making the connection insecure.
So I added the following code to my server:
const https = require('https');
const fs = require('fs');
var key = fs.readFileSync("./certificates/localhost.key");
var cert = fs.readFileSync("./certificates/localhost.crt");
var credentials = {
key,
cert
};
const app = express();
const port = process.env.PORT || 3080;
const server = process.env.NODE_ENV === 'development' ? https.createServer(credentials, app) : app;
server.listen(port, () => {
console.log(`App running on port ${port}...`);
});
I used a self-signed certificate to run my server on an https connection during development. This along with sameSite: "none" and secure: true resolve the issue on Chrome (and all other browsers).
I am trying to test my React (v16.10.2) application with Cypress (v4.5.0). Our application is using Okta for authentication (with the client #okta/okta-react 1.3.1).
I can use my app from the browser without any issues. The first time I login, I get the Okta login screen. I enter my user ID and password, and the react client calls the authn endpoint with my creds, and then calls the authorization endpoint to get the token. I am then taken to the first screen of our application.
When I try to login in my Cypress test, my user ID and password are entered into the login screen, the authn endpoint is called successfully, but the authorization endpoint returns a 403 error. Unfortunately, there is no other info about why I am getting the 403.
I have compared the authorization requests between the one that works in the browser, and the one that doesn't work from Cypress. The only real difference I see is that the working browser request has an origin header, whereas the failing one does not.
Question #1: Could the missing origin header be the cause of my problem?
In order to avoid a bunch of CORS and cross-site issues, I had to install a couple Chrome extensions (ignore-x-frame-headers and Access-Control-Allow-Origin-master). I am implementing them in the following code in cypress/plugins/index.js:
module.exports = (on, config) => {
on('before:browser:launch', (browser = {}, launchOptions) => {
// The following code comes from https://medium.com/#you54f/configuring-cypress-to-work-with-iframes-cross-origin-sites-afff5efcf61f
// We were getting cross-origin errors when trying to run the tests.
if (browser.name === 'chrome') {
const ignoreXFrameHeadersExtension = path.join(__dirname, '../extensions/ignore-x-frame-headers');
launchOptions.args.push(`--load-extension=${ignoreXFrameHeadersExtension}`);
const accessControlAllowOriginMasterExtension = path.join(__dirname, '../extensions/Access-Control-Allow-Origin-master');
launchOptions.args.push(`--load-extension=${accessControlAllowOriginMasterExtension}`);
launchOptions.args.push("--disable-features=CrossSiteDocumentBlockingIfIsolating,CrossSiteDocumentBlockingAlways,IsolateOrigins,site-per-process");
launchOptions.args.push('--disable-site-isolation-trials');
launchOptions.args.push('--reduce-security-for-testing');
launchOptions.args.push('--out-of-blink-cors');
}
if (browser.name === 'electron') {
launchOptions.preferences.webPreferences.webSecurity = false;
}
return launchOptions;
});
I also added the following to cypress.json:
{
"chromeWebSecurity": false
}
Here is my cypress test:
describe('Order Lookup Test', () => {
const UI_URL: string = 'http://localhost:3000/';
const ORDER_NUMBER: string = '10307906234';
beforeEach(() => {
Cypress.config('requestTimeout', 50000);
cy.visit(UI_URL);
cy.get('#okta-signin-username', {timeout: 10000}).type('xxxxxxxx');
cy.get('#okta-signin-password', {timeout: 10000}).type('xxxxxxxx');
cy.get('#okta-signin-submit', {timeout: 10000}).click();
})
it('should return an order', () => {
cy.get('.number-input', {timeout: 10000}).type(ORDER_NUMBER);
cy.get('.order-lookup-buttons-search-valid').should('be.visible').click();
})
})
Does anyone have any idea what might be going on? What other information should I be including in order to help narrow this down?