Why does Create React App .env for api keys doesn't work when uploaded to AWS Amplify - reactjs

I added a dotenv file for an api key in my react app, i followed this tutorial: https://www.pluralsight.com/guides/hiding-secret-keys-in-create-react-app as well as this: https://create-react-app.dev/docs/adding-custom-environment-variables/ I prefixed it as REACT_APP_API_KEY. The app works locally but not when uploaded to AWS.
This the on load fetch:
useEffect(() => {
const fetchData = async () => {
try {
const url = `https://api.themoviedb.org/3/search/movie/?api_key=${process.env.REACT_APP_API_KEY}&language=en-US&query=all`;
const response = await fetch(url);
const data = await response.json();
setMovieList(data.results);
} catch (err) {
console.log(err);
}
};
fetchData();
}, []);
This the search submit button handler:
const handleSearch = useCallback(async () => {
try {
const url = `https://api.themoviedb.org/3/search/movie/?api_key=${process.env.REACT_APP_API_KEY}&language=en-US&query=${query}&page=1&include_adult=false`;
const response = await fetch(url);
const jsondata = await response.json();
await setMovies(jsondata.results);
} catch (err) {
console.log(err);
}
}, [query]);
Here's the local working app:
Here's the App uploaded in AWS

Typically, whenever you deploy a React app created with Create React App (CRA), you run yarn build and then upload the build directory to some hosting solution. There is no actual process binary that you run (from React's perspective) after that.
During development, CRA's toolchain (Webpack dev server) is a running process and handles the actual serving and building of content which means it has access to the variables from .env. Once you have run yarn build, CRA is hands off and all the execution is handled by some other server that CRA is oblivious to. Any variables that did exist are hard-injected into the built app at build time Referencing Environment Variables in the HTML.
If you need these variables to change based on a deployment, you should have another .env file that gets used in your build tools or make them regular environment variables available in your build script. Just about every continuous deployment (CI) tool has this capability.

Related

Accessing root sibling file from React static site in AWS S3

I have a React (Vite) app that I'm hosting from S3/CloudFront. It gets there via AWS CDK.
In the CDK (CloudFormation) process, I'm generating a deploy time JSON file `config.json) that I'm also putting into the same S3 bucket as the React app. See image below for what my bucket ends up looking like:
What I would like to do is access the values in this config.json during runtime in my React app. In my running React App from CloudFront, It doesn't seem to be aware about the config.json file though:
To be clear, what I would like to do is do something like the following:
// App.tsx
import * as config from "./config.json"
fetch(config.url, ....)
Note: Cloudfront does have privelages for all files in this bucket:
siteBucket.addToResourcePolicy(
new iam.PolicyStatement({
actions: ["s3:GetObject"],
resources: [siteBucket.arnForObjects("*")],
principals: [
new iam.CanonicalUserPrincipal(
cloudfrontOAI.cloudFrontOriginAccessIdentityS3CanonicalUserId
),
],
})
);
new CfnOutput(this, "Bucket", { value: siteBucket.bucketName });
I found this article which outlined a similar pattern.
By fetching the file on mount, I can read the appropriate config that's generated at deploy time at the top level of the React app:
// App.tsx
function App() {
const [data, setData] = useState([]);
useEffect(() => {
fetch("/config.json")
.then(function (res) {
return res.json();
})
.then(function (data) {
setData(data);
})
.catch(function (err) {
console.log(err, " error");
});
}, []);
return (
<div>{JSON.stringify(data, null, 2)</div>
);
}

How should I be using playwright's toHaveScreenshot() within a cucumber test in a React Typescript project?

I'm wanting to implement visual regression testing into a ReactJS app. I already have playwright setup called through cucumber for some other BDD UI tests and wanted to make use of the built in toHaveScreenShot method for visual regression. However, whenever I run the test it throws this error:
Error: toHaveScreenshot() must be called during the test
Here's the test script definition:
package.json excerpt
"test:e2e": "cucumber-js --require cucumber.conf.js --require features/step_definitions/**/*.js --format #cucumber/pretty-formatter",
Here's an example of the code:
cucumber.conf.js
const {
Before,
BeforeAll,
AfterAll,
After,
setDefaultTimeout,
} = require("#cucumber/cucumber");
const { chromium } = require("playwright");
// in milliseconds
setDefaultTimeout(60000);
// launch the browser
BeforeAll(async function () {
global.browser = await chromium.launch({
headless: false,
slowMo: 1000,
});
});
// close the browser
AfterAll(async function () {
await global.browser.close();
});
// Create a new browser context and page per scenario
Before(async function () {
global.context = await global.browser.newContext();
global.page = await global.context.newPage();
});
// Cleanup after each scenario
After(async function () {
await global.page.close();
await global.context.close();
});
homepage.feature
Feature: Homepage
A simple homepage
Scenario: Someone visiting the homepage
Given a new visitor to the site
When they load the homepage
Then they see the page
homepage.js
const { Given, When, Then } = require("#cucumber/cucumber");
const { expect } = require("#playwright/test");
Given("a new visitor to the site", function () {});
When("they load the homepage", async () => {
await page.goto("http://localhost:3000/");
});
Then("they see the page", async () => {
const locator = page.locator('img[alt="An image you expect to see"]');
await expect(locator).toBeVisible();
await expect(locator).toHaveScreenshot();
});
I think the error is complaining that I'm not writing my tests in the usual test() method, but I've not come across anything similar in searches and don't know how to give this context, assuming that is the problem, while using Cucumber.
Can anyone suggest a solution? I'm at a bit of a loss.

Proxy error: Could not proxy request /time from localhost:3000 to http://localhost:5000

I am working on a simple React-Flask App which aims to fetch the current time from the Back-end and display it on the Front-end.
I have the Flask Back-end and the React Front-end both running together at the same time.
The back-end is working perfectly fine on port 5000:
Back-end
Fetch call '/time' from the front-end is unable to fetch the current time even tho I have my proxy defined in the package.json:
"proxy": "http://localhost:5000"
Front-end:
function App() {
const [currentTime, setCurrentTime] = useState(0);
const getCurrentTime = async (API) => {
const response = await fetch(API);
const jsonData = await response.json();
setCurrentTime(jsonData.time);
console.log(jsonData);
};
useEffect(() => {
// getCurrentTime('http://localhost:5000/time');
getCurrentTime('/time');
}, []);
I have tried the methods discussed here. But none of them seems to work for me.
"proxy": "http://127.0.0.1:5000". this solution worked for me. The reason why I was getting this error is that I didn't know that I have to restart the development server after making changes in the package.json.
This worked for me

How to execute external API calls with React + Express on Heroku

I’m still fairly new to full stack development and I’ve been stuck for days. I could really use some help from anyone who is familiar with React + Express + External API projects. Right now I still have my frontend and backend working concurrently and works fine on my local machine,
https://mernaddonsapp.herokuapp.com
the login and register are working with connected to mongodb alas on heroku but the external api is not working.
I've tried with heroku local, which I think runs the code on heroku locally, and everything works just fine. As for my client I served the files locally with "yarn build" and then "serve -d build" and it works.
but when I deployed the mern app to heroku I get no data sent to the front end from the external api in the back end.
this is the back end code in express
router.get('/filterByValue', (req, res) => {
let url = `https://restcountries.eu/rest/v2/all`;
let search = req.query.search;
axios({
method: 'get',
url,
})
.then((response) => {
var listofcountries = filterByValue(response.data, search);
console.log(listofcountries);
res.status(200).json(listofcountries);
})
.catch((error) => {
console.log(error);
});
});
and this is the front end code in react
filtersearch = async (e) => {
console.log(e.target.value)
if (e.key === "Enter") {
try {
const response = await axios.get(`/api/countries/filterByValue/?search=${this.state.s}`
)
this.setState({
specificCountry: response.data
})
this.props.history.push('/FilterListOfCountries', { response: this.state.specificCountry })
console.log(response)
} catch (e) {
console.log(e.message)
alert(e.message)
}
}
}

Next.js - Error: only absolute urls are supported

I'm using express as my custom server for next.js. Everything is fine, when I click the products to the list of products
Step 1: I click the product Link
Step 2: It will show the products in the database.
However if I refresh the /products page, I will get this Error
Server code (Look at /products endpoint)
app.prepare()
.then(() => {
const server = express()
// This is the endpoints for products
server.get('/api/products', (req, res, next) => {
// Im using Mongoose to return the data from the database
Product.find({}, (err, products) => {
res.send(products)
})
})
server.get('*', (req, res) => {
return handle(req, res)
})
server.listen(3000, (err) => {
if (err) throw err
console.log('> Ready on http://localhost:3000')
})
})
.catch((ex) => {
console.error(ex.stack)
process.exit(1)
})
Pages - products.js (Simple layout that will loop the products json data)
import Layout from '../components/MyLayout.js'
import Link from 'next/link'
import fetch from 'isomorphic-unfetch'
const Products = (props) => (
<Layout>
<h1>List of Products</h1>
<ul>
{ props.products.map((product) => (
<li key={product._id}>{ product.title }</li>
))}
</ul>
</Layout>
)
Products.getInitialProps = async function() {
const res = await fetch('/api/products')
const data = await res.json()
console.log(data)
console.log(`Showed data fetched. Count ${data.length}`)
return {
products: data
}
}
export default Products
As the error states, you will have to use an absolute URL for the fetch you're making. I'm assuming it has something to do with the different environments (client & server) on which your code can be executed. Relative URLs are just not explicit & reliable enough in this case.
One way to solve this would be to just hardcode the server address into your fetch request, another to set up a config module that reacts to your environment:
/config/index.js
const dev = process.env.NODE_ENV !== 'production';
export const server = dev ? 'http://localhost:3000' : 'https://your_deployment.server.com';
products.js
import { server } from '../config';
// ...
Products.getInitialProps = async function() {
const res = await fetch(`${server}/api/products`)
const data = await res.json()
console.log(data)
console.log(`Showed data fetched. Count ${data.length}`)
return {
products: data
}
}
Similar to the #Shanker's answer, but if you prefer not to install the additional package for this, here is how to do it.
async getInitialProps({ req }) {
const protocol = req.headers['x-forwarded-proto'] || 'http'
const baseUrl = req ? `${protocol}://${req.headers.host}` : ''
const res = await fetch(baseUrl + '/api/products')
}
It sounds silly but worth mentioning. If you're using SSR in your webapp the fetch call will work with a relative link on the client but will fail on the server. Only the server needs an absolute link!
If you want to prevent the server from making the request just wrap it in logic
if(global.window){
const req = fetch('/api/test');
...
}
You could utilize environment variables if your project is hosted on a provider that supports it.
env.local
// Local
URL="http://localhost:3000"
// Production
URL="https://prod.com"
Then you can use the following.
const { URL } = process.env;
const data = await fetcher(URL + '/api');
This simple solution worked for me without having to add an additional config file,
Install
npm install --save next-absolute-url
Usage
import absoluteUrl from "next-absolute-url";
async getInitialProps({ req }){
const { origin } = absoluteUrl(req, req.headers.host);
console.log('Requested URL ->',origin);
// (or) other way
const host = absoluteUrl(req, req.headers.host);
console.log('Requested URL ->',host.origin);
}
Case 1. It's not an error. The isomorphic-unfetch is running by SSR mode, so Node.js needs to know the absolute url to fetch from it, because the back-end doesn't know your browser settings.
Case 2. Another scenario is to prevent the http host poisoning headers attack.
append secret keys and tokens to links containing it:
<a href="http://_SERVER['HOST']?token=topsecret"> (Django, Gallery, others)
....and even directly import scripts from it:
<script src="http://_SERVER['HOST']/misc/jquery.js?v=1.4.4">
Case 3. The isomorphic-unfetch it's the library we are going to use to fetch data. It's a simple implementation of the browser fetch API, but works both in client and server environments.
Read more about it:
Isomorphic unfetch - Switches between unfetch & node-fetch for client & server
Prevent http host headers attack
Fetching Data for Pages
In the NextJS 9.5, we can also use process.cwd().
process.cwd() will give you the directory where Next.js is being executed.
import path from 'path'
import fs from "fs";
export const getStaticProps: GetStaticProps = async () => {
const dataFilePath = path.join(process.cwd(), "jsonFiles", "data.json");
console.log(dataFilePath); // will be YourProject/jsonFiles/data.json
const fileContents = fs.readFileSync(dataFilePath, "utf8");
const data: TypeOfData[] = JSON.parse(fileContents);
return { props: { data } };
};
Ref: https://nextjs.org/docs/basic-features/data-fetching#reading-files-use-processcwd
Putting this out there because this showed up in google results for my problem, even though the question itself isn't really related (outside of the fact that the same dependency is throwing the same error message, albeit in a different context for a different reason).
I got this issue from using hardhat while attempting to verify (verify:verify) my contract on etherscan. The problem was that in the hardhat config, I didn't have a full url under rinkeby (since I was verifying on rinkeby, would be mainnet, etc.). Copy/pasting some config stuff quickly into a project I cloned from someone else, they had a full URL in their .env, while I had the url in the config and stored only my api key in my .env.
To figure this out, though, was straightforward--go into node_modules, then find the node-fetch folder, then lib, (this is from memory--just find the line that is vomitting in your stack trace) then the line number, and put a console log there to see what the "bad" url you're seeing is. Usually that's enough of a clue; in my case, it was an API key, obviously not a URL, and that made it straightforward to solve.
If you have an absolute path issues. Try to use swr to access data.
Notice: This is a React hooks so you must call inside the component.
import useSWR from 'swr';
// optionally you can use unfetch package from npm or built yours to handle promise.
const fetcher = (...args) => fetch(...args).then(res => res.json())
export const AllProducts = () => {
const { data, error } = useSWR('/api/products', fetcher)
if (error) return <div>failed to load</div>
if (!data) return <div>loading...</div>
return (
<div>{data}</div>
);
};
Export or deploying in production
Whenever you are trying to deploy on Vercel you might encounter an error. For instance `
warn - Statically exporting a Next.js application via `next export` disables API routes`.
It means you are trying to export data and NextJS does not support fetching data from pages/api/* directory. To avoid errors, its better to separate build and export command.
// package.json
{
"scripts": {
"dev": "next",
"build": "next build", // No next export command
"start": "next start"
},
}
Thanks folks for great contribution and I hope the answer shared will help somebody too.
Make sure what the value of your API url is
In my case, I was using POST but my url was somewhat undefined.
Use console.log to see where is your request going.
this is a way to get the base hostname to fetch data from external endpoint
without getting that error
function return_url(context) {
if (process.env.NODE_ENV === "production") {
return `https://${context.req.rawHeaders[1]}`;
} else if (process.env.NODE_ENV !== "production") {
return "http://localhost:3000";
}
}
and on the getServerSideProps or getStaticProps functions you use
export async function getServerSideProps(context) {
let url = return_url(context);
const data = await fetch(`${url}/yourEndPoint`).then((res) => res.json());
return {
props: {
data: data,
},
};
}
If you are using next environment config prefix your variables with NEXT_PUBLIC_ as mentioned here Exposing Environment Variables to the Browser.
USE: NEXT_PUBLIC_STRAPI_URL="http://localhost:1337" instead of
NEXT_PUBLIC_STRAPI_URL=http://localhost:1337
use .log(console.log) after nock , so you will get exact unmatched and expected url .
Example:
nock("http://localhost")
.log(console.log)
.persist()
.get("/api/config")
.reply(200, { data: 1234 })

Resources