How to proxy with Gatsby and custom Webpack config (netlify-lambda) - webpack-dev-server

I'm using Gatsby with netlify-lambda which creates a server for functions on the 9000 ports:
http://localhost:9000/myFunctionName
In production the address of the functions is:
/.netlify/functions/myFunctionName
So I would like to have a dev mode proxy that serves http://localhost:9000/ when I call /.netlify/functions.
My custom Webpack config in gatsby-node.js:
exports.modifyWebpackConfig = ({ config, stage }) => {
if (stage === 'develop') {
config.merge({
devServer: {
proxy: {
'/.netlify/functions': {
target: 'http://localhost:9000',
pathRewrite: {
'^/\\.netlify/functions': ''
}
}
}
}
})
}
}
Does not work.
I tried this too https://www.gatsbyjs.org/docs/api-proxy/#api-proxy but I need to rewrite the url and not just prefix.
What's the best way to use netlify-lambda with Gatsby ?
Thanks

Update: Gatsby now does support Express middleware, in this merged PR. This will not support the Webpack Dev Server proxy configuration, but will allow using ordinary Express proxy middleware.
To use with netlify-lambda just add this to your gatsby-config.js :
const proxy = require("http-proxy-middleware")
module.exports = {
developMiddleware: app => {
app.use(
"/.netlify/functions/",
proxy({
target: "http://localhost:9000",
pathRewrite: {
"/.netlify/functions/": "",
}
})
)
}
}
https://www.gatsbyjs.org/docs/api-proxy/#advanced-proxying
Unfortunately, the Gatsby development proxy configuration is not at all extensible. Since Gatsby does not use the webpack dev server, its proxy options are not available. See https://github.com/gatsbyjs/gatsby/issues/2869#issuecomment-378935446
I achieve this use case by putting an nginx proxy in front of both my Gatsby development server and the netlify-lambda server. This is a very simplified version of the proxy server config (trailing slash is important):
server {
listen 8001;
location /.netlify/functions {
proxy_pass http://0.0.0.0:9000/;
}
location / {
proxy_pass http://0.0.0.0:8000;
}
}
I'm using an nginx proxy anyhow, so that's not an undesirable amount of extra dev setup, but it would certainly be better if Gatsby supported this common case.

Related

Vite serving shader file with wrong (none) MIME type

I'm developing a BabylonJS application. BabylonJS PostProcess class appends .fragment.fx to a given file name and requests that from the server. When my local Vite (version 4.0.4) dev server serves this file the content-type header is empty. This causes Firefox to intepret it as type xml and fail. Chrome fails through a different, but I think related, mechanism.
How do you configure Vite to serve the *.fragment.fx static files as text/plain? I assume I need to disable the default middleware and write some custom code instead, like this: https://vitejs.dev/config/server-options.html#server-middlewaremode but I wanted to first check there wasn't something else going on / a simpler way to configure / fix this.
The vite dev server is started using vite --host --port 3000 --force and the config in vite.config.js is:
import { defineConfig } from 'vite';
export default defineConfig(({ command, mode }) => {
// if (command === 'serve') {
// return {
// // dev specific config
// }
// } else {
// // command === 'build'
// return {
// // build specific config
// }
// }
return {
resolve: {
alias: {
"babylonjs": mode === "development" ? "babylonjs/babylon.max" : "babylonjs",
}
},
base: "",
// assetsInclude: ['**/*.fx'],
};
});
* edit 1 *
I have seen there's a parameter ?raw that can be added to the URL however I don't control how BabylonJS forms the URL so I can't see how to make this work in this situation.
I followed these instructions and set up a dev server using express. I added this block of code above the call to app.use(vite.middlewares):
app.use("**/*.*.fx", async (req, res, next) => {
const url = req.originalUrl
const file_path = path.resolve(__dirname, "." + url)
const file = fs.readFileSync(file_path, "utf-8")
res.status(200).set({ "Content-Type": "text/plain" }).end(file)
})
I now start the dev server using the following script line in the package.json of "dev": "node server",
I could not find a way to solve this by configuring the default vite dev server.

Why is my amplify environment variable appearing for just a moment then disappearing again [duplicate]

So i'm using the Contentful API to get some content from my account and display it in my Next.Js app (i'm using next 9.4.4). Very basic here. Now to protect my credentials, i'd like to use environment variables (i've never used it before and i'm new to all of this so i'm a little bit losted).
I'm using the following to create the Contentful Client in my index.js file :
const client = require('contentful').createClient({
space: 'MYSPACEID',
accessToken: 'MYACCESSTOKEN',
});
MYSPACEID and MYACCESSTOKEN are hardcoded, so i'd like to put them in an .env file to protect it and don't make it public when deploying on Vercel.
I've created a .env file and filled it like this :
CONTENTFUL_SPACE_ID=MYSPACEID
CONTENTFUL_ACCESS_TOKEN=MYACCESSTOKEN
Of course, MYACCESSTOKEN and MYSPACEID contains the right keys.
Then in my index.js file, i do the following :
const client = require('contentful').createClient({
space: `${process.env.CONTENTFUL_SPACE_ID}`,
accessToken: `${process.env.CONTENTFUL_ACCESS_TOKEN}`,
});
But it doesn't work when i use yarn dev, i get the following console error :
{
sys: { type: 'Error', id: 'NotFound' },
message: 'The resource could not be found.',
requestId: 'c7340a45-a1ef-4171-93de-c606672b65c3'
}
Here is my Homepage and how i retrieve the content from Contentful and pass them as props to my components :
const client = require('contentful').createClient({
space: 'MYSPACEID',
accessToken: 'MYACCESSTOKEN',
});
function Home(props) {
return (
<div>
<Head>
<title>My Page</title>
<link rel="icon" href="/favicon.ico" />
</Head>
<main id="page-home">
<Modal />
<NavTwo />
<Hero item={props.myEntries[0]} />
<Footer />
</main>
</div>
);
}
Home.getInitialProps = async () => {
const myEntries = await client.getEntries({
content_type: 'mycontenttype',
});
return {
myEntries: myEntries.items
};
};
export default Home;
Where do you think my error comes from?
Researching about my issue, i've also tried to understand how api works in next.js as i've read it could be better to create api requests in pages/api/ but i don't understand how to get the content and then pass the response into my pages components like i did here..
Any help would be much appreciated!
EDIT :
So i've fixed this by adding my env variables to my next.config.js like so :
const withSass = require('#zeit/next-sass');
module.exports = withSass({
webpack(config, options) {
const rules = [
{
test: /\.scss$/,
use: [{ loader: 'sass-loader' }],
},
];
return {
...config,
module: { ...config.module, rules: [...config.module.rules, ...rules] },
};
},
env: {
CONTENTFUL_SPACE_ID: process.env.CONTENTFUL_SPACE_ID,
CONTENTFUL_ACCESS_TOKEN: process.env.CONTENTFUL_ACCESS_TOKEN,
},
});
if you are using latest version of nextJs ( above 9 )
then follow these steps :
Create a .env.local file in the root of the project.
Add the prefix NEXT_PUBLIC_ to all of your environment variables.
eg: NEXT_PUBLIC_SOMETHING=12345
use them in any JS file like with prefix process.env
eg: process.env.NEXT_PUBLIC_SOMETHING
You can't make this kind of request from the client-side without exposing your API credentials. You have to have a backend.
You can use Next.js /pages/api to make a request to Contentful and then pass it to your front-end.
Just create a .env file, add variables and reference it in your API route as following:
process.env.CONTENTFUL_SPACE_ID
Since Next.js 9.4 you don't need next.config.js for that.
By adding the variables to next.config.js you've exposed the secrets to client-side. Anyone can see these secrets.
New Environment Variables Support
Create a Next.js App with Contentful and Deploy It with Vercel
Blog example using Next.js and Contentful
I recomended to update at nextjs 9.4 and up, use this example:
.env.local
NEXT_PUBLIC_SECRET_KEY=i7z7GeS38r10orTRr1i
and in any part of your code you could use:
.js
const SECRET_KEY = process.env.NEXT_PUBLIC_SECRET_KEY
note that it must be the same name of the key "NEXT_PUBLIC_ SECRET_KEY" and not only "SECRET_KEY"
and when you run it make sure that in the log says
$ next dev
Loaded env from E:\awesome-project\client\.env.local
ready - started server on http://localhost:3000
...
To read more about environment variables see this link
Don't put sensitive things in next.config.js however in my case I have some env variables that aren't sensitive at all and I need them Server Side as well as Client side and then you can do:
// .env file:
VARIABLE_X=XYZ
// next.config.js
module.exports = {
env: {
VARIABLE_X: process.env.VARIABLE_X,
},
}
You have to make a simple change in next.config.js
const nextConfig = {
reactStrictMode: true,
env:{
MYACCESSTOKEN : process.env.MYACCESSTOKEN,
MYSPACEID: process.env.MYSPACEID,
}
}
module.exports = nextConfig
change it like this
Refer docs
You need to add a next.config.js file in your project. Define env variables in that file and those will be available inside your app.
npm i --save dotenv-webpack#2.0.0 // version 3.0.0 has a bug
create .env.development.local file in the root. and add your environment variables here:
AUTH0_COOKIE_SECRET=eirhg32urrroeroro9344u9832789327432894###
NODE_ENV=development
AUTH0_NAMESPACE=https:ilmrerino.auth0.com
create next.config.js in the root of your app.
const Dotenv = require("dotenv-webpack");
module.exports = {
webpack: (config) => {
config.resolve.alias["#"] = path.resolve(__dirname);
config.plugins.push(new Dotenv({ silent: true }));
return config;
},
};
However those env variables are gonna be accessed by the server. if you want to use any of the env variables you have to add one more configuration.
module.exports = {
webpack: (config) => {
config.resolve.alias["#"] = path.resolve(__dirname);
config.plugins.push(new Dotenv({ silent: true }));
return config;
},
env: {
AUTH0_NAMESPACE: process.env.AUTH0_NAMESPACE,
},
};
For me, the solution was simply restarting the local server :)
Gave me a headache and then fixed it on accident.
It did not occur to me that env variables are loaded when the server is starting.

localhost:3000 This site can’t be reached after installing http-proxy-middleware

I am building a newsletter sign-up form that uses .netlify-lambda to send my form submission to Mailchimp. I installed http-proxy-middleware to help the front end find the netlify-lambda folder. After writing the proxy setup code below my React start script stopped working. It appears the proxy setup below is interfering with localhost:3000.
My proxy setup looks like this
const proxy = require('http-proxy-middleware');
module.exports = function(app) {
console.log('Using proxy...')
app.use(proxy('/.netlify/functions/', {
target: 'http://localhost:9000/',
"pathRewrite": {
"^\\.netlify/functions": ""
}
}));
};
If the target is localhost:9000 why is it interfering with localhost:3000?
When I start my Lambda server it says: Lambda server is listening on 9000.
I am also getting this error when trying to compile my client app.
crbug/1173575, non-JS module files deprecated
Short answer (for #lachnroll and anyone who might be encountering the same problem):
Please use const { createProxyMiddleware } = require("http-proxy-middleware") and app.use(createProxyMiddleware('/.netlify/functions/' ...)...) , instead of using const proxy = require('http-proxy-middleware'); and app.use(proxy("/.netlify/functions/" ...)...) , it should work.
Long one:
I've come across the same "can't be reached" thing in a React project when using http-proxy-middleware(2.0.3), until I changed const proxy = require('http-proxy-middleware'); and proxy("/.netlify/functions/" ...) to const { createProxyMiddleware } = require("http-proxy-middleware"); and app.use(createProxyMiddleware('/.netlify/functions/' ...)...) , I think the proxy has been removed, see: https://github.com/chimurai/http-proxy-middleware#readme

What is the correct syntax for adding the xframe header module to cra rewired?

I'm trying to return the X-Frame-Options in my create react app (rewired) but I'm not sure of the correct syntax to use to add the function to my existing override. How do I do this properly?
module.exports = override(
fixBabelImports("react-bulma-components", {
libraryName: "react-bulma-components",
libraryDirectory: "lib/components"
}),
{devServer: function(configFunction) {
return function(proxy, allowedHost) {
const config = configFunction(proxy, allowedHost)
config.headers = {
'X-Frame-Options': 'Deny'
}
return config
}
}},
...addCompressions()
);
The front end is a CRA (rewired) non-static webapp
The backend is a node app hosted seperately
I've also tried changing the buildpack to this buildback in order to add the configuration in a static.json file, but that breaks a lot of different things, uses a different server etc.
The proper way of doing this, is by not doing it... is useless, dont waste your time. Let's remember that yow CRA Page will executed on the browser and it won't send you headers/data or anything, instead it will be send as well by Nginx/Apache/NodeJs something else.
Firefox even says the same: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Frame-Options
Note: Setting X-Frame-Options inside the element is useless! For instance, has no effect. Do not use it! X-Frame-Options works only by setting through the HTTP header, as in the examples below.
Heroku
You can configure different options for your static application by writing a static.json in the root folder of your application.
Custom Headers
Using the headers key, you can set custom response headers. It uses the same operators for pathing as Custom Routes.
{
"headers": {
"/": {
"Cache-Control": "no-store, no-cache"
},
"/assets/**": {
"Cache-Control": "public, max-age=512000"
},
"/assets/webfonts/*": {
"Access-Control-Allow-Origin": "*"
}
}
}
https://blog.heroku.com/using-http-headers-to-secure-your-site

Webpack dev server sockjs-node returns 404 error

I am running a simple Vue app with webpack that I created with the vue-cli. When I run the dev server wtih npm run serve, it shows several errors in the client console when using sockjs-node. I believe this module is used by webpack for hot reloading (HMR).
The first error is:
Access to XMLHttpRequest at 'http://192.168.1.4:8080/sockjs-node/info?t=1615330207390' from origin 'http://localhost:8080' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
I can solve this in two ways by editing the devServer in my vue.config.js. The first method is by setting public: 'localhost:8080'; and the second is by setting headers: {'Access-Control-Allow-Origin': 'http://192.168.1.4:8080', 'Access-Control-Allow-Credentials': 'true'}.
In both cases, I then see the following two errors:
POST http://localhost:8080/sockjs-node/690/qvgqwbdo/xhr_streaming?t=1615330686134 404 (Not Found)
GET http://localhost:8080/sockjs-node/690/zympwfsc/eventsource 404 (Not Found)
How do I resolve these errors so that the hot reloader will connect?
In the function I set to devServer.before in my vue.config.js file, I created my own websockets using Socket.io on the same port as my devSever. When the function returned, the devServer could not use that port for websockets, so it failed to launch sockjs-node. Therefore, when the frontend client tried to connect to the devServer, the requests were going to my sockets, instead of the devServer sockets, and it was ignoring them. Hence the 404 errors.
Here is my original code:
// server.js
const { createServer } = require('http')
const io = require('socket.io')
const addSocketEvents = require('./socket-api')
const port = process.env.PORT || 8080
module.exports = {
port,
configure(app) {
// `app` is an instance of express
// add the websockets
const httpServer = createServer(app)
socket = io(httpServer, {
path: '/socket-api'
})
addSocketEvents(socket)
// starts the server
// cannot use app.listen() because that will not start the websockets
httpServer.listen(port)
}
}
// vue.config.js
const { port, configure } = require('./server')
module.exports = {
devServer: {
before: configure,
public: `localhost:${port}`,
},
}
To fix this issue, I needed to allow the devServer to use the original port for sockjs-node, and launch my sockets on a different port. However, because I need to use the same port in production (due to restrictions by my current hosting provider), I only want my sockets to use a different port when running the devServer. To do this, I simply created a different httpServer and launched it on a different port, then created a proxy in the devServer config for that port. In my configure function, I just check to see if it is running in dev or prod, and act accordingly.
My production server is a simple express instance which calls the same configure function after it is created. This allows me to put all my startup code in one place.
Here is my new code:
// server.js
const { createServer } = require('http')
const io = require('socket.io')
const addSocketEvents = require('./socket-api')
const port = process.env.PORT || 8080
const proxyPort = 8081
module.exports = {
port,
proxyPort,
configure(app) {
// `app` is an instance of express
// add the websockets
let httpServer, socketPort
if (process.env.NODE_ENV === 'development') {
httpServer = createServer()
socketPort = proxyPort
} else {
httpServer = createServer(app)
socketPort = port
}
// adds the socket-api to be used via websockets
socket = io(httpServer, {
path: '/socket-api'
})
addSocketEvents(socket)
// starts the server
httpServer.listen(socketPort)
}
}
// vue.config.js
const { port, configure } = require('./server')
module.exports = {
devServer: {
before: configure,
public: `localhost:${port}`,
proxy: {
'/socket-api': {
target: `http://localhost:${proxyPort}`,
ws: true
}
}
},
}

Resources