I am working on project with teachnology combination of React + Postgraphile (GraphQL) + axios(http request to postgraphile server).
It has lots of GraphQL queries. Initially started with queries in same file with the other JavaScript and rendering code but it became messy as soon as specific queries has been added.
While searching I came to know that we can detach queries into separate files - .graphql or .gql
For this to allow I have to integrate with Webpack module -
I wanted to know if there is simpler(kind of out of the box) way to achieve similar thing without using Webpack as it needs lots of configuration in place.
Any pointers or examples will be really helpful.
Thank you.
On the client, we typically create a mirrored tree of the pages directory under graphql then create js or ts files with the query exported! Then importing it where needed (in our case the graphql client request body).
So for example:
export const GET_TEAM_QUERY = `
query {
// your query here
}
`
Hope that helps!
Since it's non-standard functionality to import/require a file that's not .js or .json, you need some kind of plugin to tell the runtime how to interpret it and not crash. The two ways I know of are:
using the graphql-tag/loader in webpack, with only a single loader rule:
module.exports = {
// ...
module: {
rules: [
// babel loader, css imports,
{
test: /\.(graphql|gql)$/,
exclude: /node_modules/,
loader: 'graphql-tag/loader'
},
],
},
// ...
}
or, with babel-plugin-import-graphql in babel, though this plugin just mimcs the functionality of the above webpack loader, and in fact suggests using it along side graphql-tag to reduce the size of the compiled query. It is useful if you need to run your code with babel-node, but I would suggest the above webpack loader in most cases.
I found a very neat solution to this problem: graphql-code-generator. Using this tool, you can write your queries and fragments in separate files and it will compile them into typescript files, fully type-safe and ready to be imported into your components.
Related
I'm experimenting with ejecting from CRA 4 app to enable simple sharing of react components among multiple modules in a monorepo:
|-react_project_a
|-react_project_b
|-shared
After ejecting, all I had to do is to add the shared module path to two places in the webpack.config file:
new ModuleScopePlugin(paths.appSrc, [
paths.appPackageJson,
paths.clientShared //*******here
reactRefreshOverlayEntry,
]),
and
test: /\.(js|mjs|jsx|ts|tsx)$/,
include: [paths.appSrc, paths.clientShared],
loader: require.resolve('babel-loader')
works a treat, I even get hot reload. The only problem is that only the main app gets linted and so compilation errors and warnings for the shared module appear only in the browser developer tool console, like this:
The EslintWebpackPlugin used by CRA doesn't support multiple contexts for the plugin. Is there an alternative? I'm trying to provide my developers with an experience as close as possible to what they had with CRA.
Since my goal was sharing jsx components among CRA/react apps, the way simpler solution was to create a symbolic link in each react project pointing to the shared directory:
|-react_project_a
|-shared=>shared
|-react_project_b
|-shared=>shared
|-shared
And to add this to the webpack.config
resolve: {
symlinks: false
}
Since this was the only change I required, I wound up using CRACO instead of ejecting to handle the override. I'd close the question, but think folks might find this useful.
I am trying to do this as simple as possible, I studied Yarn Workspaces for a while, but that's a solution that's currently doesn't work with Electron, there were simply too many issues.
I have am Electron project here: ./electron/
I have a directory with components here: ./common/
The components are developed in React/JSX, there is nothing really fancy about them. That said, I am using hooks (useXXX).
I tried many ways to include those components (ideally, I wanted to use Yarn Workspaces, but it only multiplied the number of issues), but they all failed. Which is why I would like to avoid using yarn link or workspaces or making the common a library, etc. I just want my Electron project to behave as if the files were under ./electron. That's it.
The closest I came to a solution is by using electron-webpack, and overriding it with this config:
module.exports = function(config) {
config = merge.smart(config, {
module: {
rules: [
{
test: /\.jsx?$/,
//include: /node_modules/,
include: Path.resolve(__dirname, '../common'),
loaders: ['react-hot-loader/webpack', 'babel-loader?presets[]=#babel/preset-react']
},
]
},
resolve: {
alias: {
'#common': Path.resolve(__dirname, '../common')
}
}
})
return config
}
I can import modules, and they work... except if I use hooks. And I am getting the "Invalid Hook Call Warning": https://reactjs.org/warnings/invalid-hook-call-warning.html.
I feel like that /common folder is not being compiled properly by babel, but the reality is that I have no idea where to look or what to try. I guess there is a solution for this, through that webpack config.
Thanks in advance for your help :)
I found the solution. That happens because the instance of React is different between /common and /electron.
The idea is to add an alias, like this:
'react': Path.resolve('./node_modules/react')
Of course, the same can be done for other modules which need to be exactly on the same instance. Don't hesitate to comment this if this answer it not perfectly right.
I wrestled more than a day with a similar problem. My project has a dependency on a module A that is itself bundled by Webpack (one that I authored myself). I externalised React from A (declaring it to be a commonjs2 module). This will exclude the React files from the library bundle.
My main program, running in the Electron Renderer process, uses React as well. I had Webpack include React into the bundle (no special configuration).
However, this produced the 'hooks' problem because of two instances of React in the runtime environment.
This is caused by these facts:
module A 'requires' React and this is resolved by the module system of Electron. So Electron takes React from node_modules;
the main program relies on the Webpack runtime to 'load' React from the bundle itself.
both Electron and the Webpack runtime have their own module cache...
My solution was to externalise React from the main program as well. This way, both the main program and module A get their React from Electron - a single instance in memory.
I tried any number of aliases, but that does not solve the problem as an alias only gives direction to the question of where to find the module code. It does nothing with respect to the problem of multiple module caches!
If you run into this problem with a module that you cannot control, find out if and how React is externalised. If it is not externalised, I think you cannot solve this problem in the context of Electron. If it is externalised as a global, put React into your .html file and make your main program depend on that as well.
I'm working in a project where we want to integrate Webpack into our workflow. The problem is, we have over 1000 AngularJS files and adding import/export to all of them in one go is not an option for us. We'd like to bundle all of them and slowly incorporate the import/exports as we work on each file over time.
How would you approach that problem? Any specific best practices when doing this?
We literally had the same problem. Essentially you want to create "entry point files" that perform requires for all your files, since this is how webpack works (it follows the dependency tree). Then point webpack at these "entry point files".
The example at the link above uses TypeScript, but you can easily use ES5 like this:
# ./entry-points/feature1.js
importAll = function(r) {
r.keys().forEach(r);
};
importAll(require.context('./app/feature1', true, /module\.js$/));
importAll(require.context('./app/feature1', true, /(^(?!.*(spec|module)\.js).*\.js)$/));
You can grab a polyfill for Object.keys here, and Array.forEach` here.
Then point to this file from your webpack config like this:
entry: {
'feature1': './entry-points/feature1.js'
}
You can read more details here
I have a need for both inline SVG (currently handled with a sprite using <svg><use xlinkHref="icons.svg#info" /></svg>) and SVG used as a background in CSS (background-image: url(/assets/svg/info.svg);).
I want to implement hashing of filenames to help with cache busting, which works fine in the case of CSS, using the following:
{
test: /\.svg$/i,
loader: 'file-loader',
query: {
name: 'svg/[name]-[sha512:hash:base64:7].[ext]',
publicPath: PATHS.public
}
}
However, I want to also be able to refer to the individual SVG icons inline, whilst ensuring that the source files have a hashed filename.
Does anybody have a foolproof approach to getting the best of both worlds?
Ultimately I want to have a source folder of SVG files which during a build are:
Individually have their filenames hashed
All compiled to a sprite which has a hashed filename and is then available for use with an <svg> tag?
Thanks,
Dan
It turns out here that the problem was fairly simple, I wasn't running the sass loader over both client and server javascript.
While I've been working this problem out I've been keeping a separate demo repository to build out a good starting place for server and client rendered React with Webpack and a good production ready set of processing for assets.
https://github.com/danielrosewarne/webpack-demo
Hope that helps!
I am following the React-router docs, but I have encountered an obstacle that is not really related to the router itself: Babel transpiles the {import} as require, which would be used by Express or Node.js on the server, but from what I understand from the docs, it is actually intended for client-side rendering.
Of course, the JSX file with the router transpiled using Babel and included into a HTML browser page does not work, since require is only used by express/node server-side.
May I ask how is it actually supposed to work in the browser?
Thank you
Babel's transpile of import produces code relying on CommonJS require, you're correct.
You're also correct that node offers a natire require implementation, whereas browsers do not.
There are tools - such as webpack, browserify, and requirejs (among others,) which each do at least two things:
to package up source into a single bundle
to expose that source in a way that satisfies require to match node, allowing you to use the same code at either side.
To that end, what you need to do is to pair babel with one of the packaging tools.
Webpack is more powerful; browserify is easier to use.
Here's a tiny gulpfile where I've automated the process. The relevant source clip is this:
gulp.task('browserify', ['babel'], function() {
var browserifyConfig = {},
bpack = browserify(browserifyConfig, { 'debug' : !production });
return bpack
.require('./dist/pbar.es5.js', { 'expose' : 'pbar' })
.bundle()
.on('error', errorHandler)
.pipe(source('pbar.es5.js'))
.pipe(gulp.dest('./dist'));
});
In order for commonjs like require statement to work in a browser environment. You will need to look into a bundling solution like:
https://webpack.github.io/
http://browserify.org/
A bundler will statically parse your commonjs files and their dependencies to create a bundle which can be used in the browser.
Internet is full of great examples on how they work.
Browserify is easier to get started than Webpack, however I would suggest you learn Webpack over Browserify. Webpack provides you much much more than just loading JS files with its extensive loaders, for example you can do something like:
const imgSrc = require('images/test.svg')
magical right?