I'm stuck in a trouble with my project. I am using gulp + webpack to compile the client side. During the developing stage, I want to use nodemon to watch the file changing in the server directory and I want to find a suitable mode to watch the client side part of the project and re-run the webpack task.
Here is my gulpfile
gulp.task('clean:tmp', (cb) => {
del([paths.tmp]).then(paths => {
plugins.util.log('[clean]', paths);
cb();
});
});
gulp.task('serve', ['clean:tmp'], () => {
const config = require('./webpack.dev.config');
config.entry.app = paths.entry;
return gulp.src(paths.entry)
.pipe(webpack(config))
.pipe(gulp.dest('.tmp'));
});
gulp.task('watch', ['serve'], () => {
return nodemon({
script: `${rootServer}/`,
watch: ['server/*'],
});
});
The problem is that if I run the gulp watch with webpack.config.watch = true, webpack breaks the gulp pipe logic.
I also check out this answer Watch webpack.config.js and re-run webpack command in response to a file change
but I cannot apply the solution.
Any suggestion?
Related
I am working on a React app bootstraped from create-react-app few years ago.
The app has a .env.dev file with many variables.
The start script is "start": "env-cmd -f .env.dev --use-shell \"react-scripts start\"",
React script version: "react-scripts": "^4.0.1",
When I console.log("gggg", process.env); I get all the variables.
When I:
describe('Login via API', () => {
it('Test login', () => {
console.log('teeest', process.env)
cy.login()
})
})
instead I get an empty object.
I tried to read the question How to use process.env variables in browser running by Cypress
however this question does not answer my question on how to make the process.env variables available to Cypress test files.
Also this question says to install dotenv. Dotenv comes with react-scripts, so no need to install it if the app was create by create-react-app.
I also tried this:
in cypress.config.js I added:
const { defineConfig } = require("cypress");
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
config.env = process.env
return config
}
}
})
And in the spec I try to get the variable defined in .env.dev file:
it('Test login', () => {
console.log('new', Cypress.env('REACT_APP_USERNAME'))
cy.login()
})
Still getting undefined.
Can anyone please help me to understand what's wrong? How can I make it work?
Edit:
According to an answer here I tried to install dotenv:
npm install dotenv --save
imported in the test:
import 'dotenv/config'
describe('Login via API', () => {
it('Test login', () => {
console.log('newwwww', Cypress.env('REACT_APP_USERNAME'))
console.log('teeest', process.env)
cy.login()
})
})
Npm start
npm run cypress:open
Result:
newwwww undefined
login-test.cy.js:7 teeest {}
Thanks
When you use "start": "env-cmd -f .env.dev --use-shell \"react-scripts start\"", the env-cmd command is specific to the process for the react app.
You would need the same to run before cypress opens it's process
package.json
{
...
"dependencies": {
...
},
"scripts": {
"cy:open": "env-cmd -f .env.dev cypress open",
...
}
}
Avoiding conflict with other env setting
I also recommend using the spread operator as shown below, otherwise you would lose any env var added in other ways, e.g command line additions.
const { defineConfig } = require("cypress");
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
config.env = {
...process.env, // add all process env var here
...config.env // plus any command line overrides
}
return config // return the altered config
},
},
env: {
login_url: '/login', // define some specific env var here
products_url: '/products'
}
});
Avoiding pollution of Cypress settings
If you take a look at Settings/Project Settings in the Cypress runner, you'll see a huge number of unnecessary settings which come from the general machine env var.
To pick just those with prefix REACT_,
const { defineConfig } = require("cypress");
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
const reactEnv = Object.keys(process.env).reduce((obj, key) => {
if (key.startsWith('REACT_')) {
obj[key] = process.env[key];
}
return obj;
}, {});
config.env = {
...reactEnv, // add REACT_ process env var here
...config.env // plus any command line overrides
}
return config // return the altered config
},
},
env: {
login_url: '/login', // define some specific env var here
products_url: '/products'
}
});
You're right about dotenv being included with react-scripts, but to access it in your test files you'll have to explicitly import it.
npm install dotenv
then at the top of your cypress code
import 'dotenv/config'
See usage instructions here
https://www.npmjs.com/package/dotenv
I'm trying to create a React PWA from scratch. So far my project outputs the minified files to a dist/js folder.
In my service worker file I'm using Workbox to precache the app. This is my setting so far:
importScripts("./node_modules/workbox-sw/build/workbox-sw.js");
const staticAssets = [
"./",
"./images/favicon.png",
]
workbox.precaching.precacheAndRoute(staticAssets);
Currently if I enable offline from dev tools > Service Workers, it throws these errors and the app fails to load:
3localhost/:18 GET http://localhost:8080/js/app.min.js net::ERR_INTERNET_DISCONNECTED
localhost/:1 GET http://localhost:8080/manifest.json net::ERR_INTERNET_DISCONNECTED
3:8080/manifest.json:1 GET http://localhost:8080/manifest.json net::ERR_INTERNET_DISCONNECTED
logger.mjs:44 workbox Precaching 0 files. 2 files are already cached.
5:8080/manifest.json:1 GET http://localhost:8080/manifest.json net::ERR_INTERNET_DISCONNECTED
How can I fix this?
this means your resources are not getting cached properly,
you need to add them to cache before accessing,
workbox by default do it for you.
it shows 2 files cached, as they present in your array, expected result
same do it for all remaining too.
const staticAssets = [
"./",
"./images/favicon.png",
"./js/app.min.js",
"./manifest.json",
{ url: '/index.html', revision: '383676' }
]
you can try to add eventlistener,
self.addEventListener('install', event => {
console.log('Attempting to install service worker and cache static assets');
event.waitUntil(
caches.open("staticCacheName")
.then(cache => {
return cache.addAll(staticAssets);
})
);
});
self.addEventListener('fetch', function(event) {
event.respondWith(caches.match(event.request)
.then(function(response) {
if (response) {
return response;
}
return fetch(event.request);
})
);
});
I recently try to make a boilerplate for webpack2 + babel6 + gulp + react-hot-loader project. I started by forking react-hot-loader-minimal-boilerplate. Then I added a gulpfile to it as this branch.
If you read the code, you'll find I've only added the last 1 commit, which added the gulp package, gulp file and add the npm run gulp script. You 'd want to take a look at gulpfile.babel.js, which currently looks like this:
const gulp = require('gulp');
const webpack = require('webpack');
const cfg = require('./webpack.config');
const devServer = require('webpack-dev-server');
const path = require('path');
const util = require('gulp-util');
gulp.task('dev', () => {
cfg.plugins = [
new webpack.HotModuleReplacementPlugin(),
];
cfg.entry = {
'app': [
'babel-polyfill',
'react-hot-loader/patch',
'webpack-dev-server/client?http://localhost:8080',
'./src/index',
],
};
new devServer(webpack(cfg), {
//contentBase: path.join(__dirname, 'dist'),
hot: true,
historyApiFallback: true,
//publicPath: cfg.output.publicPath,
stats: {
colors: true,
},
}).listen(8080, 'localhost', function (err) {
if(err) throw new gutil.PluginError("webpack-dev-server", err);
util.log(`'${util.colors.cyan('dev:server')}' http://localhost:8080/webpack-dev-server/index.html`);
});
});
Supposedly, the command npm run dev and npm run gulp should have the same effect. But in reality, the gulp command is not working.
If I change my React code, the code in browser should update accordingly.
The console log for code update in npm run dev:
Instead, although the browser did get signal from webpack-dev-server for the update, the DOM is not updated along with the signal.
The console log for code update in npm run gulp:
Any suggestion on how to fix this boilerplate?
Setup
We use protractor, protractor-cucumber, and angular-mocks to handle our e2e tests. Builds are managed by grunt.
We have two build configs; pool and local. pool is the configuration that is picked up by our CI environment, local is for debugging purposes and is setup to run on a local machine. The primary difference between the two setups is that local runs everything locally, whereas pool uses a remote Selenium server (outside the CI server itself).
protractor.conf.js:
exports.config = {
params: {
widgetUrl: 'http://localhost:9980',
vudUrl: 'xxx',
testDelay: 3000,
},
// The timeout for each script run on the browser. This should be longer
// than the maximum time your application needs to stabilize between tasks.
allScriptsTimeout: 150000,
// A base URL for your application under test. Calls to protractor.get()
// with relative paths will be prepended with this.
baseUrl: 'http://localhost:' + (process.env.PORT || '9082'),
// list of files / patterns to load in the browser
specs: ['./e2e/features/*.feature'],
cucumberOpts: {
require: ['./e2e/steps/*.js', './e2e/pageObjects/*.js', './e2e/support/*.js'],
format: 'pretty',
tags: ['#context-test']
},
// Patterns to exclude.
exclude: [],
// ----- Capabilities to be passed to the webdriver instance ----
//
// For a full list of available capabilities, see
// https://code.google.com/p/selenium/wiki/DesiredCapabilities
// and
// https://code.google.com/p/selenium/source/browse/javascript/webdriver/capabilities.js
capabilities: {
browserName: 'chrome',
loggingPrefs: {
'driver': 'INFO',
'server': 'INFO',
'browser': 'INFO'
}
},
// ----- The test framework -----
//
// Jasmine and Cucumber are fully supported as a test and assertion framework.
// Mocha has limited beta support. You will need to include your own
// assertion framework if working with mocha.
framework: 'custom',
frameworkPath: 'node_modules/protractor-cucumber-framework',
mocks: {
dir: "mocks", // path to directory with mocks
default: []
},
onPrepare: function() {
// Chai config
var chai = require('chai');
var chaiAsPromised = require('chai-as-promised');
chai.use(chaiAsPromised);
global.expect = chai.expect;
console.log('params: ' + JSON.stringify(browser.params));
//browser.driver.manage().window().maximize();
}
};
We configure our mocks in a Before hook (snipped here for brevity):
this.Before(function(scenario, callback) {
// ...
let data = require(`${TEST_DATA_DIR}/default.js`);
let httpMocker = function() {
angular.module('httpMocker', ['ngMockE2E'])
.run(function($httpBackend) {
$httpBackend.whenPOST(...);
};
browser.addMockModule('httpMocker', httpMocker, {data})
// ...
callback();
});
Problem
Despite identical test setups, ngMockE2E is not being called when run in the CI environment. This can be demonstrated by the following test:
test.feature:
#Test
Feature: Test and debug
#context-test
Scenario: Get console output
Given I access the page
Then I get the log output
test.steps.js
module.exports = function() {
this.Given(/^I access the page$/, () => {
util.waitAndDo(() => true);
});
this.Then(/^I get the log output$/, () => {
util.waitAndDo(() => {
browser.manage().logs().get('browser').then(function(browserLog) {
console.log('log: ' + require('util').inspect(browserLog));
});
})
});
};
This test will dump the browser log rather than actually testing anything. When it is run locally, the logs are empty. However, when the same test is run in the CI environment, the logs show errors for failed calls to the URLs being mocked.
We have verified that the URLs used in the CI environment correctly match the regex in our mocks, so it is not a match-miss. The module is simply not being called.
To reiterate - the only major difference in configuration is that the pool config makes use of a remote Selenium hub. How could this affect how our tests are run, and why would it prevent our mocks from working?
Update current problem :
it seems that the webpack hot loader goes wrong,because when i run the following cmd:webpack,it can be built as usual.but when i run ""dev": "webpack-dev-server --color --hot --progress && node ./server.js"".webpack cannot generate built files for me .
my webpack.config is as follows:
module.exports = {
entry: getEntries(),
.....
function getEntries(){
var routeDir = path.join(SRC_DIR,"javascripts","routes");
var routeNames = routeDir?fs.readdirSync(routeDir):[];
var nameMaps = {};
routeNames.forEach(function(routeName){
var filename = routeName.match(/(.+)\.js$/)[1];
console.log("filename in entry ",filename);
if(filename){
var devEntryPath = [
'webpack-dev-server/client?http://127.0.0.1:3001', // WebpackDevServer host and port
'webpack/hot/only-dev-server',
path.join(routeDir,filename)
];
nameMaps[filename] = devEntryPath;
}
});
return nameMaps;
}
server.js
var server = new WebpackDevServer(webpack(config), {
publicPath: config.output.publicPath,
hot: true,
historyApiFallback: true
}).listen(3001,'localhost',function(err,result){
if(err) console.log(err);
console.log("webpack listening at port 3001");
});
var app = express();
app.get("/monitor/index",function(req,res){
res.sendFile(__dirname+"/src/views/"+"page1.html");
});
app.get("/monitor/category/*",function(req,res){
res.sendFile(__dirname+"/src/views/"+"page2.html");
});
app.use(express.static(__dirname))
.listen(9090, 'localhost', function (err, result) {
if (err) console.log(err);
console.log('Listening at localhost:9090');
});
finally,i found where the problem is,and know the relationship between webpack-dev-server and my express server.
when using hot-loader with webpack-dev-server:
step1:the webpack build the input file to the publicPath (which was designated in "output" of webpack.config.js).
step2,the node server will send html to the front,and search for the related assets(such as js,img etc),but where? we can change the script(related with html) path to the webpack-dev-server.(just generated by step1),so node-server will ask webpack-dev-server for help.
to sum up ,i modified 3 places:
publicPath of webpackDevServer
webpack output(publicPath),equal to above
script path in html.
that's all.and now,my project can run as expected.