Yesterday it works fine. Today I use ctrl + F8 to add a breakpoint and press ctrl + p , x d ,there is no Xdebug Start Debugging (Launch Browser) any more, only Xdebug Start Debugging .
Why where should I change? any information will be appreciated!
Just remove the current .sublime-project file and save the project again.
Project -> Save Project As...
Then insert again in the .sublime-project file the code below:
{
"folders":
[
{
"follow_symlinks": true,
"path": "."
}
],
"settings": {
"xdebug": {
"url": "http://my.local.website/",
}
}
}
** Be sure to change the "url" to your site base URL.
Sounds like you need to provide a url for xdebug.
For example, say your project documentroot is located at http://localhost/myproject/public_html, then in SublimeText3 goto Project -> Edit Project and enter a url value for xdebug.
For example, my project file currently looks like:
{
"folders":
[
{
"follow_symlinks": true,
"path": "/var/www/html/myproject"
}
],
/* below is the important part ;) */
"settings": {
"xdebug": {
"url": "http://localhost/myproject/public_html",
}
}
}
Once this is saved (you may need to restart SublimeText?) then the option to open in 'browser' should appear.
Also you can set the url field in the user settings for xdebug (Preferences -> Package Settings -> Xdebug -> Settings User) but then this would make the url open on all projects and not just the current one.
Related
I am using ESLint in my Vue(Nuxt) project in VSCode. When I save I would like my ESLint to run automatically and fix all the warnings for me automatically.
This is my settings.json file in VSCode:
{
"editor.codeActionsOnSave": {
"source.fixAll.eslint": true
},
"vetur.format.defaultFormatter.js": "vscode-typescript",
"vetur.validation.template": false,
"vetur.completion.scaffoldSnippetSources": {},
"vetur.completion.useScaffoldSnippets": false,
"vetur.format.defaultFormatter.html": "none",
"workbench.iconTheme": "material-icon-theme",
"git.autofetch": true,
"git.defaultCloneDirectory": "",
"gitlens.views.repositories.files.layout": "list",
"editor.tabSize": 2,
"editor.detectIndentation": false,
}
And this is my .eslintrc.js file:
module.exports = {
root: true,
env: {
browser: true,
node: true,
},
extends: [
"#nuxtjs",
"plugin:nuxt/recommended",
"../.eslintrc.js"
],
rules: {
//Add custom rules for the frontend here.
//Rules that are common for shared, frontend and backend should go in the parent file
"nuxt/no-cjs-in-config": "off",
},
}
The linked ../.eslintrc.js file contains the following:
module.exports = {
parserOptions: {
parser: 'babel-eslint',
},
plugins: ['jest'],
rules: {
'prefer-const': 'off',
'comma-dangle': ['error', 'always-multiline'],
'prefer-template': 'error',
},
env: {
'jest/globals': true
}
}
Whenever I save the file the warnings just show up and will not automatically fix themselves.
EDIT:
I've turned on verbose output and i'm seeing this in the output:
(node:6832) UnhandledPromiseRejectionWarning: Error: Failed to load plugin 'import' declared in 'frontend\.eslintrc.js » #nuxtjs/eslint-config': Cannot find module 'eslint-plugin-import'
Require stack:
I've then ran yarn add eslint-plugin-import and tried again, it still returns the same error.
Get eslint plugin, add this code to your settings.json
{
"editor.codeActionsOnSave": {
"source.fixAll.eslint": true
},
"eslint.validate": ["javascript"]
}
source
Launch VSCode,
Command + Shift + P, type settings and hit enter, paste and save the following:
{
"editor.codeActionsOnSave": {
"source.fixAll.eslint": true,
}
}
You're good to go!
I've managed to fix the issue.
The problem was that there were multiple working directories in my solution, which all have their own eslint config.
Putting the following line in the settings.json file of VSCode solved my issue:
"eslint.workingDirectories": [{ "mode": "auto" }]
I tried those solutions and others, and it still didn't work. Actually it was just that ESLint's use had to be approved for use in VSCode. That is, I clicked on the ESLint item on the editor's bottom bar:
Which opened a popup asking me to approve ESLint. After approval autocorrect was running as expected.
Install ESLint extension from the VSCode marketplace.
Once the ESLint extension has installed you may use CTRL + SHIFT + P to open the Command Palette. Search “ESLint fix all auto-fixable Problems” and press enter.
This command would enable eslint to fix the file on save.
In the snap above as you can see that I am getting eslint errors and just to inform you all that despite saving the file, all auto-fixable problems are were not getting fixed by eslint/prettier setup.
So I tried pressing ctrl+shift+p and selecting prettier as default formatter and also tried doing eslint restart server but that didn't worked.
I noticed that vscode was giving me some notifications at the bottom right corner (bell icon). I clicked on that and some list of pop up came up stating that there are multiple formatters installed for the same extension file. Like for example in the below snap there is .jsx file(it had two formatters one was prettier and other was vscodes inbuilt formatter). I clicked on configure button and selected prettier as default and when I saved the file it worked!
If this doesn't works for you then I think this all worked for me because I had eslint npm packages installed in my project that works with prettier to enforce the prettier rules. (these packages are eslint-config-prettier and eslint-plugin-prettier)
I ran into a similar problem-- ESLint was not properly formatting only certain, seemingly random, files on save. Running ESLint --fix would fix the formatting errors, but saving would not. Adding this line to our workspace settings.json fixed the problem:
"eslint.format.enable": true,
Making all our formatter settings look like this:
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.fixAll": true
},
"eslint.format.enable": true,
You can also go into the ESLint extension settings and check off the checkbox labeled ESLint > Format:Enable. Worked like a charm!
What fixed it for me was adding this to settings.json, because VSCode didn't know what formatter I wanted to be used on save:
"[javascript]": {
"editor.defaultFormatter": "dbaeumer.vscode-eslint"
},
Check if in the settings.json there are other formatters enabled, in my case I had this by mistake.
"[javascript]": {
"editor.defaultFormatter": "vscode.typescript-language-features"}
After getting the Eslint plugin you need to add this line to the settings.json:
{
"editor.codeActionsOnSave": {
"source.fixAll.eslint": true
},
}
Still not working? check if your eslint works fine by running this in the terminal:
eslint --ext \".js,.vue\" --ignore-path .gitignore .
If it failed with exit code 2 try removing node modules and install again.
After running this command you should see the eslint errors.
I was dealing with the same issue, and nothing seemed to help, even though I did all the configurations correctly, and ESLint was marking the problems in my files correctly.
For me the solution was to move the .vscode folder to the project root.
Now everything works as intended.
For normal Docker container, it does not save changes when container exit; But in vscode-remote, when i open my project in an container using settings devcontainer settings, the changes still exists every time i reopen vscode
My project is based on one image, i just want to open vscode remote to edit my project files, but when i changes the image files, my expectation is: changes being reverted when i reopen my vscode;
{
"name": "xxx",
"service": "xxx-service",
"dockerComposeFile": "./docker-compose.yml",
"workspaceFolder": "/home/rde",
"postCreateCommand": "rde docker:run && yarn",
"extensions": [],
"settings": {
"javascript.implicitProjectConfig.experimentalDecorators": true,
"files.exclude": {}
}
}
A protractor spec running from a Jenkins job, that connects to SauceLabs. It clicks a button to download a PDF, and checks that the file successfully downloaded. I am unable to get the chrome browser to NOT open a "Save As" prompt using an absolute path. I AM able to avoid the "Save As" prompt if I use '~/Downloads' as filename, but then my browser.wait that waits for the file to exist hangs forever.
// spec.js
import fs from 'fs'
import path from 'path'
fs.mkdirSync('./downloads')
describe('Clicking DOWNLOAD button', () => {
it('should download a proposal', () => {
const filename = path.resolve(__dirname, './downloads/proposal.pdf')
if (fs.existsSync(filename)) {
fs.unlinkSync(filename)
}
page.downloadProposalBtn.click()
browser.wait(() => fs.existsSync(filename), 180000)
.then(() => {
expect(fs.existsSync(filename)).toBe(true)
})
}, 180000)
})
Below is the relevant portion of my conf file. I would expect the prompt_for_download setting to make the prompt not show, but it does..
// conf.js
capabilities: {
platform: 'macOS 10.12',
browserName: 'chrome',
version: '59.0',
screenResolution: '1400x1050',
chromeOptions: {
args: ['--no-sandbox', '--test-type=browser', 'disable-infobars=true'],
prefs: {
download: {
prompt_for_download: false,
directory_upgrade: true,
default_directory: path.resolve(__dirname, './downloads'),
},
credentials_enable_service: false,
},
},
},
Am I missing something? I feel like I might be misunderstanding where Saucelabs is running these tests from, but it would seem that given I fs.mkdir the ./downloads folder, then when I path.resolve it, that should work.
Unsure if this constitutes an "answer" but this is how I'm proceeding. After a lot of research I think that testing if a file downloaded on a remote VM (like SauceLabs) isn't possible. What I am doing instead is breaking the test into two parts:
Test the download button: click the button and assert that no error occurred
Make GET request to the underlying endpoint api/download that the button is using, write the response to a folder, and from there assert (using nodejs) that the file exists in my Jenkins project's workspace. This feels hacky, but given SauceLabs doesn't seem to give much access to the VM that the webdriver is running on, I don't see another way.
You will get the prompt, even though the option is set to "false" if the default download directory doesn't exist. "./downloads" is OSX syntax, which the chrome driver probably doesn't understand.
Try this instead:
default_directory: __dirname + '/downloads'
I am newbie to service worker concept so forgive me if I am overlooking something from documentation. I have an angular application already running in production and I am trying to introduce service worker using sw-precache.
To start with I am trying to precache all images/fonts and couple of js files and see if it works, so my precache config is like this -
{
"cacheId": "static-cache",
"importScripts": [
"sw-toolbox.js"
],
"stripPrefix": "dist/",
"verbose": true,
"staticFileGlobs": [
"dist/img/**.*",
"dist/javascripts/require.js",
"dist/stylesheets/**.*",
"dist/webfonts/**.{ttf, eot, woff}",
"sw-toolbox.js",
"service-worker.js"
]
}
Now I can see service worker registered and installed properly and cache storage shows all the urls with _sw-precache hashes.
But when I load the application and see in network tab all static content are still served from memory/disk, not from service worker and I am unable to debug why is it so. Am I missing something here -
UPDATE:
More information: I had wrong configurations since I have dynamic url and server side rendered html. Server side it's test.jsp which is giving me initial shell.
For now I have removed all other static files from cache and kept only show.css
So update config now is -
{
"importScripts": [
"sw-toolbox.js"
],
"stripPrefix": "dist/",
"verbose": true,
"staticFileGlobs": [
"dist/stylesheets/show.css"
],
"dynamicUrlToDependencies": {
"/developers": ["dist/stylesheets/show.css"]
},
"navigateFallback": "/developers"
}
Web root folder is named differently and it is -
- dashboard
-- img
-- javascripts
-- service-worker.js
-- sw-toolbox.js
- test.jsp
And I see /developers url as an entry in storage cache, but still it's not served from service worker for next refresh. I have tried all my energy to fix this, but I desperately need some clue here, what's missing in here. TIA.
Let me know if need more info.
It seems that whitespaces in your file extension list are not allowed. Your definition for webfonts should be:
"dist/webfonts/**.{ttf,eot,woff}",
I cloned the sw-precache repo and added a unit test where I compared two generated files with two diffrent staticFileGlobs, one with whitespace and one without.
it('should handle multiple file extensions', function(done) {
var config = {
logger: NOOP,
staticFileGlobs: [
'test/data/one/*.{txt,rmd}'
],
stripPrefix: 'test'
};
var configPrime = {
logger: NOOP,
staticFileGlobs: [
'test/data/one/*.{txt, rmd}'
],
};
generate(config, function(error, responseString) {
assert.ifError(error);
generate(configPrime, function(error, responseStringPrime) {
assert.ifError(error);
console.log('responseStringPrime',responseString);
assert.strictEqual(responseString, responseStringPrime);
done();
});
});
});
and it failed. The second config didn't include the .rmd file:
-var precacheConfig = [["/data/one/a.rmd","0cc175b9c0f1b6a831c399e269772661"],["/data/one/a.txt","933222b19ff3e7ea5f65517ea1f7d57e"],["/data/one/c.txt","fa1f726044eed39debea9998ab700388"]];
versus
+var precacheConfig = [["test/data/one/a.txt","933222b19ff3e7ea5f65517ea1f7d57e"],["test/data/one/c.txt","fa1f726044eed39debea9998ab700388"]];
I want to use Paypal Adaptive Payments and Paypal Adaptive Accounts libs in my CakePHP 2.4.x application. I am loading them via composer. My composer.json file looks like this:
{
"require": {
"paypal/adaptivepayments-sdk-php":"v3.6.106",
"paypal/adaptiveaccounts-sdk-php":"v3.6.106"
},
"config": {
"vendor-dir": "Vendor"
}
}
Both libs contain Paypal/Types/Common/RequestEnvelope.php and for each lib they are different. I'm running into a conflict with this class name where the right one isn't being used. I believe the solution is to use autoload in my composer.json. I've read the documentation and don't believe I'm using it correctly. Here is what I'm attempting:
{
"require": {
"paypal/adaptivepayments-sdk-php":"v3.6.106",
"paypal/adaptiveaccounts-sdk-php":"v3.6.106"
},
"config": {
"vendor-dir": "Vendor"
},
"autoload": {
"psr-4": {
"AdaptivePaymentsLib\\": "Vendor/paypal/adaptivepayments-sdk-php/lib",
"AdaptiveAccountsLib\\": "Vendor/paypal/adaptiveaccounts-sdk-php/lib"
}
}
}
And in my controller I'm attempting to call RequestEnvelope like this:
$requestEnvelope = new AdaptivePaymentsLib\PayPal\Types\Common\RequestEnvelope("en_US");
It is not being found. Active Accounts was only recently added to the project. Previously getting the request envelope worked fine with $requestEnvelope = new PayPal\Types\Common\RequestEnvelope("en_US"); so it was only with the addition of the accounts which presented the conflict and caused the breakage.
You should not define autoloading for your dependencies - that is the task for them to solve.
If you look at the composer.json file for paypal/adaptivepayments-sdk-php, you see:
"autoload": {
"psr-0": {
"PayPal\\Service": "lib/",
"PayPal\\Types": "lib/"
}
}
If you look at the same file in paypal/adaptiveaccounts-sdk-php, you see:
"autoload": {
"psr-0": {
"PayPal\\Service": "lib/",
"PayPal\\Types": "lib/"
}
}
After installing, Composer creates a file vendor/composer/autoload_namespaces.php with this content:
return array(
'PayPal\\Types' => array($vendorDir . '/paypal/adaptivepayments-sdk-php/lib', $vendorDir . '/paypal/adaptiveaccounts-sdk-php/lib'),
'PayPal\\Service' => array($vendorDir . '/paypal/adaptivepayments-sdk-php/lib', $vendorDir . '/paypal/adaptiveaccounts-sdk-php/lib'),
'PayPal' => array($vendorDir . '/paypal/sdk-core-php/lib'),
);
So both libraries are included here, and I have no doubt the autoloading will work.
You cannot really do something about the duplicate classes with different content. Did you open an issue on Github? Without making the developer team aware of this problem, it will never get solved.
As a hack, you could define a post-install and post-update script that deletes one of these files. See the composer documentation for more details. Composer accepts either any shell command, or a static call to a PHP class. I'd go with the shell command here.