I have searched quite a bit on how to get login time but could not find any definitive answer. I do not want to introduce any timers in my scripts. My aim is to find how much time it took exactly to login and logout in my selenium script.
I have following so far-
I am getting start time and finish time and getting the login time as follows-
public void testLogin(){
String csvFile = "C:\Users\users.csv";
BufferedReader br = null;
String line = "";
String cvsSplitBy = ",";
try {
br = new BufferedReader(new FileReader(csvFile));
while ((line = br.readLine()) != null) {
// use comma as separator
String[] value = line.split(cvsSplitBy);
WebDriver driver = new HtmlUnitDriver();
//WebDriver driver = new FirefoxDriver();
driver.get("www.test.com");
long start=System.currentTimeMillis();
driver.findElement(By.id("txt-username")).sendKeys(value[0]);
driver.findElement(By.id("pwd-password")).sendKeys(value[1]);
driver.findElement(By.id("login-widget-submit")).click();
long finish=System.currentTimeMillis();
long OverallTime =finish-start;
System.out.println("Total time for login -"+OverallTime);
driver.close()
If you want to only measure the time taken to log in (not to load the login page and then log in), you will want to add a WebDriverWait after driver.get() and wait for a specific element to load to ensure that the page is fully loaded. You will want to add another wait after clicking Submit to ensure that the page after login has loaded completely. That is a better test of login time. What you have now is you start the timer potentially before the login page is loaded and then stop the timer when you click the Submit button... but the user isn't actually logged in yet.
I personally use the StopWatch class that's a part of apache.commons to do timings.
Related
I have this error "org.openqa.selenium.ElementClickInterceptedException: element click intercepted: Element is not clickable at point (209, 760)", when I run the piece of code below in headless mode. When it is run with browser displayed I have no error and test passes fine. As you can see below, I trie with waiting, js executor, actions move to element but still no good result. I am using xpath to locate / define the element, and not coordinates. Why is this happening please and how can I solve it? Thanks in advance.
#Test(priority = 1)
public void verifyAddUserWithMarkedMandatoryFields() {
// accessing add user webpage / functionality
userListObject.getAddUserButton().click();
// inserting data to complete form
addOrEditUserPageObject.insertCredentials(userModel.getUsername(), userModel.getEmail(), "", userModel.getPassword());
// clicking Submit when becoming enabled
WebDriverWait myWaitVariable = new WebDriverWait(driver, 5);
myWaitVariable.until(ExpectedConditions.elementToBeClickable(addOrEditUserPageObject.getSubmitButtonAddOrEdit()));
// Actions actions = new Actions(driver);
// actions.moveToElement(addOrEditUserPageObject.getSubmitButtonAddOrEdit()).click().perform();
JavascriptExecutor jse = (JavascriptExecutor)driver;
// jse.executeScript("scroll(209, 760)"); // if the element is on top.
jse.executeScript("scroll(760, 209)"); // if the element is on bottom.
addOrEditUserPageObject.getSubmitButtonAddOrEdit().click();
}
You should add screen size for the headless mode, something like this:
Map<String,String> prefs = new HashMap<>();
prefs.put("download.default_directory", downloadsPath); // Bypass default download directory in Chrome
prefs.put("safebrowsing.enabled", "false"); // Bypass warning message, keep file anyway (for .exe, .jar, etc.)
ChromeOptions opts = new ChromeOptions();
opts.setExperimentalOption("prefs", prefs);
opts.addArguments("--headless", "--disable-gpu", "--window-size=1920,1080","--ignore-certificate-errors","--no-sandbox", "--disable-dev-shm-usage");
driver = new ChromeDriver(opts);
I put much more things here, the only relevant point here is "--window-size=1920,1080", this should resolve your problem.
The rest is to show how things are managed, including other relevant settings for headless mode.
I am writing a selenium test script which navigates to the url, say https://www.flipkart.com/ (This is just an example website)
When you first time navigate to the home page, a message regarding Cookies is displayed and there is a button "Accept Cookies".
Whenever my selenium script runs and navigates to the home page, every time it gets the cookie message as described earlier. My question is what needs to be done so that the script will not encounter such cookie consent message?
I have managed to store the cookies in in a file. It is as below
_gut_UB-97923818-1;1;.mycompany.com;/;Fri Mar 29 18:12:07 EET 2019;false
I have also tried to set its expiry with below code
public void retrieveCookie()
{
try{
File file = new File("Cookie.data");
FileReader fileReader = new FileReader(file);
BufferedReader Buffreader = new BufferedReader(fileReader);
String strline;
while((strline=Buffreader.readLine())!=null){
StringTokenizer token = new StringTokenizer(strline,";");
while(token.hasMoreTokens()){
String name = token.nextToken();
String value = token.nextToken();
String domain = token.nextToken();
String path = token.nextToken();
Date expiry = null;
String val;
if(!(val=token.nextToken()).equals("null")){ //Thu Mar 28 23:26:39 EET 2019
expiry = new Date(val);
}
Boolean isSecure = new Boolean(token.nextToken()).booleanValue();
Cookie ck = new Cookie(name,value,domain,path,expiry,isSecure);
BaseDriver.getDriver().manage().addCookie(ck); // This will add the stored cookie to our current session
}
}
}catch(Exception ex){
ex.printStackTrace();
}
BaseDriver.getDriver().get("https://www.flipkart.com/");
}
However I get java.lang.IllegalArgumentException exception at line,
expiry = new Date(val);
It is because it is not able to parse the date
Can someone share the code so that the date can be parsed?
My only intention is whenever the test script runs, it should not encounter the cookie consent message. If there is any other way to achieve this, please suggest.
Selenium/java PhantomJs:
On First run, driver.get(loginURL), gets login url.
On the Second run, driver.get(loginURL), goes to the home page, instead of the login page. And of course it doesn't find elements of the login page.
(A scenario execution might fail and log out my not be performed at the end)
Any help? Any ideas why this is happening?
Thanks
Code Part and exception:
...
WebDriver driver;
PhantomJSDriverService phantomService = PhantomJSDriverService.createDefaultService();
System.setProperty("phantomjs.binary.path", "browserDrivers/phantomjs.exe");
driver = new PhantomJSDriver();
driver.get("https://xxxxxx/yy/");
System.out.println("Url :" + driver.getCurrentUrl());
WebElement loginField = driver.findElement(By.id("txt-username"));
WebElement passwdField = driver.findElement(By.id("txt-password"));
...
Reults of console printout:
1st Run: Url :https://xxxxxx/yy/login (correct and also finds next WebElements), fails in next steps
2st Run: Url :https://xxxxxx/yy/home (incorrect should be login page again https://xxxxxx/yy/login)
Exception thrown: org.openqa.selenium.NoSuchElementException:
{"errorMessage":"Unable to find element with id 'txt-username'"
For a proper cleanup (logout after failure in your case) you can implement a teardown method which is executed after each test in your test class, e.g. like this (using Junit4):
#org.junit.After
public void tearDown() {
//your code for performing logout
//....
//Close the current window, quitting the browser
//if it's the last window currently open.
if (driver != null) {
driver.close();
}
}
My problem:
I am running phpunit with Selenium to test a website on a server that is on the other side of the world. So, there is a delay of a few seconds for things like clicking on a tab or a new page. I start Selenium Server with Chromedriver.
eg.
public function setUp()
{
$this->setHost('localhost'); // Set the hostname for the connection to the Selenium server.
$this->setPort(4444); // set port # for connection to selenium server
$this->setBrowser('chrome'); // set the browser to be used
$this->setBrowserUrl('https://www.*.com'); // set base URL for tests
$this->prepareSession()->currentWindow()->maximize(); // Maximize the window when the test starts
$this->timeouts()->implicitWait(30000); // Wait up to 30 seconds for all elements to appear
}
public function testLoginToeSeaCare(){
$this->timeouts()->implicitWait(10000); // Wait up to 10 seconds for all elements to appear
$url = 'https://www.*.com';
$loginName = 'Ned';
$loginPassword = 'Flanders';
$this->url($url); // Load this url
$this->timeouts()->implicitWait(30000); // Wait up to 30 seconds for all elements to appear
$username = $this->byId('username'); // Search page for input that has an id = 'username' and assign it to $username
$password = $this->byId('password'); // Search page for input that has an id = 'password' and assign it to $password
$this->byId('username')->value($loginName); // Enter the $loginName text in username field
$this->byId('password')->value($loginPassword); // Enter the $loginPassword in password field
$this->byCssSelector('form')->submit(); // submit the form
$tab1Link = $this->byLinkText("Tab1"); // Search for the textlink Tab1
$this->assertEquals('Tab1', $tab1Link->text()); // assert tab text is present
$this->timeouts()->implicitWait(10000); // Wait up to 10 seconds for all elements to appear
$tab2Link = $this->byLinkText("Tab2");
$tab2Link->click(); // Click 'Tab2' tab
}
There is an error reported when the above is run and I capture it in an xml file:
********::testSearch PHPUnit_Extensions_Selenium2TestCase_WebDriverException: unknown error: Element ... is not clickable at point (430, 139). Other element would receive the click: (Session info: chrome=57.0.2987.133) (Driver info: chromedriver=2.29.461591 (62ebf098771772160f391d75e589dc567915b233)
What I am trying to do is to wait for the DOM to be completely loaded before clicking on a button. But I get the above error intermittently. Does anyone know a way around this?? Its driving me nuts!!
Try the Explicit Waits.
"An explicit wait is the code you define to wait for a certain condition to occur before proceeding further in the code. There are some convenience methods provided that help you write code that will wait only as long as required. WebDriverWait in combination with ExpectedCondition is one way this can be accomplished."
For example,
// Wait for the page title to be 'My Page'.
// Default wait (= 30 sec)
$driver->wait()->until(WebDriverExpectedCondition::titleIs('My Page'));
// Wait for at most 10s and retry every 500ms if it the title is not correct.
$driver->wait(10, 500)->until(WebDriverExpectedCondition::titleIs('My Page'));
There are many prepared conditions you can pass to the until() method. All of them subclass WebDriverExpectedCondition, including elementToBeClickable() (https://github.com/facebook/php-webdriver/wiki/HowTo-Wait).
I don't know if this will help you
but in java there is a method to wait for a certain item to be visible
Here is how it is written
WebDriverWait Wait=new WebDriverWait(Driver, 10);
Wait.until(ExpectedConditions.visibilityOf(Driver.findElement(By.//your element locator)));
Sorry I don't know how to write it in PHP
I'm developing an application on GAE that fetches a web page and searches it for a link.
This page gets updated every morning, so a cron job is executed each morning every 15 minutes for a couple of hours, to obtain current day's page.
Here's the problem: if at the first execution of the cron job the application finds the older page (yesterday's one), it keeps fetching that one, although a new page is available at the same URL.
Seems that a cache is used somewhere, but I can't disable it.
The code that the application uses for downloading the page is simply Java I/O:
InputStream input = null;
ByteArrayOutputStream output = null;
HttpURLConnection conn = null;
URL url = new URL("http://www.page.url.net");
try {
conn = (HttpURLConnection) url.openConnection();
conn.setReadTimeout(0);
conn.setUseCaches(false);
int httpResponseCode = conn.getResponseCode();
if (httpResponseCode == HttpURLConnection.HTTP_OK) {
input = conn.getInputStream();
output = writeByteArrayOutputStreamFromInputStream(input);
} else {
throw new IOException("response code " + httpResponseCode);
}
} finally {
if (input != null) {
output.close();
conn.disconnect();
}
}
What's wrong?
In order to avoid caching, I suggest to use this simple trick: add a "fake" query parameter to the end of the query string, for example if the page you are fetching is
http://www.page.url.net
add a parameter named dummy= so the url becomes:
http://www.page.url.net?dummy=2013-05-25
Just be sure the "dummy" paramater is not actually interpreted by the remote server.
Hope this helps.