Running failed test using RetryAnalyzer - not working as expected for test using data provider - selenium-webdriver

I am using IRetryAnalyzer for running failed test cases and using IAnnotationTransformer for setting annotation at run time. For #Test using data provider its giving strange result.
I have set retry limit 3, that is test should re-run 3 times. Issue is :
If test fails for first data set, then it retries 3 times (as expected). Then for all remaining data set - re-run count is 2. I am not sure, its 2 retries or its 1 run 1 retry.
Here is class implementing data provider:
#Test(dataProvider = "data-source")
public void toolbarActionsOnShapes(String selectShape)
throws InterruptedException {
Assert.assertTrue(false);
}
#DataProvider(name = "data-source")
public Object[][] allShapes() {
return new Object[][] { { "Rectangle" }, { "Circle" }, { "Triangle" }
};
}
}
On running this i get output :
https://drive.google.com/open?id=1FxercluPinPiOOUAZKe_dMa6NvVMCE0j
For every set of data, if test fails - there should be 3 retries. Dummy project zip is attached for reference.
https://drive.google.com/open?id=1Mt7V2TO4TWRKU9dN4FIFzprkDingUKaE
Thanks !!

This is due to a bug that exists in TestNG 7.0.0-beta1. Please see GITHUB-1946 for more details.
I went ahead and fixed this as part of my pull request PR-1948
Please make use of TestNG 7.0.0-SNAPSHOT to get past this problem. This should be part of the upcoming TestNG 7.0.0-beta2 (or) 7.0.0 (final release). Its not decided on this part yet.

Related

How to use the RefreshDatabase trait with a Sql server database?

I am trying to use a real Sql Server connection to run my php unit test in Laravel because SqLite does not have the function I am trying to test.
The database name is "Test" and the connection details have all been added to database.php.
I have changed my phpunit.xml env variables to refer to the new test database.
<env name="DB_CONNECTION" value="test_sqlserver"/>
<env name="DB_DATABASE" value="Test"/>
Now, when I try to run a simple test with a class that use the RefreshDatabase trait, or even the DatabaseMigrations trait, it result in the following error:
Symfony\Component\Console\Exception\InvalidOptionException: The "--drop-views" option does not exist.
It can be reproduced with the following test:
<?php
namespace Tests\Unit\Actions;
use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;
class GetResourcesExceedingHoursByPayPeriodWeeksActionTest extends TestCase
{
use RefreshDatabase;
protected $dropViews = false;
/** #test */
public function it_can_get_resources_exceeding_hours_by_pay_period_weeks()
{
}
}
Edit
The solution bellow proposed by Daniel did not solve the issue. Inspiring from that however, I added the following method:
protected function migrateFreshUsing()
{
$seeder = $this->seeder();
return array_merge([
// '--drop-views' => $this->shouldDropViews(),
// '--drop-types' => $this->shouldDropTypes(),
],
// $seeder ? ['--seeder' => $seeder] : ['--seed' => $this->shouldSeed()]
);
}
It resolved the error, however, now phpunit does not output any result. (I added a simple assertion $this->assertTrue(false);)
Full class code:
<?php
namespace Tests\Unit\Actions;
use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;
class GetResourcesExceedingHoursByPayPeriodWeeksActionTest extends TestCase
{
use RefreshDatabase;
protected $dropViews = false;
protected function migrateFreshUsing()
{
$seeder = $this->seeder();
return array_merge([
// '--drop-views' => $this->shouldDropViews(),
// '--drop-types' => $this->shouldDropTypes(),
],
//$seeder ? ['--seeder' => $seeder] : ['--seed' => $this->shouldSeed()]
);
}
/** #test */
public function it_can_get_resources_exceeding_hours_by_pay_period_weeks()
{
$this->assertTrue(false);
}
}
Edit 13:00
Adding the new tag in the phpunit.xml did not help to add output to the console.
However, I found the line causing the issue, as when it is commented, phpunit does output to the console.
To test correctly, I extended from the base TestCase class instead of the one shown above.
Then I added
$this->seed(DatabaseSeeder::class); as the first line of my test, which resulted in no output. Commenting it would allow phpunit to output, but obviously, the test would work with the dummy assertion, but not with anything that requires seeding.
With further investigation it seems that the issue is caused by 3 seeders that uses a third party package to seed from a csv file.
Every other seeder run fine.
I will keep investigating this path and update my question with the solution.
How can I test using a real Sql server database?
I'm not familiar enough with using SqlServer with Laravel, but this answer might be what you're looking for.
It's often the case that there's a default method that doesn't mesh well with a specific use case.
Writing this into your Tests\TestCase class might solve the issue.
/**
* Refresh a conventional test database.
*
* #return void
*/
protected function refreshTestDatabase()
{
if (! RefreshDatabaseState::$migrated) {
$this->artisan('migrate:fresh', [
'--drop-views' => $this->shouldDropViews(),
'--drop-types' => $this->shouldDropTypes(),
]);
$this->app[Kernel::class]->setArtisan(null);
RefreshDatabaseState::$migrated = true;
}
$this->beginDatabaseTransaction();
}
EDIT Jan 13 2023 12:23 pm
For the PhpUnit not outputting anything issue, reference this answer. Let me know in the comments if you don't get an explicit error displayed on your terminal.

How can I have the soft asserts that I put in my test script be reported into the ExtentReports Report, under each failed step?

I am working with a testing framework that uses Selenium, TestNG, Java and ExtentReports for reporting.
I have a Test script which is divided into several steps and at the end of each step I have hard asserts to validate the existence of elements that I am interacting with.
I would like to use some soft asserts in order for my next steps to be able to continue executing, but I would also like to see in the ExtentReport Report, some failure indication for each step, not just at the step when the script fails.
For example, I would like to see in the report something like: step1 - passed; step2 - failed (and logged exeception for cause of error), step3 - passed etc
Currently, if I add a soft assert for an element that cannot be found at step 2 from the example above, that step is marked as passed, and I'd like it to be marked as failed, but also continue on to step 3, 4 etc.
Does anyone know how I can do that, or provide some documentation? Any help would be much appreciated.
Assuming that these steps are part of one long test method. Try using the concept of child nodes. You can set their status to fail or error etc, when an assertion fails. The issue is you will need to have a hard assertion within a try-catch block to catch AssertionError and then set the status.
ExtentTest test = extent.startTest("Hello","Yeah");
extent.loadConfig(ExtentReports.class, "extent-config.xml");
test.log(LogStatus.PASS, "Before Step details");
ExtentTest child1 = extent.startTest("Child 1");
try{
//Assertion to be placed here
child1.log(LogStatus.PASS, "Pass");
} catch(AssertionError e) {
child1.log(LogStatus.FAIL, "Fail");
}
//Add to soft assertion
ExtentTest child2 = extent.startTest("Child 2");
try{
//Assertion to be placed here
child2.log(LogStatus.PASS, "Pass");
} catch(AssertionError e) {
child2.log(LogStatus.FAIL, "Fail");
}
//Add to soft assertion
test.appendChild(child1).appendChild(child2);
test.log(LogStatus.PASS, "After Step details");
Get a report as below -
Updated
Add this method to the ExtentTestManager class and call the static method from the testng test. Though this class can be written in a simpler fashion using ThreadLocal - http://extentreports.com/docs/versions/3/java/#testng-examples
public static synchronized void updateStepResult(String childNodeDesc, Object actual, Object expected) {
ExtentTest test = extentTestMap.get((int) (long) (Thread.currentThread().getId()));
ExtentTest cn = test.appendChild(extent.startTest(childNodeDesc));
try {
assertEquals(actual, expected);
cn.log(LogStatus.PASS, "Pass");
} catch (AssertionError e) {
cn.log(LogStatus.FAIL, "Fail");
}
}

Timeout for AndroidJUnitRunner + ActivityInstrumentationTestCase2?

The setup:
An older project I've inherited has a lot of legacy instrumentation tests and I would like to impose a timeout on them, since a lot of them can hang indefinitely and this makes it hard to get a test report. I'm in the process of updating the tests to be Junit4 style, but at the moment they're all extending ActivityInstrumentationTestCase2.
Tried so far:
In the documentation for AndroidJUnitRunner it says to set this flag:
Set timeout (in milliseconds) that will be applied to each test: -e timeout_msec 5000
...
...
All arguments can also be specified in the in the AndroidManifest via a meta-data tag
I've tried adding AndroidJUnitRunner configuration to the app manifest and the test manifest, but the timeout_msec meta-data item has had no effect so far.
You can use a rule to provide a timeout for each test in the class as shown below.
#Rule public Timeout timeout = new Timeout(120000, TimeUnit.MILLISECONDS);
You can also specify per test basis timeouts by using the following
#Test(timeout = 100) // Exception: test timed out after 100 milliseconds
public void test1() throws Exception {
Thread.sleep(200);
}
You can read more about the differences using this link
https://stackoverflow.com/a/32034936/2128442

How to have a Selenium test locate elements generated by Angular?

I'm currently taking the angular tutorial using Wisdom framework as back end. As a consequence, I run end-to-end tests using Fluentlenium, as the wisdom framework doc states.
My test for step 3, although dead simple, doesn't pass.
Full test can be found at github : Step03IsImplementedIT
However, here is the offending extract (around lines 30)
#Test
public void canTestPageCorrectly() {
if (getDriver() instanceof HtmlUnitDriver) {
HtmlUnitDriver driver = (HtmlUnitDriver) getDriver();
if(!driver.isJavascriptEnabled()) {
driver.setJavascriptEnabled(true);
}
Assert.assertTrue("Javascript should be enabled for Angular to work !", driver.isJavascriptEnabled());
}
goTo(GoogleShopController.LIST);
// Et on charge la liste des téléphones
FluentWebElement phones = findFirst(".phones");
assertThat(phones).isDisplayed();
FluentList<FluentWebElement> items = find(".phone");
assertThat(items).hasSize(3); // <-- this is the assert that fails
}
Failure message :
canTestPageCorrectly(org.ndx.wisdom.tutorial.angular.Step03IsImplementedIT) Time elapsed: 2.924 sec <<< FAILURE!
java.lang.AssertionError: Expected size: 3. Actual size: 1.
at org.fluentlenium.assertj.custom.FluentListAssert.hasSize(FluentListAssert.java:60)
at org.ndx.wisdom.tutorial.angular.Step03IsImplementedIT.canTestPageCorrectly(Step03IsImplementedIT.java:33)
From that failure, I guess the angular controllers weren't loaded.
How can I make sure they are ? And how can I have a working test ?
Turned out the error wasn't the expected one ... Well, it was, but in a hidden fashion.
HtmlUnitDriver, as one may be aware, is a pure Java implementation of a browser and, as such, has some limitations.
One of its limitation is Javascript interpretation, which seems to go awfully bad with angular ....
To make long things short, the simplest way to fix that is to replace the default driver with firefox one which implies
setting fluentlenium.browser to firefox
make sure driver loads correctly (since firefox.exe should be on path when trying to use its driver) by adding a small assert at the beginning of the test
Final test is then
assertThat(getDriver()).isInstanceOf(FirefoxDriver.class);
goTo(GoogleShopController.LIST);
FluentList<FluentWebElement> items = find("li");
FluentLeniumAssertions.assertThat(items).hasSize(3);
fill("input").with("nexus");
await();
items = find(".phone");
FluentLeniumAssertions.assertThat(items).hasSize(1);
fill("input").with("motorola");
await();
items = find(".phone");
FluentLeniumAssertions.assertThat(items).hasSize(2);

Using SolrNet to query Solr from a console application?

I'm trying to use SolrNet in a command line application (or more accurately, from LINQPad) to test some queries, and when trying to initialize the library, I get the following error:
Key 'SolrNet.Impl.SolrConnection.UserQuery+Resource.SolrNet.Impl.SolrConnection' already registered in container
However, if I catch this error and continue, the ServiceLocator gives me the following error:
Activation error occured while trying to get instance of type ISolrOperations`1, key ""
With the inner exception:
The given key was not present in the dictionary.
My full code looks like this:
try
{
Startup.Init<Resource>("http://localhost:8080/solr/");
Console.WriteLine("Initialized\n");
}
catch (Exception ex)
{
Console.WriteLine("Already Initialized: " + ex.Message);
}
// This line causes the error if Solr is already initialized
var solr = ServiceLocator.Current.GetInstance<ISolrOperations<Resource>>();
// Do the search
var results = solr.Query(new SolrQuery("title:test"));
I'm running Tomcat 7 on Windows 7x64 with Solr 3.4.0 installed.
There's another message about the same problem on StackOverflow, though the accepted answer of putting the Startup.Init code in Global.asax is only relevant to ASP.NET.
Restarting the Tomcat7 service resolves the problem, but having to do this after every query is a pain.
What is the correct way to use the SolrNet library to interact with Solr from a C# console application?
The correct way to use SolrNet in a console application is to only execute the line
Startup.Init<Resource>("http://localhost:8080/solr/");
once for the life of your console application. I typically put it as the first line in my Main method as shown below...
static void Main(string[] args)
{
Startup.Init<Resource>("http://localhost:8080/solr/");
//Call method or do work to query from solr here...
//Using your code in a method...
QuerySolr();
}
private static void QuerySolr()
{
var solr = ServiceLocator.Current.GetInstance<ISolrOperations<Resource>>();
// Do the search
var results = solr.Query(new SolrQuery("title:test"));
}
Your error is coming from the fact that you are trying to initialize the SolrNet connection multiple times. You only need to initialize it once when the console application starts and then reference (look up) via the ServiceLocator when needed.
My Solution is clear Startup before Init
Startup.Container.Clear();
Startup.InitContainer();
Startup.Init<Resource>("http://localhost:8080/solr/");

Resources