Why You Should Worry About Expected Fails and Unexpected Pass Test

Gaurav Khattar - Oct 20 '22 - - Dev Community

Building softwares is a rigorous process where every step of the development lifecycle plays a vital role in the final outcome, including testing. Whether you are testing your web portal for features or for browser compatibility support, ensuring the reliability of the test itself is important.

So to make sure that our testing is omnidirectional we deliberately create test cases that we know are going to fail. For example testing what happens if we enter wrong email and password in a login form.

These test cases are expected to fail, however sometimes they pass, unexpectedly. Now do you get the meaning of unexpected pass!!

These situations are often faced by software testing professionals dealing with a large number of test cases every day. And all those who are not much familiar, can save themselves likely before facing one soon.

Let’s have a deeper look at the reasons leading to expected fails and unexpected passes –

Poor Analysis

Analyzing, planning and scheduling the problem properly is important and demands utmost attention. Often the test plans are not efficient enough. More or less, they are incomplete summaries containing poor detailed descriptions of use cases. The documentation about the test plans are ignored after they are written and the testers often make erroneous judgements about the overall test plans.

Often testing is deliberately performed for a limited amount of time especially in rigorous manual testing.The testers tend to postpone the significant testing to the development phase which leads to huge investment of more time in detecting bugs, and testing the cases towards the end

The test communication problem is also one of the issues which need to be worked upon with great concern. It basically involves scanty test documentation. Such problems often occur when there is inadequate maintenance of test documents and test communication

Check this out: UI Testing Tutorial: A Comprehensive Guide With Examples and Best Practices

Incomplete Test Coverage

One of the most troublesome tasks a software testing team needs to work upon is insufficiency of test coverage. Although one puts best foot forward to cover most scenarios but often poor coverage of real-world use cases can lead to false positive or false negative.

Every team should reasonably work for a decent code coverage that amounts to be around 80%, rather than wasting time covering simple code (getters and setters) as they may not necessarily contain any logic.

If the testers are not rigorously covering each combination and permutation based on the description of the use cases. There persists collect all tests, required for executing each test. Nothing should be neglected at the tester end. The team has insufficient understanding about what should be tested and what should not be as coverage at times in not possible on lower levels. Thus team should do a proper analysis and shift their focus on this metric.

Check this out: Exploratory Testing Tutorial: A Comprehensive Guide With Examples and Best Practices

Security At Risk

The situation of false positive is far more dangerous than a false negative, as it leads to creation of a false sense of security. The tester should worry about how they can enhance security use-cases to re-establish the feeling of security at the user end

Not just that, the tester should worry about the security of use cases, but major efforts should also be invested during the application design stage. The developers can even come up with easier and cheaper ways to create secure applications and software by working on cross site scripting flaws and others remedies.

Your testing team perhaps is not working frequently on code scanning which should be done not just at the beginning of the project but also during the quality assurance stage. There might be lack of threat modelling techniques that can boost the detection of any vulnerabilities or design flaws that might have crept into the application or software created. Not just the testers but even the developers should ponder over the act of running the software or application in order to monitor it from time to time, in order to avoid insecure activities which get reported at this phase.

Check this out:User Acceptance Testing (UAT) Tutorial: A Comprehensive Guide With Examples and Best Practices

Vulnerable Test Tools And Test Environment

Often, there might be a problem about the insufficiency of the number of test environments. Either the test environment has poor quality resulting in excessive defects or they have unsatisfied fidelity to the actual system which is being tested.

Another issue that might require your attention is that there might be a difference in the behaviour of the system and software under test during operation. If a company shares vital information about test environment, tools, setup this shall ease up the situation of facing vulnerable problems in which the tests fail to get delivered or there is lack in the configuration control of test data, test software, and test environments.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .