prosee wrote:How often do regression tests fail, do you think, based on your experience?
The answer to this one is pretty hard to provide in general.
Failure rates vary with the number of tests in a regression suite, and how much change there was in the application being tested.
The bigger the suite in terms of number of tests and the more diverse it is the more tests are going to show a problem. That's a GOOD sign...that the regression testing is working.
The opposite case, with a small, poorly designed suite, will seldom show that a regression happened...but we think that this might give you a very false sense of security. So we recommend more tests, more-detailed tests, and more focused tests...
We maintain a large test suite for eValid itself and we usually find one two things wrong after each build. In many cases the change is small or sometimes unimportant. We then modify the test script or make a small change to the product.
Our best guess on this is the following: If your tests are showing more than a 10% failure rate then you don't have a stable enough product to justify maintaining a test suite for it. With that much change the economics that strongly favor regression testing don't work out very well.
eValid Techical Support