AcceptanceTesting
Business people and testers collaborating
Top 5(ish) reasons why teams fail with acceptance testing
- No collaboration
- Focusing on 'how' not on 'what'
- Tests unusable as live documentation
- Acceptance testing is not considered as an 'value-adding' activity
- Expecting acceptance tests to be a full regression suite
- Focusing on tools
- Automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied - 'test code' is not maintained 'with love'
Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)
You could split out more exhaustive testing into a separate section, separate suite, or (better?) a separate tool.
Don't reject acceptance testing because you don't like the tool - start with the tasks you need to achieve. If it is difficult to automate, it doesn't mean it can be ignored - it is still an 'acceptance test' and it still needs to be run.
Definition of 'acceptance test': whatever you've agreed with the client (not just that that can be automated)
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term.