Difference between revisions of "AcceptanceTesting"
(4 intermediate revisions by one other user not shown) | |||
Line 7: | Line 7: | ||
# Expecting acceptance tests to be a full regression suite | # Expecting acceptance tests to be a full regression suite | ||
# Focusing on tools | # Focusing on tools | ||
− | # | + | # "test code" is not maintained with love - automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied |
# objectives of team members not aligned | # objectives of team members not aligned | ||
# no management buy-in | # no management buy-in | ||
Line 22: | Line 22: | ||
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term. | Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term. | ||
+ | Using tools during a 'specification workshop' is too slow (esp. with senior sponsors) - use a whiteboard, formalise them and send them as "meeting minutes" | ||
== Business people and testers collaborating == | == Business people and testers collaborating == | ||
Line 60: | Line 61: | ||
* If you write specifications as tests, you are a 'test analyst', not a 'business analyst' (but you win out in the end! :) ) | * If you write specifications as tests, you are a 'test analyst', not a 'business analyst' (but you win out in the end! :) ) | ||
+ | A sad story | ||
+ | * Project was successful, working in a collaborative way, delivering value | ||
+ | * The BA left the team | ||
+ | * The new BA refused to sit with the team - and was supported in this by management (as it was not part of the 'global process' to work in this way) | ||
+ | * ... | ||
+ | Some things about writing stories | ||
* backlog items tend to look like 'tasks' not 'stories' - aim for 'things we want the system to do' | * backlog items tend to look like 'tasks' not 'stories' - aim for 'things we want the system to do' | ||
* story is 'intentionally vague' - 'a promise/invitation for a conversation' | * story is 'intentionally vague' - 'a promise/invitation for a conversation' | ||
* important factor is the 'shared understanding' | * important factor is the 'shared understanding' | ||
* acceptance criteria are the examples that are an output of the conversation, and limit the scope (or are the specification) | * acceptance criteria are the examples that are an output of the conversation, and limit the scope (or are the specification) | ||
− | + | * Ron Jeffries - "the most important thing on a requirements document is the phone number of the person who wrote it" | |
− | * Ron Jeffries - the most important thing on a requirements document is the phone number of the person who wrote it | ||
* 3 C's "card", "confirmation", "conversation" | * 3 C's "card", "confirmation", "conversation" | ||
Line 81: | Line 87: | ||
"collaboration doesn't happen when people have different objectives" | "collaboration doesn't happen when people have different objectives" | ||
− | failure: only collaborate during the workshop, then after: | + | A failure: only collaborate during the workshop, then after: |
* BA - deliver specification on a certain date | * BA - deliver specification on a certain date | ||
* PM - deliver a project on a certain date | * PM - deliver a project on a certain date | ||
Line 87: | Line 93: | ||
* no-one had objective of building a quality product | * no-one had objective of building a quality product | ||
− | success: everyone was able to share the same tool (fitnesse) | + | A success: everyone was able to share the same tool (fitnesse) |
* everyone was working on the same 'document' - with the same goal | * everyone was working on the same 'document' - with the same goal | ||
* nothing 'lost in translation' | * nothing 'lost in translation' | ||
* but was a different team (perhaps more of a people thing) | * but was a different team (perhaps more of a people thing) | ||
− | |||
In the beginning, there is often lots of resistance to collaborate | In the beginning, there is often lots of resistance to collaborate | ||
Line 98: | Line 103: | ||
* and then people will approach before | * and then people will approach before | ||
− | Other notes | + | Resistance Factors |
+ | * Your work is being questioned | ||
+ | * Code ownership | ||
+ | * need "evidence" of the value-add of collaboration, to engender buy-in | ||
+ | |||
+ | How to make this visible | ||
+ | * Track time fixing issues, doing rework, cost of bugs | ||
+ | |||
+ | Note that some companies benefit by charging time for maintenance work :( | ||
+ | |||
+ | == Other notes == | ||
* not easy - because of silo-culture | * not easy - because of silo-culture | ||
* the problem of fractional people | * the problem of fractional people | ||
Line 106: | Line 121: | ||
* don't take a requirements document, then write acceptance tests from that | * don't take a requirements document, then write acceptance tests from that | ||
* 'translation problem': http://www.brokenpicturetelephone.com | * 'translation problem': http://www.brokenpicturetelephone.com | ||
+ | |||
+ | * If you are in a situation where 70-80% of projects fail, "I did my bit" is a way to maintain sanity | ||
+ | * You need to invest time in learning and practicing | ||
+ | |||
+ | |||
+ | * Transition to collaborative acceptance testing involves a large organisational change | ||
+ | * Lots of people have new/changed roles/habits | ||
+ | * You can't 'hide it' from management | ||
+ | * Calling it 'specification with concrete examples' only gets you so far | ||
+ | |||
+ | == write-ups on web == | ||
+ | |||
+ | * http://gojko.net/2009/09/24/top-10-reasons-why-teams-fail-with-acceptance-testing/ | ||
+ | * http://www.touilleur-express.fr/2009/09/20/citcon-2009-user-acceptance-test/ | ||
+ | * http://social.hortis.ch/2009/09/24/citcon-paris-2009-le-compte-rendu/ | ||
+ | * http://warzee.fr/entry/opening_session_citcon_09_europe |
Latest revision as of 04:32, 24 September 2009
Top 5(ish) reasons why teams fail with acceptance testing
- No collaboration
- Focusing on 'how' not on 'what'
- Tests unusable as live documentation
- Acceptance testing is not considered as an 'value-adding' activity
- Expecting acceptance tests to be a full regression suite
- Focusing on tools
- "test code" is not maintained with love - automation code is not considered as important as 'production code' - 'it's only test code' - normal code rules are not applied
- objectives of team members not aligned
- no management buy-in
- underestimating the skill required to do this well
Acceptance tests are a specification of a system - in order to be a good specification, they should be exemplars, but don't need to be dealing with every single edge case (if they are to remain readable/useable as documentation)
You could split out more exhaustive testing into a separate section, separate suite, or (better?) a separate tool.
Don't reject acceptance testing because you don't like the tool - start with the tasks you need to achieve. If it is difficult to automate, it doesn't mean it can be ignored - it is still an 'acceptance test' and it still needs to be run.
Definition of 'acceptance test': whatever you've agreed with the client (not just that that can be automated)
Dislike term 'acceptance testing' - could mean definition as above (Test Driven Requirements, Example Driven Requirements, etc), but often is thought as being equivalent to 'UAT' - this is *not* what is being discussed. 'Specification Workshop' has been successful as a term.
Using tools during a 'specification workshop' is too slow (esp. with senior sponsors) - use a whiteboard, formalise them and send them as "meeting minutes"
Business people and testers collaborating
Currently in 2nd Sprint
- Result for earlier approach was not good
- 50k hours in one release
- Siloed teams
- It was 9 months before the software was finished
- now switching to 6 scrum teams (2 designers, 1 tester, 4 developers)
- (but switching to new application also)
- positive results so far
Collaboration as a sys admin -> team 'hand over' the application ...
- lots of arguing during deployment
- team started to ask sysadmin to verify things 'up front'
- then brought sys admin into team
- eventually contributing to prioritisation of stories
Another story
- Waterfall model, siloed
- To help a move to agile, have management showcase the project
- Writing requirements is a collaborative activity, involving the whole team
- Everyone can voice an opinion + help define the acceptance criteria
- Try to automate as much as possible
The way the F15 was designed
- Customers said 'we want a 2.5 mach airplane'
- Designers attempted it, and couldn't (for the right cost)
- Go back, and asked 'why?'
- We need to get away from Russian planes really quickly
- How would a more agile plane work?
- Yes, yes - that would be fine!
- Developers know the technical limitations - tell them what the problem is, and maybe they'll come up with a different/better solution - get everyone in the same room to discuss it
If you have a waterfall project with lots of specifications, should you throw them away?
- Yes - but be mindful of the political ramifications - perhaps suggest that they need 'clarification'?
- If you write specifications as tests, you are a 'test analyst', not a 'business analyst' (but you win out in the end! :) )
A sad story
- Project was successful, working in a collaborative way, delivering value
- The BA left the team
- The new BA refused to sit with the team - and was supported in this by management (as it was not part of the 'global process' to work in this way)
- ...
Some things about writing stories
- backlog items tend to look like 'tasks' not 'stories' - aim for 'things we want the system to do'
- story is 'intentionally vague' - 'a promise/invitation for a conversation'
- important factor is the 'shared understanding'
- acceptance criteria are the examples that are an output of the conversation, and limit the scope (or are the specification)
- Ron Jeffries - "the most important thing on a requirements document is the phone number of the person who wrote it"
- 3 C's "card", "confirmation", "conversation"
For software vendors, with 1000s of customers - how do you manage 'the customer'?
- eg. iPlayer - customer is 'the british public' 50M users! - as with TV, use focus groups - and the 'producers' = product owners
- affordable sessions - just members of the public (who belong to a specified group) - for an hour at a time
How to decide how something should work, vs. whether something is 'in' or 'out'?
- need more than just a single 'truth'
- it is a conversation that needs to happen
- involve wider stakeholders - eg. financial controller, who can estimate a cost/value
"collaboration doesn't happen when people have different objectives"
A failure: only collaborate during the workshop, then after:
- BA - deliver specification on a certain date
- PM - deliver a project on a certain date
- Tester - test what is built by a certain date
- no-one had objective of building a quality product
A success: everyone was able to share the same tool (fitnesse)
- everyone was working on the same 'document' - with the same goal
- nothing 'lost in translation'
- but was a different team (perhaps more of a people thing)
In the beginning, there is often lots of resistance to collaborate
- but sometimes the arguments win through
- and this earns respect
- and then people will approach before
Resistance Factors
- Your work is being questioned
- Code ownership
- need "evidence" of the value-add of collaboration, to engender buy-in
How to make this visible
- Track time fixing issues, doing rework, cost of bugs
Note that some companies benefit by charging time for maintenance work :(
Other notes
- not easy - because of silo-culture
- the problem of fractional people
- it is an accounting issue - and you should refuse to carry the burden of the cost of a partial resource when you aren't getting the benefit
- should be looking at throughput of delivered value, not the costs of utilisation of resources
- also, risk of resources being pulled, 'unreliable resources'
- don't take a requirements document, then write acceptance tests from that
- 'translation problem': http://www.brokenpicturetelephone.com
- If you are in a situation where 70-80% of projects fail, "I did my bit" is a way to maintain sanity
- You need to invest time in learning and practicing
- Transition to collaborative acceptance testing involves a large organisational change
- Lots of people have new/changed roles/habits
- You can't 'hide it' from management
- Calling it 'specification with concrete examples' only gets you so far