Agile Insider reality bytes…

20Apr/10Off

Top Tips – User Centred Automated Acceptance Tests

Following on from my previous article, I thought I would share some top tips for creating automated acceptance tests for projects using user centred design.  There is nothing inherent in user centred design which prevents an agile approach being adopted and indeed agile adds huge value to user centred design by reducing the feedback loops during the "design".

The outputs from the "design" part of user centred design is usually a set of wireframes, sometimes highly graphical to be used for marketing and soliciting customer feedback.  Low fidelity wireframes, as produced by balsamiq are fantastic for exploring the solution space with minimal commitment and retain that back of a napkin look and feel which prevents misinterpretation by customers and business about how much work remains to be done making them real.  At some point, developers will get their hands on these wireframes and be tasked with "making them real".  Of course, if the developers can help create the wireframes, then so much better and what a great way to reduce that feedback loop 😉

An example wireframe...

So, we're a delivery team and we have the wireframes available which we want to use to start carving this up into stories (with a customer ideally), executable tests and working code.  Where to start...

First we want to express what we're seeing as stories so we can assign value to the stories and prioritise based on ROI.  Depending on the technology choices, these wireframes also help to perform an initial design of the application in terms of data, services, components, etc.

The stories we write can be high level stories capturing the user experience as they navigate through the site, or they can be as low level as which fields are visible within a given area of the site.

My preferred approach is to write stories and tests at the high level and as we uncover the detail capture these as smaller stories and more detailed tests.  This seems to fit naturally with user centred design and allows planning and release activities to focus on the high level stories around the user experience (which is probably why they chose to do user centred design in the first place) and allow the development and build activities to focus on delivering the smaller stories.

My top tips would be...

  • Write your stories and tests based on what you see in the wireframes
    • e.g. When a user (who is not logged in) visits the homepage they can see the "News" in the main area.  Notice that the wireframe has a single article, but this could be many articles in the future.  There is also no reason to assume this is the "Latest News", "Featured News", "Breaking News" or anything else.
  • Be constantly aware of the context (explicit or not) within the wireframe when writing your test
    • e.g. In the latest news section, we may see a "Read More" link and assume that we can simply fnid this "Read More" link on the page.  Of course, in the future, there may be more than one news article (so we want to find the "Read More" link related to a particular article) and indeed there be more sections that contain "Read More" links such as user comments (so when we implement a test around the news, we want to ensure that we are finding the correct "Read More", which means we look for it in the news section and in relation to the article(s) of interest).
  • Don't assume the wireframe is what the final implementation will be so think in abstract terms
    • e.g. if there is a button in the wireframe, this may be a clickable image later, or a tab panel.  So word your tests not on the button, but on the capability the button provides, so instead of "...and the user can click <Help>" we might have "..and the user is able to obtain <Help>".
  • Envisage alternative implementations while writing the tests and validate your test would still be valid
    • e.g. If I change this from being a form to being editable inline, does the test still work?
  • Try to be discrete in your tests, it is better to have lots of small, simple tests than a small number of tests that test too many things.
    • e.g.
      • when... can they see today's weather
      • when... can they see today's temperature
      • when... can they see tomorrow's weather
  • Use "Given, When, Then... and" to think about the scenarios you need, but try expressing them in natural english.
    • e.g.  When someone is viewing the school holidays calendar (Given) and they enter their postcode (When), the calendar should only display holidays for schools within the catchment radius (Then) and the user should be able to remove the filter (And).
    • I like to think of "Thens" as being the assertions, and with UCD this is the stuff I want to see on the screen as a result of me doing something.
    • The "Ands", which are an extension to Dan North's, I like to think of as the stuff I can now do in addition and in UCD this usually relates to new functionality which has been exposed or links that have appeared.
  • Refer to the Concordion Techniques page for some great examples of how to write robust tests in natural language.
  • Separate site aspects (navigation, login) from functional or contextual tests
  • Refactor your tests as you go to evolve a testing DSL to simplify writing future tests
  • Use personas (or metaphors) to encapsulate data
    • Bring your personas to life
      • Write them a CV, or post their key characteristics on the team wall.
    • You can validate key aspects of your persona within your test (in the "Given" section) to provide better tests by clarifying what aspects of the persona you are actually testing
      • e.g. "When Joe (who is on legal aid) views the list of local lawyers, the lawyers who provide legal aid are highlighted".  Notice that highlighted could mean anything, from promoting them to the top, changing colour, or adding a nice icon...
  • Don't create too many personas
    • Personas are very powerful, but when abused become difficult to manage.  To test functional aspects related to simple data variations use dynamic personas (e.g. when someone, who has not yet entered their postcode, accesses their profile...)
  • Think like a customer
    • Ask yourself how much money you would pay for a given feature or piece of functionality

I'm sure that I'll refine this example in the future and therefore watch this space...

29May/09Off

Agile Requires Skilled Developers

In a recent tweet from @EstherDerby, she states

Some ppl complain agile only works w/ highly skilled developers. Never been clear 2 me that ANY dev. method works w/o highly skilled devs.

I think the subtle distinction is that agile REQUIRES skilled developers to be successful, whereas some of the "heavier" methodologies would be better with skilled developers but don't actually require them.  I also think that when we look at many of the agile successes, it would be interesting to determine whether there was any correlation between the level of skill of the developers compared to the level of skill on the non-successful and/or non-agile projects.

And taking this just a little bit further...  Would a small group of skilled developers be successful regardless of the methodology?  Truly skilled developers tend to be very pragmatic and will always find ways to simplify the complexity around them, so I'm sure that if you took 8 highly skilled, highly successful agile developers and stuck them on a waterfall project they would deliver a successful result, at least in terms of the customer...

What's also quite interesting about this dynamic is that once a developer "sees the light" and becomes "agile" they can't imagine going back to waterfall, despite the fact they can add immense value by being part of a waterfall project and improving the processes.  There is something very selfish about this which has not yet been picked up in the mainstream...  This is also possibly one of the reasons why agile suffers an identity crisis, often being regarded as a cult.

And no, I'm not advocating waterfall, I'm just wondering whether skilled developers have more impact on success compared to the methodology.