My current contract ends in a few more days so I'm taking the opportunity to dust off my worn copy of Rework by 37 signals. I have to make a long overdue thanks to Craig Davidson, an outstanding agile developer I encountered in a previous engagement.
It's not a traditional agile book by any means, but the facts that are presented within the book resonate strongly with my agile values and I find it has helped me immensely to keep grounding myself between contracts. I am now constantly surprised just how many paper-cuts I have personally accepted at each engagement and am equally surprised at my own personal level of intolerance now. I'm actually thinking of requesting a discount from the authors since I now use this book as a gift I give almost routinely...
I challenge anyone not to find the book invaluable at challenging their own current view of the world.
So, once more, and I must apologise profusely for the tardiness, thank you so much Craig...
This article largely follows on from some previous entries and in particular my entry on user centred test driven development.
It is often a complaint that large organisations trundle along painfully and slowly. Work can't start without following some process or other until you have sign-off. Part of this sign-off will probably involve agreement to follow certain standards and guidelines, but if these standards don't yet exist how can we start???
To challenge this and present an alternative approach, why not make the "standards" part of the delivery itself. Make it clear up front that rather than wait for the standards to be released (which would be the normal mode of attack in large organisations) you will actively work with whichever standard's body exists in the organisation to evolve just enough standards to support the actual work you are doing as you work through the backlog.
To make this work, COURAGE is imperative... Someone has to have the courage to put a stake in the ground early, recognising there is a small risk this may change. Developers should embed the standards into their automated testing as early as possible, this means that when and if a standard does change, there are tests in place which will assist developers in ensuring that all work to date is easily brought up to date...
The results of this is a design language that everyone can understand, when someone says they are writing a test which is looking for the jobs tag in the currently featured news article, everyone should know what that refers to in the wireframes, and also know how this will be identified and marked up in the implementation. This allows tests to be written before any code and even for the final "Look And Feel" to progress alongside development.
Of course, you're always free to continue in the traditional model and wait for three months until the standards body within the organisation produces a 300 page guidelines document before even starting that killer new feature that will storm the market... Or make totally random guesses, which are much more likely to be wrong, and be safe in the knowledge you have the traditional saviour of projects - Hope and Prayer!!!
I'll admit it, I'm not 100% agile since I tend to like solutions. I would prefer a problem to solve, but if a problem is intangible I often find a solution is a great way to explore the potential and help express the underlying problem.
It is worth noting that innovation and invention are solutions and not problems. When you apply for a patent, you don't patent the problem, you patent the solution, why? If there are multiple solutions to any given problem then wouldn't we want to patent the problem?
I also find that quite often the solutions I am exploring are the result of a problem that I have assumed was well known and in actuality is not; it is in these moments that using the solution to derive the lowest common denominator of a problem stated as a user story can ensure that everyone is speaking the same language.
But given a solution, do we always need to know the problem? My pragmatism (and a few lean principles, i.e. waste) would state that if the solution brings value then run with it, if not ditch it. If on the other hand the solution seems too constrained or lacking in features/functionality, then why not exploit the solution as the basis for a vision? Indeed many visions are just that, solutions to problems, expressed in a language alien to most developers.
So given a vision expressed in an abstract form vs a vision that is pinned against an actual solution which is easier to understand and less likely to be misinterpreted? I know which I'd prefer...
Do solutions have value, obviously; must we always mine problems, hmmm...
Test Driven is a loaded term and means different things to different people. I much prefer the term Test First which clearly states that the test comes before the implementation. However, for me, the value is not necessarily in creating an executable test, but in the thought processes that Test First brings out.
One of my colleagues at emergn is constantly reminding me that when presented with a solution you need to ask yourself what is the problem. This is what lead me to question TDD and it's variants. If you search the web for Test Driven Development, you'll uncover a wealth of information from many of the authors in my BlogRoll, as well as many variants on a theme. I think the wikipedia entry is a particularly good summary of Test Driven as it is currently understood by the community but for the real meat and bones you need to look at the articles from the thought leaders behind the practises.
However, when I look at this wealth of information, I'm now faced with the question... OK, these are all solutions, but what is the fundamental problem they are solving? Is it
- Code quality/design is poor
- Code is overly complex/difficult to test
- Unused functionality within the code
- Too many bugs
It is only when we understand the underlying problem fully that we can then evaluate the applicability/suitability of a particular approach. Indeed, it is this lack of a clearly defined problem which makes it impossible to determine which approach is best, since we can't define any tests up front... This is actually a little bit of a paradox, but highlights for me one of the most important points about TDD. TDD is a solution to too many problems and in certain cases is not a very good solution.
Testing should be about proving functionality, but unfortunately we too often see TDD trying to address non testing issues like design and code quality. Of course, code needs to be testable, but it is not the responsibility of the Testing to enforce this, it is the responsibility of the design, but this is (unfortunately IMHO) the missing piece in most methodologies... Indeed, the approach of writing the "Simplest Possible Thing" to pass a test is possibly my biggest bug-bear with TDD. This approach often means you can pass tests with no functionality other than hard coded return values, now that has to be the biggest process smell ever.
I'm going to stop using TDD, or Test Driven now and use instead the term I prefer which is Test First. For me, this means that before I do anything, I will determine ahead of time how I would test it. The immediate benefit (a.k.a value) I get from determining how to test something is I'm also starting to think about (dare I suggest design) things like apis/design/interaction/responsibilities. This is not the same as BDUF (big design up front), instead it is just enough thought at just the right moment to (hopefully) prevent a disaster. I can then apply some cost/benefit analysis with a pinch of risk analysis to assign a value to actually writing all the tests that TDD would have forced you to write, or just a small percentage to cover the important or more complex functionality.
Of course, in many cases I will actually create executable tests for many of those uncovered during this thought exercise, but will I go through the whole Red/Green/Refactor cycle, possibly not. For me the Red/Green/Refactor is like micro context switching. I prefer a slightly longer period of focus and therefore I may write quite a few tests at the same time before applying that context switch to go into coding mode. This of course is my personal preference and undoubtedly would be scorned upon by the dogmatic TDD zealots, but if this is what makes me more productive, and without a baseline problem statement I challenge them to prove that this is not the best way to do your testing.
Test First allows me to
- Understand the problem I'm trying to solve
- Think about how I will solve it (just enough design)
- Uncover any unknowns or risks hidden in the initial problem statement
- Produce high quality code which has just enough tests
Many of the Test Driven approaches do indeed address many of the initial problems stated earlier, but are they the only solution to these problems? Indeed, are these problems actually just symptoms of even deeper problems? If the problem is simply that your code is very tightly coupled and difficult to test then the best ROI will probably come not from TDD but from up-skilling your development team on fundamental design principles (unskilled developers was the real problem in the first place and TDD can't solve that).
- Wikipedia entry for Test Driven Development: http://en.wikipedia.org/wiki/Test-driven_development
- Introducing BDD by Dan North: http://dannorth.net/introducing-bdd (highly recommended reading)
- C2 wiki entry for Test Driven: http://c2.com/cgi/wiki?TestDriven
Here's my little user story:
As a blogger,
I want to use Threely as my url shortener.
So that I get more letters on Twitter.
Pretty simple story, and also very simple to implement, but impossible to test? I can certainly test the API to twitter, but given I use a wordpress plugin (developed by someone else) to do the auto-posting is it worth the effort for the addition of 6 lines of code?
For me, the pragmatic solution is to take the risk based approach. I'm going live with Threely (http://3.ly) (again, fixed a small typo), but have an escape plan (extremely important when there are known risks) which is the ability to revert if anything untowards happens and revisit my approach.
So, here goes, pressing the big red button
Just what exactly is it that distinguishes a good developer from an average developer? Certification in a particular language or technology demonstrates the ability to be "average", but certainly doesn't demonstrate good. I believe a good developer is someone who has an aptitude for developing, which is inherently extremely difficult to measure or quantify. However, there are possibly a few things that can help you identify your good developers:
- They are capable of using several languages to get things done
- They are pragmatic in their approach
- They understand the concepts as well as the solutions
- They can think at multiple levels of abstraction
- They can get things moving despite uncertainty
- They champion quality and continuous improvement
- They like to share their knowledge and expertise
Of course, these are very subjective measures and very hard to qualify or quantify. It can be very hard to demonstrate an ability to use several languages if the working environment dictates a single language. Red tape may prevent pragmatism. If the environment prevents these qualities being expressed then it is very likely the most important qualities of the best developers are being suppressed.
If you wish to get the best from your best developers, and achieve that 10 times productivity that is often quoted, you should look to make sure that you have provided them with an environment that allows them to demonstrate (through action) the above characteristics. If there is anyone you can think of right now with some or most of these characteristics, why not take the opportunity to ask them how you can help to allow them to improve.
You've never done "agile" before and you work on a legacy system which is extremely large, complex, fragile and bug-ridden. I'm guessing quite a few developers will be able to identify with this scenario. Does this mean "agile" is a no-go? On the contrary, applying "agile" techniques will probably make it easier to fix those bugs, faster and in the process improve the code just a little bit.
Here are my tips (prerequisites) before attempting "agile" within a legacy project:
- Remove blockers
- Legacy builds tend to be painfully long - setup an environment that allows you to build and test the changes you will make quickly.
- Protect yourself
- The changes you'll want to make are probably buried extremely deep within existing methods - use refactoring to pull out the code you want to change into a new method/class.
- Write tests that capture the existing functionality of the code you just extracted.
If it is too difficult to perform any of the above steps then don't bother with "agile" just yet (although by all means apply pragmatism), choose your "agile" battles carefully and you'll live to fight another day.