Software Craftsmanship conference wrap-up

Here is the wrap-up I wrote yesterday evening on the Software Craftsmanship conference in London while waiting for my plane to get me back home. I would like to thank Jason Gorman for organizing this conference. It was overwhelming and one of my favorite experiences so far. I also have to thank Gojko for his Specification Workshops talk. I’ll try them out. Micah Martin additionally made me aware that I need to practice with his Kata & Sparring session. You should read on the rest of the conference in the extended body.

Introduction

Jason Gorman invited to the first conference on Software Craftsmanship. Some weeks before the conference took place there was some ongoing discussion on the Software Craftsmanship google group on the definition of Software Craftsmanship and the distinction towards the Agile movement. Being a practical conference there were a lot of sessions focused on practices shown for the intended audience: Software Craftsmen.

Courses visited

Here are some brief insights based on my notes that I took during participating particular sessions. Before going into much detail on each session I will describe in brief what the session was all about before denoting my particular insights.

Mapping Personal Practices

Adewale Oshineye from Google presented very briefly two diagrams he created during similar workshops in the past and showed the audience two styles, that could be used. The goal was to build a mindmap structure of particular practices that were regularly applied from each of the participants. After that everyone had 10 minutes to write down her particular practices. After that an introduction round was started, where every participant should describe the contents of their mindmap to the remaining people in 90 seconds.

This technique was quite overwhelming. One question during the starting phase arose on how to focus the context of the map. Adewale had included technical and personal practices into his maps, so that this question was reasonable. The clue behind this technique is, that you can define the context and the focus as you may see fit. You can decide to just focus on technical practices or just on collaboration practices. Right before attending this session I already thought on trying to use this technique to see evolution points for the workers in my group and use it as a visual tool as well.
During the presentation of each participants’ map, I made some notes about what practices I apply but forgot to include in my map, what other practices sounded interesting and on what I would like to try out next. The list of interesting practices included E-Mails to oneself as a todo list, rather than wrapping up third-party frameworks build a more simpler solution on your own and using post-it notices to make the step from the short-term memory towards the long-term. One participant described that he first of all comes up with a name for a function and then deciding based on the name, on which class this new method should belong, as a design aid. Another participant described, that she likes to get some distance from the topic when striving with a hard problem. Getting some time to think over it can lead to a better solution in the end. Another one described that he regularly uses the revision history to get a clue of what happened to a particular class. There was also a call for help seekers, that do not argue their seek directly but rather indirectly instead. The one bringing up this item described that he had noticed that pattern and made himself aware of this. Traceability was also mentioned in that round. One participant described, that he regularly is asking stupid questions to resolve thinking abilities on his collaborators. The last thing I denoted on my list of interesting things is the ability of one participant’s customer to actually read the source code. While being able to have customers participate full-time on the project, this can lead to more direct customer conversation, if technical staff and customers are talking the same ubiquitous language.

Ruby Kata & Sparring

Micah Martin from 8th Light introduced the parallels between Martial Arts and Software development. He described, that during learning and practicing martial arts he was doing the same stuff over and over again; sometimes also for the sake to get better moving abilities during the fight by practicing movements that are actually not used in the fight itself. Micah concluded that the combination of practice, experiment and reflection leads to mastery in software development as well. After making his mission clear, he showed the audience an actual Kata he practiced over and over again. He showed the Langston’s Ant implemented in Ruby live.

Roughly in 10 minutes Micah used test-driven development to come up with a useable class, that could also be shown in a prepared evaluation program over several steps. After the session I got the opportunity to talk to Micah on his presentation. He stated that he had practiced solving the problem 30-40 times before. Since the solution just takes about 10 minutes to implement, he was also able to incorporate those sessions during daily work. The solution seemed very quick to me, even though not rushed. He ended up with roughly 20-30 lines of code and as much of unit test code as well. Since I saw his Kata I’m also planning to practice more small Katas in the next few weeks. Also I am thinking over to show some of these during a coding dojo session as a prepared Kata.

Specification Workshops

Gojko Adzic from Neuri gave a great talk on Specification Workshops, a technique he introduces in his latest book “Bridging the Communication Gap”. He introduced the session with an e-mail from an angry customer, where the customer described some business conditions with obvious inconsistencies. He then described the technique as a heading activation in each iteration right after picking the stories to implement in the next two to four weeks. Bringing together business people, developers and testers, discuss new stories in examples builds a shared understanding of the business among the relevant people in the project and helps building a common ubiquitous language.

Initially it seemed that there were just two people attending this session. Gojko nearly suggested to visit one of the sessions that were taking in parallel. Then a crowd of nearly 30 people came in, filling the room completely. There even were not enough charis for everyone to sit in there. During the talk I realized several concepts I just recently also read in other books. Gojko included in his reasoning the need to focus on the what and why during these sessions. The Poppendiecks describe this fact as set-based discussion rather than focusing on point-based negotiations. Developers and testers need to focus on what to implement rather than how and business people should be asked on the why’s behind proposed solutions in order to get an understanding of the underlying business problem and being able to cope with that. He introduced a story on printing from Java in the early days, which had troubled one of the projects he was on. After asking the customer why there had to be printing support in the software – which was rather complicated by that time – the team found out, that a re-arrangement of the UI navigation would make the need for printing support obsolete – and besides that more user friendly as well. The customer just kept asking for this since the company that had build the previous system was not able to re-arrange the UI flow. After the talk I had some words with Gojko. I am very pleased that he gave me a copy of his book “Bridging the Communication Gap” and I directly had to ask him to sign it for me. After reading the review comments from Lisa Crispin on it, I had already put it onto my order list on Amazon – now I can drop it from that list.

Gojko already put up the resources on his course on his blog. The slides should give you a good start on the topic.

Responsibility-driven Design with Mock Objects

Willem van den Ende and Marc Every from Quality Without A Name lead this session. Roughly they introduced the idea to use mock objects together with CRC card design in order to be able to directly incorporate the results from CRC design sessions into the software and unit-tests. In a rotational pairing session they also showed how to implement these for a text adventure game using rubyspec while including people from the audience.

Willem and Marc gave some guides before kicking off the demo session. On of the main goals with responsibility design is to avoid train wrecks, where a client of a class needs to call a function on one class getting another class, calling a function on that one, and so on, until the actual work happens. This distracts the reader of the source code and leads to high maintenance costs of software. One of the participants was obviously very familiar with responsibility-drive design since he gave several hints and remarked a lack of previous thinking during the live demo. The concept overall seemed to be very interesting, though the technique seems to be harder to learn.

Empirical Experiences of Refactoring in Open Source

Steve Counell from the Brunel University showed his empirical results from 7 open source software projects. He examined 15 refactorings for occurrence on these projects and had some very insightful graphs showing the usage and correlations to other refactorings and code-smells for the developers. Among the conclusions he gave were the tendency towards simpler refactorings, avoidance of encapsulation for the sake of testing purposes and interrelationships between i.e. move field and move method.

His graphs also showed that during the first few revisions of a class there were much more refactorings applied than to a later state. The conclusions from the code smells examination showed, that seemly easy code smells usually lead to a chain of up to 200 refactorings. Among these costly code smells were duplicate code and large class or large method. He closed with the tendency to not give in to seemingly easy smell removals, since they can very obviously pile up. In the discussion afterwards it was stated that there might be weight in the value coming from the costly smell removals as well. The overall study seemed very interesting to me. Maybe someone also comes up with a combination of version control, bug tracking and refactoring and examines their interrelationships. Steve also stated that the software they used for measuring the refactorings can be made available to other companies as well.

5 Reasons to have a Coding Dojo at your company

Ivan Sanchez lead this session. It was a quick introduction to Coding Dojos with some easy to follow-up rules. After that he asked for volunteers to a pairing session rotating each 5 minutes for a minesweeper implementation. Due to time-pressure he had already give it a head-start with some implementation to focus the overall on the experience rather than on 20 minute initial design decision. Ivan also enforced rules for the audience to just participate when there are green tests during the tdd-pairing session etc.

The session was very impressive and as a side effect I could get insights from the 20 favorite keyboard shortcuts session that I could not participate since it took place in parallel to Gojko’s session on Specification Workshops. (Alt-Shift-Up/Down is an impressive shortcut in eclipse – I’ll have to memorize this one.) During the last three weeks I already started some Coding Dojo sessions at work. What I got to know during this session was that I need to ensure the rules behind it more and force less discussion and more implementation after an initial discussion of 20 minutes.

Conclusion

During the next few months I will deepen Coding Dojos with my team and our developers. Among the practices that were new to me and I will definitely try out for some time are Specification Workshops and the Personal Practice Maps. Specification Workshops seemed to be a great opportunity since I realized just this week at work that there was some misunderstanding among some developers and the stakeholder of some functionality where these Specification Workshops would have helped. The Personal Practice Maps I will incorporate in April when I will have a evolution meeting with one my junior workers in order to get a better shared understanding of his particular personal practices and get him towards being journeyman. The Ruby Kata session made me excited on exercising TDD more in my leisure time. The session on responsibility-design made me curious about CRC card design. I think I will get a book on this as well. All together it was a great conference and 95% of the attendees raised the hand when Jason asked for visitors to the Software Craftsmanship Conference 2010. I’m looking forward to it, too.

Testing focus of Software Craftsmanship – Principles I

While the values might just seem to be a list of higher level goals with no direct outcome on the particular project, there is a lower level of definition in Agile methodologies: Principles. From Elisabeth Hendrickson and Lisa Crispin & Janet Gregory I was able to construct a list of useful principles. Here is the first half of this list of testing principles.

  • Provide Continuous Feedback
  • Feedback is one of the higher values for sucessful projects. As Poppendieck & Poppendieck pointed out in Lean Software Development – An Agile Toolkit feedback is vital to software development. They compare software development and manufacturing with cooking: While development or engineering means to come up with a recipe, manufacturing is just applying over and over the recipe that is already there. Since software development is a recipe building action, it needs iterations and feedback to be successful.

    From this perspective testers provide feedback from the customer point-of-view into the programming activities. Likewise they provide feedback as a substitute for the end-consumer by adding exercising tests – may these be manual or automated. Like the memory structure in the current computer systems, testers can be thought of as the cache towards production for each program increment. This does not mean, that testers are the gatekeepers for any software deliveral. This rather means, that they can provide timely feedback to the programmers, rather than opening or closing the gate towards production. By addressing issues before the software gets into production, the feedback for the programming team is achieved in an efficient manner for them to learn.

  • Deliver Value to the Customer
  • Testers are the bridge between the programming team with its technical terms and the customers with their business language. By incorporating a solid ubiquitous language into the test cases, these two worlds can get closed and guarantee transparency for the customer. This higher level goal can be achieved by working directly with the customer on test cases and likewise giving feedback to the team members who are developing the software the customer ordered. A test which is twisted up with complex logic is no value for the customer. A test case turning business terms green when it’s passing on the other hand is. Testers provide the service of filling the communication gap between technical software development and the business of the end user.

  • Enable Face-to-Face Communication
  • Alistair Cockburn points out the principle, that two people at a whiteboard are communicating more efficient than two people over email or videotape. He visualizes this principle in a well understandable manner. When considering software testing, a tester should therefore motivate and enable direct communication forms wherever possible. When two people are talking face-to-face on a difficult requirement, they efficiently get a common understand of it. Therefore tester should enable direct communication to customers, to developers, to department heads, to project management, …

    Lisa Crispin and Janet Gregory introduce the Law Of Three in their book on Agile Testing. The Law Of Three states, that any design decision or requirement discussion need to involve one person from all of the three groups: customers, developers and testers. Whenever a discussion is started, a decision need to involve all affected groups. Direct face-to-face communication supports this law by building up efficient communication channels between the three.

  • Keep it Simple
  • Simplicity is a key in Agile Software Development. Without the waste of usual software processes teams maintain their ability for agility. This directly means to keep your test cases as clear as possible. Additionally through simplicity you enable your team to understand your tests and be able to adapt it if the business demand this. Just like the Collective Code-Ownership practice in eXtreme Programming, testing teams have the demand for collective test case ownership. Just through simplicity you not only enable your co-testers, but also the business analysts, developers and customers to adapt test cases when the situation asks for this.

  • Practice Continuous Improvement
  • This principle is simply demanded by the craft aspect behind Software Development. Just like any new cool programming language or technique, testers should keep their improvement potential up-to-date. By visiting conferences on test techniques or latest test automation tools, a tester can improve her skills in these areas. This also means, that learning a new programming language or test technique such as performance- or load-testing is a way to improve one’s own abilities. Learning a common language to express testing patterns is another example of it. By providing this pattern language to your teammates, you enable them to learn from you and additionally you improve your speaking skills. These are just some parts of your daily work where you can practice improvement on yourself and your team. Look out for additional possibilities.

  • Reduce test documentation overhead
  • As Mary and Tom Poppendieck pointed out in Lean Software Development – An Agile Toolkit wasteful tasks should be avoided in order to have the highest possible return on investment. This means reducing test plans, test design documents and test protocols using tool support and ubiquitous language to a minimum. This does not mean to drop any test documentation. Rather your team and your customer should be brought in to the discussion of the level of necessary documentation. In case you’re working in safety critical software or your organisation demands it due to CMMI or ISO9000 compliance, you will be forced to use more extensive test documentation. Look out for ways to reduce these to a minimum and avoid wasteful documentation, which needs adaptation to the current situation every week or even on a daily basis.

Testing symbiosis

Elisabeth Hendrickson raised a point on her blog on test automation. She raises a very good point on the time-value of information. She states that just as money today is more worth than money tomorrow, information known today is more valueable than knowing the same thing tomorrow. Agile practices such as Fail Fast and iterative development enable teams to know more informations when making a decision. This is also true for (A)TDD of course.

While reading through the comments of Elisabeth’s blog entry, I noticed James Bach’s thoughts on it and felt to have to respond on it immediately. One thing I realised lately is, that there seems to be a discussion about whether to automate testing or not on-going between testers from the Agile school of testing and the context-driven testing school. Personally while trying to answer Bret Pettichord’s question whether I have to pick a school of testing or not, I feel, that both schools are right to some degree. This does not mean, that they are wrong to some other degree, of course. The point here is, that this is no well-structured problem – you’re faced with well-structured problems in elementary school. Software and by then software testing as well is an ill-structured problem of our adult world. There is no test automation is always right or sapient testing is always right out there.

Despite thinking in just two categories, I encourage to think of test automation and manual testing as a symbiosis. You can do a lot more manual testing, if you have covered those tedious to repeat tests with automation. On the other hand you can more easily automate software testing, if the software is built for automation support. This means low coupling of classes, high cohesion, easy to realize dependency injections and an entry point behind the GUI. Of course by then you will have to test the GUI manually, but you won’t need to exercise every test through the slow GUI with the complete database as backend. There are ways to speed yourself up. If you’re lucky, they are open for you – this seems to be the case on most Agile developed projects. If you’re not lucky, you have to find ways where scripting or automation can speed up your tedious manual test cases. But having both in place together is the best choice you can take from my point of view.

Black-Belt Testing

Matt Heusser gave me a black belt for my reply to his black-belt testing challenge. We exchanged some thoughts via e-mail. If you would like to get to know, what to do in order to get one, too, you should read his blog entries on this: Black-Belt Testing Challenge. Maybe I will decide to post my replies to his challenge in some weeks from now. But since the challenge is ongoing, I decided to keep it private for the moment.

Matt’s profile was interesting to me:

I’m a software craftsman with an interest in testing, project management, development, how people learn and systems improvement.

I just realised that his statement nearly fully describes the view I have from myself as well. The ideas we exchanged via e-mail were pretty interesting and I’m now looking forward to meet him in personal.

Testing focus of Software Craftsmanship – Values

Some weeks ago I was first made aware of Software Craftsmanship. By the same time I had run over Bob Martin’s fifth Agile value and Alistair Cockburn’s New Software Engineering. Just today I strived over a topic opened from Joshua Kerievsky on the XP Mailing List: The Whole Enchilada just to find another blog entry from Dave Rooney on the underlying thoughts. For me it seems there is something upcoming, there is something in the air and I would like to share my current picture of the whole from a testing perspective. Even in this week there was a discussion on the Agile Testing group being started by Lisa Crispin’s blog entry on The Whole Team. I decided to organise these sorts in a series of postings to come during the next few weeks. This time I would like to start with values from Agile methodologies. First of all I haven’t read every book on every topic around Agile Testing and Software Craftsmanship so far. There are books on technical questions on my bookshelf as well as managerial books that I would like to get into. Additionally I have not had the opportunity to get to know an Agile team in action – though my team did a really good job moving the whole test suite from a shell script based approach to a business facing test automation tool during the last year. My company came up with a new project structure during that time and I just lately noticed, that – similar to Scrum – there is a flaw of technical factors in the new project structure. The new organisation seems to focus on just managerial aspects of software development without advises on how to gain technical success.

That said, I started to read on Agile methodologies during the last year a lot. The idea of Agile value and principles still is fascinating me. It even fascinates me so far, that I would like to compile a list of factors to notice during day-to-day work. Since Agile methodologies use the Practices, Principles and Values scheme to describe the underlying concepts – during the last year I noticed a parallel to ShuHaRi – I would like to come up with a similar structure. Here are the values from eXtreme Programming and Scrum combined into a single list:

  • Communication
  • Human interactions focus on a large amount of communication. This item on the list is particular related to the first value from the Agile Manifesto: Individuals and interactions over processes and tools. Likewise Alistair Cockburn introduced the concept of Information Radiators in order to even combine communications and feedback on publicly available whiteboards or flipcharts pages. In the software development business the right way to communicate can reduce a lot of wasted efforts with assumed functionality. Communication therefore as well serves the principle from Lean Software Development to eliminate waste.

  • Simplicity
  • The case for simplicity arose the first time when my team was suffering from a legacy test approach, which dealt with too many dependencies and a chained-test syndrom. Simply spoken: The tests flawed the simplicity value. Changing one bit on the one function forced changing several tests on the other side of the test framework. Due to the high-coupling nature that was caused by no particular design rules and no efforts spent on paying down Technical Debt, adapting test cases to the business needs was not simple. The high complexity in the test code base was the starting point for this lack of simplicity. By incorporating design patterns, refactoring and test-driven development we were able to handle this lack of simplicity in our currently used approach. Additionally one thing that I learned from Tom DeMarco’s The Deadline is, that when incorporating complex interfaces in the software system you’re building, you also make the interfaces between the humans involved equally complex. This results directly in a higher amount of communication necessary to compensate – a fact that Frederick Brooks noticed nearly fourty years ago in The Mythical Man Month.

  • Feedback
  • As described before on the communication topic, information radiators which spread informations i.e. about the current status of the build in Continuous Integration or about remaining storypoints on a burndown chart make feedback visible. There is even more to feedback. Continuous Integration is a practice which leads to continuous feedback of repetetive tasks such as unit tests, code inspections and the like. The feedback gathered after each iterations’s end is another point to exercise continuous improvement of the whole team. When feedback is gathered quickly, according actions can be taken by the individuals involved.

  • Courage
  • Each functional team has to address problems and come up with proposed solutions. In order to state underlying problems you need to have courage to bring up topics that might throw the project behind the schedule. On the other hand staying quiet about these problems, might directly result in Technical Debt and as Dave Smith has put it:

    TechnicalDebt is a measure of how untidy or out-of-date the development work area for a product is.
    Definition of Technical Debt

    There are other topics, where each team member needs to have courage. Here is a non-complete list to give you a vision of it:

  • when organizational habits are counter-productive to team or project goals
  • when working in a dysfunctional team
  • when the code complexity raises above a manageable threshold
  • Respect
  • Respecting each team member for their particular achievements and technical skill-sets is to my understanding part of a functional team definition. When having a respectful work-environment, a tester is more likely to take the courage to raise flaws of code-metrics or violated coding-conventions. When raising issues of other’s work-habits it is more likely to have a constructive way of solving problems by sending out a Congruent Message. When each team member has the respect on the technical skills of each other team member it is more likely to occur even in problematic situations. In a respectful atmosphere critics are not made in the form of accusation and therefore lead directly to constructive and creative solutions rather than protecting behaviour of the opposing parties.

  • Commitment
  • The whole team gives the commitment to the customer to deliver valueable quality at the end of each iteration. Without the commitment of each team member to deliver value to their customer, the project success is put onto stake. Leaving out unit tests on critical functions may blow the application once in production. Likewise left-out acceptance tests may lead to a time bomb exploding during customer presentation while doing some exploratory tests. Likewise the team gives the commitment to their organisation to produce the best product they can in order for the organisation to make money with it. The flip-side of the coin will lead to distrust from organisational management. Committing to the vision of the project, the goal of the iteration, is a basic value for each team.

  • Focus
  • Focus is an essential value behind testing. If you easily distract yourself with follow-up testing, you may find yourself with a bunch of testcases exercised without following your initial mission to find critical bugs fast. Sure, it’s relevant to do some follow-up testing on bugfixes that just occured during the last changes, but if you loose your focus on the particular testing mission, you are also missing to deliver the right value to your customer. Face-to-Face Communication and Continuous Improvement help you keep your focus on the right items, while a simple system supports you in the ability to focus on the harder-to-test cases of your software.

  • Openness
  • If you would like to provide your customer the best business value possible, it might turn out, that you need to be open to new ideas. Particularly business demands change due to several factors: market conditions, cultural or organisational habits. As Kent Beck points out in eXtreme Programming eXplained:

    The problem isn’t change, because change is going to happen; the problem, rather, is our inability to cope with change.

    Without openness a tester is not able to cope with the change that is going to happen – may it be to technical needs or just the business demanding the underlying change.

    Testing Libraries vs. Frameworks

    Lisa Crispin reported last week on the difference between Test Libraries and Frameworks. When reading her blog entry I felt the urge to comment to it on the approach we used at work for our two-step based testing functions. In the end I figured that I quiet had not understood the point on where the difference between test libraries and frameworks might be. Here is the comment I made to the blog-entry:

    Since I’m working in the specialized business of mobile phone rating and billing, we introduced our own framework for testing. During the last year we switched our legacy test cases, that were fragile and suffering from high maintenance costs, to a FIT based approach. We came up with two steps for this.

    We have built up a toolkit with re-usable tools, that our lightweight and most probably being reused on the next project. These additionally need to be independent of particular customer terminologies. From my point of view this portion is a test library for our product.

    The other part is an incorporating one. We build up Fixtures, that are customer dependent – from project to project. We use the toolkit low-level components for this as far as possible and stick to subclassing or interface implementation, where needed.

    Historically they grew from the customer specific fixture part towards re-usable components in the toolkit. So we start off with working software and peel out the details, that we are likely to re-use after having a working copy of it.

    Together both form a testing framework. I don’t seem to get your point on the difference of a library and a framework fully. Our low-level function toolkit seems to be a test library, when incorporated with customer terminology we get a framework, where we simply need to write down the test cases in some tables.

    Today I took the time to look-up the definition of a software library and a Software Framework on WikiPedia. After reading the definitions from WikiPedia and retrospecting my own words in the comment on Lisa’s blog, I today claim that we came up with a test library (our ToolKit) and combined with FitNesse and other third-party libraries – i.e. JDom, J2EE, etc. – and our customer dependend fixture code (we called this our Fixtures) we have a testing framework for our software product. Any other thoughts on this?